Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

🚀 Feature: re-write LlamaIndex instrumentation to use LlamaIndex CallbackManager #540

Open
1 task done
nirga opened this issue Feb 27, 2024 · 5 comments
Open
1 task done
Labels
help wanted Extra attention is needed

Comments

@nirga
Copy link
Member

nirga commented Feb 27, 2024

Which component is this feature for?

LlamaIndex Instrumentation

🔖 Feature description

Right now, we monkey-patch classes and methods in LlamaIndex which requires endless work and constant maintenance. LlamaIndex has a system for callbacks that can potentially be used to create/end spans without being too coupled with with the framework's inner structure.

🎤 Why is this feature needed ?

Support LlamaIndex entirely and be future-proof to internal API changes

✌️ How do you aim to achieve this?

Look into LlamaIndex callback_manager and how other frameworks are using it.

🔄️ Additional Information

No response

👀 Have you spent some time to check if this feature request has been raised before?

  • I checked and didn't find similar issue

Are you willing to submit PR?

None

@nirga nirga added the help wanted Extra attention is needed label Feb 27, 2024
@varaarul
Copy link

varaarul commented Apr 3, 2024

I can start looking into this!

@larinam
Copy link

larinam commented Apr 30, 2024

@nirga , could you fix the link to documentation in the description and give a hint what pieces of this project's code should be touched by this change?

@nirga
Copy link
Member Author

nirga commented May 2, 2024

Done @larinam!
Basically, right now we wrap LlamaIndex calls to log and create spans. This means we have to keep up with their APIs and update our instrumentations every time something changes on LlamaIndex's end.

A better solution would be to register ourselves to the callbacks of the LlamaIndex engine and just open and close spans when we get called. This is much more robust and future proof.

@larinam
Copy link

larinam commented May 11, 2024

All right @nirga , thanks. But I'll try to ask one more time with the question.
The code base is not that little for the project, could you give a hint what pieces of this project's code should be touched by this change? Could you just attach a couple of links?

@nirga
Copy link
Member Author

nirga commented May 11, 2024

In our project it's just under the LlamaIndex instrumentation. Do you want to join our slack and discuss there? Might be easier :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
help wanted Extra attention is needed
Projects
None yet
Development

No branches or pull requests

3 participants