Can't see traces in the webUI if callback_handler passed to LLM init constructor #1864
Unanswered
gabrielfior
asked this question in
Support
Replies: 1 comment 2 replies
-
The problem is with CrewAI callback. do the following
hope i helps. i am doing the same |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I went through the Langfuse docs regarding Langchain integration, and I understand that passing the callback to the functions
invoke
as well aschain.run
andpredict
sends traces to Langfuse. However, what I would like to have is to pass the callback handlers directly to the LLM init method (see code below) and all subsequent traces be logged to Langfuse.Please note the following
-> If I execute methods like
llm.invoke
, I see the traces in the UI-> If I pass this llm to an CrewAI Agent, which is then called within a CrewAI crew, no traces are displayed in the webUI.
I expected this to work out-of-the-box.
Guidance much appreciated!
Beta Was this translation helpful? Give feedback.
All reactions