Replies: 1 comment
-
turning this into an issue, thanks for reporting! Feel free to subscribe to the issue to get notified when this gets fixed. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I am running something very straight forward and I am seeing that I am not seeing the chat completion information on langfuse by running invoke/ainvoke/run etc, only if I stream. This, I don't think should be the case. I would appreciate any insight as to what I am doing wrong. I will provide reproducible code.
Environment:
Here is the code I am running:
The code runs and I am getting the output, but I get the following warning:
On langfuse,
We see that the token information and output from the llm is not being traced.
Where am I going wrong?
Beta Was this translation helpful? Give feedback.
All reactions