Scoring a trace after the LLM chain returns #1610
-
Hello, I am trying to score a trace based on the output of a Langchain LLM chain. I am unable to associate a score with the trace and it doesn't show up in the web UI. However, if I call score on the trace before the LLM chain is invoked, the score shows up in the UI. How do I get around this? I am using the python SDK like so: `trace = langfuse.trace(session_id=session_id, metadata=metadata, tags=tags) result = llm_chain.invoke(config={"callbacks": [handler])
|
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
Hi @pooja1423 |
Beta Was this translation helpful? Give feedback.
Hi @pooja1423
I had a similar issue in a slightly different context. But in general langfuse executes network requests in the background for better performace without blocking the current execution thread. To ensure that the requests are completely sent before terminating the application it is suggested to add
langfuse.flush()
at the end of the script or on shutdown. You can read more hereHope this helps :)