This issue was moved to a discussion.
You can continue the conversation there. Go to discussion →
bug: function calling output is not captured with streaming #1943
Labels
You can continue the conversation there. Go to discussion →
Describe the bug
when using function calling and streaming, the output is not captured and tokens aren't counted
To reproduce
rom langfuse import Langfuse
from langfuse.decorators import observe
from langfuse.openai import openai # OpenAI integration
from openai import OpenAI
client = OpenAI()
tools = [
{
"type": "function",
"function": {
"name": "get_current_weather",
"description": "Get the current weather",
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "The city and state, e.g. San Francisco, CA",
}
},
"required": ["location"],
},
},
}
]
messages = [
{
"role": "system",
"content": "You are a helpful assistant.",
"role": "user",
"content": "What is the weather in boston",
}
]
@observe()
def get_openai_response(tools):
return client.chat.completions.create(
model="gpt-3.5-turbo-16k",
tools=tools,
tool_choice="auto",
messages=messages,
stream=True,
)
@observe()
def main():
num_chunk = 0
for chunk in get_openai_response(tools):
num_chunk += 1
print(f"num chunk = {num_chunk}")
main()
Additional information
No response
The text was updated successfully, but these errors were encountered: