Skip to content

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

bug: function calling output is not captured with streaming #1943

Closed
DanrForetellix opened this issue May 1, 2024 · 4 comments
Closed

bug: function calling output is not captured with streaming #1943

DanrForetellix opened this issue May 1, 2024 · 4 comments
Assignees
Labels

Comments

@DanrForetellix
Copy link

Describe the bug

when using function calling and streaming, the output is not captured and tokens aren't counted

To reproduce

rom langfuse import Langfuse
from langfuse.decorators import observe
from langfuse.openai import openai # OpenAI integration
from openai import OpenAI

client = OpenAI()

tools = [
{
"type": "function",
"function": {
"name": "get_current_weather",
"description": "Get the current weather",
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "The city and state, e.g. San Francisco, CA",
}
},
"required": ["location"],
},
},
}
]
messages = [
{
"role": "system",
"content": "You are a helpful assistant.",
"role": "user",
"content": "What is the weather in boston",
}
]

@observe()
def get_openai_response(tools):
return client.chat.completions.create(
model="gpt-3.5-turbo-16k",
tools=tools,
tool_choice="auto",
messages=messages,
stream=True,
)

@observe()
def main():
num_chunk = 0
for chunk in get_openai_response(tools):
num_chunk += 1
print(f"num chunk = {num_chunk}")

main()

Additional information

No response

@marcklingen
Copy link
Member

thanks for reporting

@hassiebp
Copy link
Contributor

Thanks for reporting @DanrForetellix . This is a known limitation (tool call + streaming). Why exactly do you decide to stream the rather concise function call result?

@DanrForetellix
Copy link
Author

DanrForetellix commented May 14, 2024 via email

@hassiebp
Copy link
Contributor

Thanks for the explanation, @DanrForetellix! I'll convert this issue to a discussion, such that we can gauge interest by other users in supporting this use case which helps us prioritize building a solution for it.

@langfuse langfuse locked and limited conversation to collaborators May 14, 2024
@hassiebp hassiebp converted this issue into discussion #2055 May 14, 2024

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

Labels
Projects
None yet
Development

No branches or pull requests

3 participants