Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: Assistants streaming #737

Open
wants to merge 5 commits into
base: master
Choose a base branch
from
Open

Conversation

coolbaluk
Copy link

@coolbaluk coolbaluk commented May 8, 2024

Supersedes #731

Based on the original fork of @tanzyy96 just brought up to date.

We've been running this for a couple of weeks and has served us well.

Example usage:

              stream, err = client.CreateThreadAndStream(ctx, openai.CreateThreadAndRunRequest{
			RunRequest: openai.RunRequest{
				AssistantID: AssistantID,
			},
			Thread: openai.ThreadRequest{
				Messages: Messages,
			},
		})
		
		defer stream.Close()
		
		for {
		        resp, err = stream.Recv()
		        if errors.Is(err, io.EOF) {
			        break
		        }
                 }
		

Copy link

codecov bot commented May 8, 2024

Codecov Report

Attention: Patch coverage is 80.00000% with 12 lines in your changes are missing coverage. Please review.

Project coverage is 97.75%. Comparing base (774fc9d) to head (322bd92).
Report is 15 commits behind head on master.

Files Patch % Lines
run.go 80.00% 6 Missing and 6 partials ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##           master     #737      +/-   ##
==========================================
- Coverage   98.46%   97.75%   -0.72%     
==========================================
  Files          24       24              
  Lines        1364     1200     -164     
==========================================
- Hits         1343     1173     -170     
  Misses         15       15              
- Partials        6       12       +6     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@opvexe
Copy link

opvexe commented May 11, 2024

may i ask question?
how to talk in next context  use last thread_id?
	cc := openai.NewClientWithConfig(config)

	stream, _ := cc.CreateRunStreaming(context.Background(), "thread_VEPTeIWj1umjdFyUd0Aj4lb2", openai.RunRequest{
		AssistantID: "asst_IuAlZbLkuIgcky26QPB2TQy0",
	})

	//stream, _ := cc.CreateThreadAndStream(context.Background(), openai.CreateThreadAndRunRequest{
	//	RunRequest: openai.RunRequest{
	//		AssistantID: "asst_IuAlZbLkuIgcky26QPB2TQy0",
	//	},
	//	Thread: openai.ThreadRequest{
	//		Messages: []openai.ThreadMessage{
	//			{
	//				Role:    openai.ThreadMessageRoleUser,
	//				Content: "我刚问了什么?",
	//			},
	//		},
	//	},
	//})

	defer stream.Close()

	for {
		resp, err := stream.Recv()
		if errors.Is(err, io.EOF) {
			break
		}

		t.Log("thread_id", resp.ID)

		for _, content := range resp.Delta.Content {
			t.Log(content.Text.Value)
		}
	}

@tanzyy96
Copy link

tanzyy96 commented May 12, 2024

may i ask question?
how to talk in next context  use last thread_id?
	cc := openai.NewClientWithConfig(config)

	stream, _ := cc.CreateRunStreaming(context.Background(), "thread_VEPTeIWj1umjdFyUd0Aj4lb2", openai.RunRequest{
		AssistantID: "asst_IuAlZbLkuIgcky26QPB2TQy0",
	})

	//stream, _ := cc.CreateThreadAndStream(context.Background(), openai.CreateThreadAndRunRequest{
	//	RunRequest: openai.RunRequest{
	//		AssistantID: "asst_IuAlZbLkuIgcky26QPB2TQy0",
	//	},
	//	Thread: openai.ThreadRequest{
	//		Messages: []openai.ThreadMessage{
	//			{
	//				Role:    openai.ThreadMessageRoleUser,
	//				Content: "我刚问了什么?",
	//			},
	//		},
	//	},
	//})

	defer stream.Close()

	for {
		resp, err := stream.Recv()
		if errors.Is(err, io.EOF) {
			break
		}

		t.Log("thread_id", resp.ID)

		for _, content := range resp.Delta.Content {
			t.Log(content.Text.Value)
		}
	}

There's an API for inserting message in a thread. Example below:

_, err = a.client.CreateMessage(ctx, threadId, openai.MessageRequest{
    Role:    openai.ChatMessageRoleUser,
    Content: messageText,
})
if err != nil {
    logger.Error("failed to create message", zap.Error(err))
    return err
}

outStream, err = a.client.CreateRunStreaming(ctx, threadId, openai.RunRequest{
    AssistantID: assistantId,
})

@opvexe
Copy link

opvexe commented May 12, 2024

may i ask question?
how to talk in next context  use last thread_id?
	cc := openai.NewClientWithConfig(config)

	stream, _ := cc.CreateRunStreaming(context.Background(), "thread_VEPTeIWj1umjdFyUd0Aj4lb2", openai.RunRequest{
		AssistantID: "asst_IuAlZbLkuIgcky26QPB2TQy0",
	})

	//stream, _ := cc.CreateThreadAndStream(context.Background(), openai.CreateThreadAndRunRequest{
	//	RunRequest: openai.RunRequest{
	//		AssistantID: "asst_IuAlZbLkuIgcky26QPB2TQy0",
	//	},
	//	Thread: openai.ThreadRequest{
	//		Messages: []openai.ThreadMessage{
	//			{
	//				Role:    openai.ThreadMessageRoleUser,
	//				Content: "我刚问了什么?",
	//			},
	//		},
	//	},
	//})

	defer stream.Close()

	for {
		resp, err := stream.Recv()
		if errors.Is(err, io.EOF) {
			break
		}

		t.Log("thread_id", resp.ID)

		for _, content := range resp.Delta.Content {
			t.Log(content.Text.Value)
		}
	}

There's an API for inserting message in a thread. Example below:

_, err = a.client.CreateMessage(ctx, threadId, openai.MessageRequest{
    Role:    openai.ChatMessageRoleUser,
    Content: messageText,
})
if err != nil {
    logger.Error("failed to create message", zap.Error(err))
    return err
}

outStream, err = a.client.CreateRunStreaming(ctx, threadId, openai.RunRequest{
    AssistantID: assistantId,
})

thanks very much

@liushaobo-maker
Copy link

合并进去了吗?现在很需要这个

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants