Error 400 when using OpenAI API in Automations #13678
Labels
bb-automations
Budibase Automations related work
bug
Something isn't working
env - production
Bug found in production
low-hanging-fruit
Discussed in #13592
Originally posted by fueledbyEmin May 2, 2024
I'm using the tutorial at https://docs.budibase.com/docs/openai to send a question to OpenAI API.
But I get Error 400
{ "success": false, "response": "Error: Request failed with status code 400" }
From OpenAI forums, the problem can be in the JSON payload which should include
max_tokens
andtemperature
params. What's more,max_tokens
andtemperature
should be set to specific values. For example,max_tokens=64
andtemperature=0.5
.If Budibase correctly structures OpenAI API requests in the background, what could be the problem here?
I'm self-hosting using docker-compose.
CSE Team findings
GPT-3.5 Turbo returns a
400
responseGPT-4 returns a
404
responsePossibly the internal model name has been depricated in the API and may need to be updated.
The text was updated successfully, but these errors were encountered: