Is there a way I can change the OpenAI endpoint URL? #343
-
Hello, I might be misreading the docs, but I'm not sure how to configure this. I want to use a service called ConvoAI, they have an OpenAI-compatible API endpoint which is this: And then all I have to do is set my API key. The thing is... I can't find a place where I can change the default OpenAI API URL endpoint (which I think is Thanks in advance! |
Beta Was this translation helpful? Give feedback.
Replies: 9 comments 6 replies
-
Most commands in LWE support tab completion.
The default Once you set those in the CLI, you can save those settings as a preset. |
Beta Was this translation helpful? Give feedback.
-
#265 has nothing to do with it. The error is occurring in the thread that generates the title. By default LWE uses OpenAI's GPT-3.5 to generate a short title for all conversations. If you don't have a valid There is a config setting Given the proliferation of OpenAI compatible endpoints, it's probably best to add a |
Beta Was this translation helpful? Give feedback.
-
I took a look at adding Probably the cleanest and easiest option for you is to set You could also use LWE has quite a few other providers: https://github.com/orgs/llm-workflow-engine/repositories?q=provider Finally, if you just want to go w/o titles, this will work to suppress the error: Adjust config as follows: backend_options:
title_generation:
provider: fake_llm
plugins:
enabled:
- provider_fake_llm That's the testing class, and will just title everything |
Beta Was this translation helpful? Give feedback.
-
It's working fine for me: backend_options:
title_generation:
provider: fake_llm
# other settings...
plugins:
enabled:
- provider_fake_llm
You can see the Are you sure it's the same error, or is it another error? Have you run LWE with the debug flag and compared the backtrace? |
Beta Was this translation helpful? Give feedback.
-
That traceback doesn't say WHAT the error is. Kinda hard for me to debug if I don't know what the error is. You can try putting some debug statements in a few spots along that traceback, without more data I cannot help, and I cannot reproduce the issue. |
Beta Was this translation helpful? Give feedback.
-
The stack trace is like a map of the files and line numbers where the code execution was when a program crashes. So you already have a map of the actual files and locations, you'd look there for relevant variables being passed into those function calls, and add debug log statements. The LWE debug facility should probably work in all cases if you installed it as a package: from lwe import debug
# varname is a variable you want to see the value of, it can be any kind of variable.
debug.console(varname) Since you'll be hacking stuff in Hope that helps. |
Beta Was this translation helpful? Give feedback.
-
You can use a template with front matter, it should allow you to set a
custom title:
https://llm-workflow-engine.readthedocs.io/en/latest/templates.html#front-matter
There's an `edit-run` action for templates, that would allow you to open
the template in your CLI editor, then you type your prompt and save, an
it'll send the prompt, and use the custom title.
…On Mon, Apr 29, 2024, 9:05 PM ForeverNooob ***@***.***> wrote:
Thanks once again. I'll try to grok the basics of Python down when I can.
In the meantime, would it be an idea for a feature request to allow
setting a title even before the first request / response from the LLM?
Because then I could just set my own title and not see that error.
—
Reply to this email directly, view it on GitHub
<#343 (reply in thread)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAAKV7CWR24WJCCVNBVQ3OLY73U5XAVCNFSM6AAAAABGZWQIHOVHI2DSMVQWIX3LMV43SRDJONRXK43TNFXW4Q3PNVWWK3TUHM4TENRYHA4TK>
.
You are receiving this because you commented.Message ID:
<llm-workflow-engine/llm-workflow-engine/repo-discussions/343/comments/9268895
@github.com>
|
Beta Was this translation helpful? Give feedback.
-
Stack trace on |
Beta Was this translation helpful? Give feedback.
-
Again, I cannot reproduce the other issue you are reporting: This code is what decides to use auto-title generation or not, if a title already exists, then As you can see from my example, the template properly inserts the provided custom title. That title is present in I'm really not sure what you're doing differently, but everything I see is showing this logic working fine. |
Beta Was this translation helpful? Give feedback.
Most commands in LWE support tab completion.
/model [TAB]
will show you all available options you can set on a particular provider.The default
chat_openai
provider has options for setting both base url and api key.Once you set those in the CLI, you can save those settings as a preset.