Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Escaped double quotes: \" in model responses get shown as just " in chat UI #437

Open
AlexanderLuck opened this issue Apr 6, 2024 · 6 comments
Labels
bug Something isn't working

Comments

@AlexanderLuck
Copy link

What happened?

Hi,
when I let modesl generate code containing strings with escaped " the codeGPT UI seems to strip the \ from the text for example I get as an answer:
var query = $"from(bucket:"{BucketName}") |> range(start: -1y) |> filter(fn: (r) => r._measurement == "metadata_tracking" and r.measurement_name == "{measurementName}") |> last()";

Correct would be:

var query = $"from(bucket: \"{BucketName}\") |> range(start: -1y) |> filter(fn: (r) => r._measurement == \"metadata_tracking\" and r.measurement_name == \"{measurementName}\") |> last()";

When I ask the model to fix the first response it correctly tells me the delimiters are missing and posts exactly the same with missing delimiters.

Relevant log output or stack trace

No response

Steps to reproduce

Ask in chat:
generate a c# string log message with other variables using a " delimited string

CodeGPT version

2.5.1

Operating System

None

@AlexanderLuck AlexanderLuck added the bug Something isn't working label Apr 6, 2024
@reneleonhardt
Copy link
Contributor

@carlrobertoh I can't really find handling of escaped quotes or corresponding tests in llm-client 😅
Maybe integration tests could catch that?
When I insert \\\" into some requests and responses in OllamaClientTest everything is still green... as expected 😄

There are multiple locations dealing with escape sequences (including quotes) in CodeGPT cpp files, but I can't find tests.

@reneleonhardt
Copy link
Contributor

@AlexanderLuck What service / model service are you using?
ollama run codellama doesn't generate that code including escapes with that prompt and empty context.

@carlrobertoh
Copy link
Owner

I have noticed the same. The bug is related to how the response is rendered on the screen. Each time a new message is received, it is converted into HTML (using flexmark) before being displayed to the user. Most likely, the problem lies somewhere in that process.

As a workaround, you can see the "correct" output by clicking on this icon button:
Screenshot 2024-04-06 at 18 36 27

@AlexanderLuck
Copy link
Author

I use Claude and GPT4, both have that issue.
ooks like classic pre-conversion filtering issue as @carlrobertoh describes.
It is just a minor annoyance that happens rarely so I though I'd report it in case somebody works on the UI at some point.

@AlexanderLuck
Copy link
Author

Just checked again, also happens with user inputs, if you just put in chat:
"
repeat after me:

var query = $"from(bucket: "{BucketName}") ";
"
You will see that your own inputs are stripped of the two " as well.
Again, not a big deal, at least for me, but I thought maybe the same code filters other things for other users as well.

@alvitawa
Copy link

alvitawa commented May 31, 2024

This bug is super annoying
Cant do any latex with codegpt anymore because of it
But gpt4o on in chat.openai.com has the same issue it seems
gpt3.5 does not

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

4 participants