-
Notifications
You must be signed in to change notification settings - Fork 59
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How can i update to vLLM v0.4.1 for llama3 support ? #66
Comments
+1 |
+1, looking to figure this out soon |
Same issue here. There is a blocking bug on Llama3 that has been fixed with v0.4.1. |
Pretty please |
Hi all, thank you for raising this issue! I have just merged the vLLM 0.4.2 update into main, you can use it by changing your Docker image in your endpoint from |
Hello everyone,
I would like to update the vLLM version to v0.4.1 in order to get access to LLAMA3 but i don't know how modify the fork runpod/vllm-fork-for-sls-worker. Could you please guide me ? Happy to help in some way!
The text was updated successfully, but these errors were encountered: