Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Force Ollama to run in MACOS GPU Core - Flag Requirements : ollama run llama2-uncensored - GPU<> #24

Open
akramIOT opened this issue Apr 10, 2024 · 0 comments

Comments

@akramIOT
Copy link

Is your feature request related to a problem? Please describe.
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]

As on today there is Custom flag setting or argument to force the OLLAMA to run in Apple M1, M2 or M3 MLX GPU Core infra, by default OLLAMA is running in CPU Core.

03_04_13

Describe the solution you'd like

We would be interested to have a enhancement feature to force Ollama to run in MACOS GPU Core - Flag Requirements : ollama run llama2-uncensored - GPU<>

Describe alternatives you've considered
A clear and concise description of any alternative solutions or features you've considered.

Additional context
Add any other context or screenshots about the feature request here.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant