Replies: 1 comment
-
|
PyOMlx is a model server for Apple MlX models. You can download MlX models directly from huggingface here https://huggingface.co/mlx-community . PyOllaMx is a chat interact utility that can use both Ollama and MlX models (via PyOMlx) . So for ollama models, simply use ollama to download models . You can look here https://ollama.com/library . I have a roadmap item to streamline and simply this process by adding mode discovery and download functionality directly within PyOllaMx app itself. It's coming soon 😊 |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I don't see documentation on how to download a model for use with PyOMlx and pyollmx. What is the process?
Beta Was this translation helpful? Give feedback.
All reactions