Deploy on Ollama

#7
by realElonMusk - opened

Great work! Two quick questions:

  1. How to load main model together with mmproj GGUF onto ollama platform for multimodal supports?
  2. Or if author has any plan to upload onto https://ollama.com/library? So that users can easily deploy your masterpiece with ollama run hauhaucs/<model>.

:D

Hey :)

https://huggingface.co/HauhauCS/Qwen3.5-9B-Uncensored-HauhauCS-Aggressive/discussions/3
Check out that bit, but I normally would advise to just go with LM Studio in all fairness, all you have to do is put both files in the same folder and it just 'works'.

Got it and thanks. Sound like deployment on ollama is tricky. Since my dev env is under WSL, I didn't try LM Studio direct on Windows. Perhaps this is a chance to change :D

realElonMusk changed discussion status to closed

Sign up or log in to comment