Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Was hoping this would be a LM Studio alternative (for local LLMs) with a friendlier UI. I think there's a genuine need for that.

It could make available only the LLLMs that your Mac is able to run.

Many Silicon owners are sitting on very able hardware without even knowing.



Just added support for Ollama and LM Studio servers! I'm getting 66 tokens/sec and 0.17s time to first token for Llama 3.1, it's pretty mind-blowing.

https://x.com/charliebholtz/status/1873798821526069258


I don't know LM Studio but I really like OpenWebUI. Maybe worth a try.

I use it mainly because my LLM runs on a server, not my usual desktop.


On that note, I recently learned from Simon Willson's blog that if you have uv installed, , you can try OpenWebUI via:

    uvx --python 3.11 open-webui serve


This is exactly what I’m building at https://www.get-vox.com - it automatically detects your local models installed via Ollama.

It is fast, native and cross-platform (built with Qt using C++ and QML).


Try msty.app




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: