It could make available only the LLLMs that your Mac is able to run.
Many Silicon owners are sitting on very able hardware without even knowing.
https://x.com/charliebholtz/status/1873798821526069258
I use it mainly because my LLM runs on a server, not my usual desktop.
uvx --python 3.11 open-webui serve
It is fast, native and cross-platform (built with Qt using C++ and QML).
It could make available only the LLLMs that your Mac is able to run.
Many Silicon owners are sitting on very able hardware without even knowing.