ollama/llm
Daniel Hiltgen 56f754f46b Enable Ollama engine by default
This changes the default behavior to use the Ollama engine for supported
models, while retaining the ability to disable the Ollama engine and
fall back to the Llama engine.  Models in the OllamaEngineRequired list
will always run on the Ollama engine.
2025-12-12 11:20:35 -08:00
..
llm_darwin.go Optimize container images for startup (#6547) 2024-09-12 12:10:30 -07:00
llm_linux.go Optimize container images for startup (#6547) 2024-09-12 12:10:30 -07:00
llm_windows.go win: lint fix (#10571) 2025-05-05 11:08:12 -07:00
server.go Enable Ollama engine by default 2025-12-12 11:20:35 -08:00
server_test.go llm: Don't always evict models on CPU-only systems 2025-12-02 10:58:08 -08:00
status.go logs: catch rocm errors (#12888) 2025-10-31 09:54:25 -07:00