ollama/llm
Baptiste Jamin 59241c5bee
server: add logprobs and top_logprobs support to Ollama's API (#12899)
Adds logprobs support to Ollama's API including support for Ollama's
OpenAI-compatible API. By specifying the new 'logprobs' boolean parameter
in the API, Ollama will return the log probabilities for each token generated.
'top_logprobs', an integer value can also be specified up to the value 20.
When specified, the API will also provide the number of most likely tokens to
return at each token position

Co-authored-by: Baptiste Jamin <baptiste@crisp.chat>
2025-11-11 08:49:50 -08:00
..
llm_darwin.go Optimize container images for startup (#6547) 2024-09-12 12:10:30 -07:00
llm_linux.go Optimize container images for startup (#6547) 2024-09-12 12:10:30 -07:00
llm_windows.go win: lint fix (#10571) 2025-05-05 11:08:12 -07:00
memory.go DRY out the runner lifecycle code (#12540) 2025-10-23 11:20:02 -07:00
memory_test.go DRY out the runner lifecycle code (#12540) 2025-10-23 11:20:02 -07:00
server.go server: add logprobs and top_logprobs support to Ollama's API (#12899) 2025-11-11 08:49:50 -08:00
server_test.go DRY out the runner lifecycle code (#12540) 2025-10-23 11:20:02 -07:00
status.go logs: catch rocm errors (#12888) 2025-10-31 09:54:25 -07:00