ollama/ml
Vadim Grinco 98f699773a Applied 00-fix-vulkan-building.patch
Work done by McBane87 here: https://github.com/whyvl/ollama-vulkan/issues/7#issuecomment-2660836871

Signed-off-by: Vadim Grinco <vadim@grinco.eu>
2025-03-10 12:34:37 +01:00
..
backend Applied 00-fix-vulkan-building.patch 2025-03-10 12:34:37 +01:00
nn attention: Remove unnecessary contiguous operations 2025-03-01 20:53:23 -08:00
backend.go ml: Enable support for flash attention 2025-03-01 20:53:23 -08:00