ollama/scripts
Daniel Hiltgen a4770107a6
vulkan: enable flash attention (#12937)
Also adjusts the vulkan windows build pattern to match recent changes in other backends
so incremental builds are faster.
2025-11-04 10:31:22 -08:00
..
build_darwin.sh Align versions for local builds (#9635) 2025-03-14 15:44:08 -07:00
build_docker.sh Update ROCm (6.3 linux, 6.2 windows) and CUDA v12.8 (#9304) 2025-02-25 13:47:36 -08:00
build_linux.sh CI: Set up temporary opt-out Vulkan support (#12614) 2025-10-15 14:18:01 -07:00
build_windows.ps1 vulkan: enable flash attention (#12937) 2025-11-04 10:31:22 -08:00
env.sh build: avoid unbounded parallel builds (#12319) 2025-09-18 14:57:01 -07:00
install.sh fix: own lib/ollama directory 2025-03-03 13:01:18 -08:00
push_docker.sh change `github.com/jmorganca/ollama` to `github.com/ollama/ollama` (#3347) 2024-03-26 13:04:17 -07:00
tag_latest.sh CI: clean up naming, fix tagging latest (#6832) 2024-09-16 16:18:41 -07:00