docs: add Claude Code integration guide
This commit is contained in:
parent
b44d9b3347
commit
515c46c176
|
|
@ -6,7 +6,7 @@ Ollama provides compatibility with the [Anthropic Messages API](https://docs.ant
|
||||||
|
|
||||||
## Recommended models
|
## Recommended models
|
||||||
|
|
||||||
For coding use cases, models like `qwen3-coder` are recommended.
|
For coding use cases, models like `glm-4.7:cloud`, `minimax-m2.1:cloud`, and `qwen3-coder` are recommended.
|
||||||
|
|
||||||
Pull a model before use:
|
Pull a model before use:
|
||||||
```shell
|
```shell
|
||||||
|
|
@ -206,7 +206,7 @@ curl -X POST http://localhost:11434/v1/messages \
|
||||||
|
|
||||||
## Using with Claude Code
|
## Using with Claude Code
|
||||||
|
|
||||||
[Claude Code](https://docs.anthropic.com/en/docs/claude-code) can be configured to use Ollama as its backend:
|
[Claude Code](https://code.claude.com/docs/en/overview) can be configured to use Ollama as its backend:
|
||||||
|
|
||||||
```shell
|
```shell
|
||||||
ANTHROPIC_BASE_URL=http://localhost:11434 ANTHROPIC_API_KEY=ollama claude --model qwen3-coder
|
ANTHROPIC_BASE_URL=http://localhost:11434 ANTHROPIC_API_KEY=ollama claude --model qwen3-coder
|
||||||
|
|
|
||||||
|
|
@ -32,7 +32,9 @@
|
||||||
"codeblocks": "system"
|
"codeblocks": "system"
|
||||||
},
|
},
|
||||||
"contextual": {
|
"contextual": {
|
||||||
"options": ["copy"]
|
"options": [
|
||||||
|
"copy"
|
||||||
|
]
|
||||||
},
|
},
|
||||||
"navbar": {
|
"navbar": {
|
||||||
"links": [
|
"links": [
|
||||||
|
|
@ -52,7 +54,9 @@
|
||||||
"display": "simple"
|
"display": "simple"
|
||||||
},
|
},
|
||||||
"examples": {
|
"examples": {
|
||||||
"languages": ["curl"]
|
"languages": [
|
||||||
|
"curl"
|
||||||
|
]
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
"redirects": [
|
"redirects": [
|
||||||
|
|
@ -97,6 +101,7 @@
|
||||||
{
|
{
|
||||||
"group": "Integrations",
|
"group": "Integrations",
|
||||||
"pages": [
|
"pages": [
|
||||||
|
"/integrations/claude-code",
|
||||||
"/integrations/vscode",
|
"/integrations/vscode",
|
||||||
"/integrations/jetbrains",
|
"/integrations/jetbrains",
|
||||||
"/integrations/codex",
|
"/integrations/codex",
|
||||||
|
|
|
||||||
|
|
@ -0,0 +1,69 @@
|
||||||
|
---
|
||||||
|
title: Claude Code
|
||||||
|
---
|
||||||
|
|
||||||
|
## Install
|
||||||
|
|
||||||
|
Install [Claude Code](https://code.claude.com/docs/en/overview):
|
||||||
|
|
||||||
|
<CodeGroup>
|
||||||
|
|
||||||
|
```shell macOS / Linux
|
||||||
|
curl -fsSL https://claude.ai/install.sh | bash
|
||||||
|
```
|
||||||
|
|
||||||
|
```powershell Windows
|
||||||
|
irm https://claude.ai/install.ps1 | iex
|
||||||
|
```
|
||||||
|
|
||||||
|
</CodeGroup>
|
||||||
|
|
||||||
|
## Usage with Ollama
|
||||||
|
|
||||||
|
Claude Code connects to Ollama using the Anthropic-compatible API.
|
||||||
|
|
||||||
|
1. Set the environment variables:
|
||||||
|
|
||||||
|
```shell
|
||||||
|
export ANTHROPIC_BASE_URL=http://localhost:11434
|
||||||
|
export ANTHROPIC_API_KEY=ollama
|
||||||
|
```
|
||||||
|
|
||||||
|
2. Run Claude Code with an Ollama model:
|
||||||
|
|
||||||
|
```shell
|
||||||
|
claude --model qwen3-coder
|
||||||
|
```
|
||||||
|
|
||||||
|
Or run with environment variables inline:
|
||||||
|
|
||||||
|
```shell
|
||||||
|
ANTHROPIC_BASE_URL=http://localhost:11434 ANTHROPIC_API_KEY=ollama claude --model qwen3-coder
|
||||||
|
```
|
||||||
|
|
||||||
|
## Connecting to ollama.com
|
||||||
|
|
||||||
|
1. Create an [API key](https://ollama.com/settings/keys) on ollama.com
|
||||||
|
2. Set the environment variables:
|
||||||
|
|
||||||
|
```shell
|
||||||
|
export ANTHROPIC_BASE_URL=https://ollama.com
|
||||||
|
export ANTHROPIC_API_KEY=<your-api-key>
|
||||||
|
```
|
||||||
|
|
||||||
|
3. Run Claude Code with a cloud model:
|
||||||
|
|
||||||
|
```shell
|
||||||
|
claude --model glm-4.7:cloud
|
||||||
|
```
|
||||||
|
|
||||||
|
## Recommended Models
|
||||||
|
|
||||||
|
### Cloud models
|
||||||
|
- `glm-4.7:cloud` - High-performance cloud model
|
||||||
|
- `minimax-m2.1:cloud` - Fast cloud model
|
||||||
|
- `qwen3-coder:480b` - Large coding model
|
||||||
|
|
||||||
|
### Local models
|
||||||
|
- `qwen3-coder` - Excellent for coding tasks
|
||||||
|
- `gpt-oss:20b` - Strong general-purpose model
|
||||||
Loading…
Reference in New Issue