Skip to content

Commit b1fa491

Browse files
committed
add token caching, remove mention of other LLM providers no longer supported.
1 parent 243798f commit b1fa491

File tree

4 files changed

+25
-10
lines changed

4 files changed

+25
-10
lines changed

docs/getting-started/index.mdx

Lines changed: 2 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -41,10 +41,6 @@ Before using MyCoder with a specific provider, you need to provide the appropria
4141
export ANTHROPIC_API_KEY=your-api-key
4242
# or
4343
export OPENAI_API_KEY=your-api-key
44-
# or
45-
export MISTRAL_API_KEY=your-api-key
46-
# or
47-
export XAI_API_KEY=your-api-key
4844
```
4945

5046
2. Create a `.env` file in your working directory with the appropriate key:
@@ -61,10 +57,8 @@ MyCoder supports multiple AI providers:
6157
| Provider | Environment Variable | Models |
6258
| ---------- | -------------------- | ------------------------------------ |
6359
| Anthropic | `ANTHROPIC_API_KEY` | claude-3-opus, claude-3-sonnet, etc. |
64-
| OpenAI | `OPENAI_API_KEY` | gpt-4o, o3-mini, etc. |
65-
| Mistral AI | `MISTRAL_API_KEY` | mistral-large, mistral-medium, etc. |
66-
| xAI/Grok | `XAI_API_KEY` | grok-1 |
67-
| Ollama | N/A (local) | Various local models |
60+
| OpenAI | `OPENAI_API_KEY` | gpt-4o, gpt-4-turbo, etc. |
61+
| Ollama | N/A (local) | Models with tool calling support |
6862

6963
You can specify which provider and model to use with the `--provider` and `--model` options:
7064

docs/index.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -14,7 +14,7 @@ Currently available as a research preview, MyCoder is built to work alongside de
1414

1515
## Key Features
1616

17-
- **AI-Powered**: Supports multiple AI providers including Anthropic, OpenAI, Mistral AI, xAI/Grok, and Ollama
17+
- **AI-Powered**: Supports multiple AI providers including Anthropic, OpenAI, and Ollama
1818
- **Extensible Tool System**: Includes tools for file operations, shell commands, web browsing, and more
1919
- **Parallel Execution**: Can spawn sub-agents to work on different parts of a task simultaneously
2020
- **Self-Modification**: Capable of modifying code, including its own codebase

docs/providers/anthropic.md

Lines changed: 19 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -58,6 +58,25 @@ Anthropic offers several Claude models with different capabilities and price poi
5858
- Claude models have a 200K token context window, allowing for large codebases to be processed
5959
- For cost-sensitive applications, consider using Claude Haiku for simpler tasks
6060

61+
## Token Caching
62+
63+
MyCoder implements token caching for Anthropic's Claude models to optimize performance and reduce API costs:
64+
65+
- Token caching stores and reuses parts of the conversation history
66+
- The Anthropic provider uses Claude's native cache control mechanisms
67+
- This significantly reduces token usage for repeated or similar queries
68+
- Cache efficiency is automatically optimized based on conversation context
69+
70+
You can enable or disable token caching in your configuration:
71+
72+
```javascript
73+
export default {
74+
provider: 'anthropic',
75+
model: 'claude-3-7-sonnet-20250219',
76+
tokenCache: true, // Enable token caching (default is true)
77+
};
78+
```
79+
6180
## Troubleshooting
6281

6382
If you encounter issues with Anthropic's Claude:

docs/providers/openai.md

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -51,13 +51,15 @@ export default {
5151

5252
## Supported Models
5353

54-
OpenAI offers several models with different capabilities:
54+
MyCoder supports all OpenAI models that have tool/function calling capabilities. Here are some recommended models:
5555

5656
- `gpt-4o` (recommended) - Latest model with strong reasoning and tool-calling capabilities
5757
- `gpt-4-turbo` - Strong performance with 128K context window
5858
- `gpt-4` - Original GPT-4 model with 8K context window
5959
- `gpt-3.5-turbo` - More affordable option for simpler tasks
6060

61+
You can use any other OpenAI model that supports function calling with MyCoder. The OpenAI provider is not limited to just these listed models.
62+
6163
## Best Practices
6264

6365
- GPT-4o provides the best balance of performance and cost for most MyCoder tasks

0 commit comments

Comments
 (0)