-
Notifications
You must be signed in to change notification settings - Fork 760
docs: LLM provider configuration #6529
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
The latest updates on your projects. Learn more about Vercel for GitHub.
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This PR moves LLM provider configuration documentation from the AI completion guide to a dedicated configuration page to improve discoverability and reduce confusion for users setting up LLM providers.
- Extracts comprehensive LLM provider configuration from the AI completion guide
- Creates a dedicated configuration page with examples for each supported provider
- Replaces the original content with a concise reference to the new guide
Reviewed Changes
Copilot reviewed 3 out of 4 changed files in this pull request and generated 4 comments.
| File | Description |
|---|---|
| mkdocs.yml | Adds new LLM Providers page to the Configuration section navigation |
| docs/guides/editor_features/ai_completion.md | Removes detailed provider configuration content and redirects to new guide |
| docs/guides/configuration/llm_providers.md | Creates comprehensive new guide with provider-specific configuration examples |
Tip: Customize your code reviews with copilot-instructions.md. Create the file or learn how to get started.
| chat_model = "openai/gpt-4o-mini" # Routes to OpenAI config | ||
| edit_model = "anthropic/claude-3-sonnet" # Routes to Anthropic config | ||
| autocomplete_model = "ollama/codellama" # Routes to Ollama config | ||
| autocomplete_model = "some_other_provider/some_model" # Routes to OpenAI compatible config |
Copilot
AI
Sep 26, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This line is confusing as it duplicates the autocomplete_model key from line 30. Consider using a different model role or removing this duplicate to avoid confusion about which value would be used.
| autocomplete_model = "some_other_provider/some_model" # Routes to OpenAI compatible config |
| ollama ls | ||
| ``` | ||
|
|
||
| 3. Start the Ollama server: |
Copilot
AI
Sep 26, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The list numbering is incorrect. This should be '3.' but the previous steps were not numbered consistently. The preceding content shows numbered steps ending at 2, so this should continue the sequence properly.
| 5. Install the OpenAI client (`pip install openai` or `uv add openai`). | ||
|
|
||
| 6. Start marimo: |
Copilot
AI
Sep 26, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The step numbering is inconsistent with the previous steps. Step 4 was visiting the URL, so these should be steps 5 and 6, but they don't align with the logical flow from the previous numbered list.
| curl http://127.0.0.1:11434/v1/models | ||
| ``` | ||
|
|
||
| 7. Configure `marimo.toml` (or use Settings): |
Copilot
AI
Sep 26, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The step numbering continues to be inconsistent. This should be the final configuration step, but the numbering doesn't follow a clear sequence from the previous steps.
This adds OpenRouter as a supported provider. Docs were added here: #6529 <img width="1142" height="479" alt="Screenshot 2025-09-25 at 10 08 45 PM" src="https://github.com/user-attachments/assets/0d8e8ba8-f5a3-42eb-a090-a6f195c5a31e" /> BREAK: This is technically breaking for those who used openai-compatible config for open router. Now we router them to the openrouter config which might not be set. They will get a warning and can update their config. Closes #6243
Move the LLM Provider configuration to its own page and give examples for each provider. Lots of folks have gotten tripped up on inferring from similar config
Closes #4586