Skip to content

Conversation

@mscolnick
Copy link
Contributor

@mscolnick mscolnick commented Sep 26, 2025

Move the LLM Provider configuration to its own page and give examples for each provider. Lots of folks have gotten tripped up on inferring from similar config

Closes #4586

@mscolnick mscolnick requested a review from akshayka as a code owner September 26, 2025 01:58
@vercel
Copy link

vercel bot commented Sep 26, 2025

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Preview Comments Updated (UTC)
marimo-docs Ready Ready Preview Comment Sep 26, 2025 1:59am

Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR moves LLM provider configuration documentation from the AI completion guide to a dedicated configuration page to improve discoverability and reduce confusion for users setting up LLM providers.

  • Extracts comprehensive LLM provider configuration from the AI completion guide
  • Creates a dedicated configuration page with examples for each supported provider
  • Replaces the original content with a concise reference to the new guide

Reviewed Changes

Copilot reviewed 3 out of 4 changed files in this pull request and generated 4 comments.

File Description
mkdocs.yml Adds new LLM Providers page to the Configuration section navigation
docs/guides/editor_features/ai_completion.md Removes detailed provider configuration content and redirects to new guide
docs/guides/configuration/llm_providers.md Creates comprehensive new guide with provider-specific configuration examples

Tip: Customize your code reviews with copilot-instructions.md. Create the file or learn how to get started.

chat_model = "openai/gpt-4o-mini" # Routes to OpenAI config
edit_model = "anthropic/claude-3-sonnet" # Routes to Anthropic config
autocomplete_model = "ollama/codellama" # Routes to Ollama config
autocomplete_model = "some_other_provider/some_model" # Routes to OpenAI compatible config
Copy link

Copilot AI Sep 26, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This line is confusing as it duplicates the autocomplete_model key from line 30. Consider using a different model role or removing this duplicate to avoid confusion about which value would be used.

Suggested change
autocomplete_model = "some_other_provider/some_model" # Routes to OpenAI compatible config

Copilot uses AI. Check for mistakes.
ollama ls
```

3. Start the Ollama server:
Copy link

Copilot AI Sep 26, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The list numbering is incorrect. This should be '3.' but the previous steps were not numbered consistently. The preceding content shows numbered steps ending at 2, so this should continue the sequence properly.

Copilot uses AI. Check for mistakes.
Comment on lines +246 to +248
5. Install the OpenAI client (`pip install openai` or `uv add openai`).

6. Start marimo:
Copy link

Copilot AI Sep 26, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The step numbering is inconsistent with the previous steps. Step 4 was visiting the URL, so these should be steps 5 and 6, but they don't align with the logical flow from the previous numbered list.

Copilot uses AI. Check for mistakes.
curl http://127.0.0.1:11434/v1/models
```

7. Configure `marimo.toml` (or use Settings):
Copy link

Copilot AI Sep 26, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The step numbering continues to be inconsistent. This should be the final configuration step, but the numbering doesn't follow a clear sequence from the previous steps.

Copilot uses AI. Check for mistakes.
@mscolnick mscolnick merged commit 715b076 into main Sep 26, 2025
25 checks passed
@mscolnick mscolnick deleted the ms/llm-provider-docs branch September 26, 2025 13:21
mscolnick added a commit that referenced this pull request Sep 26, 2025
This adds OpenRouter as a supported provider.

Docs were added here: #6529

<img width="1142" height="479" alt="Screenshot 2025-09-25 at 10 08
45 PM"
src="https://github.com/user-attachments/assets/0d8e8ba8-f5a3-42eb-a090-a6f195c5a31e"
/>


BREAK: This is technically breaking for those who used openai-compatible
config for open router. Now we router them to the openrouter config
which might not be set. They will get a warning and can update their
config.

Closes #6243
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

documentation Improvements or additions to documentation

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Better support for xAI/grok for AI configuration

2 participants