CodeRunner is a cross-platform MCP (Model Context Protocol) server that executes AI-generated code in a secure, sandboxed environment. It supports macOS via Apple's native containers and Linux/Windows via Docker.
Key use case: Process your local files (videos, images, documents, data) with remote LLMs like Claude or ChatGPT without uploading your files to the cloud. The LLM generates code that runs locally on your machine to analyze, transform, or process your files.
Without CodeRunner | With CodeRunner |
---|---|
LLM writes code, you run it manually | LLM writes and executes code, returns results |
Upload files to cloud for AI processing | Files stay on your machine, processed locally |
Install tools and dependencies yourself | Tools available in sandbox, auto-installs others |
Copy/paste scripts to run elsewhere | Code runs immediately, shows output/files |
LLM analyzes text descriptions of files | LLM directly processes your actual files |
Manage Python environments and packages | Pre-configured environment ready to use |
Prerequisites:
- For macOS: Apple Silicon (M1/M2/M3/M4) and the Apple Container tool installed.
- For Linux/Windows: Docker installed and running.
git clone https://github.com/instavm/coderunner.git
cd coderunner
chmod +x install.sh
./install.sh
The script will detect your operating system and set up CodeRunner accordingly.
- On macOS: The MCP server will be available at
http://coderunner.local:8222/mcp
- On Linux/Windows: The MCP server will be available at
http://localhost:8222/mcp
Install required packages for examples:
pip install -r examples/requirements.txt
Configure Claude Desktop to use CodeRunner as an MCP server:
-
Copy the example configuration:
cd examples cp claude_desktop/claude_desktop_config.example.json claude_desktop/claude_desktop_config.json
-
Edit the configuration file and replace the placeholder paths:
- Replace
/path/to/your/python
with your actual Python path (e.g.,/usr/bin/python3
or/opt/homebrew/bin/python3
) - Replace
/path/to/coderunner
with the actual path to your cloned repository
Example after editing:
{ "mcpServers": { "coderunner": { "command": "/opt/homebrew/bin/python3", "args": ["/Users/yourname/coderunner/examples/claude_desktop/mcpproxy.py"] } } }
- Replace
-
Update Claude Desktop configuration:
- Open Claude Desktop
- Go to Settings → Developer
- Add the MCP server configuration
- Restart Claude Desktop
-
Start using CodeRunner in Claude: You can now ask Claude to execute code, and it will run safely in the sandbox!
Use CodeRunner with OpenAI's Python agents library:
-
Set your OpenAI API key:
export OPENAI_API_KEY="your-openai-api-key-here"
-
Run the client:
python examples/openai_agents/openai_client.py
-
Start coding: Enter prompts like "write python code to generate 100 prime numbers" and watch it execute safely in the sandbox!
Gemini CLI is recently launched by Google.
~/.gemini/settings.json
{
"theme": "Default",
"selectedAuthType": "oauth-personal",
"mcpServers": {
"coderunner": {
"httpUrl": "http://coderunner.local:8222/mcp"
}
}
}
Kiro is recently launched by Amazon.
~/.kiro/settings/mcp.json
{
"mcpServers": {
"coderunner": {
"command": "/path/to/venv/bin/python",
"args": [
"/path/to/coderunner/examples/claude_desktop/mcpproxy.py"
],
"disabled": false,
"autoApprove": [
"execute_python_code"
]
}
}
}
Code runs in an isolated container with VM-level isolation. Your host system and files outside the sandbox remain protected.
From @apple/container:
Each container has the isolation properties of a full VM, using a minimal set of core utilities and dynamic libraries to reduce resource utilization and attack surface.
On Linux and Windows, CodeRunner uses Docker for similar containerization and security benefits.
CodeRunner consists of:
- Sandbox Container: Isolated execution environment (Apple Container or Docker) with a Jupyter kernel.
- MCP Server: Handles communication between AI models and the sandbox.
The examples/
directory contains:
openai-agents
- Example OpenAI agents integrationclaude-desktop
- Example Claude Desktop integration
We welcome contributions! Please see CONTRIBUTING.md for guidelines.
This project is licensed under the Apache 2.0 License - see the LICENSE file for details.