lmstudio
Here are 10 public repositories matching this topic...
Lightweight & fast AI inference proxy for self-hosted LLMs backends like Ollama, LM Studio and others. Designed for speed, simplicity and local-first deployments.
-
Updated
Oct 1, 2025 - Go
Go SDK and CLI for LM Studio: manage, interact, and run LLMs via WebSocket API
-
Updated
Sep 8, 2025 - Go
Go SDK and CLI for LM Studio: manage, interact, and run LLMs via WebSocket API
-
Updated
May 13, 2025 - Go
Generate synthetic datasets using local LLMs via Ollama and LMstudio with Llama 3.3, DeepSeek-R1, Phi-4, Gemma 3, Mistral Small 3.1 and other major language models.
-
Updated
Oct 6, 2025 - Go
Improve this page
Add a description, image, and links to the lmstudio topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the lmstudio topic, visit your repo's landing page and select "manage topics."