LMDeploy is a toolkit for compressing, deploying, and serving LLMs.
-
Updated
Nov 7, 2025 - Python
LMDeploy is a toolkit for compressing, deploying, and serving LLMs.
Firefly: 大模型训练工具,支持训练Qwen2.5、Qwen2、Yi1.5、Phi-3、Llama3、Gemma、MiniCPM、Yi、Deepseek、Orion、Xverse、Mixtral-8x7B、Zephyr、Mistral、Baichuan2、Llma2、Llama、Qwen、Baichuan、ChatGLM2、InternLM、Ziya2、Vicuna、Bloom等大模型
Openai style api for open large language models, using LLMs just as chatgpt! Support for LLaMA, LLaMA-2, BLOOM, Falcon, Baichuan, Qwen, Xverse, SqlCoder, CodeLLaMA, ChatGLM, ChatGLM2, ChatGLM3 etc. 开源大模型的统一后端接口
🐋MindChat(漫谈)——心理大模型:漫谈人生路, 笑对风霜途
InternEvo is an open-sourced lightweight training framework aims to support model pre-training without the need for extensive dependencies.
A unified tool to generate fine-tuning datasets for LLMs, including questions, answers, and dialogues. ✨🤖📚💬
AgriMind: The smart LLM application that every farmer truly needs
internlm funetuning
Speed benchmarking a 7B LLM on different gcloud VMs (using llama.cpp)
Add a description, image, and links to the internlm topic page so that developers can more easily learn about it.
To associate your repository with the internlm topic, visit your repo's landing page and select "manage topics."