Stars
DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
An Extensible Toolkit for Finetuning and Inference of Large Foundation Models. Large Models for All.
Fengshenbang-LM(封神榜大模型)是IDEA研究院认知计算与自然语言研究中心主导的大模型开源体系,成为中文AIGC和认知智能的基础设施。
Code and documentation to train Stanford's Alpaca models, and generate the data.
OpenBLAS is an optimized BLAS library based on GotoBLAS2 1.13 BSD version.
A machine learning compiler for GPUs, CPUs, and ML accelerators
Running large language models on a single GPU for throughput-oriented scenarios.
Making large AI models cheaper, faster and more accessible