The open source platform for AI-native application development.
-
Updated
Dec 2, 2024 - Python
The open source platform for AI-native application development.
Higher performance OpenAI LLM service than vLLM serve: A pure C++ high-performance OpenAI LLM service implemented with GPRS+TensorRT-LLM+Tokenizers.cpp, supporting chat and function call, AI agents, distributed multi-GPU inference, multimodal capabilities, and a Gradio chat interface.
Started out as Dynamic Function Calling for OAI. Upon reviewing a research paper released (LATM) This is/has become a implementation of such system using: OpenAI and Autogen
A simple example that demonstrates how to use the function call feature of the OpenAI API
Claudetools is a Python library that enables function calling with the Claude 3 family of language models from Anthropic.
akshare-gpt 是一个开源工具,旨在将 Akshare 集成到 GPT 的工具中,实现自然语言问答。
YAICLI: A powerful command-line AI assistant with 25+ LLM providers. Features chat, command execution, quick queries, function calling, MCP support, and persistent history. Seamlessly integrates into workflows with smart environment detection.
✨ 基于代码生成和函数调用(function call)的大语言模型(LLM)智能体 ✨ 通过自然语言提问,使用大语言模型智能解析数据库结构,对数据进行智能多表结构化查询和统计计算,根据查询结果智能绘制多种图表。 支持自定义函数(function call)和Agent调用,多智能体协同。 基于代码生成的思维链(COT)。 实现智能体对用户的反问,解决用户提问模糊、不完整的情况。
A large model communication platform based on Baidu ERNIE and STreamlit
Enable any LLM to call tool in gpt format. Development based on Qwen2. If it cannot adapt to your LLM, your Pull Request is wanted.
Multi-Mission Tool Bench: Assessing the Robustness of LLM based Agents through Related and Dynamic Missions
The Time Complexity of Function Call in Python
FastAPI app that takes a users prompt, transforms it to a SQL query and fetches the result from an Azure SQL Database through an OpenAI function call.
Use tools with ollama python
Add a description, image, and links to the function-call topic page so that developers can more easily learn about it.
To associate your repository with the function-call topic, visit your repo's landing page and select "manage topics."