Eko (Eko Keeps Operating) - Build Production-ready Agentic Workflow with Natural Language - eko.fellou.ai
-
Updated
Mar 11, 2025 - TypeScript
Eko (Eko Keeps Operating) - Build Production-ready Agentic Workflow with Natural Language - eko.fellou.ai
Minimalist web-searching platform with an AI assistant that runs directly from your browser. Uses WebLLM, Wllama and SearXNG. Demo: https://felladrin-minisearch.hf.space
This project collects GPU benchmarks from various cloud providers and compares them to fixed per token costs. Use our tool for efficient LLM GPU selections and cost-effective AI models. LLM provider price comparison, gpu benchmarks to price per token calculation, gpu benchmark table
Test and evaluate LLMs and model configurations, across all the scenarios that matter for your application
Multi-Agent Conversation Framework in TypeScript
It's time for a paradigm shift! The future of software is in plain English ✨
Chrome extension that interacts with content using Groq
Empower Your Productivity with Local AI Assistants
A Framework for Narrative Agents
LinguFlow, a low-code tool designed for LLM application development, simplifies the building, debugging, and deployment process for developers.
Enhanced browsing with LLMs
Official TypeScript wrapper for DeepInfra Inference API
🔬 Experiment is a feature-rich chat interface for Large Language Models (LLMs) like Anthropic, OpenAI, and Mistral.
A minimalistic LLM chat UI
The world's first Dockerfile-based Autonomous Coding Agent
LLM Warehouse is a platform for showcasing LLMs, allowing users to explore each model and try them out using the web-app interface.
Add a description, image, and links to the llm-inference topic page so that developers can more easily learn about it.
To associate your repository with the llm-inference topic, visit your repo's landing page and select "manage topics."