Highlights
- Pro
Stars
Implementation of Newton-CG algorithm with backtracking line-search, for PyTorch.
(NeurIPS 2021) Revisiting Deep Learning Models for Tabular Data
Codomain attention neural operator for single to multi-physics PDE adaptation.
A curated, comprehensive collection of open-source AI tools, frameworks, datasets, courses, and seminal papers.
Simple repository for training small reasoning models
(ICLR 2025 Spotlight) TabReD: Analyzing Pitfalls and Filling the Gaps in Tabular Deep Learning Benchmarks
ML models + benchmark for tabular data classification and regression
code for "Mesh-Informed Neural Operator : A Transformer Generative Approach"
Build Real-Time Knowledge Graphs for AI Agents
[ICLR 2024] Scaling physics-informed hard constraints with mixture-of-experts.
(CVPR 2025) Code of "Chat2SVG: Vector Graphics Generation with Large Language Models and Image Diffusion Models"
StarVector is a foundation model for SVG generation that transforms vectorization into a code generation task. Using a vision-language modeling architecture, StarVector processes both visual and te…
(ICLR 2025) TabM: Advancing Tabular Deep Learning With Parameter-Efficient Ensembling
On Finetuning Tabular Foundation Models Paper Code
ai-forever / gigachain
Forked from langchain-ai/langchain⚡ Набор решений для разработки LLM-приложений на русском языке с поддержкой GigaChat ⚡
Everything you need to reproduce "Better plain ViT baselines for ImageNet-1k" in PyTorch, and more
[NeurIPS 2024] Mixture of Experts for Audio-Visual Learning
TPAMI:Frequency-aware Feature Fusion for Dense Image Prediction
[AAAI 2024] Official code for Efficient Deweather Mixture-of-Experts with Uncertainty-aware Feature-wise Linear Modulation
Official implementation of AAAI-2024 paper "Frequency-Adaptive Pan-Sharpening with Mixture of Experts"
[ICLR 2025 Spotlight] Official implementation of "Time-MoE: Billion-Scale Time Series Foundation Models with Mixture of Experts"
a free and opensource app that lets you gain an unfair advantage