You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
feat: Add multi-turn conversation support and OpenResponses/Ollama providers (withceleste#111)
* feat: add multi-turn conversation support with Message and Role types
Adds first-class conversation primitives for multi-turn workflows:
- New types: `Message` (role + content) and `Role` enum (user/assistant/system/developer)
- All text entrypoints now accept `messages=` parameter:
* Namespace: `celeste.text.generate(...)`, `celeste.text.stream.generate(...)`, `celeste.text.sync.generate(...)`
* Client: `create_client(modality="text", ...).generate(...)` and `.analyze(...)`
- When `messages` are provided, they take precedence over `prompt` (which becomes optional)
- Provider normalization handles different API formats:
* Anthropic: system/developer messages lifted to top-level `system`
* Google: system/developer messages lifted to `system_instruction`; assistant role becomes `model`
* Chat-style providers: `messages=[...]` arrays (Cohere/Mistral/Groq/DeepSeek/Moonshot)
* Responses-style providers: `input=[...]` arrays (OpenAI/xAI)
- Foundation for agents and multi-step workflows requiring conversation state persistence
Files changed:
- Core types: `src/celeste/types.py`
- Text IO: `src/celeste/modalities/text/io.py`
- Text client: `src/celeste/modalities/text/client.py`
- All 9 text provider clients updated
- Namespace API: `src/celeste/namespaces/domains.py`
- Public exports: `src/celeste/__init__.py`
* feat: add OpenResponses provider and Ollama local support
Adds support for OpenAI Responses-compatible APIs and local model hosting:
OpenResponses Provider:
- New `Provider.OPENRESPONSES` implementing OpenAI Responses API surface
- Compatible with `POST /v1/responses` endpoint + SSE parsing
- Supports structured outputs via `output_schema` → `text.format = json_schema`
- Normalizes usage fields to Celeste's unified usage model
- Supports `base_url=` parameter for custom API gateways
Ollama Provider:
- New `Provider.OLLAMA` as a local wrapper over OpenResponses protocol
- Default base URL: `http://localhost:11434`
- Uses `NoAuth` authentication (no headers required)
- Supports unregistered local models with parameter validation warnings
Infrastructure:
- Added `NoAuth` class for local providers that don't require authentication
- Added `base_url=` plumbing on text APIs for targeting local gateways
- Supports rapid local iteration with unregistered models
Files added:
- `src/celeste/providers/openresponses/` (full provider implementation)
- `src/celeste/modalities/text/providers/openresponses/` (text modality client)
- `src/celeste/providers/ollama/` (Ollama wrapper)
- `src/celeste/modalities/text/providers/ollama/` (Ollama text client)
Files modified:
- `src/celeste/core.py` (Provider enums)
- `src/celeste/auth.py` (NoAuth)
- `src/celeste/client.py` (base_url support)
- `src/celeste/modalities/text/models.py` (OLLAMA_MODELS)
- Provider exports updated
* fix: resolve mypy error and apply formatting fixes for messages feature
- Fix variable name collision in anthropic client (content -> prompt_content)
- Apply ruff formatting fixes to provider clients and namespace API
* fix: add class definition to ollama client for template contract
- Convert OllamaClient from alias to proper class inheriting OpenResponsesClient
- Apply ruff formatting fixes to openresponses provider files
* fix: exclude wrapper providers from template contract test
- Revert OllamaClient to simple alias (it's just a wrapper)
- Update test to skip wrapper providers like ollama that re-export another provider's client
- Wrapper providers don't need to match the template contract since they delegate to the wrapped provider
* chore: bump version to 0.9.1
0 commit comments