Skip to content

Commit d99edd3

Browse files
committed
doc: how does it compare to x
Signed-off-by: blob42 <contact@blob42.xyz>
1 parent c5d9c41 commit d99edd3

File tree

2 files changed

+128
-1
lines changed

2 files changed

+128
-1
lines changed

README.md

Lines changed: 101 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -26,6 +26,21 @@ Although this fork introduces breaking changes and a substantial rewrite, I've t
2626

2727
In particular, the model definition flow was carefully designed to quickly add custom model profiles for specific cases and easily switch between them or assign them to custom commands.
2828

29+
**[How Does It Compare To X](./doc/how-does-it-compare-to.md)**
30+
31+
**[Demo](#commands)**
32+
33+
**[Configuration](#configuration)**
34+
35+
**[Command Definition](#configuration)**
36+
37+
**[Model Specification](#model-specification)**
38+
39+
**[Templates](#templates)**
40+
41+
**[Example Config](#example-configuration)**
42+
43+
2944

3045
## Installation
3146

@@ -175,7 +190,7 @@ require("codegpt").setup({
175190
})
176191
```
177192

178-
### Overriding Command Configurations
193+
### Overriding Command Definitions
179194

180195
The configuration table `commands` can be used to override command configurations.
181196

@@ -205,6 +220,56 @@ commands = {
205220
}
206221
```
207222

223+
### Model Specification
224+
225+
The `models` table defines available LLM models for each provider. Models are
226+
organized by provider type and can inherit parameters from other models.
227+
228+
```lua
229+
models = {
230+
default = "gpt-3.5-turbo", -- Global default model
231+
ollama = {
232+
default = "gemma3:1b", -- Ollama default model
233+
['qwen3:4b'] = {
234+
alias = "qwen3", -- Alias to call this model
235+
max_tokens = 8192,
236+
temperature = 0.8,
237+
append_string = '/no_thinking', -- Custom string to append to the prompt
238+
},
239+
},
240+
openai = {
241+
["gpt-3.5-turbo"] = {
242+
alias = "gpt35",
243+
max_tokens = 4096,
244+
temperature = 0.8,
245+
},
246+
},
247+
}
248+
```
249+
250+
#### Inheritance
251+
252+
Models can inherit parameters from other models using the `from` field. For example:
253+
```lua
254+
["gpt-foo"] = {
255+
from = "gpt-3.5-turbo", -- Inherit from openai's default
256+
temperature = 0.7, -- Override temperature
257+
}
258+
```
259+
260+
#### Aliases
261+
262+
Use `alias` to create shorthand names for models.
263+
264+
#### Override defaults
265+
266+
Specify model parameters like `max_tokens`, `temperature`, and `append_string`
267+
to customize behavior. see `lua/codegpt/config.lua` file for the full config specification.
268+
269+
#### Model Selection
270+
271+
- Call `:lua codegpt.select_model()` to interactively choose a model via UI.
272+
208273
### UI Configuration
209274

210275
```lua
@@ -236,6 +301,8 @@ hooks = {
236301

237302
## Templates
238303

304+
You can use macros inside the user/system message templates when defining a command.
305+
239306
The `system_message_template` and `user_message_template` can contain template macros. For example:
240307

241308
| macro | description |
@@ -246,6 +313,39 @@ The `system_message_template` and `user_message_template` can contain template m
246313
| `{{command_args}}` | Everything passed to the command as an argument, joined with spaces. |
247314
| `{{language_instructions}}` | The found value in the `language_instructions` map. |
248315

316+
### Template Examples
317+
318+
Here is are a few examples to demonstrate how to use them:
319+
320+
```lua
321+
commands = {
322+
--- other commands
323+
cli_helpgen = {
324+
system_message_template = 'You are a documentation assistant to a software developer. Generate documentation for a CLI app using the --help flag style, including usage and options. Only output the help text and nothing else.',
325+
user_message_template = 'App name and usage:\n\n```{{filetype}}\n{{text_selection}}```\n. {{command_args}}. . {{language_instructions}}',
326+
model = 'codestral',
327+
language_instructions = {
328+
python = 'Use a standard --help flag style, including usage and options, with example usage if needed.',
329+
},
330+
},
331+
rs_mod_doc = {
332+
system_message_template = 'You are a Rust documentation assistant. Given the provided source code, add appropriate module-level documentation that goes at the top of the file. Use the `//!` comment format and example sections as necessary. Include explanations for what each function in the module.',
333+
user_message_template = 'Source code:\n```{{filetype}}\n{{text_selection}}\n```\n. {{command_args}}. Generate the doc using module level rust comments `//!` ',
334+
},
335+
336+
-- dummy command to showcase the use of chat_history
337+
acronym = {
338+
system_message_template = 'You are a helpful {{filetype}} programming assistant that abbreviates identifiers and variables..',
339+
user_message_template = 'abbreviate ```{{filetype}}\n{{text_selection}}```\n {{command_args}}',
340+
chat_history = {
341+
{ role = 'user', content = 'abbreviate `configure_user_script`' },
342+
{ role = 'assistant', content = 'c_u_s' },
343+
{ role = 'user', content = 'abbreviate ```lua\nlocal = search_common_pattern = {}```\n' },
344+
{ role = 'assistant', content = 'local = s_c_p = {}' },
345+
},
346+
},
347+
```
348+
249349
## Callback Types
250350

251351
| name | Description |

doc/how-does-it-compare-to.md

Lines changed: 27 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,27 @@
1+
2+
## Rationale
3+
4+
There are dozens of vim/neovim AI plugins, but I've always favored the minimalist design and efficient workflow of the original `CodeGPT.nvim` plugin. When I noticed the project was no longer actively maintained, I began patching it to meet my needs. Eventually, I decided to fork the project and clean up the codebase, adding documentation for new features in hopes of helping others who value simplicity and speed.
5+
6+
My workflow is heavily focused on Ollama, so most testing has been done with the Ollama API and OpenAI-compatible endpoints. Contributions are always welcome.
7+
8+
## How Does It Compare To X
9+
10+
The goal of `codegpt-ng` is not to outperform other plugins or replicate the functionality of tools like Copilot or Cursor. Instead, it prioritizes minimalism and seamless integration with Vim's native capabilities. Review the demos to see if this approach aligns with your coding style.
11+
12+
### Key Differentiators
13+
14+
- **Templates**: The standout feature of `codegpt-ng` is its powerful [template system](../README.md#templates), enabling users to define custom commands quickly without writing Lua code.
15+
16+
---
17+
18+
### CodeCompanion
19+
20+
CodeCompanion aims to deliver a full AI-first development experience akin to Cursor, offering a wide array of features. While this may suit some users, I find many of these features unnecessary or easily achievable on `codegpt-ng` with minimal overhead and cognitive load. For developers who prefer lightweight tools with high customizability, `codegpt-ng` is a better fit.
21+
22+
---
23+
24+
### Ollama.nvim
25+
26+
- **Recommended for**: Users seeking an Ollama management interface alongside AI capabilities.
27+
- **codegpt-ng's Advantage**: Full Ollama compatibility with the flexibility to switch models, define custom configurations, and tailor parameter mixes for specific commands or use cases—all without requiring additional plugins or complex setups.

0 commit comments

Comments
 (0)