You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+101-1Lines changed: 101 additions & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -26,6 +26,21 @@ Although this fork introduces breaking changes and a substantial rewrite, I've t
26
26
27
27
In particular, the model definition flow was carefully designed to quickly add custom model profiles for specific cases and easily switch between them or assign them to custom commands.
28
28
29
+
**[How Does It Compare To X](./doc/how-does-it-compare-to.md)**
30
+
31
+
**[Demo](#commands)**
32
+
33
+
**[Configuration](#configuration)**
34
+
35
+
**[Command Definition](#configuration)**
36
+
37
+
**[Model Specification](#model-specification)**
38
+
39
+
**[Templates](#templates)**
40
+
41
+
**[Example Config](#example-configuration)**
42
+
43
+
29
44
30
45
## Installation
31
46
@@ -175,7 +190,7 @@ require("codegpt").setup({
175
190
})
176
191
```
177
192
178
-
### Overriding Command Configurations
193
+
### Overriding Command Definitions
179
194
180
195
The configuration table `commands` can be used to override command configurations.
181
196
@@ -205,6 +220,56 @@ commands = {
205
220
}
206
221
```
207
222
223
+
### Model Specification
224
+
225
+
The `models` table defines available LLM models for each provider. Models are
226
+
organized by provider type and can inherit parameters from other models.
227
+
228
+
```lua
229
+
models= {
230
+
default="gpt-3.5-turbo", -- Global default model
231
+
ollama= {
232
+
default="gemma3:1b", -- Ollama default model
233
+
['qwen3:4b'] = {
234
+
alias="qwen3", -- Alias to call this model
235
+
max_tokens=8192,
236
+
temperature=0.8,
237
+
append_string='/no_thinking', -- Custom string to append to the prompt
238
+
},
239
+
},
240
+
openai= {
241
+
["gpt-3.5-turbo"] = {
242
+
alias="gpt35",
243
+
max_tokens=4096,
244
+
temperature=0.8,
245
+
},
246
+
},
247
+
}
248
+
```
249
+
250
+
#### Inheritance
251
+
252
+
Models can inherit parameters from other models using the `from` field. For example:
253
+
```lua
254
+
["gpt-foo"] = {
255
+
from="gpt-3.5-turbo", -- Inherit from openai's default
256
+
temperature=0.7, -- Override temperature
257
+
}
258
+
```
259
+
260
+
#### Aliases
261
+
262
+
Use `alias` to create shorthand names for models.
263
+
264
+
#### Override defaults
265
+
266
+
Specify model parameters like `max_tokens`, `temperature`, and `append_string`
267
+
to customize behavior. see `lua/codegpt/config.lua` file for the full config specification.
268
+
269
+
#### Model Selection
270
+
271
+
- Call `:lua codegpt.select_model()` to interactively choose a model via UI.
272
+
208
273
### UI Configuration
209
274
210
275
```lua
@@ -236,6 +301,8 @@ hooks = {
236
301
237
302
## Templates
238
303
304
+
You can use macros inside the user/system message templates when defining a command.
305
+
239
306
The `system_message_template` and `user_message_template` can contain template macros. For example:
240
307
241
308
| macro | description |
@@ -246,6 +313,39 @@ The `system_message_template` and `user_message_template` can contain template m
246
313
|`{{command_args}}`| Everything passed to the command as an argument, joined with spaces. |
247
314
|`{{language_instructions}}`| The found value in the `language_instructions` map. |
248
315
316
+
### Template Examples
317
+
318
+
Here is are a few examples to demonstrate how to use them:
319
+
320
+
```lua
321
+
commands= {
322
+
--- other commands
323
+
cli_helpgen= {
324
+
system_message_template='You are a documentation assistant to a software developer. Generate documentation for a CLI app using the --help flag style, including usage and options. Only output the help text and nothing else.',
325
+
user_message_template='App name and usage:\n\n```{{filetype}}\n{{text_selection}}```\n. {{command_args}}. . {{language_instructions}}',
326
+
model='codestral',
327
+
language_instructions= {
328
+
python='Use a standard --help flag style, including usage and options, with example usage if needed.',
329
+
},
330
+
},
331
+
rs_mod_doc= {
332
+
system_message_template='You are a Rust documentation assistant. Given the provided source code, add appropriate module-level documentation that goes at the top of the file. Use the `//!` comment format and example sections as necessary. Include explanations for what each function in the module.',
333
+
user_message_template='Source code:\n```{{filetype}}\n{{text_selection}}\n```\n. {{command_args}}. Generate the doc using module level rust comments `//!` ',
334
+
},
335
+
336
+
-- dummy command to showcase the use of chat_history
337
+
acronym= {
338
+
system_message_template='You are a helpful {{filetype}} programming assistant that abbreviates identifiers and variables..',
There are dozens of vim/neovim AI plugins, but I've always favored the minimalist design and efficient workflow of the original `CodeGPT.nvim` plugin. When I noticed the project was no longer actively maintained, I began patching it to meet my needs. Eventually, I decided to fork the project and clean up the codebase, adding documentation for new features in hopes of helping others who value simplicity and speed.
5
+
6
+
My workflow is heavily focused on Ollama, so most testing has been done with the Ollama API and OpenAI-compatible endpoints. Contributions are always welcome.
7
+
8
+
## How Does It Compare To X
9
+
10
+
The goal of `codegpt-ng` is not to outperform other plugins or replicate the functionality of tools like Copilot or Cursor. Instead, it prioritizes minimalism and seamless integration with Vim's native capabilities. Review the demos to see if this approach aligns with your coding style.
11
+
12
+
### Key Differentiators
13
+
14
+
-**Templates**: The standout feature of `codegpt-ng` is its powerful [template system](../README.md#templates), enabling users to define custom commands quickly without writing Lua code.
15
+
16
+
---
17
+
18
+
### CodeCompanion
19
+
20
+
CodeCompanion aims to deliver a full AI-first development experience akin to Cursor, offering a wide array of features. While this may suit some users, I find many of these features unnecessary or easily achievable on `codegpt-ng` with minimal overhead and cognitive load. For developers who prefer lightweight tools with high customizability, `codegpt-ng` is a better fit.
21
+
22
+
---
23
+
24
+
### Ollama.nvim
25
+
26
+
-**Recommended for**: Users seeking an Ollama management interface alongside AI capabilities.
27
+
-**codegpt-ng's Advantage**: Full Ollama compatibility with the flexibility to switch models, define custom configurations, and tailor parameter mixes for specific commands or use cases—all without requiring additional plugins or complex setups.
0 commit comments