Skip to content

Conversation

@lecheel
Copy link

@lecheel lecheel commented Feb 28, 2025

Add VIM command mode

- Implements the `Xai` struct and `LLM` trait for interacting with the X.AI API.
- Includes error handling for missing API key.
- Supports streaming responses and termination.
- Implements the `Gemini` struct and `LLM` trait for interacting with the Gemini API.
- Includes error handling for missing API key.
- Supports streaming responses and termination.
- Adds command mode with `:o`, `:w`, `:clear`, `:save`, `:quit` commands
- Implements file loading, response saving, chat clearing, and saving chat history
- Updates prompt rendering to display mode and cursor position
- Adds LLM status indicator
- Add 'm' command load messages from a fixed file
- Add screenshots demonstrating new features
- Pass config to `Chat::new`
- Add optional output file writing
- Refactor LLM answer handling for file writing
- Remove `gemini.rs` and `xai.rs` files
- Remove related config structs
- Uncomment release profile settings in Cargo.toml
- Enable LTO, stripping, and set codegen units to 1
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant