An AI-powered Git commit message generator written in Rust that automatically stages your changes and creates Conventional Commit messages. Works against the OpenAI API, a local Ollama instance, or any OpenAI-compatible endpoint (LiteLLM, LocalAI, vLLM, ...).
git-cmt-rs automatically stages all changes with git add ., analyzes the diff, and generates a meaningful commit message using whichever chat-completions backend you point it at. It follows the Conventional Commits specification and provides an interactive commit experience with editor review.
Feel free to tweak the code to try different models, providers, or prompt templates. The implementation is simple and hackable.
This project is a Rust implementation of the Go based git-cmd project found here
- 🤖 AI-powered: Works with hosted OpenAI (default:
gpt-4.1-mini), local Ollama, or any OpenAI-compatible proxy - 🔌 Backend-agnostic: Switch providers by setting
OPENAI_BASE_URL— no code changes - 📝 Conventional Commits: Generates messages in the
type(scope): descriptionformat - 🎯 Smart Analysis: Understands code changes and suggests contextually appropriate messages
- ✅ Push Confirmation: Asks for y/n confirmation before pushing to remote
- ⚡ Interactive: Opens your editor for final review and editing before committing
- 📦 Auto-staging: Automatically stages all changes with
git add .before analysis - 🔍 Diff-aware: Analyzes changes to generate contextually appropriate messages
- 📏 Length-aware: Keeps commit messages concise (50 chars max for description)
- Rust (1.70+ recommended)
- Git
- One of: an OpenAI API key, a running Ollama instance, or another OpenAI-compatible endpoint
git clone https://github.com/AaronSaikovski/git-cmt-rs
cd git-cmt-rs
cargo build --release# Move binary to a directory in your PATH
sudo mv target/release/git-cmt-rs /usr/local/bin/git-cmt-rs- Set your OpenAI API key:
export OPENAI_API_KEY="your-api-key-here" or $env:export OPENAI_API_KEY="your-api-key-here"
- (Optional) Override model or base URL:
export OPENAI_MODEL="gpt-4.1-mini" export OPENAI_BASE_URL="https://api.openai.com/v1"
- (Optional) For strict JSON Schema enforcement (hosted OpenAI only):
export OPENAI_RESPONSE_FORMAT="json_schema"
git-cmt-rs speaks the OpenAI HTTP shape, so it works against Ollama's
OpenAI-compatible endpoint with no code changes:
# Start Ollama and pull a code-capable model
ollama serve &
ollama pull qwen2.5-coder
# Point git-cmt-rs at the local endpoint
export OPENAI_BASE_URL="http://localhost:11434/v1"
export OPENAI_MODEL="qwen2.5-coder" # or llama3.2, mistral, codellama, ...
unset OPENAI_API_KEY # Ollama ignores authThe default OPENAI_RESPONSE_FORMAT is json_object, which Ollama supports.
Don't set it to json_schema — Ollama's /v1 endpoint doesn't implement
OpenAI's strict structured-outputs feature.
export OPENAI_BASE_URL="http://localhost:4000/v1" # whatever your proxy serves
export OPENAI_MODEL="your-model-name"
# Set OPENAI_API_KEY only if your proxy requires it; otherwise leave it unset.If your proxy supports strict JSON Schema, you can opt in with
OPENAI_RESPONSE_FORMAT=json_schema. If it rejects response_format
entirely, use OPENAI_RESPONSE_FORMAT=none.
- Run the tool (stages all changes automatically):
git-cmt-rs
- The editor opens for final review and editing of the commit message.
- Save and close the editor to create the commit.
- After commit, confirm whether to push to remote (y/n).
- If confirmed, changes are pushed; if declined, commit stays local.
The tool automatically stages all changes with git add . before analyzing and generating a commit message.
- Auto-staging: Stages all changes with
git add . - Diff Analysis: Reads staged changes with
git diff --cached -b(truncated to 3072 chars if necessary) - AI Processing: Sends the diff to the configured LLM backend (OpenAI / Ollama / proxy) with structured prompts; response format defaults to
json_objectfor broad compatibility, with opt-injson_schemafor hosted OpenAI - Message Generation: Produces a commit object with
type,scope, andmessage - Interactive Commit: Opens your editor with the message for final review and editing
- Create Commit: Runs
git commitwith the approved message - Push Confirmation: Asks user to confirm push to remote (y/n)
- Final Push: Runs
git pushif confirmed, or exits with commit saved locally if declined
type(scope): description
- Types: feat, fix, docs, style, refactor, test, chore
- Scope: Optional component/module name
- Description: Clear, concise summary (max 50 chars)
$ git-cmt-rs
Staged all changes with `git add .`
Staged diff found; generating message for changes...
Parsed commit: type='feat', scope='auth', message='add OAuth2 login integration'
# Opens editor for final review
# Save and close editor to commit
Commit created successfully.
Push commit to remote? (y/n): y
Changes pushed successfully!$ git-cmt-rs
Staged all changes with `git add .`
Staged diff found; generating message for changes...
Parsed commit: type='fix', scope='api', message='resolve null pointer in validation'
# Opens editor for final review
# Save and close editor to commit
Commit created successfully.
Push commit to remote? (y/n): n
Push cancelled. Commit saved locally.Users can respond with n or no at the push confirmation to keep the commit local without pushing to remote:
$ git-cmt-rs
Staged all changes with `git add .`
...
Commit created successfully.
Push commit to remote? (y/n): n
Push cancelled. Commit saved locally.OPENAI_API_KEY– API key (required for hosted OpenAI; optional for Ollama and most local proxies)OPENAI_MODEL– model to use (default:gpt-4.1-mini)OPENAI_BASE_URL– API endpoint (default:https://api.openai.com/v1)OPENAI_RESPONSE_FORMAT– one of:json_object(default) – broad compatibility (OpenAI, Ollama, most proxies)json_schema– strict structured outputs (hosted OpenAI only)none– omitresponse_formatentirely (oldest backends)
EDITOR– editor for reviewing commits (defaults to system default)
- Failed to stage changes → exits if
git add .fails - No staged changes → exits with helpful message if no changes exist
- Missing API key → only an issue when the configured backend requires one; against hosted OpenAI you'll see a 401 with the API's response body
- Invalid
OPENAI_RESPONSE_FORMAT→ exits with the list of valid values (json_object,json_schema,none) - API failures → shows HTTP status and response body
- Invalid JSON → shows raw model output for debugging
- Commit creation failed → exits with error message if
git commitfails - Push declined → exits gracefully with "Push cancelled. Commit saved locally." when user responds with
norno - Push failed → shows error if
git pushfails (commit is already saved locally) - Invalid push confirmation input → prompts user to answer
y/nagain
reqwest– HTTP clientserde/serde_json– JSON parsingtokio– async runtimeanyhow– error handling
├── src/main.rs # Core logic
├── Cargo.toml # Dependencies and metadata
└── README.md # This file
cargo buildThis project is open source. See the repository for details.
"No staged changes found"
- This occurs when there are no modified files in your working directory. Make sure you have uncommitted changes before running
git-cmt-rs.
"OPENAI_API_KEY not set"
- Only required for hosted OpenAI. Export your key, or unset it and point
OPENAI_BASE_URLat a local Ollama / proxy that doesn't need auth.
"LLM request failed"
- Check that the backend is reachable (
curl $OPENAI_BASE_URL/models). - For hosted OpenAI: verify your API key and that your account has credits.
- For Ollama: confirm
ollama serveis running and the model is pulled (ollama list). - If the backend rejects
response_format, tryexport OPENAI_RESPONSE_FORMAT=json_objector=none.
Editor not opening
- Set your editor explicitly:
export EDITOR="code --wait"
"Push cancelled. Commit saved locally." message
- This is expected behavior. The user can respond
nornoat the push confirmation prompt to keep the commit local. - The commit is already created and saved; the push is simply skipped.
- You can push manually later with
git push.
"Push failed" error
- The commit was created successfully, but the push to remote failed (network issues, authentication, etc.)
- Your commit is safely saved locally
- You can try pushing again manually or resolve any issues before retrying