Skip to content

Commit b2c533f

Browse files
authored
Merge pull request #12 from optave/feat/mcp-complete-server
feat(config): add apiKeyCommand for secure credential resolution
2 parents d831d0d + f3ab237 commit b2c533f

File tree

6 files changed

+433
-6
lines changed

6 files changed

+433
-6
lines changed

.github/workflows/claude-code-review.yml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -39,6 +39,7 @@ jobs:
3939
plugin_marketplaces: 'https://github.com/anthropics/claude-code.git'
4040
plugins: 'code-review@claude-code-plugins'
4141
prompt: '/code-review:code-review ${{ github.repository }}/pull/${{ github.event.pull_request.number }}'
42+
allowed_tools: 'Bash(gh pr *),Bash(gh api *),Bash(git diff *),Bash(git log *),Read,Glob,Grep'
4243
# See https://github.com/anthropics/claude-code-action/blob/main/docs/usage.md
4344
# or https://code.claude.com/docs/en/cli-reference for available options
4445

CLAUDE.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -49,7 +49,7 @@ JS source is plain JavaScript (ES modules) in `src/`. No transpilation step. The
4949
| `cycles.js` | Circular dependency detection |
5050
| `export.js` | DOT/Mermaid/JSON graph export |
5151
| `watcher.js` | Watch mode for incremental rebuilds |
52-
| `config.js` | `.codegraphrc.json` loading |
52+
| `config.js` | `.codegraphrc.json` loading, env overrides, `apiKeyCommand` secret resolution |
5353
| `constants.js` | `EXTENSIONS` (derived from parser registry) and `IGNORE_DIRS` constants |
5454
| `native.js` | Native napi-rs addon loader with WASM fallback |
5555
| `resolve.js` | Import resolution (supports native batch mode) |
@@ -64,6 +64,7 @@ JS source is plain JavaScript (ES modules) in `src/`. No transpilation step. The
6464
- Non-required parsers (all except JS/TS/TSX) fail gracefully if their WASM grammar is unavailable
6565
- Import resolution uses a 6-level priority system with confidence scoring (import-aware → same-file → directory → parent → global → method hierarchy)
6666
- Incremental builds track file hashes in the DB to skip unchanged files
67+
- **Credential resolution:** `loadConfig` pipeline is `mergeConfig → applyEnvOverrides → resolveSecrets`. The `apiKeyCommand` config field shells out to an external secret manager via `execFileSync` (no shell). Priority: command output > env var > file config > defaults. On failure, warns and falls back gracefully
6768

6869
**Database:** SQLite at `.codegraph/graph.db` with tables: `nodes`, `edges`, `metadata`, `embeddings`
6970

README.md

Lines changed: 18 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -366,6 +366,7 @@ See **[docs/recommended-practices.md](docs/recommended-practices.md)** for integ
366366
- **CI/CD** — PR impact comments, threshold gates, graph caching
367367
- **AI agents** — MCP server, CLAUDE.md templates, Claude Code hooks
368368
- **Developer workflow** — watch mode, explore-before-you-edit, semantic search
369+
- **Secure credentials**`apiKeyCommand` with 1Password, Bitwarden, Vault, macOS Keychain, `pass`
369370

370371
## 🔁 CI / GitHub Actions
371372

@@ -395,6 +396,23 @@ Create a `.codegraphrc.json` in your project root to customize behavior:
395396
}
396397
```
397398

399+
### LLM credentials
400+
401+
Codegraph supports an `apiKeyCommand` field for secure credential management. Instead of storing API keys in config files or environment variables, you can shell out to a secret manager at runtime:
402+
403+
```json
404+
{
405+
"llm": {
406+
"provider": "openai",
407+
"apiKeyCommand": "op read op://vault/openai/api-key"
408+
}
409+
}
410+
```
411+
412+
The command is split on whitespace and executed with `execFileSync` (no shell injection risk). Priority: **command output > `CODEGRAPH_LLM_API_KEY` env var > file config**. On failure, codegraph warns and falls back to the next source.
413+
414+
Works with any secret manager: 1Password CLI (`op`), Bitwarden (`bw`), `pass`, HashiCorp Vault, macOS Keychain (`security`), AWS Secrets Manager, etc.
415+
398416
## 📖 Programmatic API
399417

400418
Codegraph also exports a full API for use in your own tools:

docs/recommended-practices.md

Lines changed: 85 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -332,6 +332,91 @@ codegraph search "catch exception; format error response; report failure to clie
332332

333333
---
334334

335+
## Secure Credential Management
336+
337+
Codegraph's LLM features (semantic search with LLM-generated descriptions, future `codegraph ask`) require an API key. Use `apiKeyCommand` to fetch it from a secret manager at runtime instead of hardcoding it in config files or leaking it through environment variables.
338+
339+
### Why not environment variables?
340+
341+
Environment variables are better than plaintext in config files, but they still leak via `ps e`, `/proc/<pid>/environ`, child processes, shell history, and CI logs. `apiKeyCommand` keeps the secret in your vault and only materializes it in process memory for the duration of the call.
342+
343+
### Examples
344+
345+
**1Password CLI:**
346+
347+
```json
348+
{
349+
"llm": {
350+
"provider": "openai",
351+
"apiKeyCommand": "op read op://Development/openai/api-key"
352+
}
353+
}
354+
```
355+
356+
**Bitwarden CLI:**
357+
358+
```json
359+
{
360+
"llm": {
361+
"provider": "anthropic",
362+
"apiKeyCommand": "bw get password anthropic-api-key"
363+
}
364+
}
365+
```
366+
367+
**macOS Keychain:**
368+
369+
```json
370+
{
371+
"llm": {
372+
"provider": "openai",
373+
"apiKeyCommand": "security find-generic-password -s codegraph-llm -w"
374+
}
375+
}
376+
```
377+
378+
**HashiCorp Vault:**
379+
380+
```json
381+
{
382+
"llm": {
383+
"provider": "openai",
384+
"apiKeyCommand": "vault kv get -field=api_key secret/codegraph/openai"
385+
}
386+
}
387+
```
388+
389+
**`pass` (GPG-encrypted):**
390+
391+
```json
392+
{
393+
"llm": {
394+
"provider": "openai",
395+
"apiKeyCommand": "pass show codegraph/openai-key"
396+
}
397+
}
398+
```
399+
400+
### Priority chain
401+
402+
The resolution order is:
403+
404+
1. **`apiKeyCommand`** output (highest priority)
405+
2. **`CODEGRAPH_LLM_API_KEY`** environment variable
406+
3. **`llm.apiKey`** in config file
407+
4. **`null`** (default)
408+
409+
If the command fails (timeout, not found, non-zero exit), codegraph logs a warning and falls back to the next available source. The command has a 10-second timeout.
410+
411+
### Security notes
412+
413+
- The command is split on whitespace and executed with `execFileSync` (array args, no shell) — no shell injection risk
414+
- stdout is captured; stderr is discarded
415+
- The resolved key is held only in process memory, never written to disk
416+
- Keep `.codegraphrc.json` out of version control if it contains `apiKeyCommand` paths specific to your vault layout, or use a shared command that works across the team
417+
418+
---
419+
335420
## .gitignore
336421

337422
Add the codegraph database to `.gitignore` — it's a build artifact:

src/config.js

Lines changed: 45 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,7 @@
1+
import { execFileSync } from 'node:child_process';
12
import fs from 'node:fs';
23
import path from 'node:path';
3-
import { debug } from './logger.js';
4+
import { debug, warn } from './logger.js';
45

56
export const CONFIG_FILES = ['.codegraphrc.json', '.codegraphrc', 'codegraph.config.json'];
67

@@ -18,6 +19,10 @@ export const DEFAULTS = {
1819
defaultDepth: 3,
1920
defaultLimit: 20,
2021
},
22+
embeddings: { model: 'minilm', llmProvider: null },
23+
llm: { provider: null, model: null, baseUrl: null, apiKey: null, apiKeyCommand: null },
24+
search: { defaultMinScore: 0.2, rrfK: 60, topK: 15 },
25+
ci: { failOnCycles: false, impactThreshold: null },
2126
};
2227

2328
/**
@@ -33,13 +38,50 @@ export function loadConfig(cwd) {
3338
const raw = fs.readFileSync(filePath, 'utf-8');
3439
const config = JSON.parse(raw);
3540
debug(`Loaded config from ${filePath}`);
36-
return mergeConfig(DEFAULTS, config);
41+
return resolveSecrets(applyEnvOverrides(mergeConfig(DEFAULTS, config)));
3742
} catch (err) {
3843
debug(`Failed to parse config ${filePath}: ${err.message}`);
3944
}
4045
}
4146
}
42-
return { ...DEFAULTS };
47+
return resolveSecrets(applyEnvOverrides({ ...DEFAULTS }));
48+
}
49+
50+
const ENV_LLM_MAP = {
51+
CODEGRAPH_LLM_PROVIDER: 'provider',
52+
CODEGRAPH_LLM_API_KEY: 'apiKey',
53+
CODEGRAPH_LLM_MODEL: 'model',
54+
};
55+
56+
export function applyEnvOverrides(config) {
57+
for (const [envKey, field] of Object.entries(ENV_LLM_MAP)) {
58+
if (process.env[envKey] !== undefined) {
59+
config.llm[field] = process.env[envKey];
60+
}
61+
}
62+
return config;
63+
}
64+
65+
export function resolveSecrets(config) {
66+
const cmd = config.llm.apiKeyCommand;
67+
if (typeof cmd !== 'string' || cmd.trim() === '') return config;
68+
69+
const parts = cmd.trim().split(/\s+/);
70+
const [executable, ...args] = parts;
71+
try {
72+
const result = execFileSync(executable, args, {
73+
encoding: 'utf-8',
74+
timeout: 10_000,
75+
maxBuffer: 64 * 1024,
76+
stdio: ['ignore', 'pipe', 'pipe'],
77+
}).trim();
78+
if (result) {
79+
config.llm.apiKey = result;
80+
}
81+
} catch (err) {
82+
warn(`apiKeyCommand failed: ${err.message}`);
83+
}
84+
return config;
4385
}
4486

4587
function mergeConfig(defaults, overrides) {

0 commit comments

Comments
 (0)