Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 6 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,12 @@

All notable changes to this project will be documented in this file.

## [1.3.10] - 2026-05-14

### Added

- OpenAI provider now supports system proxy via `HTTPS_PROXY` / `HTTP_PROXY` / `NO_PROXY` environment variables

## [1.3.9] - 2026-03-16

### Added
Expand Down
6 changes: 6 additions & 0 deletions CHANGELOG_zh.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,12 @@

[English](./CHANGELOG.md)

## [1.3.10] - 2026-05-14

### 新增

- OpenAI provider 支持通过 `HTTPS_PROXY` / `HTTP_PROXY` / `NO_PROXY` 环境变量设置系统代理

## [1.3.9] - 2026-03-16

### 新增
Expand Down
4 changes: 3 additions & 1 deletion CLAUDE.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ Seven source files in `src/`:
- **index.ts** — CLI entry point (Commander.js). Routes to openai or claude provider based on config
- **config.ts** — Reads env vars into a `Config` object. `provider` field selects openai/claude. Exports `GenerateResult` type
- **git.ts** — Runs `git diff --cached` and `git commit -m`
- **llm.ts** — Sends diff to OpenAI-compatible chat completions endpoint, returns `GenerateResult` with token usage
- **llm.ts** — Sends diff to OpenAI-compatible chat completions endpoint via undici fetch with EnvHttpProxyAgent (supports HTTPS_PROXY/HTTP_PROXY/NO_PROXY), returns `GenerateResult` with token usage
- **claude.ts** — Calls `claude -p` CLI to generate commit messages (Claude reads diff + source files itself), returns `GenerateResult`
- **prompt.ts** — Builds system prompt enforcing Conventional Commits format
- **update-check.ts** — Non-blocking version check against GitHub, 24h cache in `~/.ai-commit/.update-check`
Expand All @@ -39,3 +39,5 @@ ai-commit -d # dry-run test
Required: `AI_COMMIT_API_KEY` (only for openai provider)

Optional: `AI_COMMIT_PROVIDER` (openai/claude, default openai), `AI_COMMIT_API_URL` (default DeepSeek), `AI_COMMIT_MODEL`, `AI_COMMIT_LANGUAGE` (en/zh), `AI_COMMIT_MAX_TOKENS` (500)

Proxy: `HTTPS_PROXY` / `HTTP_PROXY` (for openai provider), `NO_PROXY` (bypass)
12 changes: 12 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -160,6 +160,18 @@ ai-commit --uninstall
| `AI_COMMIT_MAX_TOKENS` | Max tokens for generation | `500` |
| `AI_COMMIT_EMOJI` | Always add emoji (`true` / `false`) | `false` |

### Proxy

OpenAI provider supports system proxy via standard environment variables:

```bash
export HTTPS_PROXY="http://127.0.0.1:7890"
# or
export HTTP_PROXY="http://127.0.0.1:7890"
```

Both `HTTPS_PROXY` / `https_proxy` and `HTTP_PROXY` / `http_proxy` are recognized. `NO_PROXY` / `no_proxy` is also supported to bypass proxy for specific hosts.

## Changelog

See [CHANGELOG.md](./CHANGELOG.md) for release history.
Expand Down
12 changes: 12 additions & 0 deletions README_zh.md
Original file line number Diff line number Diff line change
Expand Up @@ -162,6 +162,18 @@ ai-commit --uninstall
| `AI_COMMIT_MAX_TOKENS` | 最大生成 token 数 | `500` |
| `AI_COMMIT_EMOJI` | 始终添加 emoji(`true` / `false`) | `false` |

### 代理

OpenAI provider 支持通过标准环境变量设置系统代理:

```bash
export HTTPS_PROXY="http://127.0.0.1:7890"
# 或
export HTTP_PROXY="http://127.0.0.1:7890"
```

支持 `HTTPS_PROXY` / `https_proxy` 和 `HTTP_PROXY` / `http_proxy`。同时支持 `NO_PROXY` / `no_proxy` 跳过指定主机的代理。

## 更新日志

查看 [CHANGELOG_zh.md](./CHANGELOG_zh.md) 了解完整版本历史。
Expand Down
4 changes: 3 additions & 1 deletion docs/DESIGN.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ AI Commit 是一个命令行工具,通过分析 `git diff` 内容,调用大
|------|------|------|
| 语言 | Node.js (TypeScript) | 生态成熟,npm 分发方便,跨平台 |
| CLI 框架 | Commander.js | 轻量、主流 |
| HTTP 请求 | node-fetch / 内置 fetch | 调用 LLM API |
| HTTP 请求 | undici fetch + EnvHttpProxyAgent | 调用 LLM API,支持系统代理 |
| 交互 | Inquirer.js | 终端交互体验好 |
| 包管理 | npm | 通过 `npx` 可免安装使用 |

Expand Down Expand Up @@ -130,6 +130,8 @@ async function generateCommitMessageWithClaude(diff: string, config: Config): Pr
```typescript
async function generateCommitMessage(diff: string, config: Config): Promise<string>;
// 构建请求体,调用 OpenAI 兼容 API,返回生成的 commit message
// 使用 undici 的 EnvHttpProxyAgent 自动读取 HTTPS_PROXY/HTTP_PROXY/NO_PROXY 环境变量
// 支持系统代理,无需额外配置
```

请求体格式(OpenAI 兼容):
Expand Down
16 changes: 13 additions & 3 deletions package-lock.json

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

3 changes: 2 additions & 1 deletion package.json
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,8 @@
],
"dependencies": {
"@inquirer/prompts": "^8.3.0",
"commander": "^12.1.0"
"commander": "^12.1.0",
"undici": "^8.2.0"
},
"devDependencies": {
"@types/node": "^20.14.0",
Expand Down
5 changes: 4 additions & 1 deletion src/llm.ts
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
import { Config, GenerateResult } from "./config";
import { getSystemPrompt } from "./prompt";
import { prepareDiffContent } from "./git";
import { EnvHttpProxyAgent, fetch as undiciFetch } from "undici";

interface ChatResponse {
choices: { message: { content: string } }[];
Expand All @@ -9,6 +10,7 @@ interface ChatResponse {

export async function generateCommitMessage(diff: string, config: Config): Promise<GenerateResult> {
const content = prepareDiffContent(diff);
const dispatcher = new EnvHttpProxyAgent();

const body = {
model: config.model,
Expand All @@ -20,13 +22,14 @@ export async function generateCommitMessage(diff: string, config: Config): Promi
temperature: 0.3,
};

const response = await fetch(config.apiUrl, {
const response = await undiciFetch(config.apiUrl, {
method: "POST",
headers: {
"Content-Type": "application/json",
Authorization: `Bearer ${config.apiKey}`,
},
body: JSON.stringify(body),
dispatcher,
});

if (!response.ok) {
Expand Down