A global AI command bar for macOS. Hit ⌘⌥A anywhere — a full-width liquid-glass strip slides in, you type, and it either answers you inline or actually does things on your Mac.
| You type… | CommandBar does… |
|---|---|
"What's the difference between async/await and GCD?" |
Streams a concise answer inline, then auto-hides |
"Open Figma" |
Launches Figma immediately |
"Move all PDFs from Downloads to Documents/Archive" |
Shows a step-by-step plan, asks you to confirm, then runs it |
"Start my coding environment" |
Runs your saved workflow (open Xcode, Terminal, Spotify) in sequence |
"Explain this error: EXC_BAD_ACCESS" |
Gives you a short diagnosis right in the bar |
- ⌘⌥A summons the bar from anywhere — no Dock icon, no switching apps
- Full-width liquid-glass overlay (NSVisualEffectView behind-window blur)
- Streaming AI answers via Ollama — text appears as it's generated
- Action planner — multi-step confirmation with live step progress
- Saved workflows — name any sequence and trigger it by phrase
- Keyboard-first — Tab, ⎋, ↵ handle everything; no mouse required
- 100% local — no API keys, no data leaves your machine
- macOS 14 Sonoma or later
- Xcode 15+
- Ollama running locally
Download from ollama.com and pull a model:
ollama pull llama3Make sure Ollama is running — it listens on localhost:11434 by default.
git clone https://github.com/Abp0101/CommandBar.git
cd CommandBarswift build
.build/debug/CommandBarOr open in Xcode:
open Package.swiftOn first launch, macOS will ask for Accessibility access (needed for the global hotkey). Go to System Settings → Privacy & Security → Accessibility and enable CommandBar.
Then press ⌘⌥A anywhere to summon the bar.
Sources/CommandBar/
├── CommandBarApp.swift @main entry — Settings scene only (no Dock window)
├── AppDelegate.swift Menu bar icon, hotkey wiring, window controller setup
├── HotkeyManager.swift Global ⌘⌥A via Carbon RegisterEventHotKey
├── CommandBarWindowController.swift NSPanel subclass — full-width, above all windows
├── CommandBarView.swift SwiftUI UI — state machine, text input, animations
├── ActionStep.swift Model for a single plan step + status enum
├── AIService.swift Ollama API client — intent classification + streaming answers
├── ActionExecutor.swift NSWorkspace / NSAppleScript execution engine
├── WorkflowStore.swift JSON-persisted reusable command sequences
├── SettingsView.swift Preferences: General, Workflows
└── Resources/
└── Info.plist LSUIElement=YES, Accessibility usage strings
idle ──[submit]──► thinking ──[is question]──► answering(text)
└──[is action]───► planning(steps) ──[confirm]──► executing(i, steps) ──► (auto-dismiss)
Open HotkeyManager.swift and change the key code and modifier mask:
// Key codes: kVK_ANSI_A = 0x00, kVK_ANSI_Space = 0x31, etc.
// Modifiers: cmdKey | optionKey | shiftKey | controlKey
RegisterEventHotKey(
0x00, // ← change key code here
UInt32(cmdKey | optionKey), // ← change modifiers here
hotKeyID, ...
)Add a case to ActionExecutor.execute(step:):
} else if desc.contains("screenshot") {
await takeScreenshot()
}Then implement the handler using NSWorkspace, NSAppleScript, or a shell command via Process.
- GUI hotkey picker in Preferences
- Context awareness — inject selected text automatically
- Frontmost app context (pass active app to AI for smarter answers)
- Command history with fuzzy search (↑ / ↓ to cycle)
- Plugin API for custom Swift action handlers
- Workflow recorder — "watch what I do" → save as workflow
- iCloud sync for workflows across Macs
- Inline image support (screenshots as context)
| Permission | Why |
|---|---|
| Accessibility | Registering the global ⌘⌥A hotkey |
| Automation / Apple Events | Controlling apps via AppleScript |
| Network | Talking to local Ollama server |
CommandBar never reads your screen, keystrokes, or files unless you explicitly ask it to perform an action involving them.
PRs welcome! Please open an issue first for large changes.
- Fork → branch (
feature/my-thing) → PR - Run
swift buildbefore opening a PR - Follow existing code style (no third-party dependencies unless essential)
MIT © 2025 — see LICENSE