A Swift package providing UI testing utilities for iOS and macOS apps.
| Target | Purpose |
|---|---|
AccessibilityKit |
Accessibility identifiers and query helpers |
VisionEvalKit |
Claude Vision-powered screenshot evaluation |
UITestingBridge |
Lightweight HTTP server exposing the AX tree on port 7979 |
UIActionKit |
Cross-platform UI action execution (tap, type, swipe) |
SnapshotKit |
Snapshot testing via swift-snapshot-testing |
PerformanceKit |
Performance measurement helpers |
A command-line tool that connects to a running UITestingBridge inside a simulator, fetches the accessibility tree, optionally captures a screenshot, and runs Vision-based expectation checks.
# Debug build
swift build --target AppleUITester
# Release build (recommended for CI)
swift build -c release --target AppleUITester
# Install to ~/.local/bin
mkdir -p ~/.local/bin
cp .build/release/apple-ui-tester ~/.local/bin/apple-ui-testerapple-ui-tester --bundle-id <id> [options]
| Flag | Description |
|---|---|
--bundle-id <id> |
The app's bundle identifier, e.g. com.example.MyApp |
| Flag | Default | Description |
|---|---|---|
--host <host> |
localhost |
Host where UITestingBridge is listening |
--port <port> |
7979 |
Port where UITestingBridge is listening |
--expectations <json> |
— | JSON array of expectation strings for Vision eval |
--screenshot |
off | Capture a screenshot before running Vision eval |
--output <path> |
stdout | Write the JSON report to a file instead of stdout |
| Code | Meaning |
|---|---|
0 |
Success |
1 |
Bridge unreachable |
2 |
One or more Vision eval expectations failed |
# Boot simulator and launch app
xcrun simctl boot "iPhone 16"
xcrun simctl launch booted com.example.MyApp
# Wait a moment for UITestingBridge to start, then run
apple-ui-tester \
--bundle-id com.example.MyApp \
--screenshot \
--expectations '["Main screen is visible","Navigation is accessible","No error messages shown"]' \
--output /tmp/ux-report.json
cat /tmp/ux-report.json{
"bundleId": "com.example.MyApp",
"bridgeReachable": true,
"axTree": { "role": "application", "children": [...] },
"screenshot": { "captured": true, "path": "/tmp/ux-screenshot-1234567890.png" },
"evalReport": {
"sceneName": "com.example.MyApp",
"results": [
{ "expectation": "Main screen is visible", "passed": true, "reasoning": "..." },
{ "expectation": "No error messages shown", "passed": false, "reasoning": "..." }
]
},
"errors": []
}- Xcode 15+ with command-line tools
ANTHROPIC_API_KEYenvironment variable set (required for--screenshot+--expectations)- App under test must embed
UITestingBridgeand callUITestingBridge.start()in its debug entry point:
#if DEBUG
UITestingBridge.start()
#endifAn MCP (Model Context Protocol) server that exposes the same UI testing capabilities over stdio JSON-RPC, allowing AI agents to drive UI tests programmatically.
swift build --target AppleUITesterMCP
# Release build
swift build -c release --target AppleUITesterMCPRun as a stdio MCP server — the binary reads JSON-RPC messages from stdin and writes responses to stdout. Configure it in your MCP client (e.g. Claude Code, Hermes Agent):
{
"command": ".build/release/apple-ui-tester-mcp"
}The server exposes tools for fetching accessibility trees, capturing screenshots, running Vision-based evaluations, and executing UI actions via UIActionKit.
Add the package to your app target:
// Package.swift
.package(url: "https://github.com/your-org/AppleUITesting", from: "1.0.0")
// In your target dependencies
.product(name: "UITestingBridge", package: "AppleUITesting")Then start the bridge on app launch (debug only):
import UITestingBridge
@main
struct MyApp: App {
init() {
#if DEBUG
UITestingBridge.start()
#endif
}
var body: some Scene { ... }
}