Skip to content

Sprint/v1.2.0#3

Merged
aaddrriann merged 3 commits intomasterfrom
sprint/v1.2.0
Apr 28, 2026
Merged

Sprint/v1.2.0#3
aaddrriann merged 3 commits intomasterfrom
sprint/v1.2.0

Conversation

@aaddrriann
Copy link
Copy Markdown
Collaborator

Summary

  • Adds a Local provider path in the Mind tab, routing
    inference to any Ollama or OpenAI-compatible local endpoint (LM Studio,
    vLLM, xAI, etc.)
  • New OllamaAPI service handles connection testing, model listing,
    pull/delete/create, and streaming chat — with URL normalization to fix
    external provider path doubling
  • Full connection manager UI: AddConnectionModal, ConnectionRow,
    OllamaSection, and OllamaManageModal with an 11-entry vision model
    catalog
  • Stream runs to EOS with no timeout or token cap; PTT AbortSignal is
    the sole cancellation mechanism — prevents silent hangs on CPU-only
    inference
  • Manage modal polished: widened to 680px, catalog and installed model
    list both scroll with a consistent thin 5px pill scrollbar

Test plan

  • Launch dev build; navigate Mind tab → Local — OllamaSection renders
  • Add Connection → Verify returns green + model count
  • Save; row appears with toggle, Manage, Configure actions
  • Trigger PTT → Ollama responds via streaming chat
  • Add external connection with bearer token → Verify succeeds (no 404)
  • Manage modal → all three scroll regions use thin pill scrollbar
  • Quick-pull a catalog model → progress bar updates; model appears in
    list
  • Kill Ollama mid-stream → error surfaces gracefully, no crash
  • npm run typecheck — zero errors

Sprint alignment:
- Add scripts/sprint-start.sh — fetches origin/master and compares
  commit counts using --count to avoid Windows line-ending issues;
  exits 1 on divergence so callers can branch on exit code.
- Add .claude/skills/sprint-align.md (/sprint-align) — protocol for
  running the script, interpreting output, and guiding rebase/merge
  decisions without executing them autonomously.

Branch lifecycle skills:
- Add .claude/skills/new-branch.md (/new-branch) — enforces correct
  base branch per type: feature/* off dev, fix/* off master,
  sprint/* and release/* off master and dev respectively.
- Add .claude/skills/merge-flow.md (/merge-flow) — validates PR target,
  blocks direct-to-master routing for non-hotfix branches, and runs a
  pre-PR checklist covering alignment, version bump, and changelog.
- Add .claude/skills/commit-msg.md (/commit-msg) — drafts structured
  commit messages: section labels, em-dash rationale, 72-char wrap,
  no Co-Authored-By, no markdown in body.

Settings and gitignore:
- Add .gitattributes — forces LF on *.sh and .claude/skills/*.md to
  prevent CRLF corruption of bash scripts on Windows checkout.
- Add .claude/settings.json — registers .claude/skills/ as the skill
  discovery path for Claude Code.
- Ignore run.bat and setup.bat — local developer convenience scripts
  with no value in version history.
Type system & storage:
- Add LocalConnection interface and extend MindProvider to include
  'ollama' — routes inference to any OpenAI-compatible local endpoint
  alongside Anthropic and OpenAI.
- Extend FlickySettings with localConnections array; add IPC channels
  for CRUD, bearer key management, and model pull/delete/create.

Ollama API service (new ollama-api.ts):
- testConnection tries /api/tags then falls back to /v1/models — covers
  native Ollama and external OpenAI-compat providers (xAI, LM Studio).
- normalizeBase strips trailing /v1 to prevent URL path doubling when
  callers append /v1/chat/completions.
- streamChat carries no timeout and no token cap — model runs to EOS;
  PTT AbortSignal is the sole cancellation mechanism, preventing silent
  hangs on slow CPU inference (e.g. qwen3-vl:8b at ~8 t/s).
- isVisionModel covers 16 known vision families for screenshot routing.

IPC & preload:
- Eight new handlers in index.ts for connection CRUD, bearer key ops,
  and Ollama model pull/delete/create with progress event forwarding.
- Preload exposes matching methods on window.flicky including streaming
  pull-progress, pull-complete, and pull-error event subscriptions.

UI — MindTab, OllamaSection, ConnectionRow, AddConnectionModal:
- MindTab gains a third "Local" tab; selecting it renders OllamaSection
  which lists enabled connections with per-row model selector.
- AddConnectionModal: External/Local type toggle, URL verify with
  latency + model count hint, bearer auth, prefix ID, model IDs, tags.
- OllamaManageModal: installed model list, 11-entry vision model
  catalog with quick-pull, manual pull-by-tag, delete, and create.
- vision-models.ts catalogs 11 vision-capable models with Ollama
  library slugs for one-click pull and Browse links.

Polish:
- Manage modal widened to 680px; catalog wrapped in 300px scrollable
  container; thin 5px pill scrollbar applied to catalog, installed
  model list, and modal body — consistent with app scrollbar style.
- Redundant vision badge removed from catalog cards; section title
  already reads "Vision-capable models".
@vercel
Copy link
Copy Markdown

vercel Bot commented Apr 28, 2026

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Actions Updated (UTC)
flicky Ready Ready Preview, Comment Apr 28, 2026 3:43pm

Request Review

OllamaManageModal:
- onClick={handlePull} → onClick={() => handlePull()} — prevents
  MouseEvent from being passed as the optional tag parameter,
  fixing TS2322 that was failing the CI typecheck step.

Release:
- package.json version bumped 1.0.3 → 1.2.0 to align with
  sprint/v1.2.0 Ollama local connections feature.
@aaddrriann aaddrriann merged commit c0b5aee into master Apr 28, 2026
6 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant