Skip to content

Conversation

@la3lma
Copy link

@la3lma la3lma commented Jan 19, 2026

Hi.

This an extension I did to be able to use LLMs running locally in LMStudio. Very useful to avoid direct dependencies on hyper scalars obviously.

This isn't a "pretty" PR, but if you are interested in this functionality, I can clean it up into a pretty PR. Please say "please try" or words to that effect then I will, if not then I won't. In any case: Awesome project. Love it!

@benjaminshafii
Copy link
Member

benjaminshafii commented Jan 20, 2026

Automated review (bot): thanks for pushing LMStudio support.

I think the cleanest/most maintainable approach here is to use OpenCode’s native provider system instead of adding a generic http_request bridge in OpenWork.

Why this matters (OpenWork philosophy / parity)

  • OpenWork should stay a thin UI over OpenCode primitives; model execution + provider logic belongs in OpenCode.
  • A generic Tauri HTTP proxy is a large security surface area (arbitrary URLs/headers/payloads) and duplicates logic OpenCode already has.

What OpenCode already supports

  • OpenCode’s provider catalog already includes an lmstudio provider ID (OpenAI-compatible) with a default API base URL of http://127.0.0.1:1234/v1 and npm: @ai-sdk/openai-compatible.
  • The provider layer already wires baseURL automatically (baseURL falls back to model.api.url) and only adds an Authorization header when an API key exists.
  • OpenCode exposes a native credential endpoint: PUT /auth/:providerID (operationId auth.set). The JS SDK exposes this as client.auth.set({ providerID, auth: { type: 'api', key } }).
  • OpenWork already calls client.provider.list() on connect, so LMStudio should appear automatically when running a recent OpenCode server.

Suggested direction for this PR

  1. Remove the new http_request command and any direct LMStudio HTTP calls from OpenWork.
  2. Make LMStudio usable via the existing provider flow:
    • Ensure lmstudio shows up in the model picker via provider.list().
    • Add a small “Connect / Configure” affordance:
      • If a key is required, set it via client.auth.set({ providerID: 'lmstudio', auth: { type: 'api', key } }) (no shell env required).
      • If the user changed LMStudio’s port/host, set provider.lmstudio.options.baseURL in opencode.json (or via client.config.update) instead of hardcoding.
  3. If the real issue is that LMStudio is missing from the provider list for some users, that’s an upstream OpenCode version/catalog problem and should be fixed in OpenCode (not in OpenWork).

Happy to re-review if you pivot the implementation to the native provider path.

@benjaminshafii
Copy link
Member

Automated message: this PR adds a general-purpose HTTP tunnel in Tauri, hard-codes LMStudio as default, and targets the old src/ layout. That diverges from OpenWork’s philosophy (thin UI over OpenCode primitives) and expands the security surface (arbitrary URL/headers).

Recommendation: use OpenCode’s provider system for LMStudio and avoid a generic HTTP command (or strictly allowlist localhost). Please rebase onto dev and refactor accordingly.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants