Conversation
src/app/configurations/page.tsx
Outdated
| "phi3:3.8b", | ||
| "phi3:14b", | ||
| ], | ||
| openrouter: [ |
There was a problem hiding this comment.
Will need to refactor this after my PR goes in.
There was a problem hiding this comment.
Also means pricing metrics for these open router models will not be supported
There was a problem hiding this comment.
Oh, let's fix that, how we can do it?
There was a problem hiding this comment.
TLDR: option 3.
Option 1: we duplicate the entire object body, for each model configuration, onto the model that is available as a proxy via open router.
- pro: no further refactors from my PR
- con: repeated config - only if the provider model and proxied model identical in price.
Option 2: we instead put an attribute on the object body of models that have availability via proxies
- pro: avoids repeated config
- con: separation of concerns if the configs do differ in price between provider and proxies model
Option 3: we split the provider / model list config away from the config details and instead reference the config in the original provider / model and repeat it in the proxies (open router).
- pro: can assign the default model config directly to the source providers offering. Can
{ ...spread, override: "specific values" }like price if they different slightly or keep identical default model config if the same - con: requires further change to our model config layout (not an issue)
Conclusion: we go with option 3. Easier to maintain. But the pricing appears to be the same for both providers / proxies offering.
There was a problem hiding this comment.
OpenAI pricing is same too. No additional markup for using it via open router (OR) proxy.
And other models like Nous Hermes don't offer their own hosted service. So those prices are what OR sets.
There are other proxies offering a similar service to OR like AI/ML API
https://aimlapi.com/models/nous-hermes-2-mistral-dpo-7b?43d56c14_page=2&6c9afc24_page=19
Their prices are higher than OR so I see no point in considering them.
|
Already wrote it on X/Twitter but testing it worked 👍 Could use sonnet through OpenRouter. Great job man :) |
oh that’s interesting! I’ll debug and see what’s wrong. I’d love to have Deepseek supported |
Yeah would love to use Deepseek-Coder than Sonnet 3.5 due to a similar performance but really 20x cheaper. Edit: Thats on OpenRouter's side. The model can be used when you allow them to use your input data for model training. |
so it’s fixed now? @tm17-abcgen |
yes |
…config/stats to use it

Use any of the OpenRouter models with CursorLens, including
claude-3.5-sonnet(by default Cursor doesn't support it when it's through OpenRouter)