Path: /integrations/libraries/cursor
After following the instruction and ading the config to my api key, other model stopped working and I got this response
"body": {
"error": {
"message": "This is not a chat model and thus not supported in the v1/chat/completions endpoint. Did you mean to use v1/completions?",
"type": "invalid_request_error",
"param": "model",
"code": null
}
},
Also, it would be nice if I can read all models that are in portkey on cursor, without specifying a single model in config. loading multiple model will be useful.
Path: /integrations/libraries/cursor
After following the instruction and ading the config to my api key, other model stopped working and I got this response
Also, it would be nice if I can read all models that are in portkey on cursor, without specifying a single model in config. loading multiple model will be useful.