Merged
Conversation
zakiali
reviewed
Aug 26, 2024
zakiali
reviewed
Aug 26, 2024
src/exchange/providers/ollama.py
Outdated
|
|
||
|
|
||
| # | ||
| # NOTE: this is experimental, best used with 70B model or larger if you can |
Collaborator
There was a problem hiding this comment.
Since experimental, we should flag it as such and raise a warning when instantiated with something like "This is an experimental provider and support may be dropped in the future" or something like that.
Collaborator
Author
There was a problem hiding this comment.
yep please take a look again
zakiali
reviewed
Aug 27, 2024
src/exchange/providers/ollama.py
Outdated
| """ | ||
|
|
||
| def __init__(self, client: httpx.Client) -> None: | ||
| print('PLEASE NOTE: this is an experimental provider, use with care') |
Collaborator
There was a problem hiding this comment.
maybe just mention here that OllamaProvider is the provider referenced (since this happens under the hood in goose, and might not be obvious)
zakiali
approved these changes
Aug 27, 2024
lukealvoeiro
pushed a commit
that referenced
this pull request
Sep 2, 2024
lukealvoeiro
added a commit
that referenced
this pull request
Sep 2, 2024
* main: fix typos found by PyCharm (#21) added retry when sending httpx request to LLM provider apis (#20) chore: version bump to `0.8.2` (#19) fix: don't always use ollama provider (#18) fix: export `metadata.plugins` export should have a valid value (#17) Create an entry-point for `ai-exchange` (#16) chore: Run tests for python >=3.10 (#14) Update pypi_release.yaml (#13) ollama provider (#7) chore: gitignore generated lockfile (#8)
codefromthecrypt
pushed a commit
to codefromthecrypt/exchange
that referenced
this pull request
Oct 13, 2024
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
In theory ollama can work, for example:
however better with the larger models, if you have a beefy enough machine to run it locally.