-
Notifications
You must be signed in to change notification settings - Fork 286
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: Unable to use oco with ollama running mistral #310
Comments
@jaroslaw-weber hi man, could you take a look please? |
@jagadish-k could you try with other |
Same for me even with OCO_MODEL set to mistral
|
Hi there, I am also getting this same error. I think the doc should add some more details on setting an ollama model with open-commit I have this bare minimum config & ollama is running fine on port 11434 on localhost OCO_AI_PROVIDER='ollama'
OCO_DESCRIPTION=false
OCO_EMOJI=false |
Guys I have found a fix for this. You see the So to fix the issue you can run ollama on ipv6 too or in my case I have configured ollama to listen on all interfaces by setting the @di-sukharev I can enhance the documentation on this by adding guide for setting up this if you like so I will submit a PR. I hope this will help thanks |
Opencommit Version
3.0.11
Node Version
18.15.0
NPM Version
9.5.0
What OS are you seeing the problem on?
Mac
What happened?
I am unable to use opencommit to generate commit message for my staged files using locally running ollama.
I get the following error :
Expected Behavior
I expect opencommit to work with locally running ollama
Current Behavior
I am running ollama with mistral in one terminal.
ollama run mistral
In the other terminal where I have staged files, I was able to curl to it
This is the config I have
Possible Solution
No response
Steps to Reproduce
No response
Relevant log output
The text was updated successfully, but these errors were encountered: