Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Unable to use oco with ollama running mistral #310

Open
jagadish-k opened this issue Mar 14, 2024 · 5 comments
Open

[Bug]: Unable to use oco with ollama running mistral #310

jagadish-k opened this issue Mar 14, 2024 · 5 comments
Labels
bug Something isn't working

Comments

@jagadish-k
Copy link

Opencommit Version

3.0.11

Node Version

18.15.0

NPM Version

9.5.0

What OS are you seeing the problem on?

Mac

What happened?

I am unable to use opencommit to generate commit message for my staged files using locally running ollama.

I get the following error :

 ✖ local model issues. details: connect ECONNREFUSED ::1:11434

Expected Behavior

I expect opencommit to work with locally running ollama

Current Behavior

I am running ollama with mistral in one terminal.
ollama run mistral
In the other terminal where I have staged files, I was able to curl to it

curl http://127.0.0.1:11434
Ollama is running%

This is the config I have

OCO_OPENAI_API_KEY=undefined
OCO_TOKENS_MAX_INPUT=undefined
OCO_TOKENS_MAX_OUTPUT=undefined
OCO_OPENAI_BASE_PATH=undefined
OCO_DESCRIPTION=false
OCO_EMOJI=false
OCO_MODEL=gpt-3.5-turbo-16k
OCO_LANGUAGE=en
OCO_MESSAGE_TEMPLATE_PLACEHOLDER=$msg
OCO_PROMPT_MODULE=conventional-commit
OCO_AI_PROVIDER=ollama

Possible Solution

No response

Steps to Reproduce

No response

Relevant log output

> OCO_AI_PROVIDER='ollama' opencommit
┌  open-commit
│
◇  30 staged files:
...
◇  📝 Commit message generated
│
└  ✖ local model issues. details: connect ECONNREFUSED ::1:11434
@jagadish-k jagadish-k added the bug Something isn't working label Mar 14, 2024
@di-sukharev
Copy link
Owner

@jaroslaw-weber hi man, could you take a look please?

@di-sukharev
Copy link
Owner

@jagadish-k could you try with other OCO_MODEL config? now it's set to gpt-3.5-turbo-16k which is not mistral

@SebastienElet
Copy link

Same for me even with OCO_MODEL set to mistral

❯ OCO_AI_PROVIDER="ollama" OCO_MODEL=mistral opencommit
┌  open-commit
│
◇  2 staged files:
  .zshrc
  Makefile
│
◇  📝 Commit message generated
│
└  ✖ local model issues. details: connect ECONNREFUSED ::1:11434

@Abir-Tx
Copy link

Abir-Tx commented Apr 9, 2024

Hi there, I am also getting this same error. I think the doc should add some more details on setting an ollama model with open-commit

I have this bare minimum config & ollama is running fine on port 11434 on localhost

OCO_AI_PROVIDER='ollama'
OCO_DESCRIPTION=false
OCO_EMOJI=false

@Abir-Tx
Copy link

Abir-Tx commented Apr 9, 2024

Guys I have found a fix for this. You see the oco is requesting the ollama API on ipv6 loopback which is ::1 and by default ollama doest not listen on ipv6 address.

So to fix the issue you can run ollama on ipv6 too or in my case I have configured ollama to listen on all interfaces by setting the OLLAMA_HOST='0.0.0.0 env variable.

@di-sukharev I can enhance the documentation on this by adding guide for setting up this if you like so I will submit a PR.

I hope this will help thanks

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

4 participants