Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ollama not respecting model selection #460

Merged
merged 1 commit into from
Nov 29, 2024
Merged

Ollama not respecting model selection #460

merged 1 commit into from
Nov 29, 2024

Conversation

dustinwloring1988
Copy link
Collaborator

Fixes bug with Ollama not respecting model selection

@coleam00
Copy link
Collaborator

Thank you @dustinwloring1988!

@coleam00 coleam00 merged commit a0ba540 into stackblitz-labs:main Nov 29, 2024
1 check passed
@miyaniakshar1234
Copy link

Screenshot (13)
i see that error each time i request i have tried pnpm and docker both but it not resolved

@dustinwloring1988
Copy link
Collaborator Author

@miyaniakshar1234 how is ollama setup, in docker? also did you set the baseURL for Ollama in the settings or the .env.local

@kalyanipujari204
Copy link

can you tell me where i can get that ollama base url

@devictang
Copy link

can you tell me where i can get that ollama base url

I think that should be http://localhost:11434/ by default

But I have the same issue that bolt.diy is asking me for ollama api keys despite I have set the ollama url.

I installed bolt from pinokio.

jhillbht pushed a commit to jhillbht/bolt.new-any-llm that referenced this pull request Jan 27, 2025
…ot-respected

Ollama not respecting model selection
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants