Jatin Chawla
Jatin Chawla

Reputation: 11

TypeError python ollama

from langchain_community.llms import ollama
llm=ollama(base_url='http://localhost:11434',model="llama 2")

I'm encountering a TypeError: 'module' object is not callable error while attempting to use the Ollama LLM in VS Code on my Windows machine. The Ollama server is running successfully on localhost:11434, as verified through my terminal.

My ollma LLM is working properly in terminal but giving this error when being called Please help with this query

Upvotes: 1

Views: 791

Answers (2)

IZZYACADEMY.AI
IZZYACADEMY.AI

Reputation: 1

@nitgeek's answer is correct. It needs to be upper case but I think you may also need to install the dependency so that it is available for your code.

https://github.com/langchain-ai/langchain/blob/master/libs/community/langchain_community/llms/ollama.py#L398

https://pypi.org/project/langchain-community/

Also make sure that you have the langchain_community community module installed via pip

Upvotes: 0

nitgeek
nitgeek

Reputation: 1258

Once I faced the same issue. Issue is that in Ollama the "O" needs to be capital in import and usage. Below should work:

from langchain_community.llms import Ollama
llm = Ollama(model="llama2")

Upvotes: 0

Related Questions