Factor Three
Factor Three

Reputation: 2284

How to get a list of models available in Ollama using Langchain

I am trying to run a Python script that gets and prints a list of the models that are available to a running instance of Ollama. My code, based on code provided at https://medium.com/@garysvenson09/how-to-list-all-models-in-ollama-for-langchain-tutorial-786cb14298a8, is provided below:

from langchain import Ollama

ollama_client = Ollama()
model_list = ollama_client.list_models()
for model in model_list: print(f"Model Name: {model.name}, Version: {model.version}, Description: {model.description}")

The problem is that when I run the script, I get the following error:

Traceback (most recent call last):
File "C:\Research\Project 39\langtest1\Test1\firsttest.py", line 2, in <module>
from langchain import Ollama
ImportError: cannot import name 'Ollama' from 'langchain' (C:\Research\Project 39\langtest1\Test1\venv\Lib\site-packages\langchain\__init__.py)
Process finished with exit code 1

Obviously, the code I am using is flawed.

How to get a list of available models from Ollama?

Upvotes: 1

Views: 1052

Answers (2)

Factor Three
Factor Three

Reputation: 2284

I solved the problem myself, by writing a Python function that queries Ollama for its list of LLMs.

The code I used is provided below, for reference by anyone else who has the problems I had:

from langchain_ollama.chat_models import ChatOllama
import requests
import json

OLLAMA_URL = "<the URL for Ollama>"

def get_models() -> list:
    thelist = requests.get(OLLAMA_URL+"/api/tags")
    jsondata = thelist.json()
    result = list()

    for model in jsondata["models"]:
        result.append(model["model"])

    return result

The result that is returned is a list of the models that your ollama instance currently has.

Upvotes: 0

rsheng
rsheng

Reputation: 11

Same question :ImportError: cannot import name 'Ollama' from 'langchain.llms'

Langchain has moved all llm models to their own modules. You need to install pip install -U langchain-ollama. Then you can use it without any issues.

The official page : reference

Upvotes: 1

Related Questions