Reputation: 431
I ran into this problem when I was trying to import the following libraries and it is giving the error "ImportError: cannot import name 'VectorStoreIndex' from 'llama_index' (unknown location)"
I ran this exact same code in the morning and it worked perfectly.
I did !pip install llama_index
from llama_index import VectorStoreIndex, SimpleDirectoryReader, ServiceContext
from llama_index.llms import HuggingFaceLLM
from llama_index.prompts.prompts import SimpleInputPrompt
I tried commenting out first line and faced same issue for HuggingFaceLLM module Same issue for SimpleInputPrompt, got error "ModuleNotFoundError: No module named 'llama_index.prompts'"
First I faced the problem in a sagemaker notebook so I thought the issue was with the sagemaker notebook so I spun up a clean new notebook and I got the same error. So, I tried the code in my local Jypiter notebook, google collab notebook, sagemaker studiolab notebook and I got the same error.
Upvotes: 16
Views: 39532
Reputation: 1
Adding the .legacy
worked for me:
from llama_index.legacy import VectorStoreIndex
Upvotes: 0
Reputation: 1
I think this error occurs because of pip installation.
I encounterd the same error, and I figured out this error could be simply fixed with pip installation.
Would you check if the same error occurs when just after remove all the llama-* related packages and just re-install llama-index?
pip cache purge
This command will help that pip does not install previously installed cached package.
And if you encounter the same error with import, refer to the other installation method, like using git.
This documents would be helpful.
https://docs.llamaindex.ai/en/stable/getting_started/installation.html
Upvotes: 0
Reputation: 170
As mentioned by other users, the library was recently updated. While it is still possible to use the suggestion from lat, it is deprecated.
Instead, you should use the Settings
import as described here. If your old code looks like this:
from llama_index import ServiceContext, set_global_service_context
service_context = ServiceContext.from_defaults(
llm=llm, embed_model=embed_model, chunk_size=512
)
set_global_service_context(service_context)
You should update it to be the following:
from llama_index.core import Settings
Settings.llm = llm
Settings.embed_model = embed_model
Settings.chunk_size = 512
Upvotes: 1
Reputation: 19
First, you need to install:
!pip install llama_index
!pip install llama-index-llms-huggingface
Then, as it was mentioned by others, write import statements:
from llama_index.core import VectorStoreIndex, SimpleDirectoryReader, ServiceContext
from llama_index.llms.huggingface import HuggingFaceLLM
from llama_index.core.prompts.prompts import SimpleInputPrompt
Upvotes: -1
Reputation: 13
from llama_index.core.indices.vector_store.base import VectorStoreIndex
worked for me
Source: https://docs.llamaindex.ai/en/stable/api_reference/indices/vector_store.html
Upvotes: 1
Reputation: 51
Llamaindex constantly changes the modules directories.
The module you are searching for is:
from llama_index.core import VectorStoreIndex
And for an especific vectorstore using chromadb as example, you need to install:
pip install llama-index-vector-stores-chroma
and would be imported as follows
from llama_index.vector_stores.chroma import ChromaVectorStore
Source: https://docs.llamaindex.ai/en/stable/examples/vector_stores/chroma_metadata_filter.html
Upvotes: 5
Reputation: 431
The llama index library was recently updated so I was able to solve the issue by using updating the import according to the documentation
from llama_index.core import VectorStoreIndex,SimpleDirectoryReader,ServiceContext,PromptTemplate
from llama_index.llms.huggingface import HuggingFaceLLM
Upvotes: 26