Reputation: 720
I'm trying to build a simple RAG, and I'm stuck at this code:
from langchain.embeddings.huggingface import HuggingFaceEmbeddings
from llama_index import LangchainEmbedding, ServiceContext
embed_model = LangchainEmbedding(
HuggingFaceEmbeddings(model_name="thenlper/gte-large")
)
service_context = ServiceContext.from_defaults(
chunk_size=256,
llm=llm,
embed_model=embed_model
)
index = VectorStoreIndex.from_documents(documents, service_context=service_context)
where I get ImportError: cannot import name 'LangchainEmbedding' from 'llama_index' How can I solve? Is it related to the fact that I'm working on Colab?
Upvotes: 3
Views: 18925
Reputation: 1
Resolution for the ImportError: 'LangchainEmbedding' from 'llama_index'
If you’re encountering the error:
ImportError: cannot import name 'LangchainEmbedding' from 'llama_index'
This issue occurs because the import path for LangchainEmbedding has been updated in recent versions of llama_index. To resolve this, update your import statements to use the correct module path. Correct Import:
Instead of:
from llama_index import LangchainEmbedding
Use:
from llama_index.embeddings.langchain import LangchainEmbedding
This new import path reflects the updated structure of the llama_index library.
Upvotes: 0
Reputation: 142983
INFO 2024:
This answer worked in 2023 when question was asked
but it seems they moved all code and now you have to use other answers.
Not
from llama_index import LangchainEmbedding
but
from llama_index.embeddings import LangchainEmbedding
(See source code for llama_index/embeddings/__ init__.py)
Upvotes: 10
Reputation: 1
There is an update install langchain embedding separately
!pip install llama-index-embeddings-langchain
Then
from llama_index.embeddings.langchain import LangchainEmbedding
This worked for me check this for more ....
Upvotes: 0
Reputation: 3035
The other answers didn't work for me while the one below did.
from llama_index.legacy.embeddings.langchain import LangchainEmbedding
Upvotes: 9
Reputation: 21
With reference to above code. This is working exactly.
from langchain.embeddings.huggingface import HuggingFaceEmbeddings
from llama_index import ServiceContext, set_global_service_context
embed_model = HuggingFaceEmbeddings(
model_name="thenlper/gte-large"
)
Upvotes: 2
Reputation: 21
For a problem with
from llama_index import LangchainEmbedding
or
from llama_index.embedings import LangchainEmbedding
It looks like the syntax has been updated yet again, see:
https://github.com/run-llama/llama_index/blob/main/llama_index/embeddings/__init__.py
As of 2023 Dec 7th, it says:
from llama_index.embeddings.langchain import LangchainEmbedding
Today that works...tomorrow...a brave new world.
Upvotes: 2
Reputation: 31
Now this is the right Answer, LangchainEmbedding replace with HuggingFaceEmbeddings. Now its working.
from langchain.embeddings import HuggingFaceEmbeddings
from llama_index import ServiceContext, set_global_service_context
embed_model = HuggingFaceEmbeddings(
model_name="sentence-transformers/all-mpnet-base-v2"
)
Upvotes: 3