Reputation: 7879
After llamaindex
introduced v0.10 in February 2024, it introduced a lot of breaking changes to imports. I am trying to update llama-index
within a conda
environment, but I receive the following error:
> pip install llama-index --upgrade
ERROR: Cannot install llama-index-cli because these package versions have conflicting dependencies.
The conflict is caused by:
llama-index-vector-stores-chroma 0.1.4 depends on onnxruntime<2.0.0 and >=1.17.0
llama-index-vector-stores-chroma 0.1.3 depends on onnxruntime<2.0.0 and >=1.17.0
llama-index-vector-stores-chroma 0.1.2 depends on onnxruntime<2.0.0 and >=1.17.0
To fix this you could try to:
1. loosen the range of package versions you've specified
2. remove package versions to allow pip attempt to solve the dependency conflict
ERROR: ResolutionImpossible: for help visit https://pip.pypa.io/en/latest/topics/dependency-resolution/#dealing-with-dependency-conflicts
I have tried pip install llama-index-vector-stores-chroma
but get the same error.
I have also tried installing onnxruntime
but get this error:
pip install onnxruntime
ERROR: Could not find a version that satisfies the requirement onnxruntime (from versions: none)
ERROR: No matching distribution found for onnxruntime
How can I update llama-index
?
Upvotes: 2
Views: 1831
Reputation: 1
Me too.
ERROR: Could not find a version that satisfies the requirement llama-index-vector-stores (from versions: none) ERROR: No matching distribution found for llama-index-vector-stores
Upvotes: -1
Reputation: 817
Having the same issue and it took me 4 hrs working with Llama-Index folks to get it working.
In terminal,
conda deactivate
python -m venv .venv
then activate it source .venv/bin/activate
SYSTEM_VERSION_COMPAT=0 pip install --no-cache-dir "onnxruntime>=1.17.1"
SYSTEM_VERSION_COMPAT=0 pip install --no-cache-dir "llama-index-core>=0.10.13"
and SYSTEM_VERSION_COMPAT=0 pip install --no-cache-dir "llama-index-vector-stores-chroma>=0.1.4"
pip install llama-index
. It should automatically try to install the newest version 0.10.13.post1 as for 02/27/2024.For code, with the new update, the old code
from llama_index.core.llms import ChatMessage, ChatResponse
Now switched to
from llama_index.core.base.llms.types import ChatMessage, ChatResponse
Upvotes: 0
Reputation: 391
pip install onnxruntime --upgrade
It will install onnxruntime-1.16.3 or greater and will replace older versions that cause this issue.
Upvotes: 0
Reputation: 7879
I was able to solve this using this workaround: conda install onnxruntime -c conda-forge
. I got it from this thread: github.com/microsoft/onnxruntime/issues/11037
If there is a reason to not use the conda forge, please leave a comment.
Upvotes: 1