Ywen
Ywen

Reputation: 621

Conda cannot find some package (pyspark) for environment from YML

I have a .yml file that was saved by a colleague. I cannot recreate an environment with conda env create -f file.yml (both with anaconda and miniconda on Ubuntu, and with the official docker images of both)

I tried to add - conda-forge to the channels but that doesn't change anything, I still get:

Collecting pyspark==2.1.1
  Could not find a version that satisfies the requirement pyspark==2.1.1 (from versions: )
No matching distribution found for pyspark==2.1.1

CondaValueError: pip returned an error.```

Whereas installing manually this package with pip or conda works. Here are the yaml file contents:

name: stuff channels: - defaults dependencies: - pip=9.0.1=py36_1 - python=3.6.1=0 - setuptools=27.2.0=py36_0 - pip: - pyspark==2.1.1

Upvotes: 0

Views: 1142

Answers (1)

sbhle
sbhle

Reputation: 21

I think for now the guys from Continuum don't actively develop "conda env" anymore. So the recommendation is to use "conda create" directly instead. To share an environment with the exact package version you can simply export the active environment with:

conda list --explicit > my_environment.txt

and pipe the output of that to a file (in the example to "my_environment.txt"). Afterwards you can import the environment by giving it a name (in the example below "MyEnvironment") and the --file option with the exported environment:

conda create --name MyEnvironment --file my_environment.txt

Upvotes: 1

Related Questions