Reputation: 77
We want to install a specific version of pyspark
(==2.4.7). The issue is that this specific version needs a pypandoc
< 1.8. Moreover, pyspark
must be built on installation. We pin it explicitly in the dependencies as shown here:
dependencies = ["pypandoc==1.5", "pyspark==2.4.7"]
However, this does not carry when pyspark
is built during install as shown here:
Could not import pypandoc - required to package PySpark
C:\Users\foo\AppData\Local\Temp\pdm-build-env-avlow5go-shared\Lib\site-packages\setuptools\dist.py:452: SetuptoolsDeprecationWarning: Invalid dash-separated options
!!
********************************************************************************
Usage of dash-separated 'description-file' will not be supported in future
versions. Please use the underscore name 'description_file' instead.
This deprecation is overdue, please update your project and remove deprecated
calls to avoid build errors in the future.
See
https://setuptools.pypa.io/en/latest/userguide/declarative_config.html
for details.
********************************************************************************
!!
opt = self.warn_dash_deprecation(opt, section)
======== Start resolving requirements ========
Adding requirement python==3.12.5
Adding requirement pypandoc
======== Resolution Result ========
python None
pypandoc 1.13
Fetching hashes for pypandoc@1.13
Installing pypandoc@1.13...
The same error happens on an Ubuntu machine and using poetry
instead of pdm
.
Is there a way to pin the transitive dependency version in this case?
Upvotes: 0
Views: 34