Reputation: 23512
When I do a pip freeze I see large number of Python packages that I didn't explicitly install, e.g.
$ pip freeze
Cheetah==2.4.3
GnuPGInterface==0.3.2
Landscape-Client==11.01
M2Crypto==0.20.1
PAM==0.4.2
PIL==1.1.7
PyYAML==3.09
Twisted-Core==10.2.0
Twisted-Web==10.2.0
(etc.)
Is there a way for me to determine why pip installed these particular dependent packages? In other words, how do I determine the parent package that had these packages as dependencies?
For example, I might want to use Twisted and I don't want to depend on a package until I know more about not accidentally uninstalling it or upgrading it.
Upvotes: 309
Views: 194830
Reputation: 5834
You could try pipdeptree, which displays dependencies as a tree structure e.g.:
$ pipdeptree
Lookupy==0.1
wsgiref==0.1.2
argparse==1.2.1
psycopg2==2.5.2
Flask-Script==0.6.6
- Flask [installed: 0.10.1]
- Werkzeug [required: >=0.7, installed: 0.9.4]
- Jinja2 [required: >=2.4, installed: 2.7.2]
- MarkupSafe [installed: 0.18]
- itsdangerous [required: >=0.21, installed: 0.23]
alembic==0.6.2
- SQLAlchemy [required: >=0.7.3, installed: 0.9.1]
- Mako [installed: 0.9.1]
- MarkupSafe [required: >=0.9.2, installed: 0.18]
ipython==2.0.0
slugify==0.0.1
redis==2.9.1
To install it, run:
pip install pipdeptree
As noted by @Esteban in the comments you can also list the tree in reverse with -r
or for a single package with -p <package_name>
. So to find which module(s) Werkzeug is a dependency for, you could run:
$ pipdeptree -r -p Werkzeug
Werkzeug==0.11.15
- Flask==0.12 [requires: Werkzeug>=0.7]
Upvotes: 408
Reputation: 4276
pipx install pip-tools
pip-compile --output-file=- requirements.txt
Starting with the contents of requirements.txt
as
Django
results in
#
# This file is autogenerated by pip-compile with Python 3.12
# by the following command:
#
# pip-compile --output-file=- requirements.txt
#
asgiref==3.8.1
# via django
django==5.1.2
# via -r requirements.txt
sqlparse==0.5.1
# via django
Unlike pipdeptree, this works in a virtualenv. Supposedly pipdeptree can also be told to use the virtualenv's python, by calling it as pipdeptree --python auto
, but I haven't always had consistent luck with this.
Upvotes: 0
Reputation: 5512
I wrote a quick script to solve this problem. The following script will display the parent (dependant) package(s) for any given package. This way you can be sure it is safe to upgrade or install any particular package. It can be used as follows: dependants.py PACKAGENAME
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""Find dependants of a Python package"""
import logging
import pkg_resources # tested with Python 2.7, 3.6 and 3.10
import sys
__program__ = 'dependants.py'
def get_dependants(target_name):
for package in pkg_resources.working_set:
for requirement_package in package.requires():
requirement_name = requirement_package.key
if requirement_name == target_name:
yield package.project_name
# configure logging
logging.basicConfig(format='%(levelname)s: %(message)s',
level=logging.INFO)
try:
target_name = sys.argv[1]
except IndexError:
logging.error('missing package name')
sys.exit(1)
try:
pkg_resources.get_distribution(target_name)
except pkg_resources.DistributionNotFound:
logging.error("'%s' is not a valid package", target_name)
sys.exit(1)
print(list(get_dependants(target_name)))
Upvotes: 2
Reputation: 1386
If you like graphs you can use graphviz
(Documentation)
pip install graphviz
Then do something like this:
#! /usr/bin/env python3
import graphviz
import pkg_resources
GRAPH_NAME = "pipdeps"
def init_grph():
grph = graphviz.Digraph(GRAPH_NAME,
node_attr={'color': 'lightblue2', 'style': 'filled'})
# This does not seem to be interpreted on websites I tested
grph.attr(engine='neato')
return grph
def add_pip_dependencies_to_graph(grph):
l_packages = [p for p in pkg_resources.working_set]
for package in l_packages:
name = package.key
for dep in package.requires():
grph.edge(dep.key, name)
def main():
grph = init_grph()
add_pip_dependencies_to_graph(grph)
print(grph.source)
# grph.view()
main()
This prints a dotgraph/digraph/graphviz (idk 🤷♀️)
You can view it for example in an online graphviz visualiser (e.g. https://dreampuf.github.io/GraphvizOnline)
Alternatively if you have a graphic interface (😮) you can use grph.view()
Upvotes: 5
Reputation: 1093
You have two options here.
The first will output all top-level packages, excluding sub packages. Note that this will also exclude for example requests, even if you want to have it explicitly installed and include dependencies not explicit installed via pip (e.g. pip itself or setuptools which are needed by pip). Side note: these implicit dependencies can be also listed with pip freeze
when used with the option --all
. If you want to omit these extra dependencies one might exclude them with the --exclude
option.
pip3 list --not-required --format freeze --exclude pip --exclude setuptools
The second option is to print the packages based on the existing requirements.txt
file.
pip3 freeze -r requirements.txt
This will generate a file in the format:
existing-package==1.0.0
## The following requirements were added by pip freeze:
dependency-package==1.0.0
You can remove all the additionally added packages by using sed:
pip3 freeze -r requirements.txt \
| sed -n '/^## The following requirements were added by pip freeze:$/q;p'
Upvotes: 1
Reputation: 186
The following command will show requirements of all installed packages:
pip3 freeze | awk '{print $1}' | cut -d '=' -f1 | xargs pip3 show
Upvotes: 15
Reputation: 7742
The pip show
command will show what packages are required for the specified package (note that the specified package must already be installed):
$ pip show specloud
Package: specloud
Version: 0.4.4
Requires:
nose
figleaf
pinocchio
pip show
was introduced in pip version 1.4rc5
Upvotes: 137
Reputation: 12164
(workaround, not true answer)
Had the same problem, with lxml not installing and me wanting to know who needed lxml. Not who lxml needed. Ended up bypassing the issue by.
noting where my site packages were being put.
go there and recursive grep for the import (the last grep's --invert-match serves to remove lxml's own files from consideration).
Yes, not an answer as to how to use pip to do it, but I didn't get any success out of the suggestions here, for whatever reason.
site-packages me$ egrep -i --include=*.py -r -n lxml . | grep import | grep --invert-match /lxml/
Upvotes: 3
Reputation: 1034
You may also use a one line command which pipes the packages in requirements to pip show.
cut -d'=' -f1 requirements.txt | xargs pip show
Upvotes: 8
Reputation: 9991
As I recently said on a hn thread, I'll recommend the following:
Have a commented requirements.txt
file with your main dependencies:
## this is needed for whatever reason
package1
Install your dependencies: pip install -r requirements.txt
.
Now you get the full list of your dependencies with pip freeze -r requirements.txt
:
## this is needed for whatever reason
package1==1.2.3
## The following requirements were added by pip --freeze:
package1-dependency1==1.2.3
package1-dependency1==1.2.3
This allows you to keep your file structure with comments, nicely separating your dependencies from the dependencies of your dependencies. This way you'll have a much nicer time the day you need to remove one of them :)
Note the following:
requirements.raw
with version control to rebuild your full requirements.txt
.pip install --no-install <package_name>
to list specific requirements.Upvotes: 28
Reputation: 31653
First of all pip freeze
displays all currently installed packages Python, not necessarily using PIP.
Secondly Python packages do contain the information about dependent packages as well as required versions. You can see the dependencies of particular pkg using the methods described here. When you're upgrading a package the installer script like PIP will handle the upgrade of dependencies for you.
To solve updating of packages i recommend using PIP requirements files. You can define what packages and versions you need, and install them at once using pip install.
Upvotes: 4