Thomas Thorogood
Thomas Thorogood

Reputation: 2357

How to get coverage reporting when testing a pytest plugin?

Context

I am updating an inherited repository which has poor test coverage. The repo itself is a pytest plugin. I've changed the repo to use tox along with pytest-cov, and converted the "raw" tests to use pytester as suggested in the pytest documentation when testing plugins.

The testing and tox build, etc. works great. However, the coverage is reporting false misses with things like class definitions, imports, etc. This is because the code itself is being imported as part of pytest instantiation, and isn't getting "covered" until the testing actually starts.

I've read pytest docs, pytest-cov and coverage docs, and tox docs, and tried several configurations, but to no avail. I've exhausted my pool of google keyword combinations that might lead me to a good solution.

Repository layout

pkg_root/
    .tox/
        py3/
            lib/
                python3.7/
                    site-pacakges/
                        plugin_module/
                            supporting_module.py
                            plugin.py
                            some_data.dat
    plugin_module/
        supporting_module.py
        plugin.py
        some_data.dat
    tests/
        conftest.py
        test_my_plugin.py
    tox.ini
    setup.py
    

Some relevant snippets with commentary:

tox.ini

[pytest]
addopts = --cov={envsitepackagesdir}/plugin_module --cov-report=html
testpaths = tests

This configuration gives me an error that no data was collected; no htmlcov is created in this case.

If I just use --cov, I get (expected) very noisy coverage, which shows the functional hits and misses, but with the false misses reported above for imports, class definitions, etc.

conftest.py

pytest_plugins = ['pytester']  # Entire contents of file!

test_my_plugin.py

def test_a_thing(testdir):
    testdir.makepyfile(
        """
            def test_that_fixture(my_fixture):
                assert my_fixture.foo == 'bar'
        """
    )
    result = testdir.runpytest()
    result.assert_outcomes(passed=1)

How can I get an accurate report? Is there a way to defer the plugin loading until it's demanded by the pytester tests?

Upvotes: 63

Views: 171233

Answers (2)

8c6b5df0d16ade6c
8c6b5df0d16ade6c

Reputation: 2544

You can achieve what you want without pytest-cov.


❯ coverage run --source=<package> --module pytest --verbose <test-files-dirs> && coverage report --show-missing
OR SHORTER
❯ coverage run --source=<package> -m pytest -v <test-files-dirs> && coverage report -m
Example: (for your directory structure)
❯ coverage run --source=plugin_module -m pytest -v tests && coverage report -m
======================= test session starts ========================
platform darwin -- Python 3.9.4, pytest-6.2.4, py-1.10.0, pluggy-0.13.1 -- /Users/johndoe/.local/share/virtualenvs/plugin_module--WYTJL20/bin/python
cachedir: .pytest_cache
rootdir: /Users/johndoe/projects/plugin_module, configfile: pytest.ini
collected 1 items

tests/test_my_plugin.py::test_my_plugin PASSED               [100%]

======================== 1 passed in 0.04s =========================
Name                            Stmts   Miss  Cover   Missing
-------------------------------------------------------------
plugin_module/supporting_module.py  4      0   100%
plugin_module/plugin.py             6      0   100%
-------------------------------------------------------------
TOTAL                              21      0   100%

For an even nicer output, you can use:

❯ coverage html && open htmlcov/index.html

coverage HTML Report


Documentation

❯ coverage -h
❯ pytest -h

coverage

run -- Run a Python program and measure code execution.

-m, --module --- Show line numbers of statements in each module that weren't executed.

--source=SRC1,SRC2, --- A list of packages or directories of code to be measured.

report -- Report coverage stats on modules.

-m, --show-missing --- Show line numbers of statements in each module that weren't executed.

html -- Create an HTML report.

pytest

-v, --verbose -- increase verbosity.

Upvotes: 50

Ned Batchelder
Ned Batchelder

Reputation: 375574

Instead of using the pytest-cov plugin, use coverage to run pytest:

coverage run -m pytest ....

That way, coverage will be started before pytest.

Upvotes: 94

Related Questions