dWitty
dWitty

Reputation: 524

Dynamically parametrizing class-level fixtures with pytest

It is possible to parametrize test functions from command-line arguments. It is possible to have a fixture scoped to a class. I want to combine those two things, so that each class receives parametrized arguments that are given to the fixture within the class.

(essentially, per command-line argument I need to run one very expensive operation and then do a variety of cheap, speedy tests against the results of that operation, and I'd prefer not to have to rerun the expensive operation for each cheap test, so I'd like a way to save it)

In other words, I'm looking for an equivalent to pytest_generate_tests(metafunc), that would work for dynamically parametrizing a fixture, not a test function.

One thing I have already tried unusccessfully is reading the request parameters and setting those via the pytest_generate_tests hook.

conftest.py:
    def pytest_generate_tests(metafunc):
        metafunc.parametrize("result", [
                (1,0),(1,2)
            ])

test_thing.py:
    class TestThingy:
        @pytest.fixture(scope="class")
        def result(self, request):
            a,b=request.param
            return a+b

    #@pytest.mark.parametrize("result", [(0, 1), (1, 2)])
    def test_one(self, result):
        assert result!=2

Running this test causes the following error to be raised (note that the test ran fine when I tried it without the conftest hook and with the commented line uncommented):

@pytest.fixture(scope="class")
def result(self, request):
    a,b=request.param

AttributeError: 'SubRequest' object has no attribute 'param'

I'd also be interested in any other alternate way to achive the same result.

Upvotes: 6

Views: 3960

Answers (1)

Claire Nielsen
Claire Nielsen

Reputation: 2161

I've done an ugly-hack-of-a-job at parametrizing a fixture based on command line args like this:

# contents of conftest.py
import pytest

def pytest_addoption(parser):
    parser.addoption("--test-local", action="store_true", default=False)

def pytest_generate_tests(metafunc):
    if "dummy" in metafunc.fixturenames:
        #driverParams sets dummy[0], dummy[1] parametrically (as seen below)
        if metafunc.config.getoption("--test-local"):
            driverParams = [(True, None)]
        else:
            driverParams = [(False, "seriousface setting 1"), 
                            (False, "seriousface setting 2")]
        metafunc.parametrize("dummy", driverParams)

#driver can then be used as a standard fixture
@pytest.fixture(scope="function")
def driver(dummy):
    _driver = makeDriverStuff(dummy[0], dummy[1])
    yield _driver
    _driver.cleanup()

@pytest.fixture(scope="function")
def dummy():
    pass

Essentially what's going on here is that when metafunc.parametrize("dummy", driverParams) runs, it tosses driverParams into anything that relies on the dummy fixture. More importantly, it means that the dummy fixture function never actually executes.

Initially, I tried using metafunc to parametrize my driver fixture directly, and noticed that cleanup was never happening, because the (wrong) way I was doing things before was to metafunc.parametrize with makeDriverStuff(param1, param2) directly, and then scratch my head wondering why driver() was never getting called. The point of the dummy is to say, "Hey, there's a legit fixture here, don't complain that you can't find it!", and then always preempt it with something that's actually useful.

Like I said, it's an ugly hack. But you should be able to adapt it

Upvotes: 4

Related Questions