user8615607
user8615607

Reputation: 197

pytest.skip within pytest_generate_tests skips all test functions in module instead of specific tests

I'm parameterizing pytest tests with variables defined in an external YAML file using the pytest_generate_tests hook. The name of the variable file is specified on the pytest command line (--params_file). Only some of the test functions within a module are parameterized and require the variables in this file. Thus, the command line option defining the variables is an optional argument. If the optional argument is omitted from the command line, then I want pytest to just "skip" those test functions which need the external parameterized variables and just run the "other" tests which are not parameterized. The problem is, if the command line option is omitted, pytest is skipping ALL of the test functions, not just the test functions that require the parameters.

Here is the test module file:

def test_network_validate_1(logger, device_connections,):

  ### Test code omitted.....


def test_lsp_throttle_timers(params_file, logger, device_connections):

  ### Test code omitted.....

def test_network_validate_2(logger, device_connections,):

  ### Test code omitted.....

pytest_generate_tests hook in conftest.py:

# Note, I tried scope at function level as well but that did not help
@pytest.fixture(scope='session')
def params_file(request):
    pass

def pytest_generate_tests(metafunc):
  
    ### Get Pytest rootdir
    rootdir = metafunc.config.rootdir

    print(f"*********** Test Function: {metafunc.function.__name__}")

    if "params_file" in metafunc.fixturenames:
        print("*********** Hello Silver ****************")
        if metafunc.config.getoption("--params_file"):

            #################################################################
            # Params file now located as a separate command line argument for
            # greater flexibility
            #################################################################
            params_file = metafunc.config.getoption("--params_file")
            params_doc = dnet_generic.open_yaml_file(Path(rootdir, params_file),
                                                    loader_type=yaml.Loader)

            test_module = metafunc.module.__name__
            test_function = metafunc.function.__name__
            names,values = dnet_generic.get_test_parameters(test_module,
                                                            test_function,
                                                            params_doc,)

            metafunc.parametrize(names, values )
        else:
            pytest.skip("This test requires the params_file argument")

When the params_file option is present, everything works fine:

pytest isis/test_isis_lsp_throttle.py --testinfo topoA_r28.yml --ulog -s --params_file common/topoA_params.yml  --collect-only
===================================================================================== test session starts =====================================================================================
platform linux -- Python 3.7.4, pytest-3.7.0, py-1.8.0, pluggy-0.13.0
rootdir: /home/as2863/pythonProjects/p1-automation, inifile: pytest.ini
plugins: csv-2.0.1, check-0.3.5, pylama-7.6.6, dependency-0.4.0, instafail-0.4.0, ordering-0.6, repeat-0.7.0, reportportal-5.0.3
collecting 0 items                                                                                                                                                                            *********** Test Function: test_network_validate_1
*********** Test Function: test_lsp_throttle_timers
*********** Test Function: test_network_validate_2
collected 3 items
<Package '/home/as2863/pythonProjects/p1-automation/isis'>
  <Module 'test_isis_lsp_throttle.py'>
    <Function 'test_network_validate_1'>
    <Function 'test_lsp_throttle_timers'>
    <Function 'test_network_validate_2'>

================================================================================ no tests ran in 0.02 seconds =================================================================================                                                                                                                                                                            

When the params_file option is ommitted, you can see that no tests are run and the print statement shows it does not even try to run pytest_generate_tests on "test_network_validate_2"

pytest isis/test_isis_lsp_throttle.py --testinfo topoA_r28.yml --ulog -s  --collect-only                         ===================================================================================== test session starts =====================================================================================
platform linux -- Python 3.7.4, pytest-3.7.0, py-1.8.0, pluggy-0.13.0
rootdir: /home/as2863/pythonProjects/p1-automation, inifile: pytest.ini
plugins: csv-2.0.1, check-0.3.5, pylama-7.6.6, dependency-0.4.0, instafail-0.4.0, ordering-0.6, repeat-0.7.0, reportportal-5.0.3
collecting 0 items

*********** Test Function: test_network_validate_1
*********** Test Function: test_lsp_throttle_timers
*********** Hello Silver ****************
collected 0 items / 1 skipped

================================================================================== 1 skipped in 0.11 seconds ==================================================================================

Upvotes: 5

Views: 924

Answers (2)

ollien
ollien

Reputation: 4766

While MrBean Bremen's answer may work, according to the pytest authors dynamically altering the fixture list is not something they really want to support. This approach, however, is a bit more supported.

# This is "auto used", but doesn't always skip the test unless the test parameters require it
@pytest.fixture(autouse=True)
def skip_test(request):
    # For some reason this is only conditionally set if a param is passed
    # https://github.com/pytest-dev/pytest/blob/791b51d0faea365aa9474bb83f9cd964fe265c21/src/_pytest/fixtures.py#L762
    if not hasattr(request, 'param'):
        return

    pytest.skip(f"Test skipped: {request.param}")

And in your test module:

def _add_flag_parameter(metafunc: pytest.Metafunc, name: str):
    if name not in metafunc.fixturenames:
        return

    flag_value = metafunc.config.getoption(name)
    if flag_value:
        metafunc.parametrize(name, [flag_value])
    else:
        metafunc.parametrize("skip_test", ["Missing flag '{name}'"], indirect=True)

def pytest_generate_tests(metafunc: pytest.Metafunc):
    _add_flag_parameter(metafunc, "params_file")

Upvotes: 0

MrBean Bremen
MrBean Bremen

Reputation: 16815

As has been found in the discussion in the comments, you cannot use pytest.skip in pytest_generate_tests, because it will work on module scope. To skip the concrete test, you can do something like this:

@pytest.fixture
def skip_test():
    pytest.skip('Some reason')

def pytest_generate_tests(metafunc):
    if "params_file" in metafunc.fixturenames:
        if metafunc.config.getoption("--params_file"):
            ...
            metafunc.parametrize(names, values )
        else:
            metafunc.fixturenames.insert(0, 'skip_test')

E.g. you introduce a fixture that will skip the concrete test, and add this fixture to the test. Make sure to insert it as the first fixture, so no other fixtures will be executed.

Upvotes: 7

Related Questions