alon-k
alon-k

Reputation: 438

How to skip a test in pytest *before* fixtures are computed

I have a fairly large test suite written with pytest which is meant to perform system tests on an application that includes communication between server side and client side. The tests have a huge fixture which is initialized to contain information about the server which is then used to create a client object and run the tests. The server side may support different feature sets and it is reflected via attributes that may or may not be present inside the server object initialized by the fixture.

Now, quite similarly to this question, I need to skip certain tests if the required attributes are not present in the server object. The way we have been doing this so far is by adding a decorator to the tests which checks for the attributes and uses pytest.skip if they aren't there.

Example:

import functools
import pytest

def skip_if_not_feature(feature):
  def _skip_if_not_feature(func):
    @functools.wraps(func)
    def wrapper(server, *args, **kwargs):
      if not server.supports(feature):
        pytest.skip("Server does not support {}".format(feature))
      return func(server, *args, **kwargs)
    return wrapper
  return _skip_if_not_feature

@skip_if_not_feature("feature_A")
def test_feature_A(server, args...):
  ...

The problem arises when some of these tests have more fixtures, some of which have relatively time consuming setup, and due to how pytest works, the decorator code which skips them runs after the fixture setup, wasting precious time.

Example:

@skip_if_not_feature("sync_db")
def test_sync_db(server, really_slow_db_fixture, args...):
  ...

I'm looking to optimize the test suite to run faster by making these tests get skipped faster. I can only think of two ways to do it:

  1. Re-write the decorator not to use the server fixture, and make it run before the fixtures.
  2. Run code which decides to skip the tests between the initialization of the server fixture and the initialization of the rest of the fixtures.

I'm having trouble figuring out if the parts in bold are possible, and how to do them if they are. I've already gone through pytest documentation and google / stackoverflow results for similar questions and came up with nothing.

Upvotes: 0

Views: 1909

Answers (2)

Lior Cohen
Lior Cohen

Reputation: 5755

The problem at the current decorator is that after the decorator do it thing, the module contains a test functions which one or more of its arguemnt are features. Those are identified as feature by the pytest mechanism, and evaluated.

It all lies in your *args

What you can do is to make your decorator spit out a func(server) instead of func(server,*args, **kwargs) when he recognizes that it is going to skip this test. This way the skipped function won't have other fixtures, hence they will not be evaluated.

In a matter of fact you can even return instead of func a simpler empty lambda: None, as it is not going to be tested anyway.

Upvotes: 1

MrBean Bremen
MrBean Bremen

Reputation: 16855

You can add a custom marker with the feature name to your tests, and add a skip marker in pytest_collection_modifyitems if needed. In this case, the test is skipped without loading the fixtures first.

conftest.py

def pytest_configure(config):
    config.addinivalue_line(
        "markers",
        "feature: mark test with the needed feature"
    )


def pytest_collection_modifyitems(config, items):
    for item in items:
        feature_mark = item.get_closest_marker("feature")
        if feature_mark and feature_mark.args:
            feature = feature_mark.args[0]
            if not server.supports(feature):
                item.add_marker("skip")

test_db.py

@pytest.mark.feature("sync_db")
def test_sync_db(server, really_slow_db_fixture, args...):
  ...

Upvotes: 2

Related Questions