Reputation: 12213
Assume you have a large python/django project with a fair amount of functions, models and dependencies between models.
There are a lot of tests split across multiple files. How would one gain the overview of what functionality / exception cases has actually been tested?
Tests are lengthy -> it takes a lot of time to read through the complete unittest file to find the exact one that one is looking for
dependencies between modules can be in multiple test files -> one does not necessarily know where to look
Is there a tool / standard procedure how to gain overview over tests, e.g. a tabular form etc. ?
Update: The question is not about coverage,
# code
def division(x, y):
return x/y
# tests
def test_division_integers():
"""Test if we can divide integers"""
assert division(6, 3) == 2
would have 100% coverage but I don't have a test for division by 0, feeding function with strings, ... I 'd rather have some overview
| Test Name | Description | Params |
| --------------------- | ----------- | ------ |
| test division integers | <docstring> | |
Upvotes: 0
Views: 120
Reputation: 522499
would have 100% coverage but I don't have a test for division by 0, feeding function with strings, ... I 'd rather have some overview
Basically, your tests are that. No tool will be able to seriously tell you that you've tested a function with integers but not with 0 and not with strings. What you basically want is to make your tests more readable. In your above example, assert division(6, 3) == 2
is the information you want to know, and it's right there; but that will obviously get less and less comprehensible for longer and more complex tests.
A tool like behave might be the right thing there, allowing you to write your tests in very readable Gherkin files, and test tools like behave produce reports that highlight which of those tests succeeded and which didn't.
Upvotes: 1