Reputation: 22592
My current project has a policy of 100% code coverage from its unit tests. Our continuous integration service will not allow developers to push code without 100% coverage.
As the project has grown, so has the time to run the full test suite. While developers typically run a subset of tests relevant to the code they are changing, they will usually do one final full run before submitting to CI, and the CI server itself also runs the full test suite.
Unit tests by their nature are highly parallelizable, as they are self-contained and stateless from test to test. They return only two pieces of information: pass/fail and the lines of code covered. A map/reduce solution seems like it would work very well.
Are there any Python testing frameworks that will run tests across a cluster of machines with code coverage and combine the results when finished?
Upvotes: 12
Views: 1822
Reputation: 8089
I think there is no framework that matches exactly to your needs.
I know py.test has xdist plugin which adds distributed test executors. You can use it to write your CI infrastructure on top of it.
Upvotes: 4
Reputation: 33495
Not exactly what you are looking at, but the closest I could recall is from the Hadoop groups is using JUnit for testing with Hadoop. Here is the mail. As mentioned in the mail search for gridunit
papers.
Unit testing with Hadoop in a distributed way is very interesting. Any frameworks around this would be very useful, but developing a framework shouldn't be very difficult. If interested let me know.
Upvotes: 1
Reputation: 1375
I don't know of any testing frameworks that will run tests distributed off a group of machines, but nose has support for parallelizing tests on the same machine using multiprocessing.
At minimum that might be a good place to start to create a distributed testing framework
Upvotes: 4