Reputation: 4488
TFS (we're using 2012 at the moment) has a functional testing area where people set up test cases and go through them during regression testing or when a feature has been implemented. If something doesn't work, a bug can be created from a test case.
We're looking for an easy way to track the amount of time testers spend on going through the test cases before each release in addition to whether they passed or failed. Could a custom "Time Spent" field be added to a test run? Or is there a better way? I'd prefer not to use a separate tool for tracking time.
Upvotes: 1
Views: 1679
Reputation: 23434
This feature is built into TFS. When you execute one or more tests as a tester Microsoft Test Manager ( and Web Access) records both the start and end date time and associates it to the Test Run.
You can see this easily in MTM but it is no surfaced in the web access. This is the actual time between starting and ending testing making it easy to calculate a duration. If you have lots of runs you can report on total test effort within a Release as well as potentially ranking PBI's by the test time.
You can do this reporting in TFS with the Data Warehouse and Cube, and in VSO using the RestAPI.
Upvotes: 2
Reputation: 4616
It is difficult to track the actual time spent on any task all the time. People would have to be really on top of watching the watch all the time when they start and finish a tasks and of course there are interruptions and distractions.
I flaunted with the idea of using the Pomodoro technique, which worked well for me when the team wasn't too big.
There is an Visual Extension for Pomodoro Timer available but haven't used it personally so can't vouch for it.
Upvotes: 0