The value of automated testing is determined by reporting framework definetly at most. Your team must provide clear, in-time and customizable reporting for stakeholders. Nowadays data presentation plays vital role in assesssing whether your product is useful and useable or not.
The test reporting should be able to show various views of accumulated data. Someone will like to see graphs over all runs, someone will check daily runs carefully drilling down grids and logging lines, someone will want to see something like traffic lights (red, yellow, green).
Development such system is the big deal as actually full-grown project. I developed this one with my team 2 years ago. That was Dashboard which was utilized over all test automation projects. We built the Dashboard using XAMPP (Apache, MySQL), plus some extra JS code and jQuery. To plot graphs we used jGraph open source library. I really like this app as it is flexible, has pretty presentation, navigation, filtering. Moreover we implemented test running and test stands dispatching through the web dashboard interface. Though this infrastructure project had been running in background, we spend on that good portion of our budget and tehnical skills.
This challenge may frustrate if you have limited resources which are dedicated to developing tests only. There is a recipe! Likely your project uses defect-tracking system – so your task is simple utilization this tool for your daily runs. Atlassian Jira
is perfect candidate to be reused for test automation as this very flexible and cusomizable system supporting a lot of technologies (just check 3-rd parties integration – http://www.atlassian.com/software/jira/tour/plugins.jsp
). Nevertheless if you use Trac or BugZilla or ClearQuest – I believe (since I worked with them), you can apply the same approach.
I see 2 options how to deploy this integration:
1. Integration into project(s) structure with new ticket type (to isolate from regular project tickets)
2. Create new project wth required break down by components (test project or test level), releases, builds (e.g. each new run is a new build); fields customization and so on.
How to tune up:
1. Customize ticket fields
2. Identify rules on new build (e.g. each run) and ticket submission (each test case failure)
* Preferably your reporting should be granular with proper rules of counting failured/not_run/passed steps. The perfect approach is to trace auto test against manual test case (failure in one or more step(s) mean test case fail)
3. Create preliminary views in tracking system: spreadsheets, graphs using built-in filters
4. Think out roles, assignements, priorities and other useful fields
5. Teach your test automation framework to communicate with defect tracker:
- come up with interface: GUI, API, DB, Web service…
- new run submission
- new ticket submission (form filling)
- supplementary data retrieving/update/insert
6. Create a single-view Dashboard page as container of customized gadgets through filtering (or querying)
The final dashboard look and feel is limited by your defect tracker. I like Jira’s filtering options and graphs:
The last concern, what if your management will not like this idea to submit issue reports automatically? And here is your option – deploy your own bug tracker instance, establish project and required break down (components, releases, builds…) and integrate with test automation suite. Enjoy with your new reporting tool and don’t forget to show your solution to everyone potentially interested.