Join 34,000+ subscribers and receive articles from our blog about software quality, testing, QA and security.

How to generate a report that shows summary of test status (passed, failed, untested, etc). per test suite?


I have a project that uses multiple test suites to manage test cases.
I have Suite A, Suite B, Suite C, etc.
Test Cases in each Suite will be run in multiple browsers/devices, over several months’ time.

So, say I did a Test Run with Test Cases from Suite A in IE 11 in August, in iOS in September and am working on it in Safari in October.

Which report would tell me: for Suite A, x% of the test cases have not been tested yet, y% have passed, z% have failed, etc (for all the possible statuses)? Is it possible to have a suite-level summary like that?

I guess I would assume that it would take the most recent result - so if Test Case A1 passed in IE 11 in August but failed in Safari last week, the status used for this summary would be failed.



Thanks for your posting. Yes, there’s a report for this and the Results > Comparison for Cases report from the Reports tab would be a good fit. The report would show a side-by-side comparison for runs for a suite and this looks as follows:

The Latest column would also represent the latest / most recent result.




This is helpful. Can I ask a few more questions?

  1. Exactly what data is being pulled for that Latest (Coverage) column? Is it per test case - so for each test case, the latest result entered in TestRail is included there? (So it could theoretically be pulling data from various test runs and wrapping them up in that column?)

  2. In the top - the “Latest Results & per Test Run” – I see in the first column in your sample report here 106 (30%) (light grey), 24 (7%) (dark grey), 202 (57%) (green). At the very top I see there are a few yellows as well - are they not listed with x (y%) because there are too few?

  3. Is it possible to have just the very top part in the report – my client really only wants the overview at the top (the very first section in your sample, with the 3 columns)? Or a way to paginate it so I could just print / save page 1 and send that on?

Thanks for your help with this!



Sure, I’m happy to help:

  1. Yes, this is per test case and the column displays the latest/most recent result across all test runs that are part of the report.

  2. This is not shown in the graph because the number is very small, but the full statistics are also shown below the chart.

  3. While this is not possible as an option out of the box, you can simply remove the unwanted part from the HTML file (which you can download via the arrow button when viewing the report, for example).

I hope this helps and I look forward to your reply!



Tobias (or others)
I am picking up on an old thread.

When you say in #1 above,

Does the report logic go by the timestamp of the test result or by when the test run was created? Is there a built in priority to the various results?

Below we have a situation where it is picking up “skipped” as the latest result even though there is a “passed” result after that.



Hi Satish,

Thanks for your reply! This would go by the timestamp of when the result was submit into TestRail, and not when the test run was started. So the report would always display the latest result that’s submit for the case in TestRail regardless of when the test run was started/created. Hope this helps!