When a new report or dashboard is developed for consumption by other users, it is important to perform a few checks to validate the data and design of the included reports.
Report or Dashboard Design Check
Verify that the new report or dashboard conforms to the report requirement / design specifications. Some of the items to check are :
- Verify that the report or dashboard page title corresponds to the content of the reports.
- For reports with charts, the axis should be labelled appropriately.
- The aggregation level of the data in the reports should be as per the report requirements.
- Verify that the report or dashboard page design conforms to the design standards and best practices.
- Validate the presence and functionality of the report download and print options.
- Where applicable, verify that the report help text exists and is appropriate for the report content.
- Verify the existence of any required static display text in the report such as FOIA text.
Example : A new dashboard page was created with too many reports and prompts in one page which made it difficult for users to gain insights quickly. This affected used adoption.
Prompts are used to filter the data in the reports as needed. They can be of different types but the most common type of prompt is a select list or dropdown with a lis of values. Some of the key tests for prompts are :
- Verify that all the prompts are available as per requirements. Also check if the type of the prompt matches the design specification.
- For each prompt verify the label and list of values displayed (where applicable).
- Apply each prompt and verify that the data in the report is getting filtered appropriately.
- Verify the default prompt selection satisfies the report or dashboard page design specification.
Example: The default selection for the ‘Quarter’ prompt was supposed to be the current quarter but it was hardcoded by the report developer to a specific quarter.
Report Data Accuracy Check
Verify that the data shown in the report is accurate. As is evident, this check is vital aspect of the report functional testing.
- Cross check the report with data shown in a transactional system application that is trusted by the users as the source of truth for the data shown in the report.
- Come up with an equivalent database query on the target and source databases for the report. Compare the results from the queries with the data in the report.
- Review the database query generated by the report for any issues.
- Apply reports prompts and validate the database query generated by the report as well as the query output.
Example : A set of canned reports were developed for a new BI project. When data accuracy tests were done comparing the report data with the output of equivalent queries in the source system, it was found that more than 50% of them failed the testing. Upon further investigation, several ETL and BI tool modelling issues were discovered by the development team.
Drilldown Report Checks
It is common to have links to drilldown reports in a report so that the user can navigate to those reports for further details. Links to these reports can be at the column level or column heading level. For each link to the drill down report, verify the following items :
- Verify that the counts are matching between the summary and detail report where appropriate.
- Verify that all the prompts from the summary reports are getting applied to the detail report.
- Check if the links to the detail report from the summary report are working from charts, tables, table headings.
- Verify the database SQL query for the drill down report is as expected.
Example: One of the prompt for the report was not applied to the drill down report when user navigated to it from the summary report. As a result, the amounts did not match between the summary and drill down report.
Report Performance Checks
Verify that the reports and dashboard page rendering times are meeting SLA requirements. Test the performance for difference prompt selections. Perform the same checks for the drilldown reports.
Example: The report did not have any default prompt selections and the performance of the report was extremely slow when no prompts (filters) were applied. It not only caused bad user experience but also unnecessary load on the database because users often stopped execution of the report from UI without getting any value from it.
Example: Although the BI tool supported both Firefox and IE, the reports were very slow in IE because of the difference in the image caching features of the corresponding browsers.
Automate Report Functional Testing with ETL and BI Validator
comes with Component test case
that can be used to compare the output of a report with the result of a database query. By automating this vital test, the quality of data shown in the reports can be improved tremendously.
comes with Report test plan
that can be used to measure the performance of the report for different report parameters.