BI Testing: What, How, Challenges and Techniques

What is BI testing? 

BI Testing, Business Intelligence Testing – is the key to data reliability. It ensures your BI systems deliver accurate insights for informed decision-making.
At Datagaps, we introduce you to the BI Validator – BI automation testing tool, an all-encompassing BI Testing tool and Testing Automation Solution. BI Validator tool is designed to specifically fortify the reliability of BI data, using such as Tableau, Oracle Analytics, or Microsoft Power BI, enabling you to make well-informed decisions confidently. – Meet Our Dedicated BI Testers to know about it. – 14 Days Free trial  

How to test BI reports and dashboard?


Reports and dashboards are frequently employed in making critical business decisions. The most common asking term is “How to test Power BI reports.” – Testing Power BI reports are essential to ensure the accuracy and reliability of their insights.


If automated testing of power BI reports is wrong, then the organization’s business growth and success will be improper based on these reports. Inaccurate BI reports companies will face compliance and legal issues. Companies must pay attention to BI Testing because it can result in costly penalties.

Challenges in BI Projects

Enterprises use BI Testing Tools such as Tableau, Microsoft Power BI, Business Objects, Oracle Analytics and Cognos for reporting on their data. One of the key factor for the success of BI projects is the level of trust on the data shown in BI reports and dashboards. Lack of trust in the data reduces user adoption and often results in BI project failures.

Report performance is another common issue in BI projects. The reason for performance issues could be bad design or system capacity limitations. Stress testing of the BI dashboards and reports is important to understand the performance issues ahead of go-live.

Try Datagaps BI Validator a best in class BI Testing Tool for all your BI projects- Free for 14 days 

BI Testing Techniques

When a new report or dashboard is developed for consumption by other users, it is important to perform a few checks to validate the data and design of the included reports.

Report or Dashboard Design Check

Verify that the new report or dashboard conforms to the report requirement / design specifications. Some of the items to check are :

  1. Verify that the report or dashboard page title corresponds to the content of the reports.
  2. For reports with charts, the axis should be labelled appropriately.
  3. The aggregation level of the data in the reports should be as per the report requirements.
  4. Verify that the report or dashboard page design conforms to the design standards and best practices.
  5. Validate the presence and functionality of the report download and print options.
  6. Where applicable, verify that the report help text exists and is appropriate for the report content.
  7. Verify the existence of any required static display text in the report such as FOIA text.

Example : A new dashboard page was created with too many reports and prompts in one page which made it difficult for users to gain insights quickly. This affected used adoption.

Prompts Check

Prompts are used to filter the data in the reports as needed. They can be of different types but the most common type of prompt is a select list or dropdown with a lis of values. Some of the key tests for prompts are :

  1. Verify that all the prompts are available as per requirements. Also check if the type of the prompt matches the design specification.
  2. For each prompt verify the label and list of values displayed (where applicable).
  3. Apply each prompt and verify that the data in the report is getting filtered appropriately.
  4. Verify the default prompt selection satisfies the report or dashboard page design specification.

Example: The default selection for the ‘Quarter’ prompt was supposed to be the current quarter but it was hardcoded by the report developer to a specific quarter.

Report Data Accuracy Check

Verify that the data shown in the report is accurate. As is evident, this check is vital aspect of the report functional testing.

  1. Cross check the report with data shown in a transactional system application that is trusted by the users as the source of truth for the data shown in the report.
  2. Come up with an equivalent database query on the target and source databases for the report. Compare the results from the queries with the data in the report.
  3. Review the database query generated by the report for any issues.
  4. Apply reports prompts and validate the database query generated by the report as well as the query output.

Example : A set of canned reports were developed for a new BI project. When data accuracy tests were done comparing the report data with the output of equivalent queries in the source system, it was found that more than 50% of them failed the testing. Upon further investigation, several ETL and BI tool modelling issues were discovered by the development team.

Drilldown Report Checks

It is common to have links to drilldown reports in a report so that the user can navigate to those reports for further details. Links to these reports can be at the column level or column heading level. For each link to the drill down report, verify the following items :

  1. Verify that the counts are matching between the summary and detail report where appropriate.
  2. Verify that all the prompts from the summary reports are getting applied to the detail report.
  3. Check if the links to the detail report from the summary report are working from charts, tables, table headings.
  4. Verify the database SQL query for the drill down report is as expected.

Example: One of the prompt for the report was not applied to the drill down report when user navigated to it from the summary report. As a result, the amounts did not match between the summary and drill down report.

Report Performance Checks

Verify that the reports and dashboard page rendering times are meeting SLA requirements. Test the performance for difference prompt selections. Perform the same checks for the drilldown reports.

Example: The report did not have any default prompt selections and the performance of the report was extremely slow when no prompts (filters) were applied. It not only caused bad user experience but also unnecessary load on the database because users often stopped execution of the report from UI without getting any value from it.

Browser Checks

Browser compatibility of the reports is often dictated by the support of these browsers by the BI tool being used for the BI project. Any custom javascript additions to the report or dashboard page can also result in browser specific report issues.

Example: Although the BI tool supported both Firefox and IE, the reports were very slow in IE because of the difference in the image caching features of the corresponding browsers.

Automate Report Functional Testing with ETL and BI Validator

ETL Validator comes with Component test case that can be used to compare the output of a report with the result of a database query. By automating this vital test, the quality of data shown in the reports can be improved tremendously. BI Validator comes with Report test plan that can be used to measure the performance of the report for different report parameters.

BI tools such as OBIEE and Business Objects empower the business users by providing the capability to create their own reports without the help of a developer. These tools generate the database queries automatically for the reports based on the measure and dimension definitions defined on top of the physical data model. For OBIEE, this model is defined in the RPD while Business Objects stores the model in the form of an universe. Business users can select any combination of dimension and measure attributes available in the subject area to come up with their own adhoc report. From a testing perspective, this presents a huge challenge since the number of different combination of dimension and measures can be very large and impossible to test manually.

Subject Area Design Check

Subject areas contain related dimensions and measures that can be used for creating new reports on the fly. Verify that the subject areas follow the requirement / design specifications. Some of the items to check are :

  1. Verify that the subject area, dimension folders, fact folders, dimension attributes and measures following the standard naming conventions.
  2. Verify that the attributes and measures are placed in the appropriate dimension and fact folders.
  3. Validate the help text for the subject area, folders, attributes and measures.
  4. Check for any unrelated dimensions or measures in the subject area. Alternatively, check for any missing dimensions or measures.

Example : Multiple subject areas were created by different developers with different naming convensions which was confusing to the end users.

Mapping Check

Verify that all the dimension attributes and measures are mapped properly to available database tables and columns. This can be done by creating adhoc reports by select all the attributes/measures from each folder. Run these reports and validate that the data is as expected.

Example: One of the measure had a typo in the database column name which caused errors when it was included in a report.

Joins Check

When a combination of dimension attributes and measures are added to a report, the database query generated includes joins between the tables involved. If the joins have issues, it usually results in errors.

  1. Create separate reports by picking all the attributes in each dimension folder and one measure at time.
  2. Select attributes from multiple dimension folders and one measure at a time.
  3. Verify the database query generated if the join condition makes sense.

Example 1 : When a new dimension table was added to the BI model, new joins were created with the fact tables. However, the join condition for an existing report got impacted because this new addition. As a result, the report did not return any results.

Example 2 : When OBIEE is unable to determine the join between two entities, it either throws an ODBC error or in some cases use ‘CAST AS NULL’ in the database query. In the later case, the report won’t throw any error but it won’t return any data either.

Automate Adhoc Report Testing with BI Validator

BI Validator comes with Subject Area test plan that can automatically generate hundreds or thousands of reports with different combination of dimension attributes and measures. When these reports are executed, any errors in the subject area can be easily identified.

Like any other web application, BI applications also have authentication and authorization security requirements. BI applications are often integrated with single sign-on or embedded with other transactional applications. It is important to test the security aspects of the BI application just like other web applications.

Report Access Security

The objective of this test is to validate that the BI users have access to the BI reports, subject areas and dashboards is limited according to their access levels. Access to the reports is generally controlled by role based security in the BI tool.

  1. Understand how the access to reports is different for different roles.
  2. Identify users with those roles for testing.
  3. Login as these users and validate access to the reports and subject areas.

Example: In a HR analytics application, access to salary and compensation reports are restricted to select users.

Data Security (or Position based Security)

In this form of security, different users have access to a report but the data shown in the report is different based on the person running the report.

Example: In a CRM analytics application, all sales managers have access to a pipeline report but the data shown in the report is limited to the opportunity pipeline of his or her direct reports.

Single Sign-on Security

Single sign-on is often used as the authentication mechanism for BI applications in large enterprises. The objective of this testing is to ensure that users are able to access BI applications using their single sign-on access (or windows authentication).

Integrated Security

BI Applications are sometimes embedded as part of other transaction system applications using a common authentication mechanism. This integration needs to be tested for different users.

BI tools make it easy to create new reports by automatically generating the database query dynamically based on a predefined BI Model. This presents a challenge from a regression testing standpoint because any change to the BI Model can potentially impact existing reports. Hence it is important to do a complete regression test of the existing reports and dashboards whenever there is an upgrade or change in the BI Model.

Regression Testing of Report Data

The primary focus of this test is to verify that the data shown on a report or dashboard page is same before and after the change (or upgrade). In case of the BI tool upgrade, it is possible that the look and feel of the report might change. By validating the data of the report, we can be sure that the report content is same.

Example : When a new dimension table was added to the BI model, new joins were created with the fact tables. However, the join condition for an existing report got impacted because of this new addition. As a result, the report did not return any results.

Regression Testing of Report Format

Verify that there is no change in the look and feel of the report.

Example: After a BI Tool upgrade, one of the chart in an existing report stopped showing in the UI.

Regression Testing of Prompts

Verify that all the prompts are available as expected and the list of values showing in the prompts match the expected values. Apply the prompts and verify that the reports shows data as expected.

Example : One of the filter condition was accidentally deleted from an existing report. As a result, the prompt values are not getting applied as filters to the report.

Regression Testing of Report Database Query

When data is changing in the underlying tables, it might be difficult to understand if the differences in the report data before and after an upgrade is because of a regression issue or data change. It is helpful to compare the database SQL query in such cases.

Example : A recent database upgrade adversely affected the query execution plans which resulted in performance issues for the reports.

Regression Testing of Report Performance

Any upgrade or change in the system can cause regression in performance of existing reports. It is important to verify that the reports are still performing as expected.

Example : A recent database upgrade adversely affected the query execution plans which resulted in performance issues for the reports.

Regression Testing of Security

The BI security testing sections delves into various aspects of security testing. From a regression testing standpoint, the objective is to validate that the security is not adversely impacted.

Example : While migrating new report content to production, the BI administrator accidentally modified the security setting for existing HR reports. As a result, employee compensation reports were visible to all BI users.

Automate BI Regression Testing with BI Validator

BI Validator can be used to automate the regression testing of UI, data and performance of the reports and dashboards. Any differences found during the testing are shown visually to the user.

  1. Report Test Plan: Report test plan can be used to baseline the report data, PDF/Excel output prior to a change and compare it with the latest report after the change.
  2. Upgrade Test Plan: Upgrade test plan can be used to compare the report data, PDF or Excel output between a pre-upgrade and post-upgrade environment.
  3. Physical Query Test Plan: Physical Query test plan can be used to baseline the report database query and the metadata prior to a change and compare it with the latest report query after the change.
  4. Dashboard Test Plan: Dashboard test plan can be used to baseline the dashboard page in the PDF format prior to a change and compare it with the latest dashboard PDF after the change.

BI Stress testing is similar to testing of any web based application testing. The objective is to simulate concurrent users accessing reports with different prompts and understand the bottlenecks in the system.

Simulating User Behaviour

A typical BI user will login to the system and navigate to reports (or dashboards), apply prompts and drills down to other reports. After the report is rendered, the BI User reviews the data for a certain time called think time. The BI user eventually logs out of the system. The first step in conducting a stress test is to identify a list of most commonly used reports or dashboards for the testing.

Simulating Concurrent User Load

Conducting a stress test requires simulation of the above BI user behaviour concurrently for different user loads. So it is important to have a list of different user logins for the stress test. When executing the reports, each user can pick a different set of prompt values for the same report. The stress test should be able to randomize the prompt selection for each user so as generate a more realistic behaviour.

BI Cache

Most of the BI tools support a caching mechanism to improve the performance of the reports. Database queries and the data blocks used to generate the results are also cached in databases. The question that frequently comes up whether cache should turned off for stress testing. While turning off caching is a good way to understand system bottlenecks, our recommendation is to perform the stress test with the caching turned on because it is more closer to what a production system would look like. The following points should be kept in mind to reduce the impact of caching :

  1. Include a large set of reports and dashboard pages.
  2. Randomize prompt values for a given report or dashboard page so that the same report is requested with different filter conditions.
  3. User a large pool of user login so that different user based security is applied during the execution.

Stress testing is an iterative process. When the first round of stress test is conducted, a particular component of the system (eg. database cache) might max out and bring down the response times. Once this bottleneck has been addressed another round of stress test might find another bottleneck. This process needs to be continued till the target concurrent user load and report performance SLAs are met.


There are several performance data points that need to captured while running a stress test:

  1. Initial and final response time for each of the report and dashboard pages.
  2. Time spent for each request in the database and in the BI tool.
  3. CPU and memory utilization in the server machines (BI Tool and database).
  4. Load balancing across all the nodes in a cluster.
  5. Number of active sessions and number of report requests.
  6. Busy connection count in database Connection pool and current queued requests.
  7. Network load.
  8. Database cache and disk reads.
BI Stress testing with BI Validator

Stress testing is one of the key features of BI Validator. Since BI Validator integrates with the BI tool metadata (catalog), user can execute a stress test without writing a single line of code.

Download BI Validator

Get a Competitive Advantage with BI Validator. Download your 14 day free trial now.