Subject Areas in Oracle OBI are great for business users. They hide the complexity involved in warehousing projects and present an easy to use mechanism to create adhoc analysis based on the business user’s need.
From an IT perspective, testing subject areas is fairly complex. Every Subject Area is really a grouping of Dimension Folders (with attributes) and Measures also known as Facts. The dimension folders may have 100s of attributes and connected to fact tables via foreign keys. Typically, this is referred to as the Star Schema in the industry. Engineers, Business Analysts, Quality Assurance teams struggle to ensure that business users do not encounter unpleasant surprises in the form SQL Errors when they try to create and execute analysis from the subject areas.
To create a new analysis, business users are empowered to pick any subject area, select random combination of dimension attributes and facts as part of the analysis.
Now, here is the challenge; assume that there 4 dimensions:
Also, assume each dimension has 20 attributes and that “Fact” has referential integrity with all the above dimensions. Now, to address a new Service Request from business, IT has to tweak a few joins. How can subject areas be tested so that enhancements/bug fixes/changes do not cause any regression?
From our experience, some of the common issues encountered while creating an analysis :
• A specific combination of attributes and facts may result in ODBC errors.
• Depending on how the RPD modeled, the physical query may contain ‘CAST AS NULL’ when the BI Server is not able to determine the right way to join tables.
To address the above issues/risks, we have introduced a new test plan “Subject Area Test Plan” in BI Validator. It drastically simplifies the testing of Subject Areas by automatically generating logical queries using various combinations of dimension attributes and facts as shown below:
• Dimension Attributes Only – Create one logical query per dimension folder by selecting all the attributes in that folder.
• Fact Measures Only – Create one logical query per fact folder by selecting all the measures in that folder.
• Single dimension to Single Fact Only – Pick all the dimension attributes for a dimension folder in combination with one fact or measure at a time. This category should produce the most number of logical queries.
• Single dimension to Multiple Fact Table – Pick all the dimension attributes for a dimension folder in combination with all the facts in a fact folder.
• Multiple dimension to Single Fact Table – Pick one attribute from each of the dimension folders and one fact at a time.
• Multiple dimension to Multiple Fact Table – Pick one attribute from each of the dimension folders in combination with all the facts in a fact folder.
While these combinations of dimension attributes and facts do not cover all possible combinations, they represent a basic set of tests that can be used to validate the subject area with minimal manual effort.
We developed this test plan based on decades of experience in the BI space and are really excited about it. Test drive today.
As most of us in the BI community are aware, In-Place upgrades are fairly common when clients want to go through minor upgrades (e.g OBIEE 126.96.36.199 to 188.8.131.52 or Business Object 4.1 SP5 to 4.1 SP 7). However, there are a number of risks which need to carefully addressed in such scenarios. Below are few examples of what can potentially go wrong and how you can leverage BI Validator to validate “In-Place” Upgrades.
Since there is NO reference environment unlike an “Out-of-Place” upgrade, comparing dashboards and reports, performance etc becomes challenging. As an example, let’s say you have 50 dashboards with extensive customization, what is your confidence level that the dashboards have not regressed after an upgrade from say, OBIEE 184.108.40.206 to 220.127.116.11?
Often, defects or regression in the Presentation Server module in the newer version of OBI may accidentally change the layout of the reports. For example OBIEE 18.104.22.168 to 22.214.171.124 caused an extra column to be added in the table view. In our experience, such issues slip through the cracks and and go into production, completely unnoticed. This could be minor but will leave a negative impression on the business user’s mind.
In few cases, a minor upgrade may have an unintended impact on logical queries and that could result in unwarranted data differences in the upgraded environment. For example, in OBIEE 126.96.36.199, if you exclude a measure column with “count distinct”, the chart shows wrong data. It works find in the edit mode but shows wrong numbers when we run it.
While in-place upgrades are inherently risky, the time allocated for Business Analysts, Quality Analysts and Testing Teams is relatively less and they may not be able to go through complete testing cycles. As an example, security (access) issues could go undetected until a business user files a service request indicating that they do not have access to a specific report in production.
Since the production will also be upgraded, there is a need for running a quick set of regression tests before releasing the environment to the users. However, business cannot afford the long down times generally needed for validating the environment manually post upgrade.
Now, how can BI Validator help in the above situations?
BI Validator makes testing the above scenarios amazingly easy for you. Using our patented ELV architecture, you can baseline your dashboards and reports prior to the upgrade using simple user interfaces and then, run the tests after the upgrade. BI Validator presents any differences in a very intuitive manner so that the user can understand the exact differences and act accordingly.
Stay tuned to know how BI Validator can help during an “Out-of-place” Upgrade:
Try BI Validator today. It hardly takes 5 minutes to get up and running.
We just came back last night from BIWA SIG (http://www.biwasummit.com), a BI related event that was held in Redwood City, CA. The organizers did a great job of putting together wonderful sessions spanning all areas of Analytics.
Balaji Yelamanchili shared the latest and greatest from Oracle. It was nice to know that Oracle is investing in improving the user experience of the BI Platform & Applications and enhancing support for mobile platform. I saw a demo of Endeca for the first time and I should say that it is pretty cool! Not sure about the underlying algorithms and how well does it interpret unstructured text but the UI is awesome!
On our solution portfolio, we got very positive feedback from few bloggers, potential clients and partners and that is always encouraging.
BIWA SIG is definitely part of our yearly event circuit now. We sponsored the event last year, this year and most likely will do so next year as well. The only thing that disappointed us was the attendance. Other than Oracle folks and partners, there were very few customers. Hopefully next year, we will better collaborate with the organizers and help them spread the work so that more folks can benefit from this extremely focused and well organized event!
Next BIWA SIG:
When: January 27th – 29th of 2015. Visit http://www.biwasummit.com for further updates.
Where: Oracle Conference Center, Redwood City, CA
Our vision is empowering our customers with trustworthy Business Intelligence and Data.
Who We Are:
Trusted Globally by more than 70 Companies for Data Test Automation Solutions
Datagaps is passionate about data-driven testing automation. Our flagship solutions, ETL Validator, Data Flow & BI Validator are designed to help customers automate the testing of ETL, BI, Database, Data Lake, Flat File, XML Data Sources. Our tools support Snowflake, Tableau, Amazon Redshift, Oracle Analytics, Salesforce, Microsoft Power BI, Azure Synapse, SAP BusinessObjects, IBM Cognos etc. data warehousing projects and BI platforms.
Datagaps was started in the year 2010 with the mission of building trust in enterprise data and reports. We provide software for ETL Data Automation, Data Synchronization, Data Quality, Data Transformation, Test Data Generation, and BI Test Automation. Datagaps is an innovative company focused on providing the highest customer satisfaction.
Below is a simple and real example that illustrates how you can leverage BI Validator for regression testing of dashboards.
Assume that you have 250 dashboards which are widely used by your marketing, sales, finance and other divisions within your enterprise. So far, things have been running well but the current version of your ETL Tool will not be supported from the July 1st 2013, which is 45 days away and there is no option for you but to upgrade from 8.x to 9.x in this timeframe. Sounds familiar? 🙂
The IT team is now under tremendous pressure to do the upgrade in a very short period of time and yet, ensure that the process has no negative implications on the dashboards and reports used in the enterprise.
Given that there is no big budget and not much time left, what is the best way to solve this problem?
In this situation, IT teams can leverage BI Validator to benchmark all the dashboards prior to the ETL tool upgrade and very easily compare them to the dashboards after the upgrade. If there are any differences, BI Validator displays them in a very intuitive way that is easy for the user to understand and act accordingly. Below is an example –
BI Validator thus eliminates two common testing paradigms to identify differences between dashboards.
Printing dashboards and manually comparing them.
Switching between multiple environments in a browser and manually comparing them.
In addition to the above, you can run a number of other tests as well to quickly validate and minimize the risk involved with upgrades and other under the hood changes. We care about usability and made this tool incredibly easy to use!
This article walks you through the consistency check warnings found in the BI Apps RPD after running the 188.8.131.52 upgrade assistant on an 10.1.3.4.1 RPD.
Warning  – Missing functional dependency association of a column. Any Logical Table Source that has been unused in 10g is disabled during the upgrade. Cleaning up such tables before the upgrade might help avoid this warning. Solution: Enable all the Logical Table Sources that are disabled during the upgrade.
Warning  – column does not have valid data type. I encountered this error because of the disabled Logical Table Sources (from warning 39003). The datatype for all the columns belonging to the disabled LTS were set to null. Solution: Fixing the warning 39003 should fix this error, otherwise check to see if the column has a valid datatype.
Warning  – There are physical tables mapped in Logical Table Sources that are not used in any column mappings or expressions. Solution: The recommendations to workaround this issue:
Delete the unused logical tables. Delete the aggregations set for that dimension in the content tab of the fact.
Create dummy mapping columns from that table.
Metalink note: Warning 39057 [ID 1386095.1] Cleaning up any unused logical tables pre upgrade, might help from running into this warnings.
Warning  – Logical fact table FACTS have an aggregate source that does not join to a Dimension: source at the proper level of detail. Solution: Make sure there is a physical join between the fact Logical Table Source and dimension’s Logical Table Source. Make sure the aggregation levels are set at the same grain for both dimension and fact on the content tab.
Warning  – Logical level does not have any key with Use for Display property on. ‘Use for Display’ option is turned off during the upgrade. Solution: Turn on the ‘Use for Display’ option for all the keys specified.
Warning  – Logical dimension table has a source that does not join to any fact source. The Logical levels on the content tab for LTS and Fact table are not set properly. The join between the LTS and Fact table is missing in the physical layer. Solution: Fix the Logical levels on the content tab for Logical Table Sources and Fact table. Both should be set to the same Aggregation level. Check for any missing foreign key joins between LTS and Fact table in physical layer.
Warning  – Initialization block uses Connection Pool which is used for report queries. This may impact query performance. Solution: Update the initialization blocks to use the right connection pool. If a connection pool for the init block does not exist, create a new connection pool so that it does not use the connection pool specified for the queries.
Warning  – The features in database does not match the defaults. This may cause query problems. Solution: Make sure all the required database features are turned on.
Warning  – Logical dimension table has a source with aggregation content set to level ‘Detail’. The same dimension has aggregation content set in fact source to a higher level. Solution: The aggregation levels are set at different grain for fact dimension tables. Update the content tab for dimension and fact to be at the same level.