Database Testing – Ensure acceptable variance in Data differences

Database-Testing-–-Ensure-acceptable-variance-in-Data-differences-20

At a recent event, one of the prospects at the attendees came over to our booth and asked me to demonstrate a use case that was kind of interesting. I got a similar request from another prospect recently and thought that it might be a good idea to blog and show how it can be done using ETL Validator.

Database-Testing-Emsure-acceptable-variance-in-data-differences

Problem: There is a source table and a target table. The attendee wanted to find the difference in a number field between two tables. In addition,  he was also interested in specifying an acceptable variance and define a rule on that. If the difference is within the limits, then, he wanted the test case to be marked as success. If the difference exceeds the variance, then, he wanted the test case to be marked as failure. In just few minutes, we were able to demonstrate this use case using 3.4 version of ETL Validator.

Step 1: To simulate the source, we created a simple table and inserted few records.

  • Created table table_a (Segments Varchar2(20), Counts number);
  • Data Insertion
    • insert into table_a (books, quantity) values (’Tier 1′, 10);
    • insert into table_a (books, quantity) values (’Tier 2′, 15);
    • insert into table_a (books, quantity) values (’Tier 3′, 12);

Step 2: To simulate the target:

  • Created a table table_b (Segments Varchar2(20), Counts number);
  • Inserted few sample records
    • insert into table_b (books, quantity) values (’Tier 1′, 10);
    • insert into table_b (books, quantity) values (’Tier 2′, 25);
    • insert into table_b (books, quantity) values (’Tier 3′, 30);

Ok, now the sample data set was ready, it was time to showcase the Component Test Case in ETL Validator:

Step 3: I launched ETL Validator

  • Opened up ETL Validator and started with the Component Test Case since, one of my favorites.

Step 4: I dragged and Dropped to create the DB components in the test case wizard.

  • Had to create  DB component to reflect the Source. It was a wrapper on a simple query (select  * from table_a)
    • Noted  the name of the “Result Table Name” in properties for that component. {TCOMP_40531_40508}
  • Had to create DB component to reflect the Target. It was a wrapper on a simple query (select  * from table_b)
    • Noted  the name of the “Result Table Name” in properties. {TCOMP_40531_40509}

Step 5: Had to Drag and Drop another database component to capture the differences.

  • select a.books, (a.quantity-b.quantity) as Quantity from T_COMP_40531_41058 a, T_COMP_40531_41059 b where a.books=b.books
  • Now, we were ready to apply the rules on top of this component.

Step 6: Drag and drop a “Data Rules Component” to define rules on the variance. This is something unique to ETLV.

  • Double click on the entity and define a “Data Rule”  on the Quantity column. The rule was  something like  “Quantity is greater than” 10.

Step 7: Then, I just got into the run mode and  tapped  on “Run” at the bottom of the screen.

The results were displayed and clearly showed the records that did not match. With very minimal steps, we could demonstrate how to achieve a relatively complex scenario within a single test case in ETL Validator; pretty cool.

Datagaps-logo-1536x406-1

Established in the year 2010 with the mission of building trust in enterprise data & reports. Datagaps provides software for ETL Data Automation, Data Synchronization, Data Quality, Data Transformation, Test Data Generation, & BI Test Automation. An innovative company focused on providing the highest customer satisfaction. We are passionate about data-driven test automation. Our flagship solutions, ETL ValidatorDataFlow, and BI Validator are designed to help customers automate the testing of ETL, BI, Database, Data Lake, Flat File, & XML Data Sources. Our tools support Snowflake, Tableau, Amazon Redshift, Oracle Analytics, Salesforce, Microsoft Power BI, Azure Synapse, SAP BusinessObjects, IBM Cognos, etc., data warehousing projects, and BI platforms.  www.datagaps.com 

Related Posts:

Data Quality

Automate testing of Business Intelligence applications by making use of the metadata available from the BI tools such as Tableau, OBIEE, and Business Objects.

Synthetic Data

Automate testing of Business Intelligence applications by making use of the metadata available from the BI tools such as Tableau, OBIEE, and Business Objects.

ETL Testing

Automate testing of Business Intelligence applications by making use of the metadata available from the BI tools such as Tableau, OBIEE, and Business Objects.

BI Validation

Automate testing of Business Intelligence applications by making use of the metadata available from the BI tools such as Tableau, OBIEE, and Business Objects.
Products
product_menu_icon01

DataOps Suite

End-to-End Data Testing Automation

product_menu_icon02

ETL Validator

Automate your Data Reconciliation & ETL/ELT testing

product_menu_icon03

BI Validator

Automate functional regression & performance testing of BI reports

product_menu_icon04

DQ Monitor

Monitor quality of data being Ingested or at rest using DQ rules & AI

product_menu_icon05

Test Data Manager

Maintain data privacy by generating realistic synthetic data using AI

About
Free Trial