Data Observability in your Tableau Reports

Before observabilities and anomaly detection, the reports have to be validated on many
dimensions which can be achieved using the various tools and test plans provided by Datagaps

    • Validation of the reports against datasets and other reports  
    • Validation of reports against business rules and logical rules 
    • Validation of reports through upgrades and regression tests  
    • Metadata and Aesthetics Standardization  
    • Performance and Security Optimization  
Objectives of Tableau reports

The full set of details are explained in a webinar conducted by Datagaps. In this blog, we focus on data observability in the tableau reports. 

Pulling Datasets from Tableau

The seamless integration offered by Datagaps allows users to pull datasets directly from reports into datasets.

This is done via JavaScript, REST APIs, and Selenium as an add-on.

The user can parameterize the reports, filters, and workbooks to be picked up. The application allows users to navigate through their workspaces to pick up reports.

add view
Shipmode - Department Sale
Region - Shipmode

Note, this application works with a Power BI system as well. 

Profiling Reports 

Profiling a dataset in the DataOps Suite creates checkpoints on nulls, volume, distinct percentages, aggregates, patterns, lengths, extremes, and distributions of the datasets. 

The application allows a user to define which metrics and aggregates to track over the course of multiple runs. The application also catches anomalies using IQR and statistical methods using a set of moving sample records.  

Once the application samples the dataset, upper and lower bounds for all the variable metric aggregates are set, and any anomalies trigger the test case to fail. Users are notified immediately, and the auto-generated reports are presented with the notifications. 

Overall Dataflow

Overall DataFlow

Profiling Result

Profiling Results

Metric Trend

Metric Trends

ML-based Anomaly Detection 

While profiling helps keep track of datasets in terms of aggregates and patterns using statistical methods, for proper anomaly detection a machine learning system that works with understanding the seasonality of the metric in question and the dependent variables.  

Region - Shipmode

The application can train itself on a dataset and automatically find our dependent variables or define a time-series type prediction system out of the box.  

An easy-to-define system helps the user allocate the measures, categorization, statistics as of dates, and filter conditions. Based on the categorization, the training weights work differently. 

The application trains itself on the datasets over the course of multiple runs and updates the upper and lower expected bounds of the metrics accordingly. Seasonality and the dependence on other variables also affect the training system over various categorizations and segregations.  

If a metric is found outside the bounds, the test case fails, and a notification is sent to the user stating exactly which metrics in which category are found as anomalies. Reports are automatically generated and can be sent through the expected channel such as Slack, Teams, or email through plugins. 

Detection Option

Anomaly Found

Anomaly over the course of multiple cadences

Even with validation rules, migration validations, comparisons check, metadata checks, security checks, and performance checks answering the question “Are my reports correct?” requires a look at historical records and patterns. Observability plays a huge role in understanding changing datasets and constantly evolving definitions of data. This should not be limited to raw records or finalized views, but also the reports created by them.  

Using the above methodology, you can achieve an even greater sense of confidence in your reports and trust the dataset and trends based not just on rules and transfer checks, but also on past trends, patterns, and machine learning-based anomaly detection. 

For more info on the various ways you can validate your BI reports in the aforementioned dimensions, be it Tableau or Power BI refer to the following – DataGaps Webinars – YouTube 

Datagaps

Established in the year 2010 with the mission of building trust in enterprise data & reports. Datagaps provides software for ETL Data Automation, Data Synchronization, Data Quality, Data Transformation, Test Data Generation, & BI Test Automation. An innovative company focused on providing the highest customer satisfaction. We are passionate about data-driven test automation. Our flagship solutions, ETL Validator, Data Flow, and BI Validator are designed to help customers automate the testing of ETL, BI, Database, Data Lake, Flat File, & XML Data Sources. Our tools support Snowflake, Tableau, Amazon Redshift, Oracle Analytics, Salesforce, Microsoft Power BI, Azure Synapse, SAP BusinessObjects, IBM Cognos, etc., data warehousing projects, and BI platforms.  www.datagaps.com 

Queries: contact@datagaps.com