“A recent survey by TDWI found that 66% of organizations are looking for ways to improve data quality and trust. Data Validation using Data Quality testing tools such as Datagaps DataOps suite is essential to ensure trust in your data and analytics.”
7 Data Quality Dimensions
The Data Quality dimensions provide a way to categorize data validation rules and measure data quality. There are seven data quality dimensions that are commonly used to measure data quality.
Completeness refers to the existence of all required attributes in the population of data records. Data element is:
- Always required (or)
- Required based on the condition of another data element.
Example:
- Person record with a null First Name
- The person record is missing a value for marital status. The married (Y/N) field should have a non-null value of ‘Y’ or ‘N’ but is populated with a “null” value instead.
Conformity means the data is following the set of standard data definitions like data type, size, and format. All data values conform to the requirements of their respective field.
Example:
- Date of Birth is listed as “26/05/1990” but should be in the format “mm/dd/yyyy“
- Zip Code contains letters but it should be numeric
Validity is determined by how closely data values correspond to reference tables, lists of golden values documented in metadata, and value ranges, etc. All data values are valid in relation to reference tables.
Example:
- Country Code should be a valid value from the reference data for countries
- Age for a Person should be less than 100 years old
Accuracy refers to the degree to which information accurately reflects what’s being described. It can be measured against either original documents or authoritative sources and validated against defined business rules.
Example:
- US Zip Codes should match a list of legal US postal codes
- Person name is spelled incorrectly
Uniqueness refers to the singularity of records and or attributes. The objective is a single (unique) recording of data. Data element is unique — there are no duplicate values.
Example:
- Each Person should only have one record, but there are two instances of the same Person with different identifiers or spellings.
- SSN should be unique, but there are two Person records that have the same social security number.
Consistency means data across all systems reflects the same information and are in synch with each other across the enterprise. The absence of difference, when comparing two or more representations of a thing against a definition.
Example:
- Employee status is terminated but pay status is active
- Employee start date cannot be later than the Employee end date
- N number for a Person record must be the same across systems
Timeliness references whether the information is available when it is expected and needed.
Example:
- For quarterly reporting, data must be up to date by the time the data is extracted
- Last Review Date for the policy must be within the last three years
Get a Free Trial of DataOps DQM - Data Quality Monitor for 14 Days
How To Measure Data Quality Using Data Quality Dimensions?
Datagaps DataOps suite automatically computes Data Quality Scores for each rule based on the number of good vs bad data. This score is rolled up to Data Quality Dimensions at Table, Data Model, and System level.
A sample dashboard showing the Data Quality trend is shown below:

Figure: Shows the Data Quality Dimensions and their scores at a system-level DataOps Data Quality comes with the following rule types to make it easy to define Data Validation rules:
How To Compute Data Quality Score?
Data Quality Score allows us to quickly understand the current state of our data and more easily compare quality over time. Data Quality Scores will be a percent calculated by the following:
= [1 – (# of bad records / # total records)] x 100

Figure: Shows the trend of Data Quality scores at the system level
Conclusion
Automated Data Quality testing can be done using Data Validation rules. Data Quality Score provides a means to measure and track the Data Quality of your enterprise data at rest and in motion. Data Quality dimensions help categorize the data validation rules into meaningful buckets. DataOps Data Quality is a simple Data Validation testing tool that can be used to automate the Data Quality testing process.

Established in the year 2010 with the mission of building trust in enterprise data & reports. Datagaps provides software for ETL Data Automation, Data Synchronization, Data Quality, Data Transformation, Test Data Generation, & BI Test Automation. An innovative company focused on providing the highest customer satisfaction. We are passionate about data-driven test automation. Our flagship solutions, ETL Validator, Data Flow and BI Validator are designed to help customers automate the testing of ETL, BI, Database, Data Lake, Flat File, & XML Data Sources. Our tools support Snowflake, Tableau, Amazon Redshift, Oracle Analytics, Salesforce, Microsoft Power BI, Azure Synapse, SAP BusinessObjects, IBM Cognos, etc., data warehousing projects, and BI platforms.