Oracle to Azure SQL Server Migration Validation


This article discusses how a migration from Oracle to Azure SQL Server can be validated from an End-to-End perspective.

Why migrate from Oracle to Azure SQL Server?

Many organizations continue to use Oracle database for storing transactional and warehouse data in their on-premise data centers. However, with data growing exponentially, their data environments are struggling to handle the growing load which requires more processing power and increased storage, adding to data costs. So, migrating Oracle data to the cloud is an extremely viable option for optimizing costs and performance, and migrating the Oracle database to Azure SQL Server (or Azure Synapse) is a popular alternative.

Migration from Oracle to Azure SQL Server can be justified mainly by below:

  • Help save money: Oracle Database is a great platform but does involve sizeable licensing costs. 

  • Help move data and compute from an on-premise data center with limited resources to the cloud opens up the possibility of scaling on-demand.

  • Raw data can be stored in a Data Lake using ADLS reducing the need to store every dataset in the Oracle database.

  • Reduce on-premise database and data center management costs by making use of cloud based PAAS.

If you looking to automate your ETL testing, try Datagaps ETL Validator ETL testing tool. Request Free Trial

Primary Challenges in Data Migration

While Oracle and Azure both are RDMS with SQL as the main language, there are a few differences that can cause issues during or post-migration.

  • Data types conversion issues – As both the systems interpret SQL ANSI standards slightly differently, there are variations in the data type conversions, especially in terms of numeric and date data types & formats. 
  • Metadata migration issues – Oracle database objects such as indexes, sequences, packages, procedures, triggers, grants and constraints need to be migrated and validated in the SQL Server database.
  • Performance for Data-Intensive workloads post migration in the new database needs to be validated and tuned
  • Data corruption: Data migration challenges, including the possibility of  data corruptiondata loss, and extended downtime. If the target system encounters some defect while the transfer is underway, the data migration process would be unsuccessful, leading to data corruption. Data load failures may result in duplicates or missing data.
  • Prolonged Downtime: The probability of extended downtime occurs when the data migration process takes longer than anticipated. The source system is not functional during the migration process, which presents a potential risk for organizations. 
  • Incremental Data syncs: Data migration projects usually involve syncing of incremental data between Oracle and SQL Server databases while the new environment is being tested and validated. This required data comparison and validation to ensure that both systems have the same data and the new SQL Server database is operating as expected.

How to Overcome the Risks Associated with Data Migration?

While migrating data, here are some of the best practices to keep in mind: 

  • Data back-up: Make sure to back up data before proceeding further with implementation. If something goes wrong during the execution, you can revert if you have taken a backup. 
  • Plan efficiently: Manage your resources well in advance to effectively carry out the migration process. Data migration can be complex, so stick to the plan and work accordingly. 
  • Automated Testing: To ensure favorable outcomes, testing is crucial. Therefore, throughout the planning and implementation phase, test the data migration process.

Migration Process

The end-to-end migration can be divided into 3 phases and each phase consists of one or more activities. 

  • Pre-migration – Analysis, Discover, Assess and convert. These are important to prepare a plan for a successful migration. However, the conversion activity will be required in heterogeneous migration scenarios to convert the source schema to work in the target schema. 
  • Migration – Public schema, Data migration, and Data sink All three activities are important for a successful migration and to decide the cutover from the source to the target environment (new source after successful migration).
  • Post-migration – Validation, Application remediation, and Performance validation and optimization. All of these tasks will need to be repeated in order to verify the migration status, correct the application, and validate performance and optimization.

For any successful technology migration, a defined approach is required which helps to recover from unknown situations. The migration phases consist of multiple activities and each activity has its own set of tasks to accomplish.

Perform Tests

To do the database migration, the following activities need to be completed. 

  • Develop validation tests: To test the database migration, we need to use SQL queries. Create the validation queries to run against both the source and target databases. Your validation queries should cover the scope that you’ve defined.
  • Set up a test environment: The test environment should contain a copy of the source database and target database.
  • Run validation tests: Run the validation tests against the source and target, and then view the results.
  • Run performance tests: Run performance tests against the source and target, analyze and then compare the results.

Pitfalls and Drawbacks

The scope of Data Migration projects are often underestimated resulting in delays. Data validation is a key component of Data Migration projects and lack of efforts to automate the validation results in data issues being identified after go-live.  Delays in data migration projects may impact other strategic initiatives within the enterprise as migration to the cloud is one of the first steps towards the digital transformation.


Cost Of Migration

Time To Market
Poor Data Quality

Validation Techniques

Migrating data is not only a process of migrating the data from its source system to the destination system, but it is also a complex task of retaining the quality of the data which requires a dedicated validation team with quality assurance skills and trusted validation tools. 

From the validation point of view, it means that the data must be validated thoroughly end-to-end along with migrating from the existing system to the new destination system successfully. Therefore, Validation of already migrated data could be achieved by following some standard steps. 

By following the above steps, one can achieve the data migration 100%.

Validation of ETL Changes

Since this doesn’t contain any data initially, after full load/initial load when the ETL process happens for the second, third, or fourth time, data is loaded in an incremental fashion where changes are appended to the data warehouse to keep the modified records.    

The Insertion of a record happens only if a record doesn’t already exist, else the update is done only for modified records.

Validate data sources — Perform a data count check and verify that the table and column data type meet the specifications of the data model. Make sure check keys are in place and remove duplicate data. If not done correctly, the aggregate report could be inaccurate or misleading.  

Validate constraints – This involves validating the constraints and making sure they are applied on the expected tables. This involves checking the misuse of integrity constraints like Foreign Key.  

Validate nulls – This involves validating the full data set in the source and the target tables by using the minus query. We need to perform both source minus target and target minus source. If the minus query returns a value, then that should be considered as mismatched rows.

Conclusion :

Data migration validation is as important in ensuring that the data has been migrated successfully into the Cloud database without data corruption or data loss. Post-migration testing phase should include, the comparison of count of records in source vs target, then it must be confirmed during the migration whether the records were transformed as per the business requirements. Then, the data reconciliation, Duplicate check, NULL check, and referential integrity check must be performed to ensure the retention of the quality of data. Data is a key asset for any enterprise today and it should be treated as such during the data migration.


Established in the year 2010 with the mission of building trust in enterprise data & reports. Datagaps provides software for ETL Data Automation, Data Synchronization, Data Quality, Data Transformation, Test Data Generation, & BI Test Automation. An innovative company focused on providing the highest customer satisfaction. We are passionate about data-driven test automation. Our flagship solutions, ETL ValidatorDataFlow, and BI Validator are designed to help customers automate the testing of ETL, BI, Database, Data Lake, Flat File, & XML Data Sources. Our tools support Snowflake, Tableau, Amazon Redshift, Oracle Analytics, Salesforce, Microsoft Power BI, Azure Synapse, SAP BusinessObjects, IBM Cognos, etc., data warehousing projects, and BI platforms. 

Related Posts:

Data Quality

Automate testing of Business Intelligence applications by making use of the metadata available from the BI tools such as Tableau, OBIEE, and Business Objects.

Synthetic Data

Automate testing of Business Intelligence applications by making use of the metadata available from the BI tools such as Tableau, OBIEE, and Business Objects.

ETL Testing

Automate testing of Business Intelligence applications by making use of the metadata available from the BI tools such as Tableau, OBIEE, and Business Objects.

BI Validation

Automate testing of Business Intelligence applications by making use of the metadata available from the BI tools such as Tableau, OBIEE, and Business Objects.

DataOps Suite

End-to-End Data Testing Automation


ETL Validator

Automate your Data Reconciliation & ETL/ELT testing


BI Validator

Automate functional regression & performance testing of BI reports


DQ Monitor

Monitor quality of data being Ingested or at rest using DQ rules & AI


Test Data Manager

Maintain data privacy by generating realistic synthetic data using AI

Free Trial