A holistic component-based platform for automating Data Reconciliation tests in modern Data Lake and Cloud Data Migration projects using Apache Spark.
Built using Apache Spark
Visual Test Case Builder
All Data Sources
Get the Power of DataOps DataFlow
DataOps Dataflow is a modern, web browser-based solution for automating the testing of ETL, Data Warehouse, and Data Migration projects. Use Dataflow to inject data from any of the varied data sources, compare data, and load differences to S3 or a database. With fast and easy to set up, create and run dataflow in minutes. A best in the class testing tool for Big Data Testing
DataOps Dataflow can integrate with all modern and advanced data sources including RDBMS, NoSQL, Cloud, and File-Based.
Enables Continuous Integration
By automating the testing of Data Lake and Data Migration projects, DataOps DataFlow enables Continuous Integration.
- Integrates with Jenkins: DataOps Dataflow provides a command-line interface for kicking of Dataflows and Pipelines. Customers have used this interface to execute tests automatically from Jenkins.
- Email Notifications: Key stakeholders are automatically notified by email.
- Web Reporting: DataOps DataFlow comes with out-of-the-box web reporting. Queries can be executed on the Dataflow repository for additional reporting.
Connects to all Popular Data Sources
We support your data source in whichever form it is. You think of any kind of data source – whether it is a relational, NoSQL, cloud, or file data source – we support most of them.
Data Transformation Testing
DataOps Dataflow supports testing of Data transformations by providing a visual test case builder that supports extracting test data from multiple sources in a single test case.
- SQL Component: Support transforming data using Spark SQL queries.
- Code Component: Supports Python, Scala and R
- Attribute Component: Rename columns and use spark SQL UDFs to transform data
See DataOps DataFlow in action
Add value to your Big Data Analytics projects and save money.