Top 11 Essential Considerations for Performing ETL Testing | Nitor Infotech
Send me Nitor Infotech's Monthly Blog Newsletter!
×
nitor logo
  • Company
    • About
    • Leadership
    • Partnership
  • Resource Hub
  • Blog
  • Contact
nitor logo
Add more content here...
Artificial intelligence Big Data Blockchain and IoT
Business Intelligence Careers Cloud and DevOps
Digital Transformation Healthcare IT Manufacturing
Mobility Product Modernization Software Engineering
Thought Leadership
Aastha Sinha Abhijeet Shah Abhishek Suranglikar
Abhishek Tanwade Abhishek Tiwari Ajinkya Pathak
Amit Pawade Amol Jadhav Ankita Kulkarni
Antara Datta Anup Manekar Chandra Gosetty
Chandrakiran Parkar Dr. Girish Shinde Gaurav Mishra
Gaurav Rathod Harshali Chandgadkar Kapil Joshi
Madhavi Pawar Marappa Reddy Milan Pansuriya
Minal Doiphode Mohit Agarwal Mohit Borse
Nalini Vijayraghavan Neha Garg Nikhil Kulkarni
Omkar Ingawale Omkar Kulkarni Pranit Gangurde
Prashant Kamble Prashant Kankokar Priya Patole
Rahul Ganorkar Ramireddy Manohar Ravi Agrawal
Robin Pandita Rohini Wwagh Sachin Saini
Sadhana Sharma Sambid Pradhan Sandeep Mali
Sanjeev Fadnavis Saurabh Pimpalkar Sayanti Shrivastava
Shardul Gurjar Shravani Dhavale Shreyash Bhoyar
Shubham Kamble Shubham Muneshwar Shweta Chinchore
Sidhant Naveria Sreenivasulu Reddy Sujay Hamane
Tejbahadur Singh Tushar Sangore Vasishtha Ingale
Veena Metri Vidisha Chirmulay Yogesh Kulkarni
Software Engineering | 14 Jul 2022 |   12 min

Top 11 Essential Considerations for Performing ETL Testing

featured image

ETL testing is a crucial part of data warehouse systems. It involves performing end-to-end testing of a data warehouse application. Read on to learn about each important phase of the ETL testing process.

1. Requirements Testing: The objective of requirements testing is to ensure that all defined business requirements are as per the business user’s expectations. During requirements testing, the testing team should perform the analysis of business requirements in terms of requirement test ability and completeness. The following pointers should be considered during requirement testing:

  • Verification of logical data model with design documents.
  • Verification of Many – Many attribute relationship
  • Verification of the type of keys used
  • All transformation rules must be clearly specified
  • Target data type must be specified in data model or design document
  • Purpose and overview of the reports must be clearly specified
  • Report design should be available
  • All report details such as grouping, parameters to be used, filters should be specified
  • Technical definitions such as data definitions and details about the tables and fields would be used in reports
  • All details for header, footer and column heading must be clearly specified
  • Data sources and parameter name and value must be clearly specified
  • Verification of technical mapping in terms of report name, table name, column name and description of each report must be documented

2. Data Model Testing: The objective of this testing is to ensure that the physical model is in accordance with the logical data model. The following activities should be performed during this testing:

  • Verification of logical data model as per design documents
  • Verification of all the entity relationships as mentioned in design document
  • All the attributes, keys must be defined clearly
  • Ensure that the model captures all requirements
  • Ensure that the design and actual physical model are in sync
  • Ensure naming conventions
  • Perform schema verification
  • Ensure that the table structure, keys and relationship are implemented in the physical model as per the logical model.
  • Validation of Indexes and Partitioning

3. Unit Testing: The objective of Unit testing is to validate whether the implemented component is functioning as per design specifications and business requirements. It involves testing of business transformation rules, error conditions, mapping fields at staging and core levels. The following pointers should be considered during Unit Testing:

  • All transformation logic should work as designed from source till target
  • Surrogate keys have been generated properly
  • NULL values have been populated where expected
  • Rejects have occurred where expected and log for rejects is created with sufficient details
  • Auditing is done properly
  • All source data that is expected to be loaded into target, actually is loaded− compare counts between source and target
  • All fields are loaded with full contents− i.e. no data field is truncated while transforming

4. System Integration Testing: Once unit testing is done and all exit criteria of unit testing are met, the next phase of testing is integration testing. The objective of integration testing is to ensure that all integrated components are working as expected. The data warehouse application must be compatible with upstream and downstream flows and all the ETL components should be executed with correct schedule and dependency. The following listed pointers should be considered during Integration Testing:

  • ETL packages with Initial Load
  • ETL packages with Incremental Load
  • Executing ETL packages in sequential manner
  • Handling of rejected records
  • Exception handling verification
  • Error logging

5. Data Validation Testing: The objective of this testing is to ensure that the data flow through the ETL phase is correct and cleansed as per the applied business rules. The following listed pointers should be considered during Data Validation Testing:

  • Data comparison between source and target
  • Data flow as per business logic
  • Data type mismatch
  • Source to target row count validation
  • Data duplication
  • Data correctness
  • Data completeness

6. Security Testing: The objective of this testing is to ensure that only an authorized user can access the reports as per assigned privileges. While performing security testing, the following aspects should be considered:

  • Unauthorized user access
  • Role based access to the reports

7. Report Testing: The objective of report testing is to ensure that BI reports meet all the functional requirements defined in the business requirement document. While performing functional testing, the following aspects should be considered:

  • Report drill down, drill up and drill through
  • Report navigation and embedded links
  • Filters
  • Sorting
  • Export functionality
  • Report dashboard
  • Dependent reports

Verify the report runs with a broad variety of parameter values and in whatever way the users will be receiving the report (e.g. A subscription runs and deploys the report as desired)

  • Verify that the expected data is returned
  • Verify that the performance of the report is within an acceptable range
  • Report data validation (Correctness, Completeness and integrity)
  • Verify required security implementation
  • Automating processes whenever possible will save tremendous amounts of time
  • Verify that the business rules have been met

8. Regression Testing: The objective of regression testing is to keep the existing functionality intact each time new code is developed for a new feature implementation or if existing code is changed during correction of application defects. Prior to regression testing, impact analysis must be carried out in coordination with developers in order to determine the impacted functional areas of application. Ideally, 100% regression is recommended for each drop/build. In case builds are too frequent and there is a time limitation on test execution, the regression should be planned for execution based on priority of test cases.

9. Performance Testing: The objective of performance testing is to ensure that reports or data on the reports are loaded as per the defined nonfunctional requirements. In performance testing, different types of tests would be conducted such as load test, stress test, volume test etc. While executing performance testing, the following aspects should be considered:

  • Compare the SQL query execution time on Report UI and backend data
  • Concurrent access of the reports with multiple users
  • Report rendering with multiple filters applied
  • Load the high volume of production like data to check the ETL process and check whether ETL process does it in an expected timeframe
  • Validate the OLAP system performance by browsing the cube with multiple options
  • Analyze the maximum users load at peak and off-peak time that are able to access and process BI reports

10. Test Data Generation: As test data is very important, in ETL testing, appropriate test data needs to be generated. So, depending on the volume of data, test data will be generated and used by using a test data generation tool or SQL scripts. As a best practice, generated test data would be similar to production like data.

Data masking for test data generation – Data masking is the process of protecting personal sensitive information. Data is scrambled in such a way that sensitive information can be hidden but still usable for testing without being exposed. A few data masking techniques:

  • Randomization: Generate random data within the specified data range
  • Substitution: The data presented in columns will be replaced completely or partially with artificial records.
  • Scrambled: The data type and size of the fields will be intact, but the records would be scrambled.

11. User Acceptance Testing: The objective of UAT testing is to ensure that all business requirements or rules are met as per business user perspective, and the system is acceptable to the customer.

Write to us with your thoughts about ETL testing and visit us at Nitor Infotech if you’d like to learn more about our services.

Related Topics

Artificial intelligence

Big Data

Blockchain and IoT

Business Intelligence

Careers

Cloud and DevOps

Digital Transformation

Healthcare IT

Manufacturing

Mobility

Product Modernization

Software Engineering

Thought Leadership

<< Previous Blog fav Next Blog >>
author image

Nitor Infotech Blog

Nitor Infotech is a leading software product development firm serving ISVs and enterprise customers globally.

   

You may also like

featured image

A Complete Guide to Monitoring Machine Learning Models: Part 2

In the first part of this series, I introduced you to the monitoring of machine learning models, its types, and real-world examples of each one of those. You can read Read Blog


featured image

Building and Managing AI Frameworks

I’m sure you would concur when I say that reliable AI is well on its way to becoming a vital requirement in today’s business landscape. Its features of fairness, explainability, robustness, data li...
Read Blog


featured image

Top 4 Types of Sentiment Analysis

When you’re analyzing what works for your business and what doesn’t, you deal with two types of data- objective, tangible data that you collate from surveys, feedback, and reviews, and then there’s...
Read Blog


subscribe

Subscribe to our fortnightly newsletter!

We'll keep you in the loop with everything that's trending in the tech world.

Services

    Modern Software Engineering


  • Idea to MVP
  • Quality Engineering
  • Product Engineering
  • Product Modernization
  • Reliability Engineering
  • Product Maintenance

    Enterprise Solution Engineering


  • Idea to MVP
  • Strategy & Consulting
  • Enterprise Architecture & Digital Platforms
  • Solution Engineering
  • Enterprise Cognition Engineering

    Digital Experience Engineering


  • UX Engineering
  • Content Engineering
  • Peer Product Management
  • RaaS
  • Mobility Engineering

    Technology Engineering


  • Cloud Engineering
  • Cognitive Engineering
  • Blockchain Engineering
  • Data Engineering
  • IoT Engineering

    Industries


  • Healthcare
  • Retail
  • Manufacturing
  • BFSI
  • Supply Chain

    Company


  • About
  • Leadership
  • Partnership
  • Contact Us

    Resource Hub


  • White papers
  • Brochures
  • Case studies
  • Datasheet

    Explore More


  • Blog
  • Career
  • Events
  • Press Releases
  • QnA

About


With more than 16 years of experience in handling multiple technology projects across industries, Nitor Infotech has gained strong expertise in areas of technology consulting, solutioning, and product engineering. With a team of 700+ technology experts, we help leading ISVs and Enterprises with modern-day products and top-notch services through our tech-driven approach. Digitization being our key strategy, we digitally assess their operational capabilities in order to achieve our customer's end- goals.

Get in Touch


  • +1 (224) 265-7110
  • marketing@nitorinfotech.com

We are Social 24/7


© 2023 Nitor Infotech All rights reserved

  • Terms of Usage
  • Privacy Policy
  • Cookie Policy
We use cookies to ensure that we give you the best experience on our website. If you continue to use this site we will assume that you are happy with it. Accept Cookie policy