The Evolution of Data Transformation

Data transformation has long been a critical aspect of data management, with ETL (Extract, Transform, Load) being the traditional approach. However, with the advent of modern data technologies and cloud data warehouses, ELT (Extract, Load, Transform) has emerged as a powerful alternative. Let’s delve into the nuances of these approaches and explore their strengths and weaknesses in today’s data landscape.

ETL versus ELT, a position switch for the “T”

ETL (Extract, Transform, Load) – the T before the L

ETL is a traditional approach to data integration that involves extracting data from various sources, transforming it outside the data warehouse, and then loading it into the target database. The transformation phase typically occurs on dedicated ETL servers or platforms before loading the data into the destination.

Data transformation is handled in the data pipeline.


Imagine a retail company that collects sales data from multiple stores and wants to consolidate this information into a centralized data warehouse for analysis. In an ETL scenario:

  • Extract: Data is extracted from point-of-sale systems, online transactions, and other sources, and stored in a staging area or environment.
  • Transform: The extracted data is cleaned, standardized, and transformed to fit the schema of the data warehouse. This might involve data cleansing, aggregation, or joining with other datasets.
  • Load: The transformed data is loaded into the data warehouse, where it can be queried and analyzed by business intelligence tools. The data warehouse has a predefined schema where the data is mapped to.

ETL Vendors:

  • Informatica: Informatica offers a comprehensive suite of ETL tools for enterprise data integration and management.
  • Talend: Talend provides open-source and commercial ETL solutions with support for cloud data integration and big data processing.
  • IBM DataStage: IBM DataStage is an ETL tool designed for scalable data integration and transformation across heterogeneous systems.

ELT (Extract, Load, Transform) – The T is moved to the end

ELT flips the traditional ETL process by loading raw data into the data lake or warehouse first, then performing transformations within the warehouse using SQL, Python, or other programming languages. This approach leverages the processing power and scalability of modern cloud data warehouses for transformation tasks.


Consider a healthcare organization that collects patient data from multiple sources, including electronic health records and medical devices. In an ELT scenario:

  • Extract: Raw data from various sources is extracted and loaded into a cloud data warehouse like Snowflake or Google BigQuery.
  • Load: The extracted data is loaded into staging tables within the data warehouse, preserving its original format.
  • Transform: Transformation tasks, such as data cleansing, enrichment, and aggregation, are performed directly within the data warehouse using SQL queries or scripts.

ELT Vendors:

  • Snowflake: Snowflake is a cloud data platform that supports ELT workflows with native support for SQL-based transformations.
  • Google BigQuery: Google BigQuery is a serverless, highly scalable data warehouse that enables ELT processing with built-in machine learning capabilities.
  • Amazon Redshift: Amazon Redshift is a fully managed data warehouse service that offers ELT capabilities for processing and analyzing large datasets at scale.

By understanding the differences between ETL and ELT, along with examples and key vendors in each category, organizations can make informed decisions when designing their data integration and analytics workflows.

Pros and Cons of ETL


  • Well-established process with mature tooling.
  • Suitable for complex transformations involving large volumes of data.


  • Requires additional infrastructure for transformation tasks.
  • Longer processing times due to separate transformation step.

Pros and Cons of ELT


  • Utilizes the power of modern cloud data warehouses for transformation, enabling faster processing and scalability.
  • Simplifies architecture by eliminating the need for separate transformation servers.


  • May lead to increased storage costs if raw data is retained in the data warehouse.
  • Transformation logic may become complex and harder to manage within SQL scripts.

Introducing Peliqan: the Blend of both Principles

Peliqan adopts primarily ELT principles, leveraging the capabilities of data warehouses for transformation tasks. However, it introduces a unique twist by offering basic yet powerful transformations for SaaS data sources using Singer pipelines. Unlike typical ELT tools that land source data in raw format files in a data lake, Peliqan lands data directly into a relational data warehouse with specific column transformations and incremental patterns included.

Conclusion: Choose the Right Approach

In the dynamic landscape of data management, the choice between ETL and ELT depends on various factors such as data volume, complexity of transformations, and infrastructure considerations. While traditional ETL remains relevant for certain use cases, ELT offers compelling advantages, especially with the emergence of innovative tools like Peliqan that blend the best of both worlds.



Piet-Michiel Rappelet is a founder of Peliqan. Before Peliqan, Piet-Michiel co-founded, a no-code iPaaS integration platform. was acquired by Qlik in 2020 and Piet-Michiel became Director of Product Management Foundational Services at Qlik. Piet-Michiel’s primary interest is in scaling SaaS software and rolling out customer-oriented service teams. Piet-Michiel holds a Masters degree in mathematics, he lives with his wife and two kids in Belgium.