Informatica Tutorial

Informatica is a widely used data integration and management software tool that helps organizations extract, transform, and load (ETL) data from various sources into a unified format for analysis and reporting.

It provides a comprehensive platform for data integration, data quality, data governance, and master data management.

Informatica provides a wide range of connectors to various data sources and targets. This includes databases like Oracle, SQL Server, and Teradata moreover It also connects cloud-based platforms like Amazon Web Services (AWS) and Microsoft Azure.

Before we talk more about Informatica, let us first understand why we need ETL. 


Extract, transform, and load (ETL) process data from multiple sources into a large, central repository called a data warehouse.

ETL uses a set of business rules to clean and organize raw data and prepare it for storage, data analytics, and machine learning (ML).


Extract, transform, and load (ETL) tools extract or copy raw data from multiple sources. It store data in a staging area. Certainly an intermediate storage area for temporarily storing extracted data.

Data staging areas are often means their contents are erased after data extraction is complete. However, the staging area might also retain a data archive for troubleshooting purposes.


This process consolidates the raw data in the staging area to prepare it for the target data warehouse.

For example processes like Data cleansing, Data format revision,Data de duplication,Aggregator,Rank,Joining and many other transformations.


In data loading, extract transform, and load (ETL) tools move the transformed data from the staging area into the target data warehouse.

For most organizations that use automated, well defined, continual, and batch driven ETL process.

  • Full Load or Bulk Load: The data loading process when we do it at very first time. The job extracts entire volume of data from a source table and loads into the target data warehouse after applying the required transformations. One time job run after then changes alone will be captured as part of an incremental extract. 
  • Incremental load or Refresh load: The modified data updates in target followed by full load. The changes will be captured by comparing created or modified date against the last run date of the job. The modified data alone extracted from the source.The date will be updated in the target without impacting the existing data. 

Further, You can visit site to get more insight on product offerings:

Scroll to top