informatica powermart powercenter 7 basics l.
Skip this Video
Loading SlideShow in 5 Seconds..
Informatica PowerMart / PowerCenter 7 Basics PowerPoint Presentation
Download Presentation
Informatica PowerMart / PowerCenter 7 Basics

Loading in 2 Seconds...

play fullscreen
1 / 178

Informatica PowerMart / PowerCenter 7 Basics - PowerPoint PPT Presentation

  • Uploaded on

Informatica PowerMart / PowerCenter 7 Basics. Education Services. PC6B-20030512.  Informatica Corporation, 2003. All rights reserved. Course Objectives. At the end of this course you will: Understand how to use all major PowerCenter 7 components

I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
Download Presentation

PowerPoint Slideshow about 'Informatica PowerMart / PowerCenter 7 Basics' - nadda

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
informatica powermart powercenter 7 basics

Informatica PowerMart / PowerCenter 7 Basics

Education Services


 Informatica Corporation, 2003. All rights reserved.

course objectives
Course Objectives

At the end of this course you will:

  • Understand how to use all major PowerCenter 7 components
  • Be able to perform basic Repository administration tasks
  • Be able to build basic ETL Mappings and Mapplets
  • Be able to create, run and monitor Workflows
  • Understand available options for loading target data
  • Be able to troubleshoot most problems
extract transform and load
Extract, Transform, and Load


Operational Systems


Decision Support


Aggregate Data

Cleanse Data

Consolidate Data

Apply Business Rules






  • Transaction level data
  • Aggregated data
  • Optimized for Transaction Response Time
  • Historical
  • Current
  • Normalized or De- Normalized data


powercenter 7 architecture

Repository Designer Workflow Workflow Rep Server

Manager Manager Monitor Administrative


PowerCenter 7 Architecture


















Not Shown: Client ODBC Connections for Source and Target metadata

powercenter 7 components
PowerCenter 7 Components
  • PowerCenter Repository
  • PowerCenter Repository Server
  • PowerCenter Client
    • Designer
    • Repository Manager
    • Repository Server Administration Console
    • Workflow Manager
    • Workflow Monitor
  • PowerCenter Server
  • External Components
    • Sources
    • Targets
repository topics
Repository Topics

By the end of this section you will be familiar with:

  • The purpose of the Repository Server and Agent
  • The Repository Server Administration Console GUI interface
  • The Repository Manager GUI interface
  • Repository maintenance operations
  • Security and privileges
  • Object sharing, searching and locking
  • Metadata Extensions
repository server
Repository Server
  • Each Repository has an independent architecture for the management of the physical Repository tables
  • Components: one Repository Server, and a Repository Agent for each Repository









Repository Server Administration Console

Client overhead for Repository management is greatly reduced by the Repository Server

repository server features
Repository Server Features
  • Manages connections to the Repository from client applications
  • Can manage multiple Repositories on different machines on a network
  • Uses one Repository Agent process to insert, update and fetch objects from the Repository database tables, for each Repository it manages
  • Maintains object consistency by controlling object locking

The Repository Server runs on the same system running the Repository Agent

repository server administration console
Repository Server Administration Console

Use Repository Administration console to Administer Repository Servers and

Repositories through Repository Server. Following tasks can be performed:

  • Add, Edit and Remove Repository Configurations
  • Export and Import Repository Configurations
  • Create a Repository
  • *Promote a local Repository to a Global Repository
  • Copy a Repository
  • Delete a Repository from the Database
  • Backup and Restore a Repository
  • Start, Stop, enable and Disable a Repositories
  • View Repository connections and locks
  • Close Repository connections.
  • Upgrade a Repository
repository server administration console11
Repository Server Administration Console

Information Nodes


Console Tree

Hypertext Links to Repository Maintenance Tasks

repository management
Repository Management
  • Perform all Repository maintenance tasks through Repository Server from the Repository Server Administration Console
  • Create the Repository Configuration
  • Select Repository Configuration and perform maintenance tasks:
  • Create
  • Delete
  • Backup
  • Copy from
  • Disable
  • Export Connection
  • Make Global
  • Notify Users
  • Propagate
  • Register
  • Restore
  • Un-Register
  • Upgrade
repository manager
Repository Manager

Use Repository manager to navigate through multiple folders and repositories. Perform following tasks:

  • Manage the Repository
    • Launch Repository Server Administration Console for this purpose
  • Implement Repository Security
    • Managing Users and Users Groups
  • Perform folder functions
    • Create, Edit, Copy and Delete folders
  • View Metadata
    • Analyze Source, Target, Mappings and Shortcut dependencies.
repository manager interface
Repository Manager Interface



Main Window

Dependency Window

Output Window

users groups and repository privileges
Users, Groups and Repository Privileges
  • Steps:
  • Create groups
  • Create users
  • Assign users to groups
  • Assign privileges to groups
  • Assign additional privileges to users (optional)
managing privileges
Managing Privileges

Check box assignment of privileges

folder permissions
Folder Permissions
  • Assign one user as the folder owner for first tier permissions
  • Select one of the owner’s groups for second tier permissions
  • All users and groups in the Repository will be assigned the third tier permissions
object locking
Object Locking
  • Object Locks preserve Repository integrity
  • Use the Edit menu for Viewing Locks and Unlocking Objects
object searching menu analyze search
Object Searching(Menu- Analyze – Search)
  • Keyword search
    • Limited to keywords previously defined in the Repository (via Warehouse Designer)
  • Search all
    • Filter and search objects
object sharing
Object Sharing
  • Reuse existing objects
  • Enforces consistency
  • Decreases development time
  • Share objects by using copies and shortcuts
  • Required security settings for sharing objects:
    • Repository Privilege: Use Designer
    • Originating Folder Permission: Read
    • Destination Folder Permissions: Read/Write
adding metadata extensions
Adding Metadata Extensions
  • Allows developers and partners to extend the metadata stored in the Repository
  • Accommodates the following metadata types:
    • Vendor-defined - Third-party application vendor-created metadata lists
      • For example, Applications such as Ariba or PowerConnect for Siebel can add information such as contacts, version, etc.
    • User-defined - PowerCenter/PowerMart users can define and create their own metadata
  • Must have Administrator Repository or Super User Repository privileges
sample metadata extensions
Sample Metadata Extensions

Sample User Defined Metadata, e.g. - contact information, business user

Reusable Metadata Extensions can also be created in the Repository Manager

design process
Design Process

Create Source definition(s)

Create Target definition(s)

Create a Mapping

Create a Session Task

Create a Workflow from Task components

Run the Workflow

Monitor the Workflow and verify the results

source object definitions
Source Object Definitions

By the end of this section you will:

  • Be familiar with the Designer GUI interface
  • Be familiar with Source Types
  • Be able to create Source Definitions
  • Understand Source Definition properties
  • Be able to use the Data Preview option
source analyzer
Source Analyzer

Designer Tools

Analyzer Window

Navigation Window

methods of analyzing sources


Source Analyzer


XML file

Flat file

COBOL file

Methods of Analyzing Sources
  • Import from Database
  • Import from File
  • Import from Cobol File
  • Import from XML file
  • Create manually

Source Analyzer


Relational Source













Analyzing Relational Sources

analyzing relational sources
Analyzing Relational Sources

Editing Source Definition Properties


Source Analyzer

Flat File

Mapped Drive

NFS Mount

Local Directory


Fixed Width or










Analyzing Flat File Sources


Flat File Wizard

  • Three-step wizard
  • Columns can be renamed within wizard
  • Text, Numeric and Datetime datatypes are supported
  • Wizard ‘guesses’ datatype
xml source analysis
XML Source Analysis

Source Analyzer

Mapped Drive

NFS Mounting

Local Directory

.DTD File








In addition to the DTD file, an

XML Schema or XML file can be used as a Source Definition





Source Analyzer

Mapped Drive

NFS Mounting

Local Directory

.CBL File











Analyzing VSAM Sources

  • Supported Numeric Storage Options:
    • COMP, COMP-3, COMP-6
target object definitions
Target Object Definitions

By the end of this section you will:

  • Be familiar with Target Definition types
  • Know the supported methods of creating Target Definitions
  • Understand individual Target Definition properties
creating target definitions
Creating Target Definitions

Methods of creating Target Definitions

  • Import from Database
  • Import from an XML file
  • Manual Creation
  • Automatic Creation
automatic target creation
Automatic Target Creation

Drag-and-drop a Source Definition into the Warehouse Designer Workspace

import definition from database

Warehouse Designer















Import Definition from Database

Can “Reverse engineer” existing object definitions from a database system catalog or data dictionary

manual target creation

1. Create empty definition

3. Finished target definition

Manual Target Creation

2. Add desired columns

ALT-F can also be used to create a new column

creating physical tables

Execute SQL







Repository target table definitions


Target database tables

Creating Physical Tables

Creating Physical Tables

Create tables that do not already exist in target database

  • Connect - connect to the target database
  • Generate SQL file - create DDL in a script file
  • Edit SQL file - modify DDL script as needed
  • Execute SQL file - create physical tables in target database

Use Preview Data to verify the results (right mouse click on object)

transformation concepts
By the end of this section you will be familiar with:

Transformation types and views

Transformation calculation error treatment

Null data treatment

Informatica data types

Expression transformation

Expression Editor

Informatica Functions

Expression validation

Transformation Concepts
transformation types
Transformation Types

Informatica PowerCenter 7 provides 23 objects for data transformation

  • Aggregator:performs aggregate calculations
  • Application Source Qualifier: reads Application object sources as ERP
  • Custom: Calls a procedure in shared library or DLL
  • Expression: performs row-level calculations
  • External Procedure (TX): calls compiled code for each row
  • Filter: drops rows conditionally
  • Joiner: joins heterogeneous sources
  • Lookup: looks up values and passes them to other objects
  • Normalizer: reorganizes records from VSAM, Relational and Flat File
  • Rank: limits records to the top or bottom of a range
  • Input: Defines mapplet input rows. Available in Mapplet designer
  • Output: Defines mapplet output rows. Available in Mapplet designer
transformation types45
Transformation Types
  • Router: splits rows conditionally
  • Sequence Generator: generates unique ID values
  • Sorter: sorts data
  • Source Qualifier: reads data from Flat File and Relational Sources
  • Stored Procedure: calls a database stored procedure
  • Transaction Control: Defines Commit and Rollback transactions
  • Union: Merges data from different databases
  • Update Strategy: tags rows for insert, update, delete, reject
  • XML Generator: Reads data from one or more Input ports and outputs XML through single output port
  • XML Parser: Reads XML from one or more Input ports and outputs data through single output port
  • XML Source Qualifier: reads XML data
transformation views
Transformation Views

A transformation has three views:

  • Iconized - shows the transformation in relation to the rest of the mapping
  • Normal - shows the flow of data through the transformation
  • Edit - shows transformation ports and properties; allows editing
edit mode

Define transformation level properties

Define port level handling

Enter comments

Make reusable

Edit Mode

Allows users with folder “write” permissions to change or create transformation ports and properties

Switch between transformations

expression transformation

Click here to invoke the Expression Editor

Expression Transformation

Perform calculations using non-aggregate functions (row level)

  • Passive Transformation
  • Connected
  • Ports
    • Mixed
    • Variables allowed
  • Create expression in an output or variable port
  • Usage
    • Perform majority of data manipulation
expression editor
Expression Editor
  • An expression formula is a calculation or conditional statement
  • Used in Expression, Aggregator, Rank, Filter, Router, Update Strategy
  • Performs calculation based on ports, functions, operators, variables, literals, constants and return values from other transformations
informatica functions samples
Informatica Functions - Samples

















Character Functions

  • Used to manipulate character data
  • CHRCODE returns the numeric value (ASCII or Unicode) of the first character of the string passed to this function

For backwards compatibility only - use || instead

informatica functions
Informatica Functions

TO_CHAR (numeric)






Conversion Functions

  • Used to convert datatypes

Date Functions

  • Used to round, truncate, or compare dates; extract one part of a date; or perform arithmetic on a date
  • To pass a string to a date function, first use the TO_DATE function to convert it to an date/time datatype






ROUND (date)


TO_CHAR (date)

TRUNC (date)

informatica functions52
Informatica Functions
















Numerical Functions

  • Used to perform mathematical operations on numeric data







Scientific Functions

  • Used to calculate geometric values of numeric data
informatica functions53





Informatica Functions

Special Functions

Used to handle specific conditions within a session; search for certain values; test conditional statements

Test Functions

  • Used to test if a lookup result is null
  • Used to validate data






Encoding Functions

  • Used to encode string values



expression validation
Expression Validation

The Validate or ‘OK’ button in the Expression Editor will:

  • Parse the current expression
    • Remote port searching (resolves references to ports in other transformations)
  • Parse transformation attributes
    • e.g. - filter condition, lookup condition, SQL Query
  • Parse default values
  • Check spelling, correct number of arguments in functions, other syntactical errors
variable ports
Variable Ports
  • Use to simplify complex expressions
    • e.g. - create and store a depreciation formula to be

referenced more than once

  • Use in another variable port or an output port expression
  • Local to the transformation (a variable port cannot also be an input or output port)
  • Available in the Expression, Aggregator and Rank transformations
informatica data types




Informatica Data Types
  • Transformation datatypes allow mix and match of source and target database types
  • When connecting ports, native and transformation datatypes must be compatible (or must be explicitly converted)
datatype conversions
Datatype Conversions
  • All numeric data can be converted to all other numeric datatypes, e.g. - integer, double, and decimal
  • All numeric data can be converted to string, and vice versa
  • Date can be converted only to date and string, and vice versa
  • Raw (binary) can only be linked to raw
  • Other conversions not listed above are not supported
  • These conversions are implicit; no function is necessary
By the end of this section you will be familiar with:

Mapping components

Source Qualifier transformation

Mapping validation

Data flow rules

System Variables

Mapping Parameters and Variables

mapping designer
Mapping Designer

Transformation Toolbar

Mapping List

Iconized Mapping

pre sql and post sql rules
Pre-SQL and Post-SQL Rules
  • Can use any command that is valid for the database type; no nested comments
  • Can use Mapping Parameters and Variables in SQL executed against the source
  • Use a semi-colon (;) to separate multiple statements
  • Informatica Server ignores semi-colons within single quotes, double quotes or within /* ...*/
  • To use a semi-colon outside of quotes or comments, ‘escape’ it with a back slash (\)
  • Workflow Manager does not validate the SQL
data flow rules
Data Flow Rules









  • Each Source Qualifier starts a single data stream (a dataflow)
  • Transformations can send rows to more than one transformation (split one data flow into multiple pipelines)
  • Two or more data flows can meet together -- if (and only if) they originate from a common active transformation
    • Cannot add an active transformation into the mix

Example holds true with Normalizer in lieu of Source Qualifier. Exceptions are: Mapplet Input and Joiner transformations

connection validation
Connection Validation

Examples of invalid connections in a Mapping:

  • Connecting ports with incompatible datatypes
  • Connecting output ports to a Source
  • Connecting a Source to anything but a Source Qualifier or Normalizer transformation
  • Connecting an output port to an output port or an input port to another input port
  • Connecting more than one active transformation to another transformation (invalid dataflow)
mapping validation
Mapping Validation
  • Mappings must:
    • Be valid for a Session to run
    • Be end-to-end complete and contain valid expressions
    • Pass all data flow rules
  • Mappings are always validated when saved; can be validated without being saved
  • Output Window will always display reason for invalidity

By the end of this section, you will be familiar with:

  • The Workflow Manager GUI interface
  • Workflow Schedules
  • Setting up Server Connections
    • Relational, FTP and External Loader
  • Creating and configuring Workflows
  • Workflow properties
  • Workflow components
  • Workflow Tasks
workflow manager interface

Task Tool Bar

Workflow Designer



Navigator Window

Output Window

Status Bar

Workflow Manager Interface
workflow manager tools
Workflow Manager Tools
  • Workflow Designer
    • Maps the execution order and dependencies of Sessions, Tasks and Worklets, for the Informatica Server
  • Task Developer
    • Create Session, Shell Command and Email tasks
    • Tasks created in the Task Developer are reusable
  • Worklet Designer
    • Creates objects that represent a set of tasks
    • Worklet objects are reusable
workflow structure


Session Task

Start Task

Workflow Structure
  • A Workflow is set of instructions for the Informatica Server to perform data transformation and load
  • Combines the logic of Session Tasks, other types of Tasks and Worklets
  • The simplest Workflow is composed of a Start Task, a Link and one other Task
workflow scheduler objects
Workflow Scheduler Objects
  • Setup reusable schedules to associate with multiple Workflows
    • Used in Workflows and Session Tasks
server connections
Server Connections
  • Configure Server data access connections
    • Used in Session Tasks
  • Configure:
  • Relational
  • MQ Series
  • FTP
  • Custom
  • External Loader
relational connections native
Relational Connections (Native )
  • Create a relational (database) connection
    • Instructions to the Server to locate relational tables
    • Used in Session Tasks
relational connection properties

User Name/Password

  • Database connectivity information
  • Rollback Segment assignment (optional)
  • Optional Environment SQL (executed with each use of database connection)
Relational Connection Properties
  • Define native relational (database) connection
ftp connection
FTP Connection
  • Create an FTP connection
    • Instructions to the Server to ftp flat files
    • Used in Session Tasks
external loader connection
External Loader Connection
  • Create an External Loader connection
    • Instructions to the Server to invoke database bulk loaders
    • Used in Session Tasks
task developer




Task Developer
  • Create basic Reusable “building blocks” – to use in any Workflow
  • Reusable Tasks
    • Session Set of instructions to execute Mapping logic
    • Command Specify OS shell / script command(s) to run

during the Workflow

    • Email Send email at any point in the Workflow
session task
Session Task
  • Server instructions to runs the logic of ONE specific Mapping
    • e.g. - source and target data location specifications, memory allocation, optional Mapping overrides, scheduling, processing and load instructions
  • Becomes a component of a Workflow (or Worklet)
  • If configured in the Task Developer, the Session Task is reusable (optional)
command task
Command Task
  • Specify one (or more) Unix shell or DOS (NT, Win2000) commands to run at a specific point in the Workflow
  • Becomes a component of a Workflow (or Worklet)
  • If configured in the Task Developer, the Command Task is reusable (optional)

Commands can also be referenced in a Session through the Session “Components” tab as Pre- or Post-Session commands

additional workflow components
Additional Workflow Components
  • Two additional components are Worklets and Links
  • Worklets are objects that contain a series of Tasks
  • Links are required to connect objects in a Workflow
developing workflows


Workflow name

Select a Server

Developing Workflows

Create a new Workflow in the Workflow Designer

workflow properties

Select a Workflow Schedule (optional)

May be reusable or non-reusable

Workflow Properties

Customize Workflow Properties

Workflow log displays

workflows properties

Create a User-defined Event which can later be used

with the Raise Event Task

Workflows Properties

Define Workflow Variables that can

be used in later Task objects

(example: Decision Task)

building workflow components

Start Workflow

Building Workflow Components
  • Add Sessions and other Tasks to the Workflow
  • Connect all Workflow components with Links
  • Save the Workflow
  • Start the Workflow


Sessions in a Workflow can be independently executed

workflow designer links

Link 1

Link 3

Link 2

Workflow Designer - Links
  • Required to connect Workflow Tasks
  • Can be used to create branches in a Workflow
  • All links are executed -- unless a link condition is used which makes a link false
session tasks
Session Tasks

After this section, you will be familiar with:

  • How to create and configure Session Tasks
  • Session Task properties
  • Transformation property overrides
  • Reusable vs. non-reusable Sessions
  • Session partitions
session task85
Session Task
  • Created to execute the logic of a mapping (one mapping only)
  • Session Tasks can be created in the Task Developer (reusable) or Workflow Developer (Workflow-specific)
  • Steps to create a Session Task
    • Select the Session button from the Task Toolbar or
    • Select menu Tasks | Create

Session Task Bar Icon

session task transformations
Session Task - Transformations

Allows overrides of some transformation properties

Does not change the properties in the Mapping

monitor workflows
Monitor Workflows

By the end of this section you will be familiar with:

  • The Workflow Monitor GUI interface
  • Monitoring views
  • Server monitoring modes
  • Filtering displayed items
  • Actions initiated from the Workflow Monitor
  • Truncating Monitor Logs
monitor workflows94

Task view

Gantt Chart view

Monitor Workflows
  • The Workflow Monitor is the tool for monitoring Workflows and Tasks
  • Review details about a Workflow or Task in two views
    • Gantt Chart view
    • Task view
monitoring workflows
Monitoring Workflows
  • Perform operations in the Workflow Monitor
    • Restart -- restart a Task, Workflow or Worklet
    • Stop -- stop a Task, Workflow, or Worklet
    • Abort -- abort a Task, Workflow, or Worklet
    • Resume -- resume a suspended Workflow after a failed Task is corrected
  • View Session and Workflow logs
  • Abort has a 60 second timeout
    • If the Server has not completed processing and committing data during the timeout period, the threads and processes associated with the Session are killed

Stopping a Session Task means the Server stops reading data

monitoring workflows96

Start Completion

Task Workflow Worklet Time Time

Start, Stop, Abort, Resume Tasks,Workflows and Worklets

Status Bar

Monitoring Workflows

Task View

monitor window filtering

Right-click on Session to retrieve the Session Log (from the Server to the local PC Client)

Monitor Window Filtering

Task View provides filtering

Monitoring filters can be set using drop down menus

Minimizes items displayed in Task View

Get Session Logs (right click on Task)


By the end of this section you will be familiar with:

  • Creating a Debug Session
  • Debugger windows & indicators
  • Debugger functionality and options
  • Viewing data with the Debugger
  • Setting and using Breakpoints
  • Tips for using the Debugger
debugger features
Debugger Features
  • Debugger is a Wizard driven tool
    • View source / target data
    • View transformation data
    • Set break points and evaluate expressions
    • Initialize variables
    • Manually change variable values
  • Debugger is
    • Session Driven
    • Data can be loaded or discarded
    • Debug environment can be saved for later use
debugger interface

Debugger Mode


Solid yellow

arrow Current









Data window

Debugger Log tab

Target Data window

Session Log tab

Debugger Interface

Debugger windows & indicators


Filter Transformation

Drops rows conditionally

Active Transformation



    • All input / output
  • Specify a Filter condition
  • Usage
    • Filter rows from flat file sources
    • Single pass source(s) into multiple targets
aggregator transformation
Aggregator Transformation

Performs aggregate calculations

  • Active Transformation
  • Connected
  • Ports
    • Mixed
    • Variables allowed
    • Group By allowed
  • Create expressions in output or variable ports
  • Usage
    • Standard aggregations
informatica functions103
Informatica Functions











Aggregate Functions

  • Return summary values for non-null data in selected ports
  • Use only in Aggregator transformations
  • Use in output ports only
  • Calculate a single value (and row) for all records in a group
  • Only one aggregate function can be nested within an aggregate function
  • Conditional statements can be used with these functions
aggregate expressions
Aggregate Expressions

Aggregate functions are supported only in the Aggregator Transformation

Conditional Aggregate expressions are supported

Conditional SUM format: SUM(value, condition)

aggregator properties
Aggregator Properties

Sorted Input Property

Instructs the Aggregator to expect the data to be sorted

Set Aggregator cache sizes (on Informatica Server machine)

sorted data
Sorted Data
  • The Aggregator can handle sorted or unsorted data
    • Sorted data can be aggregated more efficiently, decreasing total processing time
  • The Server will cache data from each group and release the cached data -- upon reaching the first record of the next group
  • Data must be sorted according to the order of the Aggregator “Group By” ports
  • Performance gain will depend upon varying factors
incremental aggregation
Incremental Aggregation

MTD calculation

Trigger in Session Properties, Performance Tab

  • Cache is saved into $PMCacheDir: aggregatorname.DAT


  • Upon next run, files are overwritten with new cache information

Example: When triggered, PowerCenter Server will save new MTD totals. Upon next run (new totals), Server will subtract old totals; difference will be passed forward

Best Practice is to copy these files in case a rerun of data is ever required. Reinitialize when no longer needed, e.g. – at the beginning new month processing

joiner transformation
Joiner Transformation

By the end of this section you will be familiar with:

  • When to use a Joiner Transformation
  • Homogeneous Joins
  • Heterogeneous Joins
  • Joiner properties
  • Joiner Conditions
  • Nested joins
homogeneous joins
Homogeneous Joins

Joins that can be performed with a SQL SELECT statement:

  • Source Qualifier contains a SQL join
  • Tables on same database server (or are synonyms)
  • Database server does the join “work”
  • Multiple homogenous tables can be joined
heterogeneous joins
Heterogeneous Joins

Joins that cannot be done with a SQL statement:

  • An Oracle table and a Sybase table
  • Two Informix tables on different database servers
  • Two flat files
  • A flat file and a database table
joiner transformation111
Joiner Transformation

Performs heterogeneous joins on records from different databases or flat file sources

  • Active Transformation
  • Connected
  • Ports
    • All input or input / output
    • “M” denotes port comes from master source
  • Specify the Join condition
  • Usage
    • Join two flat files
    • Join two tables from different databases
    • Join a flat file with a relational table
joiner conditions
Joiner Conditions




are supported

joiner properties
Joiner Properties

Set Joiner Cache

Join types:

  • “Normal” (inner)
  • Master outer
  • Detail outer
  • Full outer

Joiner can accept sorted data (configure the join condition to use the sort origin ports)

mid mapping join
Mid-Mapping Join
  • The Joiner does not accept input in the following situations:
    • Both input pipelines begin with the same Source Qualifier
    • Both input pipelines begin with the same Normalizer
    • Both input pipelines begin with the same Joiner
    • Either input pipeline contains an Update Strategy
sorter transformation
Sorter Transformation
  • Can sort data from relational tables or flat files
  • Sort takes place on the Informatica Server machine
  • Multiple sort keys are supported
  • The Sorter transformation is often more efficient than a sort performed on a database with an ORDER BY clause
lookup transformation
Lookup Transformation

By the end of this section you will be familiar with:

  • Lookup principles
  • Lookup properties
  • Lookup conditions
  • Lookup techniques
  • Caching considerations
how a lookup transformation works
How a Lookup Transformation Works

Lookup value(s)

Lookup transformation

Return value(s)

  • For each Mapping row, one or more port values are looked up in a database table
  • If a match is found, one or more table values are returned to the Mapping. If no match is found, NULL is returned
lookup transformation118
Lookup Transformation

Looks up values in a database table and provides data to other components in a Mapping

  • Passive Transformation
  • Connected / Unconnected
  • Ports
  • Mixed
  • “L” denotes Lookup port
  • “R” denotes port used as a return value (unconnected Lookup only)
  • Specify the Lookup Condition
  • Usage
  • Get related values
  • Verify if records exists or if data has changed
lookup properties
Lookup Properties


Lookup SQL option



Native Database


Object name

additional lookup properties
Additional Lookup Properties

Set cache


Make cache


Set Lookup cache sizes

lookup conditions
Lookup Conditions

Multiple conditions are supported

to cache or not to cache
To Cache or not to Cache?

Caching can significantly impact performance

  • Cached
    • Lookup table data is cached locally on the Server
    • Mapping rows are looked up against the cache
    • Only one SQL SELECT is needed
  • Uncached
    • Each Mapping row needs one SQL SELECT
  • Rule Of Thumb: Cache if the number (and size) of records in the Lookup table is small relative to the number of mapping rows requiring lookup
target options
Target Options

By the end of this section you will be familiar with:

  • Row type indicators
  • Row operations at load time
  • Constraint-based loading considerations
  • Rejected row handling options
target properties
Target Properties

Session Task

Select target instance

Row loading operations

Error handling

Properties Tab

constraint based loading
Constraint-based Loading


fk1, pk2



fk1, pk2


Maintains referential integrity in the Targets

Example 1

With only One Active source, rows for Targets 1-3 will be loaded properly and maintain referential integrity

Example 2

With Two Active sources, it is not possible to control whether rows for Target 3 will be loaded before or after those for Target 2

The following transformations are ‘Active sources’: Advanced External Procedure, Source Qualifier, Normalizer, Aggregator, Sorter, Joiner, Rank, Mapplet (containing any of the previous transformations)

update strategy transformation
Update Strategy Transformation

By the end of this section you will be familiar with:

  • Update Strategy functionality
  • Update Strategy expressions
  • Refresh strategies
  • Smart aggregation
update strategy transformation127
Update Strategy Transformation

Used to specify how each individual row will be used to update target tables (insert, update, delete, reject)

  • Active Transformation
  • Connected
  • Ports
  • All input / output
  • Specify the Update Strategy Expression
  • Usage
  • Updating Slowly Changing Dimensions
  • IIF or DECODE logic determines how to handle the record
target refresh strategies
Target Refresh Strategies
  • Single snapshot: Target truncated, new records inserted
  • Sequential snapshot: new records inserted
  • Incremental: Only new records are inserted. Records already present in the target are ignored
  • Incremental with Update: Only new records are inserted. Records already present in the target are updated
router transformation
Router Transformation

Rows sent to multiple filter conditions

Active Transformation



  • All input/output
  • Specify filter conditions for each Group


  • Link source data in one pass to multiple filter conditions
parameters and variables
Parameters and Variables

By the end of this section you will understand:

  • System Variables
  • Creating Parameters and Variables
  • Features and advantages
  • Establishing values for Parameters and Variables
system variables
System Variables
  • Provides current datetime on the Informatica Server machine
    • Not a static value



  • Returns the system date value as a string. Uses system clock on machine hosting Informatica Server
    • format of the string is database type dependent
    • Used in SQL override
    • Has a constant value
  • Returns the system date value on the Informatica Server
    • Used with any function that accepts transformation date/time data types
    • Not to be used in a SQL override
    • Has a constant value


mapping parameters and variables
Mapping Parameters and Variables
  • Apply to all transformations within one Mapping
  • Represent declared values
  • Variables can change in value during run-time
  • Parameters remain constant during run-time
  • Provide increased development flexibility
  • Defined in Mapping menu
  • Format is $$VariableName or $$ParameterName
mapping parameters and variables134
Mapping Parameters and Variables

Sample declarations

Set the appropriate

aggregation type

User-defined names

Set optional Initial Value

Declare Variables and Parameters in the Designer Mappings menu

functions to set mapping variables
Functions to Set Mapping Variables
  • SetCountVariable -- Counts the number of evaluated rows and increments or decrements a mapping variable for each row
  • SetMaxVariable -- Evaluates the value of a mapping variable to the higher of two values
  • SetMinVariable -- Evaluates the value of a mapping variable to the lower of two values
  • SetVariable -- Sets the value of a mapping variable to a specified value
unconnected lookup
Unconnected Lookup
  • Will be physically “unconnected” from other transformations
    • There can be NO data flow arrows leading to or from an unconnected Lookup

Lookup function can be set within any transformation that supports expressions

Lookup data is called from the point in the Mapping that needs it

Function in the Aggregator calls the unconnected Lookup

conditional lookup technique
Conditional Lookup Technique

Two requirements:

  • Must be Unconnected (or “function mode”) Lookup
  • Lookup function used within a conditional statement

Row keys (passed to Lookup)


IIF ( ISNULL(customer_id),:lkp.MYLOOKUP(order_no))

Lookup function

  • Conditional statement is evaluated for each row
  • Lookup function is called only under the pre-defined condition
conditional lookup advantage
Conditional Lookup Advantage
  • Data lookup is performed only for those rows which require it. Substantial performance can be gained

EXAMPLE: A Mapping will process 500,000 rows. For two percent of those rows (10,000) the item_id value is NULL. Item_ID can be derived from the SKU_NUMB.

IIF ( ISNULL(item_id), :lkp.MYLOOKUP (sku_numb))

Lookup (called only when condition is true)

Condition (true for 2 percent of all rows)

Net savings = 490,000 lookups

heterogeneous targets
Heterogeneous Targets

By the end of this section you will be familiar with:

  • Heterogeneous target types
  • Heterogeneous target limitations
  • Target conversions
definition heterogeneous targets
Definition: Heterogeneous Targets

Supported target definition types:

  • Relational database
  • Flat file
  • XML
  • ERP (SAP BW, PeopleSoft, etc.)

A heterogeneous target is where the target types are

different or the target database connections are different

within a single Session Task

step one identify different target types
Step One: Identify Different Target Types

Oracle table

Oracle table

Tables are EITHER in two different databases, or require different (schema-specific) connect strings

One target is a flatfile load

Flat file

step two different database connections
Step Two: Different Database Connections

The two database connections WILL differ

Flatfile requires separate location information

target type override conversion
Target Type Override (Conversion)

Example: Mapping has SQL Server target definitions. Session Task can be set to load Oracle tables instead, using an Oracle database connection.

Only the following overrides are supported:

  • Relational target to flat file target
  • Relational target to any other relational database type
  • SAP BW target to a flat file target

CAUTION: If target definition datatypes are not compatible with datatypes in newly selected database type, modify the target definition

mapplet designer
Mapplet Designer

Mapplet Designer Tool

Mapplet Output Transformation

Mapplet Transformation Icons

mapplet advantages
Mapplet Advantages
  • Useful for repetitive tasks / logic
  • Represents a set of transformations
  • Mapplets are reusable
  • Use an ‘instance’ of a Mapplet in a Mapping
  • Changes to a Mapplet are inherited by all instances
  • Server expands the Mapplet at runtime
active and passive mapplets
Active and Passive Mapplets
  • Passive Mapplets contain only passive transformations
  • Active Mapplets contain one or more active transformations

CAUTION: changing a passive Mapplet into an active Mapplet may invalidate Mappings which use that Mapplet

    • Do an impact analysis in Repository Manager first
using active and passive mapplets
Using Active and Passive Mapplets

Multiple Passive Mapplets can populate the same target instance


Multiple Active Mapplets or Active and Passive Mapplets cannot populate the same target instance


reusable transformations
Reusable Transformations

By the end of this section you will be familiar with:

  • Reusable transformation advantages
  • Reusable transformation rules
  • Promoting transformations to reusable
  • Copying reusable transformations
reusable transformations150
Reusable Transformations
  • Define once - reuse many times
  • Reusable Transformations
    • Can be a copy or a shortcut
    • Edit Ports only in Transformation Developer
    • Can edit Properties in the mapping
    • Instances dynamically inherit changes
    • Be careful:It is possible to invalidate mappings by changing reusable transformations
  • Transformations that cannot be made reusable
    • Source Qualifier
    • ERP Source Qualifier
    • Normalizer used to read a Cobol data source
promoting a transformation to reusable
Promoting a Transformation to Reusable

Place a check in the “Make reusable” box

This action is not reversible

sequence generator transformation
Sequence Generator Transformation

Generates unique keys for any port on a row

  • Passive Transformation
  • Connected
  • Ports
    • Two predefined output ports,
    • NEXTVAL and
    • No input ports allowed
  • Usage
    • Generate sequence numbers
    • Shareable across mappings
sequence generator properties
Sequence Generator Properties

Number of Cached Values

dynamic lookup
Dynamic Lookup

By the end of this section you will be familiar with:

  • Dynamic lookup theory
  • Dynamic lookup advantages
  • Dynamic lookup rules
additional lookup cache options
Additional Lookup Cache Options

Make cache persistent

  • Cache File Name Prefix
  • Reuse cache by name for another similar business purpose
  • Recache from Database
  • Overrides other settings and Lookup data is refreshed
  • Dynamic Lookup Cache
  • Allows a row to know about the handling of a previous row
persistent caches
Persistent Caches
  • By default, Lookup caches are not persistent
  • When Session completes, cache is erased
  • Cache can be made persistent with the Lookup properties
  • When Session completes, the persistent cache is stored on server hard disk files
  • The next time Session runs, cached data is loaded fully or partially into RAM and reused
  • Can improve performance, but “stale” data may pose a problem
dynamic lookup cache advantages
Dynamic Lookup Cache Advantages
  • When the target table is also the Lookup table, cache is changed dynamically as the target load rows are processed in the mapping
  • New rows to be inserted into the target or for update to the target will affect the dynamic Lookup cache as they are processed
  • Subsequent rows will know the handling of previous rows
  • Dynamic Lookup cache and target load rows remain synchronized throughout the Session run
update dynamic lookup cache
Update Dynamic Lookup Cache
  • NewLookupRow port values
    • 0 – static lookup, cache is not changed
    • 1 – insert row to Lookup cache
    • 2 – update row in Lookup cache
  • Does NOT change row type
  • Use the Update Strategy transformation before or after Lookup, to flag rows for insert or update to the target
  • Ignore NULL Property
    • Per port
    • Ignore NULL values from input row and update the cache using only with non-NULL values from input
example dynamic lookup configuration
Example: Dynamic Lookup Configuration

Router Group Filter Condition should be:

NewLookupRow = 1

This allows isolation of insert rows from update rows

concurrent and sequential workflows
Concurrent and Sequential Workflows

By the end of this section you will be familiar with:

  • Concurrent Workflows
  • Sequential Workflows
  • Scheduling Workflows
  • Stopping, aborting, and suspending Tasks and Workflows
multi task workflows sequential
Multi-Task Workflows - Sequential

Tasks can be run sequentially:

Tasks shows are all Sessions, but they can also be other Tasks, such as Commands, Timer or Email Tasks

multi task workflows concurrent
Multi-Task Workflows - Concurrent

Tasks can be run concurrently:

Tasks shows are all Sessions, but they can also be other Tasks such as Commands, Timer or Email Tasks.

multi task workflows combined
Multi-Task Workflows - Combined
  • Tasks can be run in a combination concurrent and

sequential pattern within one Workflow:

  • Tasks shows are all Sessions, but they can also be

other Tasks such as Commands, Timer or Email Tasks

additional transformations
Additional Transformations

By the end of this section you will be familiar with:

  • The Rank transformation
  • The Normalizer transformation
  • The Stored Procedure transformation
  • The External Procedure transformation
  • The Advanced External Procedure transformation
rank transformation
Rank Transformation

Filters the top or bottom range of records

  • Active Transformation
  • Connected
  • Ports
    • Mixed
    • One pre-defined output port RANKINDEX
    • Variables allowed
    • Group By allowed
  • Usage
    • Select top/bottom
    • Number of records
normalizer transformation
Normalizer Transformation

Normalizes records from relational or VSAM sources

  • Active Transformation
  • Connected
  • Ports
    • Input / output or output
  • Usage
    • Required for VSAM Source definitions
    • Normalize flat file or relational source definitions
    • Generate multiple records from one record
normalizer transformation167
Normalizer Transformation

Turn one row





Into multiple rows

stored procedure transformation
Stored Procedure Transformation

Calls a database stored procedure

  • Passive Transformation
  • Connected/Unconnected
  • Ports
    • Mixed
    • “R” denotes port will return a value from the stored function to the next transformation
  • Usage
    • Perform transformation logic outside PowerMart / PowerCenter
external procedure transformation tx
External Procedure Transformation (TX)

Calls a passive procedure defined in a dynamic linked library (DLL) or shared library

  • Passive Transformation
  • Connected/Unconnected
  • Ports
    • Mixed
    • “R” designates return value port of an unconnected transformation
  • Usage
    • Perform transformation logic outside PowerMart / PowerCenter

Option to allow partitioning

advanced tx transformation
Advanced TX Transformation

Calls an active procedure defined in a dynamic linked library (DLL) or shared library

  • Active Transformation
  • Connected Mode only
  • Ports
    • Mixed
  • Usage
    • Perform transformation logic outside PowerMart / PowerCenter
    • Sorting, Aggregation

Option to allow partitioning

transaction control transformation
Transaction Control Transformation

Allows custom commit types (source- or target-based) and user-defined conditional commits

  • Passive Transformation
  • Connected Mode Only
  • Ports
    • Input and Output
  • Properties
    • Continue
    • Commit Before
    • Commit After
    • Rollback Before
    • Rollback After
transaction control functionality
Transaction Control Functionality
  • Commit Types
    • Target Based Commit -

Commit Based on “approximate” number of records written to target

    • Source Based Commit –

Ensures that a source record is committed in all targets

    • User Defined Commit –

Uses Transaction Control Transform to specify commits and rollbacks in the mapping based on conditions

  • Set the Commit Type (and other specifications) in the Transaction Control Condition
  • View Object Version Properties
  • Track Changes to an Object
  • Check objects “in” and “out”
  • Delete or Purge Object version
  • Apply Labels and Run queries
  • Deployment Groups
informatica business analytics suite


Plug-&-Play Approach

Packaged Analytic Solutions

Custom Built Analytic Solutions

Informatica Business Analytics Suite
informatica warehouses marts

Informatica Warehouse™








Supply Chain








Common Dimensions











Informatica Warehouses / Marts
inside the informatica warehouse

Business Intelligence

Informatica Warehouse™


Warehouse Loader™

Analytic Data Model

Analytic Bus™

Business Adapters™

Extract Transform Load







Inside the Informatica Warehouse
  • Business Adapters™ (Extract)
    • Data Source Connectivity with Minimal Load
    • Structural/Functional Knowledge of Sources
  • Analytic Bus™ (Transform)
    • Transaction consolidation and standardization
    • Source independent interface
  • Warehouse Loader (Load)
    • Type I, II slowly changing dimensions
    • History and changed record tracking
  • Analytic Data Model
    • Industry Best Practice Metrics
    • Process-centric model & conformed dimensions
  • Advanced Calculation Engine
    • Pre-aggregations for rapid query response
    • Complex calculation metrics (e.g. statistical)
powerconnect products
PowerConnect Products

Family of enterprise software products that allow companies to directly source and integrate ERP, CRM, real-time message queue, mainframe, AS/400, remote data and metadata with other enterprise data

  • PowerConnect for MQSeries (real time)
  • PowerConnect for TIBCO (real time)
  • PowerConnect for PeopleSoft
  • PowerConnect for SAP R/3
  • PowerConnect for SAP BW
  • PowerConnect for Siebel
  • PowerConnect for Mainframe
  • PowerConnect for AS/400
  • PowerConnect for Remote Data
  • PowerConnect SDK