Informatica application ilm streamline secure nonproduction mainframe environments
This presentation is the property of its rightful owner.
Sponsored Links
1 / 45

Informatica Application ILM Streamline & Secure Nonproduction Mainframe Environments PowerPoint PPT Presentation


  • 113 Views
  • Uploaded on
  • Presentation posted in: General

Informatica Application ILM Streamline & Secure Nonproduction Mainframe Environments. September 16, 2010 Scott Hagan, Data Integration Sr. Product Manager Jay Hill, ILM Director of Product Management and Marketing. Informatica Confidential & Proprietary. Agenda. Business Drivers

Download Presentation

Informatica Application ILM Streamline & Secure Nonproduction Mainframe Environments

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Informatica application ilm streamline secure nonproduction mainframe environments

Informatica Application ILMStreamline & Secure Nonproduction Mainframe Environments

September 16, 2010

Scott Hagan, Data Integration Sr. Product Manager

Jay Hill, ILM Director of Product Management and Marketing

Informatica Confidential & Proprietary


Agenda

Agenda

  • Business Drivers

  • Building Better Test Environments

  • Identifying and Masking Data Automatically

  • Enabling Seamless Database Connectivity

  • Products in Action


Market driver data proliferation customers are drowning in their own data

Market Driver: Data ProliferationCustomers are Drowning in their Own Data

Escalating storage, server, and database cost

Diminishing application and data warehouse performance

Inability to retire redundant or obsolete applications

Increasing effort spent on maintenance & compliance

More data in more place = greater risk of data breach

Informatica Confidential. Do Not Distribute.


Test data management developers qa struggle with data

Test Data ManagementDevelopers & QA Struggle With Data

Why is this such a big problem?

  • Creating data

    = time consuming, laborious, costly

  • Gaining access

    = data protection legislation. More data in more places = more risk

  • Ensuring integrity

    = complex, especially if you’re federating across systems

  • Getting enough

    = load, stress and performance testing

  • Storage space

    = expensive to maintain lots of full production copies

  • Getting the right quality

    = you need maximum code coverage


Informatica overview critical infrastructure for data driven enterprises

Informatica OverviewCritical Infrastructure for Data Driven Enterprises


The informatica approach comprehensive unified open and economical platform

Data

Warehouse

Data

Migration

Test Data

Management

& Archiving

Data

Consolidation

Master Data

Management

Data

Synchronization

B2B Data

Exchange

Database

Partner Data

SWIFT

NACHA

Unstructured

Cloud Computing

Application

HIPAA

The Informatica ApproachComprehensive, Unified, Open and Economical Platform


Application ilm products use cases improving operational efficiency compliance

Production

Development/Testing/Training Copies

Informatica Data Subset

Informatica Data Archive

Inactive data

Informatica Data Masking

Application ILM Products & Use CasesImproving Operational Efficiency & Compliance

  • Reducestorage, RDBMS license, personnel costs

  • Increaseperformance

  • Reduceeffort spent on maintenance & compliance

  • Reducedata privacy risk

Copy 3

Copy 3

Copy 2

Copy 1

Performance

D A T A B A S E S I Z E

Copy 1

Copy 2

Copy 3

Active data

T I M E


Informatica application ilm streamline secure nonproduction mainframe environments

Informatica Application ILM

  • Application ILM Enables Customers To:

    • Data Archive – Relocate older/inactive data out of production for performance, compliance and application retirement

    • Data Subset – Create smaller copies of production databases for test and development purposes

    • Data Masking – Protect sensitive information in nonproduction

  • ILM Value Proposition:

    • Lower storage and server costs

    • Improve application and query performance

    • Less time and cost for back-up & batch processes

    • Eliminate cost, complexity by retired legacy applications

    • Reduce compliance and eDiscovery expense

    • Prevent data breaches in nonproduction environments


Building better test environments

Building Better Test Environments

Informatica Data Subset


Informatica data subset product objectives

Informatica Data SubsetProduct Objectives

Smaller nonproduction footprint

Objective

Retaining only required data

Method

Enabling target application usability

Primary Challenge

Solution

Informatica Data Subset


Informatica data subset benefits of subsetting

Informatica Data SubsetBenefits of Subsetting


Informatica data subset lean copies for nonproduction use

Informatica Data SubsetLean Copies for Nonproduction Use

Time SavingsHere

Space SavingsHere

Time Slice

or

Functional Slice

Production

Database5 TB

Subset

300 GB

300 GB

300 GB

300 GB

300 GB


Informatica data subset entity concept

Informatica Data SubsetEntity Concept

  • Entity Definition

  • Logical unit to subset

  • Database and application level relationships

  • Policy scoping criteria


Entities

Entities

  • Data Subset uses metadata-based Entities

  • Entities typically represent the transactions with which your application specialists interact. Such as, purchase orders, sales orders or financial documents

  • Selection screens are also metadata-driven to allow for easy customization


Identifying and masking data automatically

Identifying and Masking Data Automatically

Informatica Data Masking


Informatica data masking product objective

Informatica Data MaskingProduct Objective

Protect sensitive information in nonproduction

Objective

Data masking

Method

Creating meaningful yet de-identified data

Primary Challenge

Solution

Informatica Data Masking


Informatica data masking privacy regulations driving masking initiatives

Informatica Data MaskingPrivacy Regulations Driving Masking Initiatives


Informatica data masking realistic masked data to prevent data breach

Informatica Data MaskingRealistic, Masked Data to Prevent Data Breach

QA 02

= Protected Data

= Production

= Nonproduction

QA 03

QA 01

  • Substitute

  • Key-Masking

  • Credit Card Special Masking

  • SSN Special Masking

  • Blur

  • Nullify

  • Randomize

  • Expression

Subset & Clone

CRM20 TB

Clone 1

FIN

15 TB

Mask & Clone

DEV 05

Subset & Mask

HR12 TB

Subset

DEV 04

DEV 03

Mask & Clone

DEV 01

DEV 02


Informatica data masking contextually correct referentially intact data masking

Informatica Data MaskingContextually Correct, Referentially Intact Data Masking


Informatica ilm broad application and database support

Informatica ILMBroad Application and Database Support

Informatica ILM Solutions

Data Masking

Data Subset

Data Archive

Application Aware Accelerators

Custom/3rd Party

Oracle e-Business

SAP

PeopleSoft

Siebel

Universal Connectivity

SQLServer

DB2UDB

DB2z/OS

VSAM

Other

Teradata

Oracle

Sybase


Informatica ilm an enterprise solution platform vendor independent

Informatica ILM: An Enterprise SolutionPlatform & Vendor Independent

ACQUIRED DIVISION

SHARED SERVICE CENTER

Reservation

Applications

Custom Billing Application

INVOICES

CONTRACTS

Oracle 9i

HPUX

10 Years = 600GB

IMS

7 Years = 1.4 TB

CALL CENTER

CORPORATE HQ

Logistics Applications

Siebel 7.8

SERVICE

REQUESTS

BENEFITS

DB2 for z

5 Years = 350GB

VSAM –

600 KSDS files


Informatica powerexchange fast and easy access to mainframe sources

Informatica PowerExchangeFast and Easy Access to Mainframe Sources!

September 16, 2010

Scott Hagan, Data Integration Sr. Product Manager

Informatica Confidential & Proprietary


Informatica powerexchange

Informatica PowerExchange

What’s the Problem?

You need access to mainframe data, quickly! No time or expertise to code extracts? FTP’s? Queries? What about Security? Speed? Recoverability? Integration Support?

Oh yes, and I need it yesterday!


Informatica powerexchange1

Informatica PowerExchange

Informatica PowerExchange helps you to…

Unlock difficult to access data – Mainframe, legacy, etc. And make it available in when you need it – Batch, regular updates or real-time


Data integration traditional methods

Data IntegrationTraditional Methods

Target

Database

Source

Data

Translate

Extract

Move

Load

Program

Extract

from one or

more sources

Filter, ASCII

EBCDIC

conversion

Transport

data across

platforms

Load data

to target

Database


Data integration powerexchange approach

Data IntegrationPowerExchange Approach

Target

Database

Source

Data

NO PROGAMMING,

NO INTERMEDIATE FILES

Data is extracted using SQL, converted (EBCDIC/ASCII), filtered and available to the target database in memory, without any program code or FTP.


Powerexchange batch highly scalable bulk access to data

SOURCES/TARGETS

PROJECTS

  • Data warehousing

  • Data migration

  • Data consolidation

  • Application implementation

  • Application migration

  • ILM

    • Test Data Sources

  • Databases

  • Data warehouses

  • Packaged applications

  • Mainframe, midrange

  • Message-oriented middleware

  • Collaboration

  • Technology standards

Informatica Data Integration Platform

PowerExchange

PowerExchange

PowerCenter

PowerExchange - BatchHighly Scalable Bulk Access to Data


Powerexchange real time immediate access to data events and web services

SOURCES

PROJECTS

  • Straight-through processing

  • Real-time analytics

  • Real-time warehousing

  • Application integration

  • Message-oriented middleware

  • Web services

  • Packaged applications

  • Multiple modes

    • Batch

    • Continuous

Informatica Data Integration Platform

PowerExchange

PowerExchange

PowerCenter

Real Time Edition

PowerExchange - Real-time Immediate Access to Data, Events, and Web Services


Powerexchange change capture creation and detection of business events

SOURCES

PROJECTS

  • Create business events from database updates

  • Operational data integration (ODI)

  • Master data management (MDM)

  • Trickle-feed data warehousing

  • Data replication/synchronization

  • Relational, mainframe, midrange databases

  • Multiple modes

    • Batch (for initial materialization)

    • Net change

    • Continuous capture

Informatica Data Integration Platform

PowerExchange

CDC Option

PowerExchange

PowerCenter

Real Time Edition

PowerExchange - Change CaptureCreation and Detection of Business Events


Powerexchange run time batch data movement test data creation

Mainframe and

Mid-Range

PackagedApplications

Relational and

Flat Files

Listener

PowerCenter

Tools

Operating Environment

User Applications

PowerExchange

Standards and

Messaging

Remote Data

PowerExchange Run-TimeBatch Data Movement (Test Data Creation?)

Targets

(ETL, EAI, BI)

DataRecords

SQL

Data Maps for Non-Relational Access


Test data management concepts

Test Data Management Concepts

  • Privacy Policies at logical level

  • Define once use multiple times

Subst Last Names

Skew Salary

Subst Credit Cards

  • Plans define a data subset with entities, filter criteria and privacy policies

Policy Assignment

Nullify SSN’s

  • Policy Assignment at physical level

  • Reuse policy for multiple applications

Files

ERP

Mainframe

DBMS


Financial services co and new regulations

Financial Services Co. and New Regulations

  • Financial holding company approval

    • October 2008 Financial Services Co. is approved by the US Federal Reserve Board to operate as a financial holding company allowing Financial Services Co. to offer additional retail banking services to its customers

  • New regulations

    • Financial Services Co. is now subject to supervision by the Federal Reserve and regulated by the FDIC

  • Redundant work

    • To comply with new regulations many business units within Financial Services Co. are performing redundant work such as complying with data privacy regulations

  • Self-service solution

    • Financial Services Co. wanted to pursue a holistic approach and build a self-service data masking solution


Self service data masking solution

Self Service Data Masking Solution

  • Corporate IT Compliance Team

    • Reviewed regulations and determined what constitutes sensitive or private data

    • Built a finite list of sensitive fields that must be masked throughout the organization and the masking rule that should be used

    • Action item 1: update Business Glossary with the corporate IT privacy policies for each sensitive field

    • Action item 2: build company-wide data masking policies for each sensitive fields with the associated masking rule

  • Online banking application owner

    • Apply corporate IT compliance team’s privacy policies to my online banking application


Business glossary

Business Glossary

Open the Business Glossary to define the masking policies in business terms


Ilm workbench policies

ILM Workbench – Policies

I’ll enter a clear name and description for credit cards

Rules can also be defined reusable mapplets

I need to create new policy for credit cards

I’ll locate available masking rules and choose the rule I want to assign


Ilm workbench policy assignment

ILM Workbench – Policy Assignment

I’ll start with assigning a policy to credit card columns

I’ll locate sensitive columns and assign the appropriate policy

I just profiled my source database. Now I can look for data patterns that represent credit cards

I need to apply the corporate privacy policy to my online banking application


Ilm workbench entities

ILM Workbench – Entities

I’ll create a subset of data based on the Customer entity

Entities are a set of related tables with a filter criteria definition

I initially want to test my masking policies on a subset of data


Ilm workbench plans

ILM Workbench – Plans

I’ll give the plan a good name to show this is a Subset and Masking plan

I’ll add all the Policy Assignments I created earlier to the plan

I’ll search for and add the Customer entity to the plan to mask only a subset of the data

Now that I reviewed my list of entities, I’m ready to create an integrated Data Subset and Data Masking plan


Ilm workbench plans1

ILM Workbench – Plans

The plan is now complete and being generated

Before I process the plan, I’m going to launch Metadata Manager and look at the data lineage from my source to target system to validate the end to end definition

Now I’m ready to process the plan. I’ll switch to PowerCenter workflow monitor for detailed monitoring information

If there were any additional sensitive fields they would have been highlighted


Ilm workbench masking validation

ILM Workbench – Masking Validation

Once the plan completes, I’d like to validate the results to ensure masking was performed as intended

I can use rules such as these to validate the results

SSN

All values have changed

SSN

All values came from the dataset

First Name

All values have changed

Credit cards

All values have proper format


Ilm workbench masking validation1

ILM Workbench – Masking Validation

After running the validation. I can see a simple scorecard of the rules that passed or failed

Here are a few of the validation rules I created earlier

I set up the rules earlier with simple operators like this one

I can see that one SSN value didn’t pass the validation rule

The value in the source is the same as the target


Data masking and data subset check list

Data Masking and Data Subset Check-List

  • Built reusable masking policies

  • Reduced redundant work

  • Complied with data privacy regulations

  • Integrated subset with privacy rules

  • Validated masking results


  • Login