E N D
1. BI1003 – Data Extraction – v2.0
2. Data Extraction
3. Data Extraction
4. PrepareMe In this course, we will cover 3 main sub-topics :
- Purpose
- Use
- Challenges
5. Purpose This course demonstrates the various data extraction methods.
Extraction Methods covered
Business Content Extraction from SAP R/3
LIS & LO Extraction from SAP R/3
CO-PA and FI-SL Extraction
Generic Extractors
Flat File Extraction
UD Connect, DB Connect, SOAP Based and Open Hub Extraction
DataSource Enhancement
6. Use In many organizations, data is fragmented and spread across many databases and applications.
To be useful, data must be integrated, standardized, synchronized, and enriched – typically through ETL (extraction, transformation, and loading) processes.
SAP BI provides a broad set of ETL capabilities that support data extraction.
With the open interfaces of SAP BI, data can be loaded from virtually any source and can handle the huge amounts of transactional data typical of the enterprise landscape.
Thus extraction deals with extracting the data into the BW system for analysing and reporting.
7. Challenges To extract large volumes of data from SAP R/3 by defining delta management
Modifying the already available Business content and satisfying user needs
Integrating data from non SAP systems into BW
8. Data Extraction
9. Data Extraction : Overview
10. Process Flow
11. Process Flow The data flows from the OLTP system to the PSA tables without any transformation.
If any transformation or filtration is required in the data it can be done in the update rules and transfer rules.
Then the data is loaded in the data targets like ODS. An ODS object acts as a storage location for consolidated and cleaned-up transaction data.
The data from multiple ODS is then loaded into InfoCube and thus available to generate reports.
In this example, Order and Delivery Data are extracted separately, and populate separate PSA and ODS tables. The data is then merged within the ODS to form a new ODS object storing combined Order Delivery data. The data then updates the InfoCube with Order Delivery data.
12. Multiple Source Systems Data in BW can come from various systems like
1. SAP systems
2. SAP Business Information Warehouse systems,
3. Flat files, through which metadata is manually maintained, and transferred into BW by using a file interface,
4. A database management system into which data is loaded from a database supported by SAP, using DB Connect and not an external extraction
program.
5. External systems, in which data and metadata is transferred using staging BAPIs.Data in BW can come from various systems like
1. SAP systems
2. SAP Business Information Warehouse systems,
3. Flat files, through which metadata is manually maintained, and transferred into BW by using a file interface,
4. A database management system into which data is loaded from a database supported by SAP, using DB Connect and not an external extraction
program.
5. External systems, in which data and metadata is transferred using staging BAPIs.
13. Types of Extractors
14. Types of Extractors Extractors mainly are of two types:
Application Specific
Cross Application
Application Specific Extractors consist of
BI Content Extractors (LO Cockpit)
Customer Generated Extractors (CO-PA, FI-SL and LIS)
Cross Application Extractors consist of generic extractors (based on Database table, Infoset and Function module)
15. Business Content Extraction
16. Business Content Extractors SAP provides extractors for almost all applications. They are provided as an add on with the Business Content.
For each application like FI,CO and HR there are specific tables in the OLTP system.
BI Content extractors consist of extract structures which are based upon these tables and thus can be used to fetch the data in BW.
Hence when any company implements application like FI, CO it can directly use the already available BI Content Extractors without the need to create their own.
SAP delivers these objects in Delivery version (D) and we need to convert them in the active version (A).
Business content DataSources from a Source System are available in BW for transferring data only after you have converted them into the active versions in the source system and replicated them.
17. Uses of BI Content Extractors Built in Extractors
High coverage (Applications and Industries)
Available for both Transaction and Master data
Ready to use
Reduce implementation Costs and efforts
Delta capable
18. LO Data Extraction : Overview
19. LO Cockpit Logistics Customizing Cockpit provides a simplified way to extract logistics data and transfer it to SAP Business Information Warehouse.
20. LO Data Extraction : Data Flow As shown in the figure the document data for various applications like Customer order, delivery and billing is first transferred to various Communication Structures.
These communication structures are used to form various extract structures in R/3 which consist of various DataSources like 2LIS_11_VAHDR, 2LIS_11_VAITM etc.
These DataSources then can be replicated to BW and assigned to Infosources.
Then by creating transfer rules and update rules and defining the transformation the data is loaded into various data targets like InfoCube and ODS, thus available for reporting.
21. V1 and V2 Updates V1 Update – Synchronous Update
If you create/change a purchase order (me21n/me22n), when you press 'SAVE' and see a success message (PO.... changed..), the update to underlying tables EKKO/EKPO has happened (before you saw the message). This update was executed in the V1 work process.
V2 Update – Asynchronous Update
If you create/change a purchase order (me21n/me22n), when you press 'SAVE' it takes few seconds to update to underlying tables EKKO/EKPO depending on system load. This update was executed in the V2 work process.
V3 Update has now replaced V1 and V2 updates.
22. V3 Update Modes in LO Cockpit There are four Update Modes in LO Extraction
Serialized V3 Update
Direct Delta
Queued Delta
Un-serialized V3 Update
23. Serialized V3 Update
24. Serialized V3 Update Transaction data is collected in the R/3 update table
Data in the update tables is transferred through a periodic update process to BW Delta queue
Delta loads from BW retrieve the data from this BW Delta queue
Transaction postings lead to:
Records in transaction tables and in update tables
A periodically scheduled job transfers these postings into the BW delta queue
This BW Delta queue is read when a delta load is executed.
25. Direct Delta
26. Direct Delta Each document posting is directly transferred into the BW delta queue
Each document posting with delta extraction leads to exactly one LUW in the respective BW delta queues
Transaction postings lead to:
Records in transaction tables and in update tables
A periodically scheduled job transfers these postings into the BW delta queue
This BW Delta queue is read when a delta load is executed.
27. Queued Delta
28. Queued Delta Extraction data is collected for the affected application in an extraction queue
Collective run as usual for transferring data into the BW delta queue
Transaction postings lead to:
Records in transaction tables and in extraction queue
A periodically scheduled job transfers these postings into the BW delta queue
This BW Delta queue is read when a delta load is executed.
29. Un-Serialized V3 Update
30. Un-Serialized V3 Update Extraction data for written as before into the update tables with a V3 update module
V3 collective run transfers the data to BW Delta queue
In contrast to serialized V3, the data in the updating collective run is without regard to sequence from the update tables
Transaction postings lead to:
Records in transaction tables and in update table
A periodically scheduled job transfers these postings into the BW delta queue
This BW Delta queue is read when a delta load is executed.
31. New LO Data Extraction : Why ?
32. New LO Data Extraction : Why ? Performance and Data Volume
Detailed extraction: You can deactivate the extraction of, for example, scheduling data ('thin extractors' lead to small upload volumes)
Document changes: only BW-relevant data changes are
updated (smaller upload volumes)
LIS tables are not updated: Reduced data volumes due to removed redundant data storage
Update with batch processes: No overload of everyday work'
33. New LO Data Extraction : Why ? Simple Handling
BW Customizing Cockpit: Central, single, maintenance tool for Logistics applications
No LIS functionality: No knowledge of LIS Customizing update settings, etc, required
Function enhancement: Creating simply and without modifications
Others
Delta tables (SnnnBIW1/-2): duplicated update, duplicated data storage
34. Customer Generated Extractors
35. Customer Generated Extractors In response to demands from the customers, SAP has designed the reporting facility in certain parts of the R/3 system in such a way that it can be easily customized to meet customer’s requirements.
The Business Information Warehouse must be able to process this data, with the result that it must be possible to define generic extractors in certain applications.
These applications are the
Logistics Information System
Profitability analysis
Special Ledger
36. LIS Extraction
38. LIS Data Extraction The LIS data flow can be described as follows:
Operative transaction data (sales orders, billing) is stored in application data tables (for example VBAK/VBAP for sales orders)
The information is transferred from an operative document to the Communication Structures (for example MCVBAK,MCVBAP) to update the data in the LIS).
These R/3 Communication perform the same function as the BW Communication Structures, but they fill LIS InfoStructures in R/3 with data and not cubes in BW.
The update rules write the information from the Communication Structure to the corresponding Information Structures.
39. LIS Data Extraction Although the term InfoStructure is used in R/3, the InfoStructure is a real transparent table and not the structure which contains data only at run time.
LIS has many reporting tools like ABC analysis, correlation, graphical displays but the limitations of the schema with only one table makes reporting slower than it would be in BW.
Thus in the LIS Extractor the data is transferred from the LIS InfoStructure in BW.
LIS extractors are now obsolete and replaced by LO Cockpit.
40. FI-SL Extractors
41. FI-SL Extractors FI-SL is an application in which data (planned and actual) from different levels of OLTP applications is combined to measure business performance.
FI-SL includes planning functions and reporting tools. FI-SL reporting in SAP R/3 is however restricted by the following :
Cross application reporting is not diverse
The OLTP system is optimized for transaction processing and a high reporting workload would have a negative impact on the overall performance of the system
The solution to the above limitation is FI-SL Reporting in BW.
42. Uses of FI-SL Extractors In FI-SL one is able to use alternate chart of accounts (operative, group specific, country specific charts of accounts)
Adjustment postings (direct data entry) can be made in the FI-SL system
Various fiscal year variants enables one to create a weekly or monthly report
Validations or substitutions allow one to check or modify the data whenever it enters the FI-SL system.
Up to three currencies and two quantities can be run in parallel in FI-SL
Selective data retrieval - the assignment of transactions to particular company code or ledgers determines which ledgers are to be updated.
43. FI-SL Data Flow In addition to data from FI,CO,MM and SD external data and data that is entered directly can also be posted into FI-SL.
The update takes place either online or as a subsequent processing. With subsequent processing a predefined number of data records are transferred to the FISL database tables at a certain point of time, independently of the time the document was created.
You can use several methods to update data in FI-SL special purpose ledger:
1. Validation checks whether there are any freely definable rules for the data that is going to be updated.
2.Substitution replaces the data with the modified data according to the set of rules before the update starts.
3.Ledger selection allows selected ledgers to be updated for data from a transaction.
4.The fields that are transferred determines which characteristics/dimensions are used in the FI-SL system.You can use these dimensions when you use this special purpose ledger system.
Operations available for FI-SL data :
- The currency translation function translates amounts that have been posted to the ledger in the FI-SL system.
- At the end of the fiscal year, you use the balance carry forward function to transfer actual and plan values from previous to current fiscal year.
-Allocation is the transfer of an amount that has already been posted from a sender object to one or more receiver objects.
-You can create roll up ledgers containing cumulative and summarized data from one or more of the other ledgers to speed up report processing times.In addition to data from FI,CO,MM and SD external data and data that is entered directly can also be posted into FI-SL.
The update takes place either online or as a subsequent processing. With subsequent processing a predefined number of data records are transferred to the FISL database tables at a certain point of time, independently of the time the document was created.
You can use several methods to update data in FI-SL special purpose ledger:
1. Validation checks whether there are any freely definable rules for the data that is going to be updated.
2.Substitution replaces the data with the modified data according to the set of rules before the update starts.
3.Ledger selection allows selected ledgers to be updated for data from a transaction.
4.The fields that are transferred determines which characteristics/dimensions are used in the FI-SL system.You can use these dimensions when you use this special purpose ledger system.
Operations available for FI-SL data :
- The currency translation function translates amounts that have been posted to the ledger in the FI-SL system.
- At the end of the fiscal year, you use the balance carry forward function to transfer actual and plan values from previous to current fiscal year.
-Allocation is the transfer of an amount that has already been posted from a sender object to one or more receiver objects.
-You can create roll up ledgers containing cumulative and summarized data from one or more of the other ledgers to speed up report processing times.
44. CO-PA Extractors
45. CO-PA Extractors Profitability Analysis (PA) is an integrated component in the SAP R/3 system.
All of the data related to profitability from the other SAP R/3 applications is mapped in CO-PA in accordance with the corresponding business transactions.
This allows you to transfer into CO-PA billing document data from SD, cost object costs from CO-PC, and overall costs from overall cost controlling.
CO-PA collects all of the OLTP data for calculating contribution margins (sales, cost of sales, overhead costs)
CO-PA also has powerful reporting tools and planning functions
46. CO-PA Extractors
47. CO-PA Extractors During billing SD, revenues and payments are transferred to profitability segments in Profitability Analysis. At the same time, sales quantities are valuated using the standard cost of goods manufactured, as specified in the cost component split from CO-PC.
One of the typical questions that can be answered with CO-PA module is what are the top products and customers in our different divisions.
The wide variety of analysis and planning functions in CO-PA allow you to plan, monitor and control the success of your company in terms of product-oriented, customer-oriented and organizational dimensions of multidimensional profitability segments.
48. Generic Extractors
49. Generic Extractors Generic Extractors are Cross Application Extractors used in scenarios where the other type of extractors are unavailable.
Generic extractors are of three types:
Based on Database view/Table
Based on InfoSet
Based on Function Module
50. Generic Extractors When should you use generic extractors?
Business Content does not contain a DataSource for your application.
The application does not feature its own generic delta
extraction method
You are using your own programs in SAP R/3 to populate the tables
The tools for generic data extraction can be used to generate an extractor for all applications.
51. Flat file Extraction BW provides facility to load data from flat files (CSV or ASCII files).
It supports the following types of data :
Transaction data
Master data, either directly or flexibly
Attributes
Text
Hierarchies
Flat file can be stored either on a local system or Application Server.
But from performance point of view we store the file on the applications server and then load it into BW.
53. UD Connect Overview For the connection to DataSources (in BW), UD Connect uses the J2EE Connector Architecture.
BI Java Connectors that are available for various drivers, protocols and providers as resource adapters.
BI JDBC Connector
BI ODBO Connector
BI SAP Query Connector
BI XMLA Connector
54. DB Connect Overview
55. DB Connect Overview A purchasing application runs on a legacy system based on an ORACLE database.
To analyze the data from the purchasing application, the data needs to be loaded into the BW System (possibly installed on a different database, e.g. MS SQL-Server).
DB-Connect can be used to connect the DBMS of of the purchasing application and extract data from tables or views.
57. Data Transfer with DB Connect By default, when a BW application server is started by the SAP kernel, the system opens up a connection to the database on which the SAP system runs.
All SQL commands (irrespective of whether they are Open or Native SQL) that are submitted by the SAP kernel or by ABAP programs, relate automatically to this default connection, meaning that they run in the context of the database transaction that is active in this connection.
58. Data Transfer with DB Connect Connection data, such as database user name, user password, or database name are taken either from the profile parameters or from the corresponding environment variables (this is database-specific).
Thus DB Connect can be used to open up other database
connections in addition to the default connection and use these connections to transfer data from tables or views into a BW system.
60. SOAP-Based Transfer of Data As a rule, data transfer in BW takes place using a data request that is sent from BW to the Source System (pull from the scheduler).
You can also send the data to SAP BW using external control. This is a data push in the SAP BW.
Data push is possible for multiple scenarios:
Transferring Data Using the SOAP Service SAP Web AS
Transferring Data Using Web Service
Transferring Data Using SAP XI
In all three scenarios, data transfer takes place via transfer mechanisms that are sufficient for Simple Object Access Protocol (SOAP) and are XML based.
62. Open Hub Service The open hub service enables you to distribute data from an SAP BW system into external data marts, analytical applications, and other applications.
With this, you can ensure controlled distribution using several systems.
The central object for the export of data is the InfoSpoke. Using this, you can define the object from which the data comes and into which target it is transferred.
Through the open hub service, SAP BW becomes a hub of an enterprise data warehouse. The distribution of data becomes clear through central monitoring from the distribution status in the BW system.
63. Datasource Enhancement Need for DataSource enhancement
When we require an additional field from a database table and it is not directly available from the datasource, then we can append the field in the extract structure of the datasource.
Then the logic to populate that field can be written in the customer exit.
64. Data Extraction
65. BI Content Extractors
66. BI Content Extractors – Steps
67. Step 1 - Activate DataSources
68. Step 1 - Activate DataSources
69. Step 2 - Replicate datasources
Tcode RSA1?
Modelling?
Source Systems?
Context Menu ?
Replicate
DataSources.
NOTE :
The Data Source must be replicated before it can be used in BW. An R/3 System connected to BW provides a variety of DataSources. These can be uploaded into BW by replicating them in the source system tree.
The DataSources are delivered in a so-called 'D version' and can be transferred to the 'A version‘ when they are activated.
NOTE :
The Data Source must be replicated before it can be used in BW. An R/3 System connected to BW provides a variety of DataSources. These can be uploaded into BW by replicating them in the source system tree.
The DataSources are delivered in a so-called 'D version' and can be transferred to the 'A version‘ when they are activated.
70. Step 3- Assign the InfoSource
71. Step 4 - Maintain InfoCube and Update rules
Go to InfoProvider (RSA1?Modelling)
Create the Infocube or select the Business Content Cube.
Activate the Infocube.
Create Update rules and activate it.
Create InfoPackage to extract the data.
Go to InfoProvider (RSA1?Modelling)
Create the Infocube or select the Business Content Cube.
Activate the Infocube.
Create Update rules and activate it.
Create InfoPackage to extract the data.
72. Step 5 - Create InfoPackage
73. Step 6 - Initialize the delta process
75. Monitoring the Upload Procedure
76. Delta Update (Scheduler)
77. Delta Update (Scheduler) For high volume of transaction data, a full update here is mostly only justified for the first time data is transferred or if the statistics data has to be rebuilt following a system failure.
Delta update mechanisms that restrict the volume of data to within realistic limits, therefore, are required to implement a performance-oriented, periodical data transfer strategy.
For e.g. when sales figures are updated every week in the Business Information Warehouse, only the sales document information that has been added or changed in the last week should be extracted.
78. LO Cockpit
79. Logistics Customizing Cockpit The LO Cockpit contains the following functionality, as specified in the procedure sequenced below:
Maintaining extract structures
Maintaining DataSources
Activating the update
Controlling V3 update
80. LO Data Extraction (SBIW)
81. Logistics Extract Structure – Cockpit (LBWE)
82. Individual Steps in LO Extraction
83. Step 1 - Maintaining the Extract Structure
84. Maintaining the Extract Structure The extract structure is filled with the assigned communication structures. You can only use selected fields from the communication structures (SAP internal control fields, for example, are not offered).
SAP already delivers extract structures, which you can enhance (by connecting to the communication structure). Every extract structure can be maintained by you and by SAP.
After you set up the extract structure, the system automatically generates it. This completes missing fields (their units and characteristics). The extract structure is created hierarchically according to the communication structures. Every communication structure leads to the generation of a substructure that belongs to the actual extract structure.
85. Maintaining DataSources
86. Maintaining DataSources : Procedure There exists a DataSource (for e.g. 2LIS_11_VAITM) for each extract structure that is made available (for example, MC11VA0ITM) in the OLTP System.
A maintenance screen is displayed in which you can to assign other properties to the fields of the extract structure
Selection fields
Hide fields
Cancellation fields
(field is inverted when canceled (*-1))
87. Step 2 – Replicating DataSources An R/3 System connected to BW provides a variety of DataSources. These can be uploaded into BW by replicating them in the source system tree.
The DataSources are delivered in a so-called 'D version' and can be transferred to the 'A version‘ when they are activated. They can be also changed and forwarded to the 'A version' (one that deviates from the default).
The DataSource must be replicated before it can be used in BW.
To do this, choose the 'Source Systems' tab strip in the BW Administrator Workbench. Select the relevant OLTP system and choose Replicate DataSources from the context menuAn R/3 System connected to BW provides a variety of DataSources. These can be uploaded into BW by replicating them in the source system tree.
The DataSources are delivered in a so-called 'D version' and can be transferred to the 'A version‘ when they are activated. They can be also changed and forwarded to the 'A version' (one that deviates from the default).
The DataSource must be replicated before it can be used in BW.
To do this, choose the 'Source Systems' tab strip in the BW Administrator Workbench. Select the relevant OLTP system and choose Replicate DataSources from the context menu
88. Step 3 - Maintaining Communication Structure
89. Step 4 - Maintaining Transfer Rules
90. Step 5 - Maintaining InfoCubes and Update Rules
91. Step 6 - Activating Extract Structures
92. Step 7 - Delete set up tables
93. Step 8 - Initialization/Simulation (OLI*BW)
94. Step 8 - Creating Infopackages
95. Step 9 - Initializing the Delta Process
96. V3 Update- Settings When the extract structure is activated for updating, the data is written to the extract structures immediately - online as well as when you fill the tables for restructuring.
The data is updated with the V3 update and is ready to be transferred with the batch operator to the central delta management.
Define job parameters such as start time and printer settings, and schedule the batch jobs at this point.
When the extract structure is activated for updating, the data is written to the extract structures immediately - online as well as when you fill the tables for restructuring.
The data is updated with the V3 update and is ready to be transferred with the batch operator to the central delta management.
Define job parameters such as start time and printer settings, and schedule the batch jobs at this point.
97. V3 Update Settings
98. Delta Queue Maintenance
99. Step 10 -Delta Update (Scheduler)
100. FI-SL Extractors
101. FI-SL extraction Step-by-Step
102. Step 1 - Setting up an FI-SL DataSource
103. Go to SBIW ? Settings for Application specific data sources ? Generate transfer structure for totals table Step 1 - Setting up an FI-SL DataSource In step 1 you generate transfer structure for your summary table. This structure is used to transfer transaction data from your ledgers to Business Information Warehouse.
The structure contains fields from summary table, objects table and derived fields from customizing.
The system names the extract structure according to the following naming convention.
If the name of the summary table ends with ‘t’, the ‘t’ is replaced with ‘b’.
If the name of the summary table does not end with ‘t’ a ‘b’ is added as a suffix to the name.
If this summary table is refreshed the existing structure is over written.In step 1 you generate transfer structure for your summary table. This structure is used to transfer transaction data from your ledgers to Business Information Warehouse.
The structure contains fields from summary table, objects table and derived fields from customizing.
The system names the extract structure according to the following naming convention.
If the name of the summary table ends with ‘t’, the ‘t’ is replaced with ‘b’.
If the name of the summary table does not end with ‘t’ a ‘b’ is added as a suffix to the name.
If this summary table is refreshed the existing structure is over written.
104. Step 2 - Setting up an FI-SL DataSource
105. Step 2 - Setting up an FI-SL datasource : Go to SBIW ?Settings for application specific datasources ? Financial accounting ? create infosource for Ledger In step 2 you define a datasource for a ledger and assign this datasource to a ledger.
In this step all the non standard ledgers with summary tables that have already been assigned a extract structure are displayed first. In other words why a ledger is not displayed in step 2 might be :
Because the ledger is a standard ledger (starts with a number eg. 00 for general and 0F for sales costs).
The summary table for the ledger does not have an extract structure.
Naming convention:
The name must start with 3. The system proposes a name with the prefix 3FI_SL_ name of the ledger.In step 2 you define a datasource for a ledger and assign this datasource to a ledger.
In this step all the non standard ledgers with summary tables that have already been assigned a extract structure are displayed first. In other words why a ledger is not displayed in step 2 might be :
Because the ledger is a standard ledger (starts with a number eg. 00 for general and 0F for sales costs).
The summary table for the ledger does not have an extract structure.
Naming convention:
The name must start with 3. The system proposes a name with the prefix 3FI_SL_ name of the ledger.
106. Step 2 - Setting up an FI-SL DataSource The status column shows whether a DataSource is assigned to the ledger already. (Traffic light green)
You select the ledger to which you want to assign the DataSource
107. Step 2 - Setting up an FI-SL DataSource This overview shows some of the fields in the extract structure of datasource 3FI_SL_LE_TT.
A global ledger has the company as its organizational unit. The company characteristic appears in the transfer structure only with datasources for global ledgers. With global ledgers the company code characteristic is optional.
Other information on datasources :
- Account Plan and Fiscal Year variant cannot be used as selection criteria
- The version field shows the FISL led version
- The FISL record type characteristic is based upon the Reporting value type info object. The extractor transforms two reporting value types from 4
different FISL record types. Plan and actual.This overview shows some of the fields in the extract structure of datasource 3FI_SL_LE_TT.
A global ledger has the company as its organizational unit. The company characteristic appears in the transfer structure only with datasources for global ledgers. With global ledgers the company code characteristic is optional.
Other information on datasources :
- Account Plan and Fiscal Year variant cannot be used as selection criteria
- The version field shows the FISL led version
- The FISL record type characteristic is based upon the Reporting value type info object. The extractor transforms two reporting value types from 4
different FISL record types. Plan and actual.
108. FI-SL Extractors After generating the FI-SL DataSource rest of the steps are same as explained earlier.
Replicate the DataSource and then assign it to the infosource.
Create transfer rules and update rules
Create InfoPackage and initialize the delta process
109. CO-PA Extractors
110. CO-PA Extraction (Steps)
111. Step 1 - Generate CO-PA DataSource
112. Step 1 - Generate DataSource In the case of costing-based profitability analysis, you can include the following in the DataSource: Characteristics from the segment level, characteristics from the segment table, fields for units of measure, characteristics from the line item, value fields, and calculated key figures from the key figure scheme.
In the case of account-based profitability analysis, on the other case, you can only include the following in the DataSource: Characteristics from the segment level, characteristics from the segment table, one unit of measure, the record currency from the line item, and the key figures.
In the case of costing-based profitability analysis, you can include the following in the DataSource: Characteristics from the segment level, characteristics from the segment table, fields for units of measure, characteristics from the line item, value fields, and calculated key figures from the key figure scheme.
In the case of account-based profitability analysis, on the other case, you can only include the following in the DataSource: Characteristics from the segment level, characteristics from the segment table, one unit of measure, the record currency from the line item, and the key figures.
113. Maintain CO-PA datasource
114. CO-PA Extractors After generating the CO-PA DataSource rest of the steps are same as explained earlier.
Replicate the DataSource and then assign it to the infosource.
Create transfer rules and update rules
Create InfoPackage and initialize the delta process
115. Generic Extractors
116. Generic Data ExtractionSteps for transaction Data
117. Generic Data ExtractionSteps for Master Data
118. Step1 - Create generic DataSource
119. Create Generic DataSource
120. Generic Data Extraction
121. Generic Data Extraction
122. Generic Data Extraction
Transport the DataSource to BW in Tcode RSA6(Post
Processing of DataSources).
The steps in BW :
Replicate the DataSources.
Create necessary Info objects
Generate Info source in BW, Maintain Communication Structure and Transfer Rules
Create Info package and schedule it.
123. Flat File Extraction
124. Flat File Extraction
125. Step 1 – Identify the Data
126. Step 2 - Create a Source System
RSA1
Modelling ?
Source Systems?
Context Menu
Create?
Select the Source
System as File
System .
127. Flat File Extraction After creating the Source System, create the Infoobjects in the BW system.
Create an InfoSource and assign the flat file Source System to it and maintain the transfer rules.
Create and ODS and maintain update rules.
Now create an infopackage and load the data.
128. Hints while loading flat file
129. Hints while loading flat file Be sure the delimiter in the InfoPackage is given as ‘ ; ’
You might get an error on alpha conversion. Take in mind the option in transfer rule to convert the data from external format into internal.
130. Hints while loading flat file Calendar day should be of the format yyyymmdd (SAP requires this format).
Sometimes you might get an error like the system cannot open the file. Try keeping the file in My Documents or in C drive.
View the preview option in the InfoPackage before loading the file
Set the number of header rows to be ignored to 1. This ignores the heading row in the flat file
131. DB Connect, UD Connect and SOAP Based
132. Steps involved for DB Connect
133. Steps involved for DB Connect Create a DB Connect Source System
Create a DataSource
Create InfoSource and assign the Source System created to it.
Create InfoPackage and load the data.
137. DataSource Enhancement
138. Datasource Enhancement via Customer Exit The different customers exits for enhancements of transaction data, master data attributes, texts and hierarchy DataSources are
EXIT_SAPLRSAP_001 : Transaction data DataSource
EXIT_SAPLRSAP_002 : Master data DataSource
EXIT_SAPLRSAP_003 : Text DataSource
EXIT_SAPLRSAP_004 : Hierarchy DataSource
139. Datasource enhancement Steps involved
140. Step 1 - Check for availability in Extract Structure Applicable for DataSources where required field is available but not included by default
Pre-requisite is empty delta queue and set-up tables
141. Step 2 - Enhance Extract Structure Find the name of the extract structure for DataSource
Namespace for custom field is ‘YY or ZZ’
Activate append structure
For currency/quantity fields, the reference field should also be present in the extract structure
142. Step 3 - Unhide the added field
Note that ‘Field Only’ is checked
Incase field is required for selection, check ‘Selection’
143. Step 4 - Write the logic Go to CMOD
Enter Project name and select component
Select appropriate Customer Exit
144. Write the Logic Write the ABAP code for populating the custom field
Test Enhancement in RSA3
Replicate DataSource in BW
145. Data Extraction
146. Flat File Extraction Create the InfoObjects document number, order no, name of the material, cost, currency, order creation date in the BW system.
Create a flat file containing data for above InfoObjects
Save the file in the CSV format.
Now create a flat file source system and InfoSource and assign the Source System to the InfoSource.
Create an ODS with document number as key field and include the above InfoObjects.
Create the update rules from the InfoSource you have created.
Create an InfoPackage and load the file.
147. BI Content Extraction In the Source system go to transaction RSA5, activate the DataSource 2LIS_11_VAITM.
Go to BI Content select the grouping as data flow before and install the InfoSource 2LIS_11_VAITM.
Now replicate the DataSource in BW and assign it to the InfoSource and maintain transfer rules
Create an ODS and update rules
Create the InfoPackage and load the ODS
148. LO Cockpit Go to transaction SBIW. Check out the various options to delete and fill the set up tables and simulate the batch jobs
Go to transaction LBWE, check the option to maintain the datasources and to activate them as well as to simulate the V3 update
149. Transactions
150. Transactions
151. Course Name
152. Tips & Tricks Before starting an SAP BW project, analyze the reporting requirements against the standard DataSources available.
Zero-in on the standard DataSources satisfying the requirement.
If data that cannot be supplied by the standard sources, we can:
- Create a generic DataSources.
- Enhance the standard DataSource.
153. Some Useful sites www.sdn.sap.com
www.help.sap.com
www.service.sap.com