Technique/SAP BW2012. 8. 3. 11:11

DSO Comparison

Posted by Rahul Nair in Rahul Rajagopalan Nair's Blog on Aug 2, 2012 10:26:20 PM

Usage


  • This tool can be used to compare DSO structures across different systems. e.g. Compare a DSOs structure in Dev v/s Prod/QA.
  • Essential pre-check before staring any development.
  • Prevent possible Go-live sync issues.
  • Prod sync checks in parallel-dev environments.

 

 

User Interface

 

1. Compare DSO between Dev & Prod

 

Specify the DSO names along with RFC Connection. F4 help is available for choosing the RFC connection.

 

01_01_Remote_System_IP.jpg

 

Login to the remote system:

 

01_02_Remote System_Login.jpg

 

  • The output displays the list of Info objects and its presence/absence in DSO.
  • The list also shows the position of the info object in the DSO structure.
  • It also hows if the info object is a key field.

 

01_03_Remote_System_OP_InfoObj.jpg


2. Compare two different DSOs in the same system.

 

 

Give the DSO name

 

02_01_Same_System_IP.jpg

 

Output:

 

Info objects missing in DSO 1:

 

02_03_Same_System_OP_InfoObj.jpg

 

 

Code

 

Copy paste this code in an executable program.

 

ABAP Code

*&---------------------------------------------------------------------*

*& Report  ZBW_DSO_CHECK_V2

*&

*&---------------------------------------------------------------------*

*& Author : Rahul Rajagoapalan Nair

*& Date   : 03 Aug 2012

*&---------------------------------------------------------------------*

 

REPORT  ZBW_DSO_CHECK_V2.

 

 

TYPE-POOLS SLISICON.

 

 

TYPES :

 

   BEGIN OF TY_DSO_COMP,

     STATUS     TYPE CHAR4,

     IOBJNM     TYPE RSDODSOIOBJ-IOBJNM,

     S_POSIT    TYPE RSDODSOIOBJ-POSIT,

     S_IOBJNM   TYPE RSDODSOIOBJ-IOBJNM,

     S_KEYFLAG  TYPE RSDODSOIOBJ-KEYFLAG,

     T_POSIT    TYPE RSDODSOIOBJ-POSIT,

     T_IOBJNM   TYPE RSDODSOIOBJ-IOBJNM,

     T_KEYFLAG  TYPE RSDODSOIOBJ-KEYFLAG,

   END OF   TY_DSO_COMP.

 

 

DATA :

   IT_S_DSO TYPE STANDARD TABLE OF RSDODSOIOBJ,

   IT_T_DSO TYPE STANDARD TABLE OF RSDODSOIOBJ.

 

DATA :

   WA_S_DSO TYPE RSDODSOIOBJ,

   WA_T_DSO TYPE RSDODSOIOBJ.

 

DATA :

   LV_S_LINE TYPE N,

   LV_T_LINE TYPE N.

 

DATA :

   CV_METADATA_TAB TYPE DD02L-TABNAME VALUE 'RSDODSOIOBJ'.

 

DATA :

   IT_OPTIONS TYPE STANDARD TABLE OF RFC_DB_OPT"72 CHAR per line

 

DATA :

   WA_OPTIONS TYPE RFC_DB_OPT"72 CHAR per line

 

 

DATA :

   IT_DSO_COMP TYPE STANDARD TABLE OF TY_DSO_COMP.

 

DATA :

   WA_DSO_COMP TYPE TY_DSO_COMP.

 

FIELD-SYMBOLS :

   <FS_DSO_COMP> TYPE TY_DSO_COMP.

 

DATA :

   IT_FIELDCAT TYPE SLIS_T_FIELDCAT_ALV,

   WA_FIELDCAT TYPE SLIS_FIELDCAT_ALV.    "FIELD CATALOG

 

DATA :

   IT_SORT TYPE SLIS_T_SORTINFO_ALV ,

   WA_SORT TYPE SLIS_SORTINFO_ALV.

 

 

DATA :

   LV_ERR_MSG  TYPE STRING,

   CV_SRC      TYPE STRING VALUE 'Source DSO :',

   CV_TGT      TYPE STRING VALUE 'Target DSO :'.

 

 

*$***********************************************************************

*$*$  SELECTION SCREEN FOR SOURCE DSO

*$*$*********************************************************************

 

SELECTION-SCREEN BEGIN OF BLOCK ssb1 WITH FRAME TITLE TEXT-001.

 

PARAMETERS :

   P_S_DSO TYPE RSDODSOIOBJ-ODSOBJECT DEFAULT 'DSONAME' OBLIGATORY,

   P_S_VER  TYPE LENGTH 1  DEFAULT 'A' OBLIGATORY,

   P_S_SYS(32)  TYPE OBLIGATORY

     LOWER CASE MATCHCODE OBJECT F4_RFCDESTYPEALL

     DEFAULT 'DEVRFC'.

 

SELECTION-SCREEN END OF BLOCK ssb1 .

 

*$***********************************************************************

*$*$  SELECTION SCREEN FOR TARGET DSO

*$*$*********************************************************************

 

SELECTION-SCREEN BEGIN OF BLOCK ssb2 WITH FRAME TITLE TEXT-002.

 

PARAMETERS :

   P_T_DSO TYPE RSDODSOIOBJ-ODSOBJECT DEFAULT 'DSONAME' OBLIGATORY,

   P_T_VER  TYPE LENGTH 1  DEFAULT 'A' OBLIGATORY,

   P_T_SYS(32)  TYPE OBLIGATORY

     LOWER CASE MATCHCODE OBJECT F4_RFCDESTYPEALL

     DEFAULT 'PRODRFC'.

 

SELECTION-SCREEN END OF BLOCK ssb2 .

 

"**********************************************************

"Source System

 

CLEAR IT_OPTIONS.

CLEAR IT_OPTIONS[].

 

CLEAR WA_OPTIONS.

CONCATENATE ' ODSOBJECT = '''

             P_S_DSO

             ''''

    INTO     WA_OPTIONS-TEXT.

    "SEPARATED BY SPACE.

 

APPEND WA_OPTIONS TO IT_OPTIONS.

 

 

CLEAR WA_OPTIONS.

CONCATENATE ' AND OBJVERS = '''

             P_S_VER

             ''''

    INTO     WA_OPTIONS-TEXT.

 

APPEND WA_OPTIONS TO IT_OPTIONS.

 

 

 

"RFC_READ_TABLE

CALL FUNCTION 'RFC_READ_TABLE'  DESTINATION P_S_SYS

   EXPORTING

     QUERY_TABLE                CV_METADATA_TAB

   TABLES

     OPTIONS                    IT_OPTIONS

     DATA                       IT_S_DSO

   EXCEPTIONS

     TABLE_NOT_AVAILABLE        1

     TABLE_WITHOUT_DATA         2

     OPTION_NOT_VALID           3

     FIELD_NOT_VALID            4

     NOT_AUTHORIZED             5

     DATA_BUFFER_EXCEEDED       6

     OTHERS                     7

           .

IF SY-SUBRC <> 0.

   CASE SY-SUBRC.

     WHEN 1.

       CONCATENATE 'Table Not Available :'

                   CV_METADATA_TAB

                   'System :'

                   P_S_SYS

              INTO LV_ERR_MSG

              SEPARATED BY SPACE.

     WHEN 2.

       CONCATENATE 'Table Without Data :'

                   CV_METADATA_TAB

                   'System :'

                   P_S_SYS

              INTO LV_ERR_MSG

              SEPARATED BY SPACE.

     WHEN 3.

       CONCATENATE 'Option Not Valid. Table :'

                   CV_METADATA_TAB

                   'System :'

                   P_S_SYS

              INTO LV_ERR_MSG

              SEPARATED BY SPACE.

     WHEN 4.

       CONCATENATE 'Field Not Valid. Table :'

                   CV_METADATA_TAB

                   'System :'

                   P_S_SYS

              INTO LV_ERR_MSG

              SEPARATED BY SPACE.

     WHEN 5.

       CONCATENATE 'Not Authorized. Table :'

                   CV_METADATA_TAB

                   'System :'

                   P_S_SYS

              INTO LV_ERR_MSG

              SEPARATED BY SPACE.

     WHEN 6.

       CONCATENATE 'Data Buffer Exceeded. Table :'

                   CV_METADATA_TAB

                   'System :'

                   P_S_SYS

              INTO LV_ERR_MSG

              SEPARATED BY SPACE.

     WHEN 7.

       CONCATENATE 'RFC Connection Not Maintained for System :'

                   P_S_SYS

                   '. Or Unknown Error.'

              INTO LV_ERR_MSG

              SEPARATED BY SPACE.

   ENDCASE.

 

   MESSAGE LV_ERR_MSG TYPE 'E'.

ENDIF.

 

"Source System

"**********************************************************

 

 

 

 

"**********************************************************

"Target System

 

CLEAR IT_OPTIONS.

CLEAR IT_OPTIONS[].

 

CLEAR WA_OPTIONS.

CONCATENATE ' ODSOBJECT = '''

             P_T_DSO

             ''''

    INTO     WA_OPTIONS-TEXT.

    "SEPARATED BY SPACE.

 

APPEND WA_OPTIONS TO IT_OPTIONS.

 

 

CLEAR WA_OPTIONS.

CONCATENATE ' AND OBJVERS = '''

             P_T_VER

             ''''

    INTO     WA_OPTIONS-TEXT.

 

APPEND WA_OPTIONS TO IT_OPTIONS.

 

 

 

"RFC_READ_TABLE

CALL FUNCTION 'RFC_READ_TABLE'  DESTINATION P_T_SYS

   EXPORTING

     QUERY_TABLE                CV_METADATA_TAB

   TABLES

     OPTIONS                    IT_OPTIONS

     DATA                       IT_T_DSO

   EXCEPTIONS

     TABLE_NOT_AVAILABLE        1

     TABLE_WITHOUT_DATA         2

     OPTION_NOT_VALID           3

     FIELD_NOT_VALID            4

     NOT_AUTHORIZED             5

     DATA_BUFFER_EXCEEDED       6

     OTHERS                     7

           .

IF SY-SUBRC <> 0.

   CASE SY-SUBRC.

     WHEN 1.

       CONCATENATE 'Table Not Available :'

                   CV_METADATA_TAB

                   'System :'

                   P_T_SYS

              INTO LV_ERR_MSG

              SEPARATED BY SPACE.

     WHEN 2.

       CONCATENATE 'Table Without Data :'

                   CV_METADATA_TAB

                   'System :'

                   P_T_SYS

              INTO LV_ERR_MSG

              SEPARATED BY SPACE.

     WHEN 3.

       CONCATENATE 'Option Not Valid. Table :'

                   CV_METADATA_TAB

                   'System :'

                   P_T_SYS

              INTO LV_ERR_MSG

              SEPARATED BY SPACE.

     WHEN 4.

       CONCATENATE 'Field Not Valid. Table :'

                   CV_METADATA_TAB

                   'System :'

                   P_T_SYS

              INTO LV_ERR_MSG

              SEPARATED BY SPACE.

     WHEN 5.

       CONCATENATE 'Not Authorized. Table :'

                   CV_METADATA_TAB

                   'System :'

                   P_T_SYS

              INTO LV_ERR_MSG

              SEPARATED BY SPACE.

     WHEN 6.

       CONCATENATE 'Data Buffer Exceeded. Table :'

                   CV_METADATA_TAB

                   'System :'

                   P_T_SYS

              INTO LV_ERR_MSG

              SEPARATED BY SPACE.

     WHEN 7.

       CONCATENATE 'RFC Connection Not Maintained for System :'

                   P_T_SYS

                   '. Or Unknown Error.'

              INTO LV_ERR_MSG

              SEPARATED BY SPACE.

   ENDCASE.

 

   MESSAGE LV_ERR_MSG TYPE 'E'.

 

ENDIF.

 

"Target System

"**********************************************************

 

 

 

 

"Get all IOs

CLEAR WA_DSO_COMP.

 

LOOP AT IT_S_DSO INTO WA_S_DSO.

 

   WA_DSO_COMP-IOBJNM WA_S_DSO-IOBJNM.

 

   APPEND WA_DSO_COMP TO IT_DSO_COMP.

 

ENDLOOP.

 

CLEAR WA_DSO_COMP.

 

LOOP AT IT_T_DSO INTO WA_T_DSO.

 

   WA_DSO_COMP-IOBJNM WA_T_DSO-IOBJNM.

 

   APPEND WA_DSO_COMP TO IT_DSO_COMP.

 

ENDLOOP.

 

SORT IT_DSO_COMP BY IOBJNM.

DELETE ADJACENT DUPLICATES

   FROM        IT_DSO_COMP

   COMPARING   IOBJNM.

 

SORT IT_S_DSO BY IOBJNM ASCENDING.

DELETE ADJACENT DUPLICATES

   FROM        IT_S_DSO

   COMPARING   IOBJNM.

 

 

SORT IT_T_DSO BY IOBJNM ASCENDING.

DELETE ADJACENT DUPLICATES

   FROM        IT_T_DSO

   COMPARING   IOBJNM.

 

LOOP AT IT_DSO_COMP ASSIGNING <FS_DSO_COMP>.

 

   CLEAR WA_S_DSO.

   CLEAR WA_T_DSO.

 

   "Read Source

   READ TABLE IT_S_DSO

     INTO  WA_S_DSO

     WITH KEY IOBJNM <FS_DSO_COMP>-IOBJNM.

 

   "Assign independent of Read Sucess or failure

   <FS_DSO_COMP>-S_IOBJNM    WA_S_DSO-IOBJNM.

   <FS_DSO_COMP>-S_POSIT     WA_S_DSO-POSIT.

   <FS_DSO_COMP>-S_KEYFLAG   WA_S_DSO-KEYFLAG.

 

   "Read Target

   READ TABLE IT_T_DSO

     INTO  WA_T_DSO

     WITH KEY IOBJNM <FS_DSO_COMP>-IOBJNM.

 

   "Assign independent of Read Sucess or failure

   <FS_DSO_COMP>-T_IOBJNM    WA_T_DSO-IOBJNM.

   <FS_DSO_COMP>-T_POSIT     WA_T_DSO-POSIT.

   <FS_DSO_COMP>-T_KEYFLAG   WA_T_DSO-KEYFLAG.

 

 

   IF <FS_DSO_COMP>-S_IOBJNM <FS_DSO_COMP>-T_IOBJNM.

 

     <FS_DSO_COMP>-STATUS ICON_GREEN_LIGHT."'S'.

 

   ELSE.

 

     <FS_DSO_COMP>-STATUS ICON_RED_LIGHT."'D'.

 

   ENDIF.

 

   IF <FS_DSO_COMP>-S_KEYFLAG NE <FS_DSO_COMP>-T_KEYFLAG.

 

     <FS_DSO_COMP>-STATUS ICON_RED_LIGHT."'D'.

 

   ENDIF.

 

 

 

ENDLOOP.

 

SORT IT_DSO_COMP BY IOBJNM.

 

 

 

 

 

 

"**********************************************************

"DISPLAY

 

   WA_FIELDCAT-COL_POS 1.

   WA_FIELDCAT-KEY 'X'.

   WA_FIELDCAT-FIELDNAME 'STATUS'.

   WA_FIELDCAT-REPTEXT_DDIC 'Status'.

   WA_FIELDCAT-ICON 'X'.

   WA_FIELDCAT-OUTPUTLEN 6.

   WA_FIELDCAT-TABNAME 'IT_DSO_COMP'.

   APPEND WA_FIELDCAT TO IT_FIELDCAT.

   CLEAR WA_FIELDCAT.

 

   WA_FIELDCAT-COL_POS 2.

   WA_FIELDCAT-KEY 'X'.

   WA_FIELDCAT-FIELDNAME 'IOBJNM'.

   WA_FIELDCAT-REPTEXT_DDIC 'Info Object'.

   WA_FIELDCAT-OUTPUTLEN 30.

   WA_FIELDCAT-TABNAME 'IT_DSO_COMP'.

   APPEND WA_FIELDCAT TO IT_FIELDCAT.

   CLEAR WA_FIELDCAT.

 

 

   WA_FIELDCAT-COL_POS 3.

   WA_FIELDCAT-FIELDNAME 'S_POSIT'.

   CONCATENATE P_S_DSO

               '- Position :'

               P_S_SYS

        INTO   WA_FIELDCAT-REPTEXT_DDIC

        SEPARATED BY SPACE.

   WA_FIELDCAT-OUTPUTLEN 8.

   WA_FIELDCAT-TABNAME 'IT_DSO_COMP'.

   APPEND WA_FIELDCAT TO IT_FIELDCAT.

   CLEAR WA_FIELDCAT.

 

 

   WA_FIELDCAT-COL_POS 4.

   WA_FIELDCAT-FIELDNAME 'S_IOBJNM'.

   CONCATENATE P_S_DSO

               '- Info Object :'

               P_S_SYS

        INTO   WA_FIELDCAT-REPTEXT_DDIC

        SEPARATED BY SPACE.

   WA_FIELDCAT-OUTPUTLEN 30.

   WA_FIELDCAT-TABNAME 'IT_DSO_COMP'.

   APPEND WA_FIELDCAT TO IT_FIELDCAT.

   CLEAR WA_FIELDCAT.

 

   WA_FIELDCAT-COL_POS 5.

   WA_FIELDCAT-FIELDNAME 'S_KEYFLAG'.

   CONCATENATE P_S_DSO

               '- Key :'

               P_S_SYS

        INTO   WA_FIELDCAT-REPTEXT_DDIC

        SEPARATED BY SPACE.

   WA_FIELDCAT-OUTPUTLEN 10.

   WA_FIELDCAT-TABNAME 'IT_DSO_COMP'.

   APPEND WA_FIELDCAT TO IT_FIELDCAT.

   CLEAR WA_FIELDCAT.

 

 

   WA_FIELDCAT-COL_POS 6.

   WA_FIELDCAT-FIELDNAME 'T_POSIT'.

   CONCATENATE P_T_DSO

               '- Position :'

               P_T_SYS

        INTO   WA_FIELDCAT-REPTEXT_DDIC

        SEPARATED BY SPACE.

   WA_FIELDCAT-OUTPUTLEN 8.

   WA_FIELDCAT-TABNAME 'IT_DSO_COMP'.

   APPEND WA_FIELDCAT TO IT_FIELDCAT.

   CLEAR WA_FIELDCAT.

 

 

   WA_FIELDCAT-COL_POS 7.

   WA_FIELDCAT-FIELDNAME 'T_IOBJNM'.

   CONCATENATE P_T_DSO

               '- Info Object :'

               P_T_SYS

        INTO   WA_FIELDCAT-REPTEXT_DDIC

        SEPARATED BY SPACE.

   WA_FIELDCAT-OUTPUTLEN 30.

   WA_FIELDCAT-TABNAME 'IT_DSO_COMP'.

   APPEND WA_FIELDCAT TO IT_FIELDCAT.

   CLEAR WA_FIELDCAT.

 

   WA_FIELDCAT-COL_POS 8.

   WA_FIELDCAT-FIELDNAME 'T_KEYFLAG'.

   CONCATENATE P_T_DSO

               '- Key :'

               P_T_SYS

        INTO   WA_FIELDCAT-REPTEXT_DDIC

        SEPARATED BY SPACE.

   WA_FIELDCAT-OUTPUTLEN 10.

   WA_FIELDCAT-TABNAME 'IT_DSO_COMP'.

   APPEND WA_FIELDCAT TO IT_FIELDCAT.

   CLEAR WA_FIELDCAT.

 

   WA_SORT-SPOS 1.

   WA_SORT-FIELDNAME 'S_POSIT'.

   WA_SORT-TABNAME 'IT_DSO_COMP'.

   WA_SORT-UP 'X'.

   APPEND WA_SORT TO IT_SORT.

   CLEAR WA_SORT.

 

 

 

CALL FUNCTION 'REUSE_ALV_GRID_DISPLAY'

   EXPORTING

     I_CALLBACK_PROGRAM                SY-REPID

     I_GRID_TITLE                      'DSO Info Object Assignment Comparision'

     IT_FIELDCAT                       IT_FIELDCAT

     IT_SORT                           IT_SORT

   TABLES

     T_OUTTAB                          IT_DSO_COMP

   EXCEPTIONS

     PROGRAM_ERROR                     1

     OTHERS                            2

           .

IF SY-SUBRC <> 0.

   CASE SY-SUBRC.

     WHEN 1.

       LV_ERR_MSG 'Program error during list generation.'.

     WHEN 2.

       LV_ERR_MSG 'Unknown error during list generation.'.

   ENDCASE.

   MESSAGE LV_ERR_MSG TYPE 'E'.

ENDIF.

 

"DISPLAY

"**********************************************************

 

 

Text Elements

 

Selection Text

 

P_S_DSODSO
P_S_SYSSystem
P_S_VERVersion
P_T_DSODSO
P_T_SYSSystem
P_T_VERVersion

 

 

Text Elements

 

001Source DSO :
002Target DSO :

 

 

Pre-Requisites

 

  • RFC connection must be available between systems.
  • Check with your basis team for details regarding RFC connection

 

 

Acknowledgments

 

I would like to thank all my team mates and managers at Infosys for their constant support & encouragement.






http://scn.sap.com/people/rahulrajagopalan.nair/blog/2012/08/02/dso-comparison?utm_source=twitterfeed&utm_medium=twitter

'Technique > SAP BW' 카테고리의 다른 글

SAP BW 7.3 Hybrid Provider  (0) 2012.05.23
Explorer with BWA  (0) 2012.05.11
Usage of BW7.3 Transformation Rule Type “Read from DataStore”  (0) 2012.05.10
Version management in SAP BW 7.3  (0) 2012.04.20
Queries/Workbooks a user can access  (0) 2012.04.20
Posted by AgnesKim
Technique/SAP BW2012. 5. 23. 17:36

SAP BW 7.3 Hybrid Provider

 

Hybrid Provider consists of a DataStore object and an InfoCube with automatically generated data flow in between.

• It combines historic data with latest delta information when a query based on it is executed.

• DSO object can be connected to a real time data acquisition Data Source/DTP.

• If the Data Source can provide appropriate delta information in direct access mode a Virtual Provider can be used instead of the DSO.

There are two types of Hybrid Providers:

  1. 1. Hybrid Providers based on direct access.
  2. 2. Hybrid Providers based on a DataStore object

 

Hybrid Providers based on Direct Access

 

Hybrid Provider based on direct access is a combination of a Virtual Provider and an InfoCube. The benefit of this Info Provider is that it provides access to real time data without actually doing any real time data acquisition.

At query runtime, the historic data is read from the InfoCube. Also the near real-time or the latest up-to-date data from the source system is read using the Virtual Provider.

 

Hybrid Providers based on a DataStore object

 

The Hybrid Provider based on a DSO is a combination of DSO and InfoCube. Once this Hybrid Provider is created and activated, the different objects including (DTP, Transformation and process chain) that are used for the data flow from the DSO to the InfoCube are created automatically.

 

One should use Hybrid Provider based on DSO as Info Provider in scenarios where there is need to load data using real time data acquisition. The DTP for real-time data acquisition from a real-time enabled Datasource to the DSO loads the data to the DSO in delta mode. The daemon used for real-time data acquisition immediately activates the data. As this daemon is stopped, the data is loaded from the change log of the DSO to the InfoCube. The InfoCube acts as storage for the historic data from DSO

 

To make data held within a DSO available for reporting, in BI7 there are a number of steps to be done that is create the DSO, InfoCube, Transformation/DTP, MultiProvider, store in a BWA and connect them all up, and then schedule and monitor load jobs.

 

A Hybrid Provider takes a DSO and does it all for you, removing substantial development and maintenance effort. Just load your data into a DSO, create a Hybrid Provider and start reporting. You can even build your Hybrid Provider on a Real-time Data Acquisition Data Source (RDA), which could potentially provide near real-time reporting from a BWA.

 

A typical usage scenario could be that you want to extract your Purchase Orders from R/3 and make available for reporting. Using a Hybrid Provider, as soon as the data is loaded into a DSO they then become available for reporting with all the benefits of an InfoCube and BWA

 

Real-time Data Acquisition

 

Real-time data acquisition enables to update data in real time. As the data is created in the source system, it is immediately updated in the PSA or the delta queue. There are special InfoPackages and DTPs that are real time enabled which are used to load data in InfoProviders.

In order to load real time data from source system to SAP BW, the Datasource should be real time enabled. Most of the standard Data Sources are real-time enabled however we can also create generic Datasource as Real time enabled.

 

Step by step process of creating Hybrid Provider:

Step: 1 we have to first create an Init Infopackage for the Datasource and schedule it as shown below in screenshot.

 

Untitled1.png

 

Step 2: After creating Init Info package, we will then need to create a RDA Info package

 

Untitled2.png

 

Step 3: Now we have the Data source ready .We will have to create a Hybrid Infoprovider Combining DSO and the Infocube. So, for that I need to first create an Infoarea

 

Untitled3.png

 

Step 4: I will go to Data flow screen which is in left hand panel in RSA1

 

Untitled4.png

 

Step 5: Navigate to Infoarea and right click and “create Data flow”

 

Untitled5.png

 

Step 6: We drag and drop Datasource icon from the sidebar available in Data flow, then right click on the icon. Click on use existing object to select the datasource

 

Untitled6.png

 

Step 7: From the Data flow panel, Keep the cursor on the Datasource, right click “Show Direct Dataflow Before”. By clicking on show direct data flow before, it’s automatically shows the relevant Infopackages for the datasource .

 

Untitled7.png

 

 

Step 8: Now, we will remove the Init Infopackage from the data flow and now, the flow will looks as shown below

 

Untitled8.png

 

Step 9: Now drag and drop DSO from side menu. Right click and “Create”. Create a new DSO. Assign the data and key fields. Save and activate it.

 

Step 10: Now, drag and drop the Hybrid Provider from side bar right click and “Create”. Create a new Hybrid Provider based on DSO. The technical name of the provider is HYPD. Assign the previously created DSO to this hybrid provider

 

Untitled9.png

 

While creating the hybrid provider, it shows a warning as follows which means that the DSO can no longer be used as a standalone DSO. It will behave only as a part of hybrid provider. The data fields and the key fields in the DSO are automatically included in the Hybrid Provider.

 

 

Step 11: Once created, it show a system created Infocube under that Hybrid Provider. Note that the Hybrid Provider and the Info Cube have the description same as the DSO, however we have flexibility to give a new name to Hybrid Provider while creating.

 

 

Untitled10.png

 

 

Step 12: we now have to click on complete data flow icon as shown below for system to create a DTP and transformation automatically for the data flow and activate the flow.

 

Untitled12.png

 

 

Step 13: Once Transformation and DTP are active, we need to assign the RDA Infopackage and the RDA DTP to a RDA daemon. Right click on the RDA Infopackage and select “Assign RDA daemon”. It will navigate to RDA Monitor. Create a daemon from left top corner create button and then assign the both of them to the daemon

 

Untitled13.png

 

 

Step 14: Create RDA daemon: In the daemon settings, the daemon specifies the technical number, short description and the period specifies the duration after which it repeats the execution.

 

Untitled14.png

 

We can see that both the Infopackage and DTP are listed under the RDA daemon.

 

 

Step 15: Now, drill down to Infocube menu and click on the DTP. Now click on “Process Chain maintenance”. It will open a system generated process chain which contains DTP from DSO to Cube

 

 

Untitled15.png

 

 

Step 16: Below is the process chain which is automatically created by the system.

 

Untitled16.png

 

 

Step 17: Go to transaction RSRDA (it is RDA Monitor). Run the daemon and the data in real-time gets update from source system to DSO.

 

Untitled17.png

 

 

Untitled18.png

 

 

The new data updated in DSO is now updated into InfoCube after this process chain has run

Below is the process chain successfully run.

 

Untitled19.png

 

 

So we can update the real-time data from the source system to BW system. The real-time data updating works similar to delta functionality. So whenever the users create a new data in source system, it gets automatically updated into BW target system.





http://scn.sap.com/community/data-warehousing/netweaver-bw/blog/2012/05/23/sap-bw-73-hybrid-provider?utm_source=twitterfeed&utm_medium=twitter

'Technique > SAP BW' 카테고리의 다른 글

DSO Comparison  (0) 2012.08.03
Explorer with BWA  (0) 2012.05.11
Usage of BW7.3 Transformation Rule Type “Read from DataStore”  (0) 2012.05.10
Version management in SAP BW 7.3  (0) 2012.04.20
Queries/Workbooks a user can access  (0) 2012.04.20
Posted by AgnesKim
Technique/SAP HANA2012. 5. 11. 15:38




INTRODUCTION

 

Ever since SAP-HANA was announced couple of years back, I've been following the discussions/developments around In-Memory Database space. In Oct 2011, Oracle CEO Larry Ellison introduced Oracle Exalytics to compete with SAP-HANA. After reading white papers on both SAP-HANA and Oracle Exalytics, it was obvious they were different. The comparison of SAP-HANA and Oracle Exalytics is like comparing apples to oranges.

 

On May 8, 2012 I tweeted:

 

SAP Mentor David Hull responded:

 

I was a bit surprised to know that most don't have a clue as to the difference between Exalytics and SAP-HANA. The difference looked obvious to me. I realized either I was missing something or they were. So I decided to write this blog. And since this blog compares SAP products with Oracle products, I've decided to use Oracle DB instead of generic term RDBMS.

 

First I'll discuss the similarity between SAP-BW, Oracle-Exalytics and SAP-HANA. At a very high level, they look similar as shown in the picture below:

Similarity.png

 

As shown, BW application sits on top of a database, Oracle or SAP-HANA. And the application helps the user find right data. The similarity ends there.

 

Let us now review how Oracle-Exalytics compares with SAP-BW with Business Warehouse Accelerator (BWA): As you can see below, there appears to be one-to-one match between the components of SAP-BW and Exalytics.

 

 

New_BWA_EXA.png

Steps
SAP-BWExalyticsComments
1 and 1a

Data found in BWA;

and returned to the user

Data found in

Adaptive Data Mart & returned to the user


2 & 2a

Data found in OLAP

Cache and returned to the user

Data found in Intelligent cache and returned to the user

This means data was not found in BWA

or Adaptive Data Mart

3 & 3a

Data found in

Aggregates and returned to the user

Data found in Essbase Cubes and returned to the user

This means data was not found in

1) Adaptive Data Mart or BWA and

2) OLAP Cache or Intelligent Cache

4 & 4a

Data found in Cubes

and returned to the user


Not sure if Essbase supports aggregates;

However Oracle supports materialized

views;I assume this is similar to SAP-BW's aggregates.

 

 

 

The diagram below shows why Exalytics Vs SAP-HANA comparison is like apple to orange comparison. In Exalytics, the information users need gets pre-created at a certain level/granularity. One of the best practices in BW/DW world is to create the aggregates upfront to get acceptable response times.

 

In SAP-HANA, however, aggregates are created on the fly; data in SAP-HANA resides in raw form, and depending on what users need, the application performs the aggregation at runtime and displays the information on the user's screen. This helps the users perform analysis near real-time and more quickly.

 

 

New_EXA_HANA.png

Based on the diagrams shown above, Exalytics it seems is comparable to SAP's six year old BWA technology.

 

SUMMARY

 

Based on discussions above, the diagram below compares all three products SAP-BW with BWA, Exalytics and SAP-HANA.

 

New_BWA_EXA_HANA.png

 

                                           Note: I didn't connect Disk to HANA DB because it is primarily used for persistence.

 

I wanted to keep this blog simple so didn't include a lot of details. Depending on your questions/thoughts, I'm planning to either update this blog or write new blog.







http://scn.sap.com/community/in-memory-business-data-management/blog/2012/05/11/bwa-exalytics-and-sap-hana?utm_source=twitterfeed&utm_medium=twitter



Posted by AgnesKim
Technique/SAP BW2012. 5. 10. 21:21


Previous post
Next post

It is quite a common requirement to load data from point A to point B in BW, while performing a lookup on a DSO to get a bunch of fields from there. 
This is usually implemented as follows: a SELECT statement the transformation Start Routine picks up data from the DSO and fills up an internal table, and an end routine (or field-level routines) populates the target fields by reading the internal table.


In keeping with the general BW 7.3 theme of automating common scenarios, a new transformation rule type has been introduced to do this automatically. 
To take this new feature out for a spin, I created a DSO with loosely based on the 0FIAR_O03 DSO. My DSO had the key fields Company Code, Customer (0DEBITOR), Fiscal Period, Fiscal Variant, Accounting Doc No, Item No and Due Date Item No. It also had the data fields Credit Control Area, Debit/Credit Amount, Local Currency, Credit Limit and Currency.


I created a Flat File DataSource , which did not contain any fields for Credit Limit and Currency. The objective was to derive these two fields in the transformation from the Credit Management Control Area Data DSO (0FIAR_O09). To begin with, this is what the transformation from DataSource to the custom DSO looked like.

Tr1.png

To perform the lookup, first the key fields of the lookup DSO have to be identified. The key fields of the 0FIAR_O09 DSO are Credit Control Area and Customer Number (0C_CTR_AREA and 0DEBITOR). The lookup logic will search the 0FIAR_O09 DSO based on these two fields. In order to do this, the Credit Control Area and Customer from the DataSource should be mapped to the Credit Limit key figure in the target.  

The first step in the Rule Details is to specify the DSO from which the field values will be picked up – in this case, 0FIAR_O09. Next, the “IOAssgnmnt” column must be manually filled up with the names of the InfoObjects. It is important that ALL the key fields of the lookup DSO are specified.

Tr2.png


In a nutshell, the above screen tells the system to derive the value of the 0CRED_LIMIT (the target field) from the 0FIAR_O09 DSO (the lookup DSO) based on the C_CTR_AREA and DEBITOR values coming in from the DSO, which correspond to the 0C_CTR_AREA and 0DEBITOR InfoObjects of the lookup DSO.


The 0CURRENCY target field also needs to be similarly mapped. 

 

This is how the transformation looks after we're done. Observe the "DSO" icon which appears next to the Credit Limit and Currency in the target of the transformation.

TR3.png


Once this is done, run the DTP. The transformation will perform perform the lookup and populate the values. Activate the data when the load completes.  
Now to begin verifying the data. The Flat file contained the following values, which were loaded to the PSA. Observe that there is no Credit Limit data in this file.

 

TR4.png

In the 0FIAR_O09 DSO, the following values were present.

tr5.png

After the load, this is how the data in the DSO looks.

 

tr6.png
As the screenshot shows, the transformation rule has correctly picked up the Credit Limit from the 0FIAR_O09 DSO.


A few caveats are in order on this feature.

  • All the key fields of the lookup DSO should be specified. If a partial key is specified (for instance, if we had mapped only 0DEBITOR in the source fields of the transformation rule) the system will assign the value from the first record it finds in the lookup DSO
  • The InfoObject Assignment for the source fields should have exactly the same names as the corresponding InfoObjects in the lookup DSO. If the InfoObject in the lookup DSO was 0CRED_LIMIT and the target InfoObject of the transformation rule was 0VALUE_LC, this technique cannot be used as the InfoObjects differ
  • The target InfoObject will be filled from the value of the InfoObject having the same name in the lookup DSO. In other words, 0CRED_LIMIT is filled up based on the value of 0CRED_LIMIT in 0FIAR_O09. If 0CRED_LIMIT did not exist in the lookup DSO, the system will throw an error during transformation activation

 

Essentially, this feature is most useful if you have simple lookups, for instance get Field X from DSO Y based on the lookup field Z and write it out in field X of the target. However, it may not be best solution if you have more complex requirements which involve

  • Pulling multiple records from the lookup DSO and getting the first or the last found record in the set
  • A lookup DSO in which the field you want has a different name




'Technique > SAP BW' 카테고리의 다른 글

SAP BW 7.3 Hybrid Provider  (0) 2012.05.23
Explorer with BWA  (0) 2012.05.11
Version management in SAP BW 7.3  (0) 2012.04.20
Queries/Workbooks a user can access  (0) 2012.04.20
All You Need to Know about HybridProvider in BW 7.30  (0) 2011.12.08
Posted by AgnesKim
Technique/SAP BW2011. 12. 8. 09:05

All You Need to Know about HybridProvider in BW 7.30

     Rakesh Kalyankar    Article     (PDF 1 MB)     14 November 2011

Overview

The paper provides a detailed description about the following aspects of hybrid providers: - Purpose - Use-cases - Metadata - Modeling - Usage - Technical details


Posted by AgnesKim
Technique/SAP BW2011. 11. 29. 13:34

Passing query parameters in background to a BEx report deployed in Enterprise portal
Sriramvijay.R 
Business Card
Company: L&T Infotech
Posted on Nov. 27, 2011 06:49 AM in Enterprise Data Warehousing/Business Warehouse, Enterprise Portal (EP)

 
 

This blog is a work around to pass query parameters in background to a BEx report deployed in Enterprise portal. 

To explain this work around simply, the below scenario is taken as example.

Consider a scenario where there are many customers (Info object 0SOLD_TO). Customer will log into Enterprise portal and run their Account statement Report. The customer’s Login ID (Should be same as customer Key) will be passed as filter value to 0SOLD_TO automatically in background. This will prevent customer from accessing others data.  

The following are the step by step instruction to do this work around. 

Create the necessary report in BW with 0SOLD_TO in free characteristics.  

Create a Web template. For creating Web templates use Web application designer. There will be standard web templates like 0ANALYSIS_PATTERN. Remove unnecessary buttons like filter, Navigation pane, New analysis from that template and save as a new template with a name 0ANALYSIS_PATTERN_1. 

Next create an iView of type "BEx Web Application iView" in Enterprise portal.  

In the iView give query technical name, Web template technical name as shown in the below screen shot.

 

 To restrict report in background we need a string like the one below.

“BI_COMMAND=&BI_COMMAND-BI_COMMAND_TYPE=SET_SELECTION_STATE&BI_COMMAND-TARGET_DATA_PROVIDER_REF_LIST-TARGET_DATA_PROVIDER_REF=DP_1&BI_COMMAND-CHARACTERISTICS_SELECTIONS=&BI_COMMAND-CHARACTERISTICS_SELECTIONS-CHARACTERISTIC_SELECTIONS_1=&BI_COMMAND-CHARACTERISTICS_SELECTIONS-CHARACTERISTIC_SELECTIONS_1-SELECTIONS=&BI_COMMAND-CHARACTERISTICS_SELECTIONS-CHARACTERISTIC_SELECTIONS_1-SELECTIONS-ELECTION_1=SELECTION_INPUT_STRING&BI_COMMAND-CHARACTERISTICS_SELECTIONS-CHARACTERISTIC_SELECTIONS_1-SELECTIONS-SELECTION_1-SELECTION_INPUT_STRING=05A05325&BI_COMMAND-CHARACTERISTICS_SELECTIONS-CHARACTERISTIC_SELECTIONS_1CHARACTERISTIC=0SOLD_TO&PAGE_ID=1_elZDdTRmeUY4TUtTc21FQm1GUHhjQS0tMFlHQ01IaTZuczd1bWRTQVZsUks5QS0t&REQUEST_ID=7”

Whenever we restrict a query with a parameter, this kind of string passed in background. There is a work around to get this string. For that you need the following things,

 (i)                  A web template with filter option – to give restriction on characteristic so that string can be captured.

 (ii)                A Web Debugging Proxy - which logs all HTTP(S) traffic between your computer and the Internet. This will capture the required string in text view.

  Change the template name to 0ANALYSIS_PATTERN in iView and execute. Give restriction to query as below.   

 

For this purpose only 0SOLD_TO was kept in free characteristic. After getting the restricted output, go to Web debugging proxy. There you get the string.

Once the string is captured, that can be modified to meet our requirement. The user login ID can be obtained from the command <User.LogonUid>. Replace Restriction parameter value with the command as shown below and remove the PAGE_ID at the end of the string. So the result string will look like this.


“BI_COMMAND=&BI_COMMAND-BI_COMMAND_TYPE=SET_SELECTION_STATE&BI_COMMAND-TARGET_DATA_PROVIDER_REF_LIST-TARGET_DATA_PROVIDER_REF=DP_1&BI_COMMAND-CHARACTERISTICS_SELECTIONS=&BI_COMMAND-CHARACTERISTICS_SELECTIONS-CHARACTERISTIC_SELECTIONS_1=&BI_COMMAND-CHARACTERISTICS_SELECTIONS-CHARACTERISTIC_SELECTIONS_1-SELECTIONS=&BI_COMMAND-CHARACTERISTICS_SELECTIONS-CHARACTERISTIC_SELECTIONS_1-SELECTIONS-ELECTION_1=SELECTION_INPUT_STRING&BI_COMMAND-CHARACTERISTICS_SELECTIONS-CHARACTERISTIC_SELECTIONS_1-SELECTIONS-SELECTION_1-SELECTION_INPUT_STRING=<User.LogonUid>&BI_COMMAND-CHARACTERISTICS_SELECTIONS-CHARACTERISTIC_SELECTIONS_1CHARACTERISTIC=0SOLD_TO"

This modified string should be added to iView Application parameter as in the below screen shot. 

iView HTTP Request method should be set to POST method as in the below screen shot. 

Again change the iView name into 0ANALYSIS_PATTERN_1.

Now, when a customer opens the report it will be restricted to his account only.

 

 

Sriramvijay.R   is a BI/BO Consultant working in L&T Infotech, India


http://www.sdn.sap.com/irj/scn/weblogs?blog=/pub/wlg/27503%3Futm_source%3Dtwitterfeed%26utm_medium%3Dtwitter%26utm_campaign%3DFeed%253A+SAPNetworkWeblogs+%2528SAP+Network+Weblogs%2529
Posted by AgnesKim
Technique/SAP BW2011. 11. 29. 13:32

Time desync of data loading from data source - 0CSM_CASE_ATTR
Hema Lakshmi SAP Employee 
Business Card
Company: SAP Labs
Posted on Nov. 28, 2011 03:35 AM in Enterprise Data Warehousing/Business Warehouse, ABAP, Public Sector, SAP NetWeaver Platform

ISSUE :

Time zone difference causes incorrect data loading from data source -   0CSM_CASE_ATTR.

SOLUTION:

The case extractor 0CSM_CASE_ATTR is being used by many customers successfully with no issues in data extraction. 

In the extractor 0CSM_CASE_ATTR the default time zone is UTC. This is the architectural design where all time fields in the Basis Case Management application are in UTC format. The table SCMG_T_CASE and SCMG_T_CASE_ATTR would have all time fields in UTC format during save/retrieval or any other operation.

In the extractor 0CSM_CASE_ATTR we use the ABAP statement 'CONVERT DAT' which converts the timestamp from local time zone to UTC format. No other format would be supported because then it would cause inconsistencies with the database.

Initially in a similar issue reported by another customer we first were exploring the idea of increasing the time range interval in RSA2 transaction in the customer system directly. But then this range has been set from beginning and has worked fine for all the customers till now. So the issue would not be resolved by this action.

Below notes explain about the Time Zone issues ~
198411  - Current data and information about time zones
741734  - Incorrect times due to the time zone settings
481835  - Analyzing the time zone settings
101726  - Incorrect times and time zone setting
15913    - Timezone problems (local vs. global time)

Refer to the below notes which talk about Daylight Saving issues ~
126692  - Daylight saving time rules are incorrect
102088  - Reducing downtime when changing from summer to winter
7417      - Conversion between winter time and daylight saving

Go through each of the notes in detail (most would have manual instructions to be performed and nothing should be skipped as it could result in inconsistencies).

Additionally if after applying all the above notes the issue still persists then the below notes would be useful -
<1> For issues on CRM cases refer to note ~
       1470994  - 0CRM_CASE_ATTR: additional fields
<2> For issues on Basis Case extraction refer to notes ~
       1281000  - retrieval of data from case extractor taking a lot of time
       1159696  - Not able to get the data using case type from case
       996360    - Case Extractor not able to pull more than 1 data package.

Hema Lakshmi   is an expert in AP Netweaver Folders Management


http://www.sdn.sap.com/irj/scn/weblogs?blog=/pub/wlg/27514%3Futm_source%3Dtwitterfeed%26utm_medium%3Dtwitter%26utm_campaign%3DFeed%253A+SAPNetworkWeblogs+%2528SAP+Network+Weblogs%2529

Posted by AgnesKim
Technique/SAP BW2011. 8. 8. 14:01

Execute a BW query by excluding values from another BW query.
Bhushan Suryavanshi 
Business Card
Company: Bombardier Aerospace
Posted on Aug. 07, 2011 02:12 AM in ABAP, Analytics, Enterprise Data Warehousing/Business Warehouse

 
 

Motivation

In SAP BW, usually there is no easy way of performing set operations (union, intersection, outer joins etc.) of two queries. This is usually a manual activity. Basically, post processing using two query results is difficult. Infosets can be used in some scenarios but these need additional modelling steps and not as flexible. This blog illustrates how we can fetch values from one query and then exclude these selected values from another query.

Business Example

Using technical content, you want to find out all infocubes that exists in the system but with no data being fetched from them over the last two years (via queries say).

Let X = {A1, A2, .. AN} be a query that delivers all the cubes in the system.

Let Y = {A1, A2, A3} be the query that delivers all the cubes in use over the last two years i.e. some data was fetched from them.

You want to find Z = X - Y = {A4, A5 .. AN}.

Steps

1) In query X, create a customer exit variable of type Selection option on characteristic A.

 

2) Write the customer exit coding in FUNCTION EXIT_SAPLRRS0_001 include ZXRSRU01. We calculate the variable values set at i_step=1. (How customer exit variables are processed is out of scope of this blog)

 

3) Code sample

Initialization of the request object (Every query execution is represented in BW via a request object).

DATA:

l_r_request TYPE REF TO cl_rsr_request,

l_s_compkey TYPE rszcompkey.

*** Initialization with query Y to be executed from within the ABAP code ***

l_s_compkey-objvers = rs_c_objvers-active.

l_s_compkey-infocube = 'INFOCUBE'.

l_s_compkey-compid = 'QUERY_Y'.   *** executing query Y

l_s_compkey-comptype = rzd1_c_comptp-report.

 

4) Get the internal query id for QUERY_Y

* Get the compkey

CALL FUNCTION 'RRI_REPDIR_READ'

CHANGING

c_s_repkey = l_s_compkey

EXCEPTIONS

entry_not_found = 1.

IF sy-subrc <> 0.

MESSAGE s792(brain) WITH l_s_compkey-compid.

EXIT.

ENDIF.

 

5) Create the Request object

CREATE OBJECT l_r_request

EXPORTING

i_genuniid = l_s_compkey-genuniid

EXCEPTIONS

OTHERS = 1.

 

 6) Call the query Y from within ABAP code

* get the query definition

CALL METHOD l_r_request->get_initial_state

IMPORTING

e_t_dim = l_t_dim

e_t_mem = l_t_mem

e_t_cel = l_t_cel

e_t_atr = l_t_atr

e_t_con = l_t_con

e_t_fac = l_t_fac

e_t_prptys = l_t_prptys

EXCEPTIONS

x_message = 8

OTHERS = 1.

 

* Set the request

CALL METHOD l_r_request->set_request

EXPORTING

i_t_dim = l_t_dim

i_t_mem = l_t_mem

i_t_cel = l_t_cel

i_t_atr = l_t_atr

i_t_con = l_t_con

i_t_fac = l_t_fac

i_t_prptys = l_t_prptys

EXCEPTIONS

no_processing = 1

x_message = 8.

 

* read data

CALL METHOD l_r_request->read_data

IMPORTING

e_warnings = l_warnings

EXCEPTIONS

no_processing = 1

x_message = 8.

  

*close the request

l_r_request->p_r_olap->FREE( ).

 

7) After fetching the results, assign the result set elements {A1, A2, A3} as exclude 'E' to the exit variable.

* Get the text table from the output handle of the request

ASSIGN l_r_request->n_sx_output-text->* TO <l_th_text>.

loop at <l_th_text> into l_s_txt_n.

clear l_s_range.

l_s_range-low = l_s_txt_n-CHAVL_EXT.

l_s_range-sign = 'E'. *** Excluding

l_s_range-opt = 'EQ'.

append l_s_range to e_t_range.

endloop.

......

e_t_range contains the variable values which will finally be submitted to OLAP to fetch the results of query X excluding values of Y i.e. X - Y.

Thus, in this way you can perform other set operations simply in the customer exit coding itself.

 

 P.S. For more details on executing BW queries from within ABAP, please refer: http://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/43db5ee1-0701-0010-2d90-c3b991eb616c

Bhushan Suryavanshi   is a SAP BI Analyst at Bombardier Aerospace.


출처 : http://www.sdn.sap.com/irj/scn/weblogs?blog=/pub/wlg/25643%3Futm_source%3Dtwitterfeed%26utm_medium%3Dtwitter%26utm_campaign%3DFeed%253A+SAPNetworkWeblogs+%2528SAP+Network+Weblogs%2529

'Technique > SAP BW' 카테고리의 다른 글

Repeat Delta Elucidate From OLTP to BW System  (0) 2011.11.25
Using Nested Exception Aggregation in BEx Reports- Scenario  (0) 2011.11.25
Remodeling on DSO  (0) 2011.07.12
Remodeling in SAP BI 7.0  (0) 2011.05.11
Interrupting the Process chain  (0) 2011.05.10
Posted by AgnesKim
Technique/SAP BW2011. 4. 8. 23:59

BW 7.30: New MDX test environment
Roman Moehl SAP Employee 
Business Card
Company: SAP AG
Posted on Apr. 08, 2011 04:00 AM in Analytics, Business Intelligence (BusinessObjects), Enterprise Data Warehousing/Business Warehouse, Standards

 
 

Abstract

SAP NetWeaver BW 7.30 introduces a new test transaction for creating, running and analyzing MDX statements. The following article explains how you can use the test environment to efficiently use the query language MDX.

Motivation

MDX is a query language for accessing multidimensional data sources. It is the central technology of SAP BW’s Open Analysis Interface.

MDX allows dimensional applications to access OLAP data in a generic and standard-based way. Besides external reporting clients from other vendors, MDX is also used by SAP’s own products, for example by BusinessObjects Web Intelligence or BPC.

As a language, MDX offers a variety of functions that potentially result in very complex statements. Customers or client applications that create their own statements often lack of good editing- and tool support. Therefore, SAP BW 7.30 offers a new test transaction for composing, executing and analyzing MDX statements.

The new test transaction MDXTEST is typically used by developers (working on MDX-based integration for SAP BW), administrators and consultants.

Hands on MDXTEST

The new test transaction MDXTEST consists of three parts:

  1. Pane Section
  2. Editor
  3. ResultSet Renderer

MDXTEST Overview

Pane section

The pane section on the left side of the transaction consists of three sub sections.

Pane section

Metadata browser

The metadata browser exposes the ODBO related metadata of the selected Cube. The selected objects (for example Members or Hierarchies) can be dragged onto the MDX editor. This improves and accelerates the construction of MDX statements. The user sees all the available objects that are available for defining statements.

Function library

The function library provides a list of all available MDX functions and methods. For each function or method, a corresponding code snippet can be added to the editor by drag and drop. The functions in the browser are arranged by on their return types, for example Member, Tuple or Set.

Statement navigator

The statement navigator provides a list of stored statements. By double clicking on a statement, the statement is read from the persistency and displayed in the MDX editor. This allows the user to easily find the stored MDX statements.

Editor

The central part of the test transaction is the editor pane. The editor itself provides a set of new functionality that is known from the ABAP editor such as mono-spaced font for indentation and formatting, line numbering or drag-and-drop of function templates.

Editor

Pretty Printer

Most MDX statements are generated by clients. These statements are often not in a readable format. Most of them need to be manually formatted to get a better understanding of statement structure. In addition, the statements are typically quite complex and often consist of a composition of multiple functions. Formatting and restructuring of the statement consumes a lot of time. A built-in pretty printer transforms the text into a “standard” formatting.

ResultSet Renderer

The result of a MDX query is displayed in a separate window to analyze the statement and its result in a decoupled way. Besides the data grid, additional information about the axis and details about MDX-specific statistic events are added to the query result.

ResutlSet Rendering

Executing a MDX statement

Once you’ve constructed your MDX in the editor, there are two ways of executing the statement:

  1. Default: The status bar provides a default execution button. The statements are executed via the multidimensional interface and the default settings.
  2. Expert mode: If you need to run the MDX statement via a different interface, then the expert mode is the right choice. The expert mode is available via a button right next to the default execution button.

The expert mode provides the following options:

  • Interface: It’s possible to run a MDX statement via several APIs. The most common interface is the default multidimensional API. In addition, it’s possible to run the statement via the flattening or XML/A interface.
  • Row restriction: The flattening API allows you to restrict the range of rows that are about to retrieved. Besides a fix from- and to-number, it’s also possible to define a fixed package size. This setting is only available if the flattening API is chosen.
  • Display: The rendering of the result can be influenced by the display setting. In general, you can switch off the default HTML rendering. This might be handy if you run performance measurements and you would like to exclude the rendering overhead.
  • Debug settings: there are a couple of internal MDX-specific debug-breakpoints which are typically only used by SAP support consultants.

Summary

In this article, you learned about the new central UI for testing MDX statements. The various components of the test environment support you in creating, executing and testing MDX with minimal effort.

Roman Moehl   Roman is Senior Developer in SAP NetWeaver BW


http://www.sdn.sap.com/irj/scn/weblogs?blog=/pub/wlg/23519%3Futm_source%3Dtwitterfeed%26utm_medium%3Dtwitter%26utm_campaign%3DFeed%253A+SAPNetworkWeblogs+%2528SAP+Network+Weblogs%2529

Posted by AgnesKim
Technique/SAP BW2011. 3. 27. 15:24

SAP BW - RFC Function Module Reconciliation SAP BW and ECC Sales Header Data

Suraj Tigga    Article     (PDF 664 KB)     21 March 2011 

Overview

Document specifies the detailed understanding for reconciling the SAP ECC (Sales Order Header data - 2LIS_11_VAHDR) with the data loaded to SAP BI. Reconciliation of the data is done using RFC function module and ABAP report. Advantage of using this method is one can schedule the ABAP report at any specific time and get the detail list of mismatched Sales Order.

Posted by AgnesKim