Technique/그외2014. 3. 14. 10:56




GKP 진출 국가별 주요 경제 동향(2013년 10월 ~ 12월).pdf


Posted by AgnesKim
Technique/SAP BO2013. 1. 10. 13:35

dashboard_html5.png

Figure 1: Select Preview in Mobile to generate the temporary files

 

With SAP BusinessObjects Dashboards 4.0 SP5 (Xcelsius 2011), you have the ability to preview and publish to an HTML5 version rather than Flash. I however am not interested in publishing to the platform for consumption on mobile devices as much as I am in retrieving the raw HTML5 files.

 

While poking around, I discovered that you can grab the previewed version of your dashboard in HTML5 format by searching your temporary files. The location of the dashboard preview is available in a randomized folder under:

 

C:\Users\<username>\AppData\Local\Temp\

 

Search your Temp directory and sub-directories for a file called dashboard.html. Open the directory of the file and you'll find the associated files as well. Save them off and test to your heart's delight.

 

dashboard_html5_output.png

Figure 2: Search for the dashboard.html file in your temporary directory to get the files for your HTML5 Dashboard

 

Note: You must be in preview mode in Dashboards for the files to be present. As soon as you exit the preview, the files are cleaned up.

 

dashboard_html5_ie.png

Figure 3: Look Ma! A non-Flash Dashboard in Internet Explorer

 

Now you have all of the files, presumably, to test out an HTML5 version of your dashboard on the desktop and other devices without needing to publish to the platform for mobile devices.

 

Check out Timo's blog post on other topics regarding HTML5 Dashboards: Making A Mobile HTML5 Dashboard: Follow-Up Q&amp;A

 

Also, check out Tammy's blog post for a good overview of the capabilities and requirements too: Mobilizing Your Dashboards: ASUG Webcast

 

You can follow me on Twitter: @jwarbington



http://scn.sap.com/community/bi-dashboards/blog/2012/11/24/how-to-get-the-html5-version-of-your-dashboard-without-publishing-to-mobile?campaigncode=CRM-XM12-SOC-PROGCAMP-Twitter

Posted by AgnesKim
Technique/SAP BW2012. 8. 3. 11:11

DSO Comparison

Posted by Rahul Nair in Rahul Rajagopalan Nair's Blog on Aug 2, 2012 10:26:20 PM

Usage


  • This tool can be used to compare DSO structures across different systems. e.g. Compare a DSOs structure in Dev v/s Prod/QA.
  • Essential pre-check before staring any development.
  • Prevent possible Go-live sync issues.
  • Prod sync checks in parallel-dev environments.

 

 

User Interface

 

1. Compare DSO between Dev & Prod

 

Specify the DSO names along with RFC Connection. F4 help is available for choosing the RFC connection.

 

01_01_Remote_System_IP.jpg

 

Login to the remote system:

 

01_02_Remote System_Login.jpg

 

  • The output displays the list of Info objects and its presence/absence in DSO.
  • The list also shows the position of the info object in the DSO structure.
  • It also hows if the info object is a key field.

 

01_03_Remote_System_OP_InfoObj.jpg


2. Compare two different DSOs in the same system.

 

 

Give the DSO name

 

02_01_Same_System_IP.jpg

 

Output:

 

Info objects missing in DSO 1:

 

02_03_Same_System_OP_InfoObj.jpg

 

 

Code

 

Copy paste this code in an executable program.

 

ABAP Code

*&---------------------------------------------------------------------*

*& Report  ZBW_DSO_CHECK_V2

*&

*&---------------------------------------------------------------------*

*& Author : Rahul Rajagoapalan Nair

*& Date   : 03 Aug 2012

*&---------------------------------------------------------------------*

 

REPORT  ZBW_DSO_CHECK_V2.

 

 

TYPE-POOLS SLISICON.

 

 

TYPES :

 

   BEGIN OF TY_DSO_COMP,

     STATUS     TYPE CHAR4,

     IOBJNM     TYPE RSDODSOIOBJ-IOBJNM,

     S_POSIT    TYPE RSDODSOIOBJ-POSIT,

     S_IOBJNM   TYPE RSDODSOIOBJ-IOBJNM,

     S_KEYFLAG  TYPE RSDODSOIOBJ-KEYFLAG,

     T_POSIT    TYPE RSDODSOIOBJ-POSIT,

     T_IOBJNM   TYPE RSDODSOIOBJ-IOBJNM,

     T_KEYFLAG  TYPE RSDODSOIOBJ-KEYFLAG,

   END OF   TY_DSO_COMP.

 

 

DATA :

   IT_S_DSO TYPE STANDARD TABLE OF RSDODSOIOBJ,

   IT_T_DSO TYPE STANDARD TABLE OF RSDODSOIOBJ.

 

DATA :

   WA_S_DSO TYPE RSDODSOIOBJ,

   WA_T_DSO TYPE RSDODSOIOBJ.

 

DATA :

   LV_S_LINE TYPE N,

   LV_T_LINE TYPE N.

 

DATA :

   CV_METADATA_TAB TYPE DD02L-TABNAME VALUE 'RSDODSOIOBJ'.

 

DATA :

   IT_OPTIONS TYPE STANDARD TABLE OF RFC_DB_OPT"72 CHAR per line

 

DATA :

   WA_OPTIONS TYPE RFC_DB_OPT"72 CHAR per line

 

 

DATA :

   IT_DSO_COMP TYPE STANDARD TABLE OF TY_DSO_COMP.

 

DATA :

   WA_DSO_COMP TYPE TY_DSO_COMP.

 

FIELD-SYMBOLS :

   <FS_DSO_COMP> TYPE TY_DSO_COMP.

 

DATA :

   IT_FIELDCAT TYPE SLIS_T_FIELDCAT_ALV,

   WA_FIELDCAT TYPE SLIS_FIELDCAT_ALV.    "FIELD CATALOG

 

DATA :

   IT_SORT TYPE SLIS_T_SORTINFO_ALV ,

   WA_SORT TYPE SLIS_SORTINFO_ALV.

 

 

DATA :

   LV_ERR_MSG  TYPE STRING,

   CV_SRC      TYPE STRING VALUE 'Source DSO :',

   CV_TGT      TYPE STRING VALUE 'Target DSO :'.

 

 

*$***********************************************************************

*$*$  SELECTION SCREEN FOR SOURCE DSO

*$*$*********************************************************************

 

SELECTION-SCREEN BEGIN OF BLOCK ssb1 WITH FRAME TITLE TEXT-001.

 

PARAMETERS :

   P_S_DSO TYPE RSDODSOIOBJ-ODSOBJECT DEFAULT 'DSONAME' OBLIGATORY,

   P_S_VER  TYPE LENGTH 1  DEFAULT 'A' OBLIGATORY,

   P_S_SYS(32)  TYPE OBLIGATORY

     LOWER CASE MATCHCODE OBJECT F4_RFCDESTYPEALL

     DEFAULT 'DEVRFC'.

 

SELECTION-SCREEN END OF BLOCK ssb1 .

 

*$***********************************************************************

*$*$  SELECTION SCREEN FOR TARGET DSO

*$*$*********************************************************************

 

SELECTION-SCREEN BEGIN OF BLOCK ssb2 WITH FRAME TITLE TEXT-002.

 

PARAMETERS :

   P_T_DSO TYPE RSDODSOIOBJ-ODSOBJECT DEFAULT 'DSONAME' OBLIGATORY,

   P_T_VER  TYPE LENGTH 1  DEFAULT 'A' OBLIGATORY,

   P_T_SYS(32)  TYPE OBLIGATORY

     LOWER CASE MATCHCODE OBJECT F4_RFCDESTYPEALL

     DEFAULT 'PRODRFC'.

 

SELECTION-SCREEN END OF BLOCK ssb2 .

 

"**********************************************************

"Source System

 

CLEAR IT_OPTIONS.

CLEAR IT_OPTIONS[].

 

CLEAR WA_OPTIONS.

CONCATENATE ' ODSOBJECT = '''

             P_S_DSO

             ''''

    INTO     WA_OPTIONS-TEXT.

    "SEPARATED BY SPACE.

 

APPEND WA_OPTIONS TO IT_OPTIONS.

 

 

CLEAR WA_OPTIONS.

CONCATENATE ' AND OBJVERS = '''

             P_S_VER

             ''''

    INTO     WA_OPTIONS-TEXT.

 

APPEND WA_OPTIONS TO IT_OPTIONS.

 

 

 

"RFC_READ_TABLE

CALL FUNCTION 'RFC_READ_TABLE'  DESTINATION P_S_SYS

   EXPORTING

     QUERY_TABLE                CV_METADATA_TAB

   TABLES

     OPTIONS                    IT_OPTIONS

     DATA                       IT_S_DSO

   EXCEPTIONS

     TABLE_NOT_AVAILABLE        1

     TABLE_WITHOUT_DATA         2

     OPTION_NOT_VALID           3

     FIELD_NOT_VALID            4

     NOT_AUTHORIZED             5

     DATA_BUFFER_EXCEEDED       6

     OTHERS                     7

           .

IF SY-SUBRC <> 0.

   CASE SY-SUBRC.

     WHEN 1.

       CONCATENATE 'Table Not Available :'

                   CV_METADATA_TAB

                   'System :'

                   P_S_SYS

              INTO LV_ERR_MSG

              SEPARATED BY SPACE.

     WHEN 2.

       CONCATENATE 'Table Without Data :'

                   CV_METADATA_TAB

                   'System :'

                   P_S_SYS

              INTO LV_ERR_MSG

              SEPARATED BY SPACE.

     WHEN 3.

       CONCATENATE 'Option Not Valid. Table :'

                   CV_METADATA_TAB

                   'System :'

                   P_S_SYS

              INTO LV_ERR_MSG

              SEPARATED BY SPACE.

     WHEN 4.

       CONCATENATE 'Field Not Valid. Table :'

                   CV_METADATA_TAB

                   'System :'

                   P_S_SYS

              INTO LV_ERR_MSG

              SEPARATED BY SPACE.

     WHEN 5.

       CONCATENATE 'Not Authorized. Table :'

                   CV_METADATA_TAB

                   'System :'

                   P_S_SYS

              INTO LV_ERR_MSG

              SEPARATED BY SPACE.

     WHEN 6.

       CONCATENATE 'Data Buffer Exceeded. Table :'

                   CV_METADATA_TAB

                   'System :'

                   P_S_SYS

              INTO LV_ERR_MSG

              SEPARATED BY SPACE.

     WHEN 7.

       CONCATENATE 'RFC Connection Not Maintained for System :'

                   P_S_SYS

                   '. Or Unknown Error.'

              INTO LV_ERR_MSG

              SEPARATED BY SPACE.

   ENDCASE.

 

   MESSAGE LV_ERR_MSG TYPE 'E'.

ENDIF.

 

"Source System

"**********************************************************

 

 

 

 

"**********************************************************

"Target System

 

CLEAR IT_OPTIONS.

CLEAR IT_OPTIONS[].

 

CLEAR WA_OPTIONS.

CONCATENATE ' ODSOBJECT = '''

             P_T_DSO

             ''''

    INTO     WA_OPTIONS-TEXT.

    "SEPARATED BY SPACE.

 

APPEND WA_OPTIONS TO IT_OPTIONS.

 

 

CLEAR WA_OPTIONS.

CONCATENATE ' AND OBJVERS = '''

             P_T_VER

             ''''

    INTO     WA_OPTIONS-TEXT.

 

APPEND WA_OPTIONS TO IT_OPTIONS.

 

 

 

"RFC_READ_TABLE

CALL FUNCTION 'RFC_READ_TABLE'  DESTINATION P_T_SYS

   EXPORTING

     QUERY_TABLE                CV_METADATA_TAB

   TABLES

     OPTIONS                    IT_OPTIONS

     DATA                       IT_T_DSO

   EXCEPTIONS

     TABLE_NOT_AVAILABLE        1

     TABLE_WITHOUT_DATA         2

     OPTION_NOT_VALID           3

     FIELD_NOT_VALID            4

     NOT_AUTHORIZED             5

     DATA_BUFFER_EXCEEDED       6

     OTHERS                     7

           .

IF SY-SUBRC <> 0.

   CASE SY-SUBRC.

     WHEN 1.

       CONCATENATE 'Table Not Available :'

                   CV_METADATA_TAB

                   'System :'

                   P_T_SYS

              INTO LV_ERR_MSG

              SEPARATED BY SPACE.

     WHEN 2.

       CONCATENATE 'Table Without Data :'

                   CV_METADATA_TAB

                   'System :'

                   P_T_SYS

              INTO LV_ERR_MSG

              SEPARATED BY SPACE.

     WHEN 3.

       CONCATENATE 'Option Not Valid. Table :'

                   CV_METADATA_TAB

                   'System :'

                   P_T_SYS

              INTO LV_ERR_MSG

              SEPARATED BY SPACE.

     WHEN 4.

       CONCATENATE 'Field Not Valid. Table :'

                   CV_METADATA_TAB

                   'System :'

                   P_T_SYS

              INTO LV_ERR_MSG

              SEPARATED BY SPACE.

     WHEN 5.

       CONCATENATE 'Not Authorized. Table :'

                   CV_METADATA_TAB

                   'System :'

                   P_T_SYS

              INTO LV_ERR_MSG

              SEPARATED BY SPACE.

     WHEN 6.

       CONCATENATE 'Data Buffer Exceeded. Table :'

                   CV_METADATA_TAB

                   'System :'

                   P_T_SYS

              INTO LV_ERR_MSG

              SEPARATED BY SPACE.

     WHEN 7.

       CONCATENATE 'RFC Connection Not Maintained for System :'

                   P_T_SYS

                   '. Or Unknown Error.'

              INTO LV_ERR_MSG

              SEPARATED BY SPACE.

   ENDCASE.

 

   MESSAGE LV_ERR_MSG TYPE 'E'.

 

ENDIF.

 

"Target System

"**********************************************************

 

 

 

 

"Get all IOs

CLEAR WA_DSO_COMP.

 

LOOP AT IT_S_DSO INTO WA_S_DSO.

 

   WA_DSO_COMP-IOBJNM WA_S_DSO-IOBJNM.

 

   APPEND WA_DSO_COMP TO IT_DSO_COMP.

 

ENDLOOP.

 

CLEAR WA_DSO_COMP.

 

LOOP AT IT_T_DSO INTO WA_T_DSO.

 

   WA_DSO_COMP-IOBJNM WA_T_DSO-IOBJNM.

 

   APPEND WA_DSO_COMP TO IT_DSO_COMP.

 

ENDLOOP.

 

SORT IT_DSO_COMP BY IOBJNM.

DELETE ADJACENT DUPLICATES

   FROM        IT_DSO_COMP

   COMPARING   IOBJNM.

 

SORT IT_S_DSO BY IOBJNM ASCENDING.

DELETE ADJACENT DUPLICATES

   FROM        IT_S_DSO

   COMPARING   IOBJNM.

 

 

SORT IT_T_DSO BY IOBJNM ASCENDING.

DELETE ADJACENT DUPLICATES

   FROM        IT_T_DSO

   COMPARING   IOBJNM.

 

LOOP AT IT_DSO_COMP ASSIGNING <FS_DSO_COMP>.

 

   CLEAR WA_S_DSO.

   CLEAR WA_T_DSO.

 

   "Read Source

   READ TABLE IT_S_DSO

     INTO  WA_S_DSO

     WITH KEY IOBJNM <FS_DSO_COMP>-IOBJNM.

 

   "Assign independent of Read Sucess or failure

   <FS_DSO_COMP>-S_IOBJNM    WA_S_DSO-IOBJNM.

   <FS_DSO_COMP>-S_POSIT     WA_S_DSO-POSIT.

   <FS_DSO_COMP>-S_KEYFLAG   WA_S_DSO-KEYFLAG.

 

   "Read Target

   READ TABLE IT_T_DSO

     INTO  WA_T_DSO

     WITH KEY IOBJNM <FS_DSO_COMP>-IOBJNM.

 

   "Assign independent of Read Sucess or failure

   <FS_DSO_COMP>-T_IOBJNM    WA_T_DSO-IOBJNM.

   <FS_DSO_COMP>-T_POSIT     WA_T_DSO-POSIT.

   <FS_DSO_COMP>-T_KEYFLAG   WA_T_DSO-KEYFLAG.

 

 

   IF <FS_DSO_COMP>-S_IOBJNM <FS_DSO_COMP>-T_IOBJNM.

 

     <FS_DSO_COMP>-STATUS ICON_GREEN_LIGHT."'S'.

 

   ELSE.

 

     <FS_DSO_COMP>-STATUS ICON_RED_LIGHT."'D'.

 

   ENDIF.

 

   IF <FS_DSO_COMP>-S_KEYFLAG NE <FS_DSO_COMP>-T_KEYFLAG.

 

     <FS_DSO_COMP>-STATUS ICON_RED_LIGHT."'D'.

 

   ENDIF.

 

 

 

ENDLOOP.

 

SORT IT_DSO_COMP BY IOBJNM.

 

 

 

 

 

 

"**********************************************************

"DISPLAY

 

   WA_FIELDCAT-COL_POS 1.

   WA_FIELDCAT-KEY 'X'.

   WA_FIELDCAT-FIELDNAME 'STATUS'.

   WA_FIELDCAT-REPTEXT_DDIC 'Status'.

   WA_FIELDCAT-ICON 'X'.

   WA_FIELDCAT-OUTPUTLEN 6.

   WA_FIELDCAT-TABNAME 'IT_DSO_COMP'.

   APPEND WA_FIELDCAT TO IT_FIELDCAT.

   CLEAR WA_FIELDCAT.

 

   WA_FIELDCAT-COL_POS 2.

   WA_FIELDCAT-KEY 'X'.

   WA_FIELDCAT-FIELDNAME 'IOBJNM'.

   WA_FIELDCAT-REPTEXT_DDIC 'Info Object'.

   WA_FIELDCAT-OUTPUTLEN 30.

   WA_FIELDCAT-TABNAME 'IT_DSO_COMP'.

   APPEND WA_FIELDCAT TO IT_FIELDCAT.

   CLEAR WA_FIELDCAT.

 

 

   WA_FIELDCAT-COL_POS 3.

   WA_FIELDCAT-FIELDNAME 'S_POSIT'.

   CONCATENATE P_S_DSO

               '- Position :'

               P_S_SYS

        INTO   WA_FIELDCAT-REPTEXT_DDIC

        SEPARATED BY SPACE.

   WA_FIELDCAT-OUTPUTLEN 8.

   WA_FIELDCAT-TABNAME 'IT_DSO_COMP'.

   APPEND WA_FIELDCAT TO IT_FIELDCAT.

   CLEAR WA_FIELDCAT.

 

 

   WA_FIELDCAT-COL_POS 4.

   WA_FIELDCAT-FIELDNAME 'S_IOBJNM'.

   CONCATENATE P_S_DSO

               '- Info Object :'

               P_S_SYS

        INTO   WA_FIELDCAT-REPTEXT_DDIC

        SEPARATED BY SPACE.

   WA_FIELDCAT-OUTPUTLEN 30.

   WA_FIELDCAT-TABNAME 'IT_DSO_COMP'.

   APPEND WA_FIELDCAT TO IT_FIELDCAT.

   CLEAR WA_FIELDCAT.

 

   WA_FIELDCAT-COL_POS 5.

   WA_FIELDCAT-FIELDNAME 'S_KEYFLAG'.

   CONCATENATE P_S_DSO

               '- Key :'

               P_S_SYS

        INTO   WA_FIELDCAT-REPTEXT_DDIC

        SEPARATED BY SPACE.

   WA_FIELDCAT-OUTPUTLEN 10.

   WA_FIELDCAT-TABNAME 'IT_DSO_COMP'.

   APPEND WA_FIELDCAT TO IT_FIELDCAT.

   CLEAR WA_FIELDCAT.

 

 

   WA_FIELDCAT-COL_POS 6.

   WA_FIELDCAT-FIELDNAME 'T_POSIT'.

   CONCATENATE P_T_DSO

               '- Position :'

               P_T_SYS

        INTO   WA_FIELDCAT-REPTEXT_DDIC

        SEPARATED BY SPACE.

   WA_FIELDCAT-OUTPUTLEN 8.

   WA_FIELDCAT-TABNAME 'IT_DSO_COMP'.

   APPEND WA_FIELDCAT TO IT_FIELDCAT.

   CLEAR WA_FIELDCAT.

 

 

   WA_FIELDCAT-COL_POS 7.

   WA_FIELDCAT-FIELDNAME 'T_IOBJNM'.

   CONCATENATE P_T_DSO

               '- Info Object :'

               P_T_SYS

        INTO   WA_FIELDCAT-REPTEXT_DDIC

        SEPARATED BY SPACE.

   WA_FIELDCAT-OUTPUTLEN 30.

   WA_FIELDCAT-TABNAME 'IT_DSO_COMP'.

   APPEND WA_FIELDCAT TO IT_FIELDCAT.

   CLEAR WA_FIELDCAT.

 

   WA_FIELDCAT-COL_POS 8.

   WA_FIELDCAT-FIELDNAME 'T_KEYFLAG'.

   CONCATENATE P_T_DSO

               '- Key :'

               P_T_SYS

        INTO   WA_FIELDCAT-REPTEXT_DDIC

        SEPARATED BY SPACE.

   WA_FIELDCAT-OUTPUTLEN 10.

   WA_FIELDCAT-TABNAME 'IT_DSO_COMP'.

   APPEND WA_FIELDCAT TO IT_FIELDCAT.

   CLEAR WA_FIELDCAT.

 

   WA_SORT-SPOS 1.

   WA_SORT-FIELDNAME 'S_POSIT'.

   WA_SORT-TABNAME 'IT_DSO_COMP'.

   WA_SORT-UP 'X'.

   APPEND WA_SORT TO IT_SORT.

   CLEAR WA_SORT.

 

 

 

CALL FUNCTION 'REUSE_ALV_GRID_DISPLAY'

   EXPORTING

     I_CALLBACK_PROGRAM                SY-REPID

     I_GRID_TITLE                      'DSO Info Object Assignment Comparision'

     IT_FIELDCAT                       IT_FIELDCAT

     IT_SORT                           IT_SORT

   TABLES

     T_OUTTAB                          IT_DSO_COMP

   EXCEPTIONS

     PROGRAM_ERROR                     1

     OTHERS                            2

           .

IF SY-SUBRC <> 0.

   CASE SY-SUBRC.

     WHEN 1.

       LV_ERR_MSG 'Program error during list generation.'.

     WHEN 2.

       LV_ERR_MSG 'Unknown error during list generation.'.

   ENDCASE.

   MESSAGE LV_ERR_MSG TYPE 'E'.

ENDIF.

 

"DISPLAY

"**********************************************************

 

 

Text Elements

 

Selection Text

 

P_S_DSODSO
P_S_SYSSystem
P_S_VERVersion
P_T_DSODSO
P_T_SYSSystem
P_T_VERVersion

 

 

Text Elements

 

001Source DSO :
002Target DSO :

 

 

Pre-Requisites

 

  • RFC connection must be available between systems.
  • Check with your basis team for details regarding RFC connection

 

 

Acknowledgments

 

I would like to thank all my team mates and managers at Infosys for their constant support & encouragement.






http://scn.sap.com/people/rahulrajagopalan.nair/blog/2012/08/02/dso-comparison?utm_source=twitterfeed&utm_medium=twitter

'Technique > SAP BW' 카테고리의 다른 글

SAP BW 7.3 Hybrid Provider  (0) 2012.05.23
Explorer with BWA  (0) 2012.05.11
Usage of BW7.3 Transformation Rule Type “Read from DataStore”  (0) 2012.05.10
Version management in SAP BW 7.3  (0) 2012.04.20
Queries/Workbooks a user can access  (0) 2012.04.20
Posted by AgnesKim
Technique/DashBoard 2012. 7. 18. 10:54

The Key First Step to Successful BI Dashboards

Ask yourself this question, “What’s the first step in building a successful business intelligence (BI) dashboard?”

If you’re like 95 percent of our readers, you said, “Scoping, of course!” But you’re wrong!

The starting point of any successful dashboard is identifying the who, what, and when. This planning phase is actually before scoping – and, unfortunately, it’s one of the most overlooked phases in the dashboard development process.

  • Who: Who is the dashboard for, and who will pay for it?
  • When: When is the dashboard due? Is there a major milestone attached to it?
  • What: What problem will it solve, and what high-level key performance indications (KPIs) do you need to track to reach them?

In this article we will focus on “what.” As part of the planning process, “what” is the second most important part of the dashboard planning process after “who.” Why? Some companies spend tens of thousands of dollars, with good reason, to hire management consultants to help them identify and analyze KPIs and transform their business. The real problem? Most companies simply don’t put enough effort into determining their KPIs.

Often companies use generic KPIs, like revenue, costs, and margins, which are lagging indicators that don’t always result in business transformation. You might notice a drop in revenue, but unless you identified the right metric, you might not realize that you need to revolutionize the way your sales people handle their accounts. A dashboard with revenue by product would only indicate a pre-existing problem; whereas leading-indicator KPIs would likely track where sales people spent their time or how long accounts were spending in one stage of the sales cycle. This type of information would allow you to correct the issue before it had a major impact on revenue.

Depending on which KPIs are selected, a business can take an innovative path as individuals respond to what is being measured. This is similar to the company that wanted to grow sales and began tracking the number of net new customers. Without realizing that many of them were actually consuming more resources than revenue creation, the company’s margins dropped.

Successful Dashboard Stories

A successful dashboard should tell a four-part story for each of KPI:

1) Current State: Where you are today

2) Trend: How you got there

3) Forecast: Where you’ll end up over time at your current run rate

4) What-if: What you need to do to hit your targets

Sales Dashboard that “Tells a Story” – Credit: AnalysisFactory.com

Note the storyline at the top right: Scorecard, Performance, Trend, What If that allow you to first recognize that you have a problem, diagnose the issues and decide on the appropriate action.

Click images for larger view

Another thing to consider is that most KPIs are useless without the use of alerts or statistical functions such as probability. Successful dashboards contain KPIs that provide direct answers to questions like “Will I hit my target?” Instead of showing this in the form of a bar chart that displays actual vs. target, consider using a simple probability formula that displays a single percentage to target on a real-time basis throughout the month.

For instance, if it’s the middle of the month and the probability of you hitting your target isn’t at or above 50 percent, then a yellow alert should display on the dashboard for that KPI, indicating that unless you implement some changes to your business now, you’ll miss your targets at month end. This is way more efficient for running a business and measuring the appropriate targets as opposed to simply displaying data.

So before you bring in IT, business leaders need to spend significant time getting to the core business goals or challenges. This allows for a clear understanding of the purpose of each metric and how it will drive behavior in your organization once it’s published broadly.

MicoYuk">Mico Yuk- Founder of the Xcelsius Gurus and popular weblog EverythingXcelsuis.com , SAP Mentor, and BI Influencer, will be teaching the above strategies and many more in her new online coaching series titled “The BI Dashboard Formula,” which will kick-off in September 2012. Here’s a complimentary download of one of the 10 templates that she will offer in her training. Don’t miss her early bird specials (including a free Blackberry PlayBook for the first 50 that sign up).

Register now and watch for the next blog post in August, covering the second step of Mico’s seven-step formula.

Posted by AgnesKim
Technique/그외2012. 7. 8. 09:55

RSAP, Rook and ERP

Posted by Alvaro Tejada Galindo  in Scripting Languages on Jul 6, 2012 4:39:03 AM

As I wrote in my blog Analytics with SAP and R (Windows version) we can use RSAP to connect to our ERP system and play with the data.

 

This time I wanted of course, to keep exploring the capabilities of RSAP, but using something else. As everybody knows, I love micro-frameworks, so for R that not an exception...gladly, Rook came to the rescue...

 

Rook is a simple web server that will run locally and will allow us to do some really nice things...enough talk...let's go to the source code...

 

RSAP_Rook.R

library("RSAP")

require("Rook")

setwd("C:/Blag/R_Scripts")

 

conn = RSAPConnect("sap.yml")

parms <- list('DELIMITER' = ';',

              'FIELDS' = list(FIELDNAME = list('CARRID', 'CARRNAME')),

              'QUERY_TABLE' = 'SCARR')

res <- RSAPInvoke(conn, "RFC_READ_TABLE", parms)

#RSAPClose(conn)

scarr<-res$DATA

flds<-sub("\\s+$", "", res$FIELDS$FIELDNAME)

scarr<-data.frame(colsplit(scarr$WA,";", names=flds))

 

parms <- list('DELIMITER' = ';',

              'FIELDS' = list(FIELDNAME = list('CITYFROM')),

              'QUERY_TABLE' = 'SPFLI')

res <- RSAPInvoke(conn, "RFC_READ_TABLE", parms)

#RSAPClose(conn)

spfli<-res$DATA

flds<-sub("\\s+$", "", res$FIELDS$FIELDNAME)

spfli<-data.frame(colsplit(spfli$WA,";", names=flds))

spfli<-unique(spfli)

 

get_data<-function(p_carrid,p_cityfrom){

  parms<-list('DELIMITER' = ';',

              'FIELDS' = list(FIELDNAME = list('CITYTO','FLTIME')),

              'OPTIONS' = list(TEXT = list(p_carrid, p_cityfrom)),

              'QUERY_TABLE' = 'SPFLI')

  res<-RSAPInvoke(conn, "RFC_READ_TABLE", parms)

  RSAPClose(conn)

  spfli<-res$DATA

  flds<-sub("\\s+$", "", res$FIELDS$FIELDNAME)

  if(length(spfli$WA)>0){

  spfli<-data.frame(colsplit(spfli$WA,";", names=flds))

  return(spfli)

  }else{

   return(spfli)

  }

}

 

newapp<-function(env){

  req<-Rook::Request$new(env)

  res<-Rook::Response$new()

  res$write('<form method="POST">\n')

  res$write('<div align="center"><table><tr>') 

  res$write('<td>Select a carrier: <select name=CARRID>')

  for(i in 1:length(scarr$CARRID)) {

    res$write(sprintf('<OPTION VALUE=%s>%s</OPTION>',scarr$CARRID[i],scarr$CARRNAME[i]))

  }

  res$write('</select></td><td>')

  res$write('Select a city: <select name=CITYFROM>')

  for(i in 1:length(spfli$CITYFROM)) {

    res$write(sprintf('<OPTION VALUE=%s>%s</OPTION>',spfli$CITYFROM[i],spfli$CITYFROM[i]))

  }

  res$write('</select></td>')

  res$write('<td><input type="submit" name="Get Flights"></td>')

  res$write('</tr></table></div>')

  res$write('</form>')

 

  if (!is.null(req$POST())) {

    p_carrid = req$POST()[["CARRID"]]

    p_cityfrom = req$POST()[["CITYFROM"]]

    flights_from<-paste('Distance in Flights from ',p_cityfrom,sep='')

 

    p_carrid<-paste('CARRID = \'',p_carrid,'\'',sep='')

    p_cityfrom<-paste('AND CITYFROM =\'',p_cityfrom,'\'',sep='')

 

    spfli<-get_data(p_carrid,p_cityfrom)

 

    if(length(spfli$CITYTO) > 0){

    png("Flights.png",width=800,height=500)

    plot(spfli$FLTIME,type="n",axes=FALSE,ann=FALSE)

    lines(spfli$FLTIME,col="blue")

    points(spfli$FLTIME, pch=21, bg="lightcyan", cex=1.25)

    box()

    xy<-length(spfli$CITYTO)

    axis(2, col.axis="blue", las=1)

    axis(1, at=1:xy, lab=spfli$CITYTO, col.axis="purple")

    title(main=flights_from, col.main="red", font.main=4)

    dev.off()

    res$write("<div align='center'>")

    res$write(paste("<img src='", server$full_url("pic"), "/", "Flights.png'", "/>", sep = ""))

    res$write("</div>")

    }else{

      res$write("<p>No data to select...</p>")

    }

  }

  res$finish()

}

 

server = Rhttpd$new()

server$add(app = newapp, name = "Flights")

server$add(app = File$new("C:/Blag/R_Scripts"), name = "pic")

server$start()

server$browse("Flights")

 

This is the result...

 

RSAP_Rook_001.png

RSAP_Rook_002.png

RSAP_Rook_003.png

RSAP_Rook_004.png

 

As you can see, we're getting the data from SAP to fill both SELECT's and then call out the query. We generate a PNG graphic showing the distance from the City From to the City To and then call it from our Web Page to show it on the screen.

 

As you can see, RSAP give us a lot of opportunities that we can take advantage by simply putting some effort and imagination. Hope this boots your R interest 



http://scn.sap.com/community/scripting-languages/blog/2012/07/06/rsap-rook-and-erp?utm_source=twitterfeed&utm_medium=twitter

Posted by AgnesKim
Technique/SAP HANA2012. 7. 5. 21:45

As I mentioned in a recent blog about the In-memory Computing Conference 2012 (IMCC 2012) event,  I found thepresentation from Xiaoqun Clever (Corporate Officer, SAP AG Senior Vice President, TIP Design & New Applications, President of SAP Labs China) entitled “Extreme Applications – neuartige Lösungen mit der SAP In-memory Technologie“ interesting but didn’t go into details. In this blog, I wanted to describe what I found so interesting about the presentation.

 

The first slide that I found intriguing was the one that describes the various scenarios for HANA. You can visualize the evolution of Hana over time as moving from left to right: Accelerator -> Database -> Platform. The first thing that I noticed is that “Cloud” is not involved at all. Indeed, it is only mentioned in a single slide later in the presentation - it almost doesn’t appear to play a major role in such HANA-related endeavors.

 

image001.jpg

 

The use case with HANA as a Database and the involvement of NetWeaver components would also cover the current intentions regarding the NetWeaver Cloud offering where HANA will be present as one possible data source. In this area, I will be curious to see what sort of functionality overlap (user management, etc) will exist between the HANA Platform and the NetWeaver Cloud as well as how the two environments will integrate with one another.

 

A later slide describing the scenario “HANA als Platform” is actually titled “Native HANA Applikationen”. The applications listed on the slide are either OnPremise or OnDemand (usually associated with the HANA AppCloud) applications. I started to consider what the portrayal of these applications as being “native” meant for the relationship between HANA and the Cloud. To be truthful, I’m starting to get the impression that we are slowly seeing a merging of the OnPremise and OnDemand worlds in terms of HANA runtime environments. Native applications might be able to run in both worlds since the underlying platform is fundamentally the same. Thus, other considerations (costs involved, customer presence, etc) might be important in making decisions regarding where such applications are hosted.

image002.jpg

 

If we take this assumption forward a few years, you might think that a HANA-Platform-based Business Suite running as a native HANA application (and thus, perhaps easily available in an OnDemand setting) might be an option but another slide in the same presentation shows that a HANA-based Business Suite would be distinct from native HANA Apps.

 

image003.jpg

 

What I liked about the presentation is that it reinforced my understanding of the differences between scenarios involving NetWeaver functionality and those based on native HANA functionality. What I don’t fully understand is the potential use of native functionality in scenarios where NetWeaver functionality (for example, NetWeaver Cloud) is used. Is this possible? Planned?  As things evolve, I’ll waiting to see if the scenarios with HANA as a Database can hold their own against HANA as a Platform scenarios.



http://scn.sap.com/community/cloud-computing-and-on-demand-solutions/blog/2012/07/05/do-native-hana-applications-imply-a-future-without-a-cloud-onpremise-deployment-distinction?utm_source=twitterfeed&utm_medium=twitter

Posted by AgnesKim
Technique/SAP BO2012. 6. 15. 02:04

Hello Everybody

This is my first blog in SDN and I would like to put down some of the limitation faced when I tried to use WEB INTELLIGENCE reporting in BI4.0 with the new connectivity option of directly using a BEX query using BICS option. I have used the .UNV universe in 3.x environment and below are some of the limitation immediately became visible  with BICS approach.

Very Important ones.

1>     BEX  built using a sub query technique:  In this case the when the “Allow External Access to this Query” is checked the Report can’t even executed in Bex( or bex web), therefore can’t be used in BOBJ platform.

2>     Slice and Dice of the dimension in the report most of the time requires refreshing of  the Report:  This is due to the inherent nature of the Data base delegated Measure setting in the Virtual universe generated at the back end during run time, and setting can’t be changed because of that. We probably have to live with this even if we don’t want the calculation of Key Figure to be done by BW for every slice and dice in WEBI.

3>     The following Filtering technique which works for .UNV universe , doesn’t work for BICS

a.       Object Base Filter ( Filter comparing the same dimension)

b.      Sub Query

c.       Feeding the Output of One query to Another

4>     Combined Query Technique( Minus, Union etc) –  The Option for this also doesn’t seem to be enabled

5>     Drilling - while I absolutely love the way the Hierarchy perform with the new BICS option, the Good Old drilling doesn’t work with BICS and I think this is huge

 

Some Other:

6>     Display Attribute defined in BEX with change of description of the field, ordering etc, has no bearing in the WEBI report. All display attribute defined in the Back end info Object level are available, and can be confusing for the General Users. This doesn’t necessarily be a defect but confusing for the end users , depending upon the type of end users.

7>     Due to inherent nature of BICS approach, we can’t do the following.. but I see this as more as a different way of doing things instead of limitation etc.

a.       Can’t have predefined filters as .UNV

b.      Can’t group dimension in logical group

Since I see WEBI as the most important Self service and Report authoring tool by Business rather than IT, these can cause some problem and pain points for use of WEBI down the line with BEX/BICS.

There are others smaller one, line query Stripping etc… Let’s see if you faced similar issues and have a healthy discussion about these.

Thanks

Arun




http://scn.sap.com/community/businessobjects-web-intelligence/blog/2012/06/14/some-limitation-in-web-intelligence-when-using-directly-with-bex-querybics-in-bi-40?utm_source=twitterfeed&utm_medium=twitter


Posted by AgnesKim
Technique/SAP HANA2012. 6. 14. 12:39

Analytics with SAP and R

Posted by Piers Harding in Scripting Languages on Jun 13, 2012 3:35:41 AM

Something that piqued my curiosity lately was the developments with SAP HANA and R (good overview here).  This is definitely a new and exciting direction for SAP, with creating a well structured, and organised 'Big Table' option for in memory computing, and then going the extra mile to embed a specialised Open Source Statistical Computing package (R) in it - making the fore front of the world of statistical analysis open to those that dare.

 

This is utterly brilliant, but the problem is that I can't access it as I don't have access to a SAP HANA instance (nor would most people).  It is also heavily geared to 'Big Data', when there is still an awful lot to be gained from small, and mid-range data analysis arenas (resisting the temptation about size and clichés).

 

This has definitely touched on my hackers itch, and in response to this I've created one more Scripting Language Connector for R - RSAP.

 

The idea of this is to enable RFC calls (using the SAP NW RFC SDK) where any table contents are returned asdata.frames (in R parlance).

 

Once you have this data in R, then the world is your oyster - it is up to your imagination as to what you do with it.  To give an overview of how it works, and what you can do, I'm going to step through the process of installing and using RSAP.

 

Obtaining and Installing

 

Firstly you need to install R.  I recommend using RStudio as it is a comfortable graphical user interface - you can get it from here.  

Under debian (read Ubuntu) flavoured Linux you can install R first before downloading/installing RStudio using:

 

sudo apt-get install r-base-core r-base-dev r-base-html r-recommended

 

 

SAP NW RFCSDK

 

The SDK is available from the SAP Service Market Place SWDC - this is a forum discussion on getting ithttp://scn.sap.com/thread/950318

If you have (like me) installed the NPL SAP Test Drive instance, then the SAP NW RFC libs exist in the /usr/sap/NPL/SYS/exe/run directory, the only problem being that it does not contain the C header files (really - SAP should make this available on SDN).

 

RSAP

 

Download or clone the RSAP project source from https://github.com/piersharding/RSAP

 

Building

 

Ensure that the R library prerequisites are installed.  To do this there is a helper script in the RSAP source code directory.  cd to the source directory (downloaded above) - in my case /home/piers/git/public/RSAP - and run the following:

 

R --no-save < install_dependencies.R

 

This will prompt to install the packages yaml, reshape, plotrix, and RUnit.

 

To build and install the RSAP package, cd to the source directory (downloaded above) - in my case /home/piers/git/public/RSAP - run the following:

 

R CMD INSTALL --build --preclean --clean --configure-args='--with-nwrfcsdk-include=/home/piers/code/sap/nwrfcsdk/include --with-nwrfcsdk-lib=/home/piers/code/sap/nwrfcsdk/lib' .

 

You must change the values for --with-nwrfcsdk-include and --with-nwrfcsdk-lib to point to the directory locations that you have downloaded the SAP NW RFC SDK to.

 

Under Linux, it is also likely that you need to add the lib directory to the LD cache or set the LD_LIBRARY_PATH variable.

 

Setting the LD Cache:

as root, edit /etc/ld.so.conf and add the lib path from above to it on it's own line.  Now regenrate the cache by executiong 'sudo ldconfig'.

 

Setting LD_LIBRARY_PATH

You must ensure that the following environment variable is set in all your shells:

export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/path/to/nwrfcsdk/lib

The easiest way to do this is to add the above line to your $HOME/.bashrc file so that it happens automatically for all future shells.

 

Does it work?

 

Once the build and install of the RSAP package is complete, now you should test to make sure it's all working.

 

Change to the package source code directory (you are probably still there from the above activities), and launch either R or RStudio.

From the R command line try the following:

 

> library(RSAP)

Loading required package: yaml

>

 

You should get the above confirmation message that the dependent yaml package has been loaded.  Now we are ready to try some R wizardry.

 

How to work with RSAP

 

Lets work through the general process steps for interacting with SAP.

 

Connecting to SAP

 

Using RSAP we need to establish a connection to SAP.  For this you need an account that has the appropriate access for RFC calls, and functionality access.  Connections can be built in two ways - directly passing connection parameters:

>     conn <- RSAPConnect(ashost="nplhost", sysnr="42",

                          client="001", user="developer",

                          passwd="developer", lang="EN")

>

 

Or using a YAML encoded file that contains the connection details:

> conn <- RSAPConnect("sap.yml")

>

 

The sap.yml file is structured like:

ashost: nplhost

sysnr: "42"

client: "001"

user: developer

passwd: developer

lang: EN

trace: 1

 

The above activates the trace functionality in NW RFC SDK.  This will create trace files in the current working directory, and are invaluable for debugging connectivity problems.

 

 

Calling SAP


Now we have the connection object, we can get connection info with it:

info <- RSAPGetInfo(conn)

Query the system with:

res <- RSAPInvoke(conn, "<RFC Function Name", parms)

Or close the connection:

RSAPClose(conn)

 

RSAPInvoke() is what we are most interested in, and we need to pass the parameters as a series of nested named lists.  The classic example is RFC_READ_TABLE:

parms <- list('DELIMITER' = '|',

              'FIELDS' = list(FIELDNAME = list('CARRID', 'CONNID', 'PRICE',

                                               'SEATSMAX', 'SEATSOCC')),

              'OPTIONS' = list(TEXT = list("CARRID = 'AA' ", " AND CONNID = 0017 ")),

              'QUERY_TABLE' = 'SFLIGHTS2')

res <- RSAPInvoke(conn, "RFC_READ_TABLE", parms)

 

The names must correspond directly to the parameter and structure (for tables) names, and use numeric and character types as appropriate.

The other thing that is really important to get your head around is that R data structures are column oriented, which means we have to think differently about tables that we get from SAP.  Tables in SAP translate to lists of vectors where the outer list is a list of column names (a slightly loose analogy but it will do) and the vectors hang off these column names corresponding to all the values in that column down the rows.

 

 

Working through the examples in get_flights.R

 

In the source code package there is an example script - get_flights.R.  It uses the standard demonstration data for the Flight Data system contained in table SFLIGHT2.  Let's look at what this does.

 

Load libraries:

> library(RSAP)

Loading required package: yaml

> library(reshape)

Loading required package: plyr

  Attaching package: ‘reshape’

  The following object(s) are masked from ‘package:plyr’:

      rename, round_any

> library(plotrix)

>

We now have all the necessary libraries for the rest of the examples.

 

conn <- RSAPConnect("sap.yml")

parms <- list('DELIMITER' = ';',

              'QUERY_TABLE' = 'SFLIGHTS2')

res <- RSAPInvoke(conn, "RFC_READ_TABLE", parms)

RSAPClose(conn)

sflight = res$DATA

flds <- sub("\\s+$", "", res$FIELDS$FIELDNAME)

sflight <- data.frame(sflight, colsplit(sflight$WA, split = ";", names = flds))


 

This connects to SAP, calls RFC_READ_TABLE to get the contents of SFLIGHT2, and sets the column delimiter for that table as ';'.  We close the connection and copy the table data from the return parameter res$DATA (see RFC_READ_TABLE in transaction SE37) into sflight.  We also grab the field names returned in table FIELDS, and remove the whitespace at the end.  Next - this is where the importance of the ';' delimiter is - using the colsplit() function from the reshape package, we split return DATA into columns named by the FIELDS that RFC_READ_TABLE provided us.

 

Now we have a data.frame that looks a lot like the table SFLIGHT2 when viewed in transaction SE16.

 

sflight <- cbind(sflight, FLIGHTNO = paste(sub("\\s+$", "",

                                           sflight$CARRID),sflight$CONNID, sep=""))

sflight$SEGMENT <- paste(sflight$AIRPFROM, sflight$AIRPTO, sep=" - ")

sflight$CARRNAME <- sub("\\s+$", "", sflight$CARRNAME)

sflight$DISTANCE <- as.numeric(lapply(sflight$DISTANCE,

                                      FUN=function (x) {sub("\\*","", x)}))

sflight$DISTANCE <- as.numeric(lapply(sflight$DISTANCE,

                                      FUN=function (x) {if (x == 0) NA else x}))

sflight[sflight$CARRNAME == 'Qantas Airways','DISTANCE'] <- 10258

 

This next chunk created  new vectors (columns) FLIGHTNO combined from CARRID and CONNID, SEGMENT from AIRPFROM and AIRPTO, and cleaned vectors CARRNAME, and DISTANCE.

 

Now create some aggregated views, to generate visualisations from:

 

airline_avgocc <- aggregate(data.frame(SEATSMAX=sflight$SEATSMAX,

                                       SEATSOCC=sflight$SEATSOCC,

                                       OCCUPANCY=sflight$SEATSOCC/sflight$SEATSMAX),

                            by=list(carrname=sflight$CARRNAME), FUN=mean, na.rm=TRUE)

airline_sumocc <- aggregate(data.frame(SEATSOCC=sflight$SEATSOCC),

                            by=list(carrname=sflight$CARRNAME), FUN=sum, na.rm=TRUE)

 

Show a pie chart  - sum of airline occupancy as a share of market:

 

x11()

lbls <- paste(airline_sumocc$carrname, "\n", sprintf("%.2f%%",         

        (airline_sumocc$SEATSOCC/sum(airline_sumocc$SEATSOCC))*100), sep="")

pie3D(airline_sumocc$SEATSOCC, labels=lbls,

      col=rainbow(length(airline_sumocc$carrname)),

      main="Occupancy sum share for Airlines", explode=0.1)

 

pie.png

 

 

Create a Stacked Bar Plot with Colors and Legend showing a summary of occupancy by segment and carrier - to do this we need to generate a summary (aggregate), and fill in the missing combinations of the grid, and then switch the orientation of rows for columns to present to the plotting funcitons:

 

d <- aggregate(SEATSOCC ~ CARRNAME:SEGMENT, data=sflight, FUN=sum, na.rm=FALSE)

d2 <- with(d, expand.grid(CARRNAME = unique(d$CARRNAME), SEGMENT = unique(d$SEGMENT)))

airline_sumsegocc <- merge(d, d2, all.y = TRUE)

airline_sumsegocc$SEATSOCC[is.na(airline_sumsegocc$SEATSOCC)] <- 0

# switch orientation to segment * carrier

counts <- data.frame(unique(airline_sumsegocc$CARRNAME))

for (a in unique(airline_sumsegocc$SEGMENT)) 

    {counts <- cbind(counts,

     airline_sumsegocc$SEATSOCC[which(airline_sumsegocc$SEGMENT == a)]);}

counts[,1] <- NULL

colnames(counts) <- unique(airline_sumsegocc$SEGMENT);

rownames(counts) <- unique(airline_sumsegocc$CARRNAME);

x11()

barplot(as.matrix(counts), main="Total Occupancy by Segment and Carrier",

        ylab="Number of Seats",

        col=rainbow(dim(counts)[1]),

        ylim=c(0, 15000), legend = rownames(counts))

 

barchart.png

 

 

Lastly - we create a simple performance indicator using a time series comparison of different airlines:

 

# performance by airline over time - dollars per customer KM

sflight$FLDATEYYMM <- substr(sflight$FLDATE, start=1, stop=6)

d <- aggregate(data.frame(PAYMENTSUM=sflight$PAYMENTSUM,

                          SEATSOCC=sflight$SEATSOCC,

                          DISTANCE=sflight$DISTANCE,

                          PERFORMANCE=(sflight$PAYMENTSUM/(sflight$SEATSOCC *

                             sflight$DISTANCE))),

               by=list(carrname=sflight$CARRNAME,

                       fldateyymm=sflight$FLDATEYYMM),

               FUN=sum, na.rm=TRUE)

d2 <- with(d, expand.grid(carrname = unique(d$carrname),

                          fldateyymm = unique(d$fldateyymm)))

agg_perf <- merge(d, d2, all.y = TRUE)

agg_perf <- agg_perf[order(agg_perf$carrname, agg_perf$fldateyymm),]

agg_perf$PERFORMANCE[is.na(agg_perf$PERFORMANCE)] <- 0

 

# create time series and plot comparison

perf_series <- data.frame(1:length(unique(agg_perf$fldateyymm)))

for (a in unique(agg_perf$carrname))

    {perf_series <- cbind(perf_series,

       agg_perf$PERFORMANCE[which(agg_perf$carrname == a)]);}

perf_series[,1] <- NULL

colnames(perf_series) <- unique(agg_perf$carrname);

# convert all to time series

for (a in length(unique(agg_perf$carrname)))

    {perf_series[[a]] <- ts(perf_series[,a], start=c(2011,5), frequency=12)}

# plot the first and line the rest

x11()

ts.plot(ts(perf_series, start=c(2011,5), frequency=12),

           gpars=list(main="Performance: dollar per customer KM",

                      xlab="Months",

                      ylab="Dollars",

                      col=rainbow(dim(perf_series)[2]), xy.labels=TRUE))

legend(2012.05, 3.2, legend=colnames(perf_series),

                     col=rainbow(dim(perf_series)[2]), lty=1, seg.len=1)

 

timeseries.png

 

 

Hopefully, I've shown that there is a lot that can be done with R - especially in the area of adHoc advanced business intelligence and data analysis.  I have not really even scratched the surface in terms of what R can offer for advanced statistical analysis and modelling - that is where the true wizards live.

 

I would love to hear back from anyone who tries RSAP out - issues and user experiences alike.

 

Edit:

I should note that Alvaro has been here before me with OData/JSON/and R -http://scn.sap.com/community/netweaver-gateway/blog/2012/04/06/when-r-met-sap-gateway

 

References:

Basic R Tutorials



Posted by AgnesKim
Technique/SAP HANA2012. 5. 31. 09:42

Secure Sockets Layer (SSL) with HANA and BI4 Feature Pack 3 requires configuration on the HANA server and BI4 server.  The following steps will show how to configure SSL using OpenSSL and a certificate obtained by from a Certificate Authority (CA).

 

OpenSSL Configuration

 

This blog will cover the OpenSSL Crypto Library, however HANA can also be configured using the SAP Crypto Library.

 

Confirm that OpenSSL is installed

 

shell> rpm -qa | grep -i openssl

openssl-0.9.8h-30.34.1

libopenssl0_9_8-32bit-0.9.8h-30.34.1

openssl-certs-0.9.8h-27.1.30

libopenssl0_9_8-0.9.8h-30.34.1

 

Confirm that OpenSSL is 64-bit

 

shell> file /usr/bin/openssl

openssl: ELF 64-bit LSB executable, x86-64, version 1 (SYSV), for GNU/Linux 2.6.4, dynamically linked (uses shared libs), stripped

 

Confirm there is a symlink to the libssl.so file

 

ssl_5.png

 

If not, create one as the root user

 

shell> ln -s /usr/lib64/libssl.so.0.9.8 /usr/lib64/libssl.so

 

SSL Certificates

 

This blog won’t go into details of how SSL works, but in generic terms you’ll need to create a Certificate Singing Request (CSR) from the HANA server and send that to a CA.  In return, the CA will give you a Signed Certificate and a copy of their Root CA Certificate.  These, then need to be setup with HANA and the BI4 JDBC and ODBC drivers.

 

Creating the Certificate Signing Request

 

shell> openssl req -new -nodes -newkey rsa:2048 -keyout Server_Key.key -out Server_Req.csr -days 365

 

Fill out the requested information according to your company:

 

-----
You are about to be asked to enter information that will be incorporated
into your certificate request.
What you are about to enter is what is called a Distinguished Name or a DN.
There are quite a few fields but you can leave some blank
For some fields there will be a default value,
If you enter '.', the field will be left blank.
-----
Country Name (2 letter code) [AU]:
State or Province Name (full name) [Some-State]:
Locality Name (eg, city) []:
Organization Name (eg, company) [Internet Widgits Pty Ltd]:
Organizational Unit Name (eg, section) []:
Common Name (eg, YOUR name) []:
Email Address []:

Please enter the following 'extra' attributes
to be sent with your certificate request
A challenge password []:
An optional company name []:

 

This will create two files

 

  • Key: Server_Key.key
  • CSR: Server_Req.csr

 

The CSR needs to be sent to the CA, which in turn will give you a signed certificate and their Root CA Certificate.

 

Convert the Root CA Certificate to PEM

 

The Root CA Certificate may come in the DER format (.cer extension), HANA requires the cert in PEM format.  Therefore, we will need to convert it using the command

 

shell> openssl x509 -inform der -in CA_Cert.cer -out CA_Cert.pem

 

HANA SSL Configuration

 

Copy both the Signed Cerficiate and Root CA Certificate to the HANA server.  For HANA SSL to work, we need to create two files:

 

  • key.pem
  • trust.pem

 

The key.pem key store file contains the certificate chain, which includes your servers key (Server_Key.key), the CA’s Signed Certificate and the Root CA Certificate.  Whereas the trust.pem trust store file contains the Root CA Certificate.

 

Create the key.pem and trust.pem trust stores

 

key.pem

 

shellcat Server_Cert.pem Server_Key.key CA_Cert.pem > key.pem

 

trust.pem

 

shellcp CA_Cert.pem trust.pem

 

Copy the files to the user's home directory

 

In the user's home directory create a .ssl directory and place both the key.pem and trust.pem files here,

 

ssl_6.png

 

Configure the certificates in HANA

 

Once the key.pem and trust.pem files have been created they need to be configured in HANA.

 

In HANA Studio go to

 

  • Administration
  • Configuration tab
  • Expand indexserver.ini
  • Expand communication
  • Configure the entries related to SSL

 

ssl_!.png

 

Start and Stop HANA to pick up the SSL configuration

 

  • HDB stop
  • HDB start

 

HANA Studio Configuration

When setting up the connection to HANA, check the option 'Connect using SSL', as seen below.

ssl_7.png

 

To confirm the connection has SSL, look for the lock icon on the server icon, as seen below.

 

ssl_8.png

 

BI4 Feature Pack 3 SSL Configuration

 

SSL in BI4 needs to configured for the HANA connectivity you plan to use. 

 

JDBC Configuration

 

For JDBC SSL configuration, we’ll need to add the trust.pem trust store to the Java Key Store (JKS) using the keytool utility provided by the JDK/JRE.  This is done via the command line.  Change the paths for your own configuration:

 

Add trust.pem to the JKS

 

C:\Documents and Settings\Administrator>"C:\Program Files (x86)\SAP BusinessObjects\SAP BusinessObjects Enterprise XI 4.0\win32_x86\jre\bin\keytool.exe" -importcert -keystore "C:\Program Files (x86)\SAP BusinessObjects\SAP BusinessObjects Enterprise XI 4.0\win32_x86\jre\lib\security\cacerts" -alias HANA -file trust.pem

 

You will be prompted for the keystore password.  The default password is: changeit

 

When prompted to 'Trust this certificate' enter yes.  The alias can be any value, however it must be unique in the keystore.

 

Confirm that your certificate has been added to the keystore

 

C:\Documents and Settings\Administrator>"C:\Program Files (x86)\SAP BusinessObjects\SAP BusinessObjects Enterprise XI 4.0\win32_x86\jre\bin\keytool.exe" -list -keystore "C:\Program Files (x86)\SAP BusinessObjects\SAP BusinessObjects Enterprise XI 4.0\win32_x86\jre\lib\security\cacerts" -alias HANA

 

If successful, you will see trustedCertEntry in the output, as below

 

ssl_11.png

 

Information Design Tool  (IDT) Configuration

 

In IDT, the connection will need to be setup with the JDBC Driver Property encrypt=true to make the connection use SSL when connecting to HANA,

 

idt.png

 

ODBC Configuration

 

Once the HANA client driver has been installed, you can set up a ODBC connection for HANA.  To connect via SSL, check the box 'Connect using SSL', as below:

ssl_2.png

 

If you added any Special property settings', they won't be displayed in the driver configuration.  To view them, launch the Windows Registry Editor and go to the key:

 

  • HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\ODBC\ODBC.INI\<Your Data Source Name>

 

ssl_4.png

 

Installing the CA Root Certificate

 

Depending on which CA you get the certificate signed from, you may run into SSL errors.  For example, in Crystal you may see this error,

 

cr1.png

 

To resolve this, install the CA Root Certificate allowing it to be trusted by the server.

 

  • Copy the CA Root Certificate to the machine where the error is coming from

 

  • Double click on the certificate and click 'Install Certificate'

cr5.png

  • Click next

 

cr4.png

  • Select the first option and click next

 

cr3.png

  • Click finish

cr6.png

 

Confirming if SSL is being used

 

Using a tool like Wireshark, the communication between the server and client can be traced, as seen below to verify that SSL is being used.

 

ssl_10.png




http://scn.sap.com/community/in-memory-business-data-management/blog/2012/05/30/ssl-with-hana-and-bi4-feature-pack-3?utm_source=twitterfeed&utm_medium=twitter

Posted by AgnesKim
Technique/SAP HANA2012. 5. 29. 21:32

Two major improvements related to Enterprise Data Warehousing that can be achieved by migrating SAP NetWeaver BW to the SAP HANA database platform are (1) the dramatic improvement in data load times for Data Store Objects being the data containers within BW, and (2) the significantly boosted performance while reporting from them. With the latest BI Content releases SAP NetWeaver 7.30 BI Content 7.37 SP 01 or SAP NetWeaver 7.31 BI Content 7.47 SP 01, a large fraction (about 2/3) of DataStore Objects that were delivered so far with BI Content have now been prepared for HANA-optimization.

 

When you copy these HANA-prepared DataStore Objects (DSO) from the delivered to the active version they will be created automatically as SAP HANA-optimized DataStore Object side-stepping the manual conversion step from standard to HANA-optimized DSO. The delivery of DataStore Objects prepared for HANA-optimization marks the next consequent step towards total cost of administration reduction after the migration of all relevant data flows to SAP BW 7.x technology. This is one important prerequisite for HANA-optimization of involved DataStore Objects (migrated data flows were delivered with SAP NetWeaver 7.30 BI Content 7.36 SP 02 or SAP NetWeaver 7.31 BI Content 7.46 SP 02). If DataStore Objects contain data, you cannot avoid the manual conversion step to HANA-optimization. Therefore, you best benefit from the HANA-prepared DataStore Objects when you copy new data flows from the delivered to the active version in your BW system powered by SAP HANA.

 

You can benefit from automatic creation of HANA-optimized DataStore Objects during BI Content activation if you are using (1) SAP HANA database as of release SAP HANA 1.0 Support Package 03 and (2) SAP NetWeaver BW 7.3, Support Package Stack 07 or SAP NetWeaver BW 7.3, including Enhancement Package 1, Support Package Stack 04.

 

Find the complete list of all 1025 HANA-optimized DataStore Objects attached to note 1708668.




http://scn.sap.com/community/data-warehousing/business-content-and-extractors/blog/2012/05/29/shipment-of-bi-content-for-sap-netweaver-bw-powered-by-sap-hana-datastore-objects-are-now-prepared-for-sap-hana-optimization?utm_source=twitterfeed&utm_medium=twitter

'Technique > SAP HANA' 카테고리의 다른 글

Analytics with SAP and R  (0) 2012.06.14
SSL with HANA and BI4 Feature Pack 3  (0) 2012.05.31
What's New in HANA Studio SPS04  (0) 2012.05.23
HANA is a great way to waste a lot of money  (0) 2012.05.17
BWA, Exalytics and SAP-HANA  (0) 2012.05.11
Posted by AgnesKim