Technique/그외2014. 3. 14. 10:56




GKP 진출 국가별 주요 경제 동향(2013년 10월 ~ 12월).pdf


Posted by AgnesKim
Technique/그외2012. 7. 8. 09:55

RSAP, Rook and ERP

Posted by Alvaro Tejada Galindo  in Scripting Languages on Jul 6, 2012 4:39:03 AM

As I wrote in my blog Analytics with SAP and R (Windows version) we can use RSAP to connect to our ERP system and play with the data.

 

This time I wanted of course, to keep exploring the capabilities of RSAP, but using something else. As everybody knows, I love micro-frameworks, so for R that not an exception...gladly, Rook came to the rescue...

 

Rook is a simple web server that will run locally and will allow us to do some really nice things...enough talk...let's go to the source code...

 

RSAP_Rook.R

library("RSAP")

require("Rook")

setwd("C:/Blag/R_Scripts")

 

conn = RSAPConnect("sap.yml")

parms <- list('DELIMITER' = ';',

              'FIELDS' = list(FIELDNAME = list('CARRID', 'CARRNAME')),

              'QUERY_TABLE' = 'SCARR')

res <- RSAPInvoke(conn, "RFC_READ_TABLE", parms)

#RSAPClose(conn)

scarr<-res$DATA

flds<-sub("\\s+$", "", res$FIELDS$FIELDNAME)

scarr<-data.frame(colsplit(scarr$WA,";", names=flds))

 

parms <- list('DELIMITER' = ';',

              'FIELDS' = list(FIELDNAME = list('CITYFROM')),

              'QUERY_TABLE' = 'SPFLI')

res <- RSAPInvoke(conn, "RFC_READ_TABLE", parms)

#RSAPClose(conn)

spfli<-res$DATA

flds<-sub("\\s+$", "", res$FIELDS$FIELDNAME)

spfli<-data.frame(colsplit(spfli$WA,";", names=flds))

spfli<-unique(spfli)

 

get_data<-function(p_carrid,p_cityfrom){

  parms<-list('DELIMITER' = ';',

              'FIELDS' = list(FIELDNAME = list('CITYTO','FLTIME')),

              'OPTIONS' = list(TEXT = list(p_carrid, p_cityfrom)),

              'QUERY_TABLE' = 'SPFLI')

  res<-RSAPInvoke(conn, "RFC_READ_TABLE", parms)

  RSAPClose(conn)

  spfli<-res$DATA

  flds<-sub("\\s+$", "", res$FIELDS$FIELDNAME)

  if(length(spfli$WA)>0){

  spfli<-data.frame(colsplit(spfli$WA,";", names=flds))

  return(spfli)

  }else{

   return(spfli)

  }

}

 

newapp<-function(env){

  req<-Rook::Request$new(env)

  res<-Rook::Response$new()

  res$write('<form method="POST">\n')

  res$write('<div align="center"><table><tr>') 

  res$write('<td>Select a carrier: <select name=CARRID>')

  for(i in 1:length(scarr$CARRID)) {

    res$write(sprintf('<OPTION VALUE=%s>%s</OPTION>',scarr$CARRID[i],scarr$CARRNAME[i]))

  }

  res$write('</select></td><td>')

  res$write('Select a city: <select name=CITYFROM>')

  for(i in 1:length(spfli$CITYFROM)) {

    res$write(sprintf('<OPTION VALUE=%s>%s</OPTION>',spfli$CITYFROM[i],spfli$CITYFROM[i]))

  }

  res$write('</select></td>')

  res$write('<td><input type="submit" name="Get Flights"></td>')

  res$write('</tr></table></div>')

  res$write('</form>')

 

  if (!is.null(req$POST())) {

    p_carrid = req$POST()[["CARRID"]]

    p_cityfrom = req$POST()[["CITYFROM"]]

    flights_from<-paste('Distance in Flights from ',p_cityfrom,sep='')

 

    p_carrid<-paste('CARRID = \'',p_carrid,'\'',sep='')

    p_cityfrom<-paste('AND CITYFROM =\'',p_cityfrom,'\'',sep='')

 

    spfli<-get_data(p_carrid,p_cityfrom)

 

    if(length(spfli$CITYTO) > 0){

    png("Flights.png",width=800,height=500)

    plot(spfli$FLTIME,type="n",axes=FALSE,ann=FALSE)

    lines(spfli$FLTIME,col="blue")

    points(spfli$FLTIME, pch=21, bg="lightcyan", cex=1.25)

    box()

    xy<-length(spfli$CITYTO)

    axis(2, col.axis="blue", las=1)

    axis(1, at=1:xy, lab=spfli$CITYTO, col.axis="purple")

    title(main=flights_from, col.main="red", font.main=4)

    dev.off()

    res$write("<div align='center'>")

    res$write(paste("<img src='", server$full_url("pic"), "/", "Flights.png'", "/>", sep = ""))

    res$write("</div>")

    }else{

      res$write("<p>No data to select...</p>")

    }

  }

  res$finish()

}

 

server = Rhttpd$new()

server$add(app = newapp, name = "Flights")

server$add(app = File$new("C:/Blag/R_Scripts"), name = "pic")

server$start()

server$browse("Flights")

 

This is the result...

 

RSAP_Rook_001.png

RSAP_Rook_002.png

RSAP_Rook_003.png

RSAP_Rook_004.png

 

As you can see, we're getting the data from SAP to fill both SELECT's and then call out the query. We generate a PNG graphic showing the distance from the City From to the City To and then call it from our Web Page to show it on the screen.

 

As you can see, RSAP give us a lot of opportunities that we can take advantage by simply putting some effort and imagination. Hope this boots your R interest 



http://scn.sap.com/community/scripting-languages/blog/2012/07/06/rsap-rook-and-erp?utm_source=twitterfeed&utm_medium=twitter

Posted by AgnesKim
Technique/그외2012. 5. 10. 20:49

Customizing Logon Page on Portal 7.3

Posted by purav mehta in SAP NetWeaver Portal on May 10, 2012 12:59:15 PM

Please find below details steps for customizing logon page on Portal 7.3.

 

1Locate the WAR file.

 

First step is to get the WAR file delievered by SAP for logon page to customize it.

 

 

Copy the war file tc~sec~ume~logon~ui.war to your local machine from

 

<Installation drive>:\usr\sap\<SID>\J00\j2ee\cluster\apps\sap.com\com.sap.security.core.logon
\servlet_jsp\logon_ui_resources\tc~sec~ume~logon~ui.war

 

 

2. Import the WAR file

 

Next we have to import the WAR file into NWDS by going to: File --> Import -->Web --> War File

 

1.jpg

 

     Select the WAR file from the local system.

 

2.jpg

 

 

    As EAR format can be deployed on JEE server, corresponding EAR project has to be created.

   For this, check the “Add project to an EAR “checkbox as above and specify suitable name in “EAR project name “based on the

   WAR project name.

 

   Click Finish to create both WAR and EAR projects.

 

 

 

3.jpg

 

    Expand the WAR project.

 

 

4.jpg

 

At this point you will notice errors in the project. To remove these errors follow the next step.

 

 

3Adding the required JAR file to remove the Errors.

 

 

     a. Next you need to locate the Jar file “tc~sec~ume~logon~logic_api.jar” on which the WAR file is dependent from the following location:

     <drive>\usr\sap\<SID>\J00\j2ee\cluster\apps\sap.com\com.sap.security.core.logon\servlet_jsp\logon_app\root\WEB-INF\lib

 

    

     Copy the tc~sec~ume~logon~logic_api.jar” file to the WebContent\WEB-INF\lib folder of the WAR project in NWDS.

 

5.jpg

 

    bThis Jar file has also to be added in the build path of WAR file.

         Right Click the WAR project and select Build Path --> Configure Build Path.

 

6.jpg

 

 

     cClick on Libraries tab.Click on “Add External  Jars”  and select the JAR file “tc~sec~ume~logon~logic_api.jar” from local system and “Add” to   get   the following screen:

 

7.jpg

 

Once done you will notice that all errors have gone !!

 

4. Make Changes to Layout

 

     a.  Now its time to start making the desired changes to the layout. In our example we are changing the branding image

          on the Logon screen. We have copied the image “hearts.jpg” to the folder WebContent\layout

 

 

 

8.jpg

 

SAP delivered image  branding-image-portals.jpg has dimension  290X360 px . If you select a bigger image it will get truncated based on the mentioned dimensions. To change the dimensions you need to edit the element urBrandImage in css file

 

 

urBrandImage{overflow:hidden;width:290px;height:360px}

 

 

 

b. After the changes have been made, we need to be sure that WAR project is updated in the EAR project and latest changes are

   picked up. For this Right Click on WAR project and select Java EE Tools -->Update EAR Libraries.

 

 

9.jpg

 

 

 

5. Configuring deployment descriptors

         

          Next we need to configure 2 deployment descriptors of the EAR application as below:

         

          a. application-j2ee-engine.xml

          b. application.xml

 

 

10.jpg

 

     a.  Configuring application-j2ee-engine.xml

 

 

 

        In the EAR, view the General tab of the file <project_name>/EARContent/META-INF/application-j2ee-engine.xml.

 

          i. Enter a provider name for your application.This is usually domain name of the client.

             The provider name defines your namespace where your applications reside on the AS Java.

              If you enter “example.com”, the application deploys to the following path:<ASJava_Installation>/j2ee/cluster/apps/example.com/<project_name>

 

        ii . Next we need to add reference to the standard application com.sap.security.core.logon

               Choose References and choose +  with the quick info text Add element

 

         iii.  Choose Create new and enter the required data.

 

   

Reference Data for the Logon Application

Field Name

Data

Reference target

com.sap.security.core.logon

Reference type

hard

Reference target type

application

Provider name

sap.com

 

11.jpg

 

This will generate the XML in background which can be displayed in the SOURCE tab :

<?xml version="1.0" encoding="UTF-8" standalone="no"?>

<application-j2ee-engine

      xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"

      xsi:noNamespaceSchemaLocation="application-j2ee-engine.xsd">

      <reference

            reference-type="hard">

            <reference-target

                  provider-name="sap.com"

                  target-type="application">com.sap.security.core.logon</reference-target>

      </reference>

      <provider-name>newLogon.com</provider-name>

</application-j2ee-engine>

 

  b.. Configuring application.xml

 

In the EAR, edit the file <project_name>/EARContent/META-INF/application.xml, and define the URL alias and for your custom logon UI.

Double click on application.xml and go to Modules tab . Select the WAR file and enter the “Context Root” field for example : new_logon

 

12.jpg

 

We have to provide this alias name later in NWA so please make a note of it.

 

 

 

6. Creating the deployable EAR file

 

     Next we need to create a deployable EAR file . For this right Click on EAR project and select Export -->SAP EAR file

 

13.jpg

 

7. Deploying the EAR file

 

     Right click on the EAR project and select Run As --> Run on server

     Enter the credentials of the server and file will get deployed on the server with a success message.

     You might get an Error screen in NWDS after deployment as below however you can ignore it.

 

 

 

14.jpg

 

8. Configuring UME properties in NWA

 

     Navigate to the following URL to modify UME properties through Netweaver Administrator

     http://<host>:<port>/nwa/auth

 

     a. Change the property Alias of the aplication for customizing login pages (ume.logon.application.ui_resources_alias)                

          to custom  application “ new_logon” which we mentioned previously  in the Context root of application.xml

 

 

     b.  Change the property Path or URL to the branding image (ume.logon.branding_image) to “layout/hearts.jpg”

 

 

15.jpg

16.jpg

 

 

Hurray!!!  We have successfully customized the Logon Screen …

 

 

9. Next aim is to have a custom text or Notice on the logon page. 

 

Please add the following code after line 44 in the logon.jsp.

 

<!-- ********************************************* -->

    <!--  disclaimer notice                         -->

                <tr>

      <td class="urLblStdNew">

        <span ><b>Notice for All Users</b>

                      <br><br>Paste your content here.

        </span>

      </td>

    </tr>    

<!-- ********************************************* -->

 

Save the new values and restart the portal server.

 

 

17.jpg

 

 

18.jpg

 

This finishes (or rather begins) our journey with the customization of Logon page …. !!!




http://scn.sap.com/community/netweaver-portal/blog/2012/05/10/customizing-logon-page-on-portal-73?utm_source=twitterfeed&utm_medium=twitter

Posted by AgnesKim
Technique/그외2011. 12. 1. 20:16

The Benefits of an Integrated Analytics Framework
Rohan Nallanickan SAP Employee 
Business Card
Company: SAP
Posted on Nov. 30, 2011 10:17 AM in Analytics, Business Intelligence (BusinessObjects)

 
 

Introduction

The terms business analytics and business intelligence are often used interchangeably. Though they may at first seem similar, each has unique objectives and approaches. Business intelligence (BI) uses transactional and historical data to determine trends, analyze what has happened in a business, and report on outcomes. Business analytics, in contrast, creates a performance management framework and then uses data to drive predictive assessment, often via data mining, statistical modelling, and scenario analysis. And while the need for business intelligence is usually recognized and addressed, few organizations have yet recognized the need to integrate analytics into the corporate culture.

The result, if analytics are considered at all, is too often a fragmented and disjointed processes like that illustrated in Figure 1. Here analytical applications and business processes fail to interact – leading to both data-management and cultural challenges throughout the BI environment.

 Figure 1

Figure 1: Typical BI and advanced analytics architecture

Disadvantages of current analytic frameworks

Siloed applications and processes

Analytical applications are designed to answer specific questions, but often have scarce or tenuous links to other business areas – so output from an analytic application may not be reflected in financial forecasts or operations. For example, a campaign management application generates optimized campaigns for a particular product. The resulting increased demand, however, may not be linked to inventory management and forecasting applications – resulting in an out-of-stock situation on a particular promotional product.

And analytic insights may not link to a business’s strategic plan. For example, suppose an analytic application identifies a market opportunity for a new product, but that opportunity may not be incorporated into the strategic plan. Conversely, analysis might identify a promising new market opportunity for an existing product – at the same time the strategic plan is phasing out that very product.

Meanwhile, a lack of write-back mechanisms may mean that insights from analytic applications are not automatically captured in transactional systems, making it difficult to close the loop from insight to action.

Finally, with multiple, disjointed systems comes significant additional costs for maintenance, plus significant time and effort required to reconcile data.

Inconsistent data

A lack of consistent data definitions and business rules across BI and analytic applications – say from supply chain analytics to financial analytics – makes it difficult to compare and act on results. For example, a company’s sales and commercial departments may define “sales revenue” differently, compromising data aggregation and analysis.

In addition, analytical models and their results may not be consistently persisted in corporate data repositories, making it difficult to provide a historical record for auditing and corporate learning. When analytics are performed locally within a specific tool, models and results are not retained in the data warehouse for future reference.

Sceptical business users

Operating independently, siloed analytic teams can push insights to end users with little to no involvement from actual decision makers in the business. Without collaboration, such teams miss valuable input that could drive insight and generate buy-in. Meanwhile, business users are often sceptical of analytic results that they have not helped generate.

Figure 2 illustrates the impact of disjointed process on the business. BI provides answers to “what happened?” while advanced analytics suggests improvements and answers “what if” questions. Ideally, the two processes complement each other – but if disconnected, they could actually work against each other. To ensure smooth operational execution, a business needs an advanced analytic system built on, or in conjunction with, its BI framework.

Figure 1

 

Figure 2: Current business processes for using BI and advanced analytics

Benefits of an integrated framework

As a best practice, SAP recommends an integrated framework that combines analytical applications with traditional BI solutions, underpinned by a holistic performance analytics framework. As the example in Figure 3 shows, such an integrated architecture has several benefits – better strategic alignment, consistent data definitions and use, and business users who embrace analytics to accelerate performance.

Figure 3

Figure 3: Future state best-of-breed architecture for integrated BI and advanced analytics

Strategically aligned application architecture and business processes  

If you start with a strategic top-down blueprint, all your BI and analytics applications can be architected according to a coherent set of solution requirements, scope, and vision, factoring in all inter-dependencies. Such a blueprint could potentially enable drill down from a strategic KPI at the top level of a balanced scorecard to the lowest degree of operational detail – describing, for example, how KPIs relate to operational metrics at the user level. A top-down analytics environment permits close alignment of a company’s analytics architecture with its enterprise performance management and strategic planning processes. Adding advanced analytics to the mix could also potentially enable you to forecast your performance under a variety of business scenarios. 

The top-down approach also identifies and builds in interdependencies among business areas, helping to balance worthwhile, but often conflicting, objectives of different business functions. For example, a bank’s sales team may want to increase the number of new loan customers while its credit department works to increase loan portfolio quality – objectives that could easily conflict. An integrated suite of analytic applications can eliminate such conflicts by optimizing decision making across conflicting constraints, ensuring smooth operational execution.  

Meanwhile, an integrated feedback loop can ensure that insights derived from an analytic system are picked up by the transactional system, so those insights can be acted on. For example, customers identified as profitable with a promising future lifetime value can be tagged in customer service applications so they receive higher service levels, increasing satisfaction and leading to greater purchase and retention rates.

Consistent and more accurate data

In this model, all insights are acted upon – and all actions are analyzed. Unlike current analytical applications in which outputs and insights are not persisted, in an integrated approach, new analytic insights are written back to the data warehouse each time. This provides a historical base for past decisions, improving both traceability and auditability. New records also become assets to corporate memory, enabling future learning.

In an integrated environment, too, all BI and analytics applications use data from the same corporate data warehouse and operational data stores, ensuring consistent data definitions throughout the company. Data sourced from data warehouses also goes through standard cleansing and quality processes – improving data quality while drastically reducing both data inconsistency and lack of confidence in analytic outputs.

Collaborative culture 

The integrated approach enables a single umbrella team – such as an analytics competency center – to own all analytics and BI applications and initiatives within the organization, with sub-teams dedicated to specific business areas. This approach facilitates a culture that more closely integrates analytics into standard operational processes, while fostering collaboration between line of business users and the analytics team – to create analytics that truly improve performance.

In a best practice environment, business users interact with the analytics team to define the questions that need answers and validate analytic insights. Business users then feed the real-world results back to the analytics team, further fine-tuning the process. This culture change benefits both teams as well as the organization as a whole.

Figure 4 shows how insights from embedded analytics can be disseminated to transform a company into an effective learning organization.

Figure 4

Figure 4: An integrated framework for BI and advanced analytics supports best-of-breed business processes

Conclusion

Performance and insight optimization services from SAP can help you create an integrated analytics framework. Backed by deep industry expertise, our team of experts in data mining, mathematical modelling, business intelligence, and performance analytics can help design an analytics strategy and integrated analytics roadmap, blueprint, and target architecture. SAP experts can then help identify, design, and deploy the specific analytical applications that best support your business objectives. 

Rohan Nallanickan   Performance Analytics Principal within the Performance & Insight Optimixation hub of SAP.


http://www.sdn.sap.com/irj/scn/weblogs?blog=/pub/wlg/27583%3Futm_source%3Dtwitterfeed%26utm_medium%3Dtwitter%26utm_campaign%3DFeed%253A+SAPNetworkWeblogs+%2528SAP+Network+Weblogs%2529

Posted by AgnesKim
Technique/그외2011. 11. 29. 13:36

Using reCaptcha at SAP Portal 7.3 Logon Page
Erhan Keseli Active Contributor Bronze: 250-499 points
Business Card
Posted on Nov. 28, 2011 05:19 AM in Enterprise Portal (EP)

 
 

If you have a portal which exposed to internet, you may want to use captcha for bots. So in this blog I will implement reCaptcha. Why do I choose reCaptcha? Because it is easy to implement and you dont need to implement a lot of things to work. Let's do it!

First you have to modify logon page. You can find it solution here: http://nwturk.com/blog/2011/06/06/changing-logon-page-on-netweaver-7-3/

You have to import jar files of reCaptcha (link). After importing files modify logonPage.jsp file for reCaptcha.

-Import reCaptcha:

<%@ page import="net.tanesha.recaptcha.ReCaptcha" %>
<%@ page import="net.tanesha.recaptcha.ReCaptchaFactory" %>

-Add reCaptcha code for displaying captcha. You can customize it for your need:

<%
	ReCaptcha c = ReCaptchaFactory.newReCaptcha("your public key", "your private key", false);
	out.print(c.createRecaptchaHtml(null, null));
%>

Be careful adding these code block between <sap:form type="logon"></sap:form>

We have done about this part. Now it is time to implement login module. You can get more information about login modules from this link. Now you have more information about login modules after link. Implement the class and add a new method to get response.

	private String getRequestValue(String parameterName)
		throws LoginException {

		HttpGetterCallback httpGetterCallback = new HttpGetterCallback();
		httpGetterCallback.setType(HttpCallback.REQUEST_PARAMETER);
		httpGetterCallback.setName((String) parameterName);

		String value = null;

		try {
			_callbackHandler.handle(new Callback[] { httpGetterCallback });

			String[] arrayRequestparam =
				(String[]) httpGetterCallback.getValue();

			if (_decodeRequestParameter) {
				value = URLDecoder.decode(arrayRequestparam[0], "UTF-8");
			} else {
				value = arrayRequestparam[0];
			}

		} catch (UnsupportedCallbackException e) {

			return null;

		} catch (IOException e) {
			throwUserLoginException(e, LoginExceptionDetails.IO_EXCEPTION);
		}

		return value;
	}

You can call method with these parameters. (example: String challengefield = getRequestValue("recaptcha_challenge_field");

And also you need client ip address. Here is the moethod to get ip address:

	private String getIPAddress(){
		String clientIp = "";
		try{
			HttpGetterCallback hgc = new HttpGetterCallback();
			_callbackHandler.handle(new Callback[] { hgc });
			hgc.setType(HttpCallback.CLIENT_IP);
			clientIp = (String)hgc.getValue();
		}catch(Exception ex){

		}
		return clientIp;
	}

 

If you have a reverse proxy you get ip address of it. So you have to configure it to get clients ip address.
After you get the parameters for reCaptcha check them:

import net.tanesha.recaptcha.ReCaptchaImpl;
import net.tanesha.recaptcha.ReCaptchaResponse;
ReCaptchaImpl reCaptcha = new ReCaptchaImpl();
reCaptcha.setPrivateKey("your_private_key");
String ipAdress = getIPAddress();
String challenge = getRequestValue("recaptcha_challenge_field");
String uresponse = getRequestValue("recaptcha_response_field");
ReCaptchaResponse reCaptchaResponse = reCaptcha.checkAnswer(ipAdress, challenge, uresponse);
if (reCaptchaResponse.isValid()) {
	// do your valid login work
}else{
    // do your invalid login work
}

Erhan Keseli  Active Contributor Bronze: 250-499 points is an Senior SAP Technical Consultant specialized on Netweaver Technology.


http://www.sdn.sap.com/irj/scn/weblogs?blog=/pub/wlg/27537%3Futm_source%3Dtwitterfeed%26utm_medium%3Dtwitter%26utm_campaign%3DFeed%253A+SAPNetworkWeblogs+%2528SAP+Network+Weblogs%2529

Posted by AgnesKim
Technique/그외2011. 3. 27. 15:21

New Memory Analysis Tools for ABAP Web Dynpro
Stephen Pfeiffer SAP Employee 
Business Card
Company: SAP AG
Posted on Mar. 24, 2011 03:41 PM in ABAP

Before NetWeaver 7.0 EHP2, it used to be difficult to analyze memory problems that were really specific to an ABAP Web Dynpro application and not just related to some mammoth internal table deep down in the application logic.

The problem was that it was hard to see the forest for the trees in the Memory Objects ranking lists:

  • What objects belong to the ABAP Web Dynpro runtime?
  • What objects belong to the Web Dynpro application?  
  • What belongs to the backend infrastructure? 

The old memory analysis functions could not tell the difference between one type of object and another.   

Here’s an example:

 What memory objects belong to ABAP Web Dynpro?

In the New ABAP Debugger, the application-specific memory analysis tool for ABAP Web Dynpro brings some order to the situation and lets you analyze the memory consumption of your ABAP Web Dynpro application much more easily and efficiently.

 Memory Analysis Views for ABAP Web Dynpro

There is also an extra filter option that lets you hide memory objects that do not pertain to the Web Dynpro application and runtime. In this case, that includes objects that belong to the ESI framework (web service infrastructure).  Here, you can see the effect of the filter – it lets you concentrate exclusively on memory objects that belong to ABAP Web Dynpro.

 Memory Analysis Objects filtered for ABAP Web Dynpro

 

 

 

Stephen Pfeiffer   is a senior developer in the ABAP Infrastructure development group.



http://www.sdn.sap.com/irj/scn/weblogs?blog=/pub/wlg/23873%3Futm_source%3Dtwitterfeed%26utm_medium%3Dtwitter%26utm_campaign%3DFeed%253A+SAPNetworkWeblogs+%2528SAP+Network+Weblogs%2529

Posted by AgnesKim
Technique/그외2011. 3. 27. 15:19

New Memory Analysis Features in NetWeaver 7.0 EHP2
Stephen Pfeiffer SAP Employee 
Business Card
Company: SAP AG
Posted on Mar. 24, 2011 03:42 PM in ABAP

 
 

In the old R/3 days, ABAP programmers rarely needed to analyze memory consumption in their programs.  The simple and robust ABAP memory model made it very hard for anyone to program a serious ABAP memory leak.

With the addition of more types of dynamic memory objects – not just internal tables, but also strings, anonymous data objects, boxed components – and a trend toward transactions that live longer, it’s now possible to write an ABAP program that can run into memory problems.

Since ABAP programmers can no longer ignore memory consumption, the ABAP Workbench has greatly enhanced the memory analysis tools in the New ABAP Debugger and in the Memory Inspector (transaction SMI or S_MEMORY_INSPECTOR).

This weblog showcases the enhancements that come with EHP2:

  • The Dominator Tree (or keep-alive tree) for containment hierarchies in memory objects – which runtime entity is keeping which memory object alive. This new analysis tool is offered in both the New ABAP Debugger and in the Memory Inspector. 
  • The Memory Object Explorer in the New ABAP Debugger, for navigating up and down through the memory objects in the keep-alive hierarchy of objects. Starting from any memory object, you can navigate up to its parents or down to its children, and you can take a look at the contents of each memory object.
  • A separate weblog shows you the new Application-specific memory analysis tools in the New ABAP Debugger – for looking at the memory consumption of ABAP Web Dynpro applications. There is also a similar tool for analyzing Web Service / HTTP applications.

You may also notice that doing memory analysis in the debugger is much more comfortable, and that we have made improvements in the user interface in the Memory Inspector as well. But we do not go into these UI changes in detail, nor will we look at the Memory Inspector transaction in this blog.

The New Features in Practice in the Debugger

You do memory analysis usually for one of two purposes:

  • You want to see how big your program – or objects in it - are in terms of memory consumption. Is it bigger than you expect?  Usually you’re in the New ABAP Debugger when you do this.
  • You want to see whether you have a memory leak. Does the memory consumption of your program change over time?  Or – how did it get so big that it dumped because of lack of memory – this question also sometimes comes up.

Usually, you’re comparing memory snapshots in the Memory Inspector (transaction SMI) when you’re doing this type of analysis.

Let’s see how the new memory analysis features work if you are in the debugger, just checking how much memory your program uses. Maybe you’re just being cautious in doing this checking – or maybe you’ve learned:  a long-running transaction or service with lots of dynamic memory objects means, you better check the memory consumption before your customers find out about it for you.  What you see here on the new memory analysis features applies just as well in the separate Memory Inspector transaction.

A Comfortable Quick Look at Memory Consumption – The Memory Analysis Tab

You don’t have to switch to a memory analysis tool anymore to see how much memory objects in your program are using.  There’s a new Memory Analysis tab in the new ABAP Debugger, on the Variable Fast Display. Just switch to the tab, click on a variable in your program, and you can see how much memory the object uses.

The Memory Analysis Tab in the ABAP Debugger 

Here we have two tables that have the same number of rows, and we happen to know that they contain the same data - except that IT_CUSTOMERS contains extra data.  Even so, IT_CUSTOMERS is smaller in memory.  How is the difference explained?   Does IT_CUSTOMERS offer a more efficient way to organize the table?  A closer look shows that we don’t understand how IT_CUSTOMERS is organized. The table statistics don’t seem to match the memory size.

  • The size of table IT_SCUSTOM is okay – 4637 rows of 16 fields with a row-length of 464 bytes is close to the Bound Used Memory (bound memory is the memory that would be freed if the table were cleared) plus some management overhead.   The table simply is that big.
  • The size of IT_CUSTOMERS is puzzling. It has the same number of rows as IT_SCUSTOM but a row length of 32 bytes. No way that the table body itself comes close to 1.7 million bytes of bound memory.

Analyzing Table IT_CUSTOMERS with the Memory Analysis Tool

To study table IT_CUSTOMERS, you activate the Memory Analysis Tool by choosing the New Tool or Replace Tool button, opening the Memory Management folder, and clicking on Memory Analysis.

 image

The tool that was available before EHP2 was the Memory Objects view, shown below in its form in NetWeaver 7.0 EHP2. This tool shows you the memory objects of the program (the representations of dynamic variables in the memory management system) ranked by size. 

Memory Objects also shows you which variables reference each memory object.  But the view does not help us to understand the differences between the IT_CUSTOMERS and IT_SCUSTOM tables. Essentially we just have an unstructured list of memory objects ranked by size.

 image

Here’s where the new Dominator Tree view (below) shows how useful it is.  In this view, it’s immediately clear how IT_CUSTOMERS is structured. The rows are so short because they contain only an object reference. The CUSTOMER objects in the table contain, by the way, a further object, ACCOUNT.  The CUSTOMER objects are also keeping several strings – variables NAME, ADDRESS, and CITY - alive.  The bound storage of the table is so high because the table is keeping all of the customer object hierarchies alive.  (Without the references from IT_CUSTOMER, the ABAP garbage collector would clear the objects away and free the memory.)

image 

The organization of the table is now clear. Where does the advantage in memory use of the IT_CUSTOMERS table come from, compared to table IT_SCUSTOM? 

To find the answer, you could copy the name of a CUSTOMER object into the clipboard, and start the Memory Object Explorer (available only in the debugger, not in the Memory Inspector). In the Memory Object Explorer (below), we can follow edges from the CUSTOMER object to its children, all of the objects that it references.

The Dominator Tree shows only strings and instances of classes that belong to the bound storage of each CUSTOMER class object. These are the objects that CUSTOMER keeps alive. The Memory Object Explorer, by contrast, also shows objects that are referenced by more than one CUSTOMER instance, memory objects that are in effect shared. Can it be that sharing of objects accounts for the reduced storage use?

We follow a CUSTOMER object to its referenced objects and then check some of the referenced objects themselves in the Explorer. COUNTRY looks like a good variable to check for sharing among multiple CUSTOMER objects. So we go from string {S:S9} ADDRESS-COUNTRY up the memory hierarchy – Higher-Level Memory Objects - to see the parents of this string, objects that reference this string.

image 

And in fact, as you can see below, the more efficient storage use in table IT_CUSTOMERS than in IT_SCUSTOM could be due to massive sharing of string objects in memory.  There are many CUSTOMER objects from our table that reference the same COUNTRY string.

image 

The IT_CUSTOMERS table design takes advantage of the fact that string variables that have the same content share a single memory object.  (See ‘value semantic’ in the ABAP Online Help (transaction ABAPHELP or at help.sap.com).  ABAP gives a string or other value-semantic object its own memory object only on an as-needed basis – if the string is changed.

In many applications, a table like this list of customers is rarely changed, but is used instead to guide processing. In this case, it may make sense to make use of memory object sharing to reduce storage consumption.  In this application, CUSTOMER attributes like the country-of-residence reference the same string, as long as the same country is being referenced.  You can check out whether the strategy is really worthwhile – when filling a table, when reading a table, in the event that object attributes are changed – by doing some testing of the alternatives in the ABAP Runtime Analysis, transaction SAT. (See the weblogs by Olga Dolinskaja on the new SAT.)

An advantage of doing memory analysis in the debugger is that you can see what the value of the variables is. This is something that you cannot do when you analyze memory snapshots in the Memory Inspector (transaction SMI).  A double click on string {S:S9} above shows us what country all of those customers inhabit:

 Displaying the Value of a Memory Object in the ABAP Debugger

But the Memory Inspector, in turn, has the advantage of being able to compare memory snapshots and show changes in memory use over time. That’s an invaluable capability for checking for leaks in a long-running ABAP application.

 

 

 

 

 

 

 

 

 

Stephen Pfeiffer   is a senior developer in the ABAP Infrastructure development group.


http://www.sdn.sap.com/irj/scn/weblogs?blog=/pub/wlg/23875%3Futm_source%3Dtwitterfeed%26utm_medium%3Dtwitter%26utm_campaign%3DFeed%253A+SAPNetworkWeblogs+%2528SAP+Network+Weblogs%2529

Posted by AgnesKim
Technique/그외2011. 3. 27. 15:17

Difference between transactions RS12 / SM12 ?
Martin Maruskin 
Business Card
Company: self employed
Posted on Mar. 25, 2011 01:35 PM in Enterprise Data Warehousing/Business Warehouse

URL: http://sapport.blogspot.com/2011/03/difference-between-transactions-rs12.html

Seems there is no difference; at least as of BW 3.0 and above. According SAP Note 316329 - Master data table locked locks on master data were handled differently in BW versions 2.0 – 2.1C. The difference is that not standard SAP lock mechanism was used but there was BW specific locking mechanism used. This was however removed in version 3.0 and subsequent. Therefore transaction RS12 is still in BW just because of BW history. As you can see the same ABAP report (RSENQRR2) is called by both TAs.

SM12

 

RS12

Martin Maruskin   SAP NetWeaver BW certified consultant


http://www.sdn.sap.com/irj/scn/weblogs?blog=/pub/wlg/23880%3Futm_source%3Dtwitterfeed%26utm_medium%3Dtwitter%26utm_campaign%3DFeed%253A+SAPNetworkWeblogs+%2528SAP+Network+Weblogs%2529

Posted by AgnesKim
Technique/그외2011. 1. 27. 10:20

Step by Step Guide for Language Translation Tool

Sai Ram Reddy Neelapu    Article     (PDF 403 KB)     06 January 2011

Overview

This document helps people to understand the steps involved in translation of standard SAP screen and also helps to change the description of standard fields.

Posted by AgnesKim
Technique/그외2011. 1. 27. 10:19

SAP Glossary availbale within App Store
''Florian Mueller'' Active Contributor Bronze: 250-499 points
Business Card
Company: Resource AG
Posted on Jan. 19, 2011 03:45 AM in Mobile

URL: http://itunes.apple.com/WebObjects/MZStore.woa/wa/viewSoftware?mt=8&ign-lr=Lockup_r2c1&id=402205056

SAP buzzword glossary released as iPhone App

During one of our technology evaluations we have created a SAP Glossary iPhone application containing several "SAP buzzwords" and corresponding explanations - nothing big but maybe useful for sales guys catching up some buzzwords during lunch and looking these up directly instead of talking about unknown technology...
If you want to provide additional content feel free to contact us, we are willing to extend contents, feedback is taken seriously!


You can download the application within the App Store, it's called "Resource Sales Glossar"

(iTunes Link)

Currently the glossary is base on German language we are planing to provide an English version as as soon as possible...


Cheers, Florian!

''Florian Mueller''  Active Contributor Bronze: 250-499 points Florian Müller works as solution architect for Resource AG (http://www.resource.ch) and is founder of richability (http://www.richability.com)


http://www.sdn.sap.com/irj/scn/weblogs?blog=/pub/wlg/23032%3Futm_source%3Dtwitterfeed%26utm_medium%3Dtwitter%26utm_campaign%3DFeed%253A+SAPNetworkWeblogs+%2528SAP+Network+Weblogs%2529

Posted by AgnesKim