Technique/그외2011. 1. 27. 10:00

An iPad/iPhone menu for SAP NetWeaver portal
Michael Nicholls SAP Employee Active Contributor Bronze: 250-499 points
Business Card
Company: SAP Australia
Posted on Jan. 26, 2011 02:11 PM in Enterprise Portal (EP)

 
 

When most users launch the SAP NetWeaver Portal, they use a browser to launch it and they get a menu which might look like this:

 

 

image

My aim was to build a portal component that could be used to create a menu system a bit like this:

image

When the user launches the portal component from Safari, they get a screen similar to this:

image

The user stores their username and password to access the SAP NetWeaver Portal using the Settings option:

image

Using the Save option then saves the values and relaunches the URL, appending the user's username and password. The user must have already logged on to the SAP NetWeaver Portal successfully so that they do not have an initial or expired password. Selecting an option leads through the different navigation levels until you get a screen that lets you actually start an iView:

image

Creating the portal component in NWDS involved creating a new portal project. I called mine Mobile. There is a set of JavaScript code that I need to use, and to make things easy, I put it under the dist/images folder and called it Includeme.js. Here is the project structure:

image

The main component is called SiteMap. It is of type AbstractPortalComponent, and we need to make it anonymous, so that it can still work even if the user has not yet been authenticated. The source for Includeme.js is here:

The code for the component is here:

A copy of the PAR file can be found here

Michael Nicholls  Active Contributor Bronze: 250-499 points is a SAP NetWeaver trainer for SAP Australia.



http://www.sdn.sap.com/irj/scn/weblogs?blog=/pub/wlg/23182%3Futm_source%3Dtwitterfeed%26utm_medium%3Dtwitter%26utm_campaign%3DFeed%253A+SAPNetworkWeblogs+%2528SAP+Network+Weblogs%2529

Posted by AgnesKim
Technique/그외2011. 1. 7. 12:43

A first (BW) issue...
Lars Breddemann SAP Employee Active Contributor Gold: 1,500-2,499 points
Business Card
Company: SAP Österreich
Posted on Jan. 06, 2011 06:27 PM in Business Intelligence (BI), Run SAP, SAP NetWeaver Platform, Software Support and Maintenance

In the second week working in BW support I noticed that there was a problem reported by customers again and again.
The customers complained about "database statistics update fails in process chains" or "brconnect run returns errors".
When checking the relevant log file entries for the actions there was always something like

ERROR at line 1:
ORA-20000: index "SAPR3"."/BIC/F100069~900"  or partition of such index is in unusable state
ORA-06512: at "SYS.DBMS_STATS", line 15181
ORA-06512: at "SYS.DBMS_STATS", line 15203
ORA-06512: at line 1

The more database savvy customers or DBAs sometimes had a quick resolution at hand: to rebuild the index.

And, indeed, after rebuilding the affected index (as I realized it was nearly always a 900 index) the statistics could be correctly gathered.
Problem solved, or not?

Well, only until the next compression run for the infocube happened.

Infocube compression, wait, like infocube.zip or what?

The term 'compression' is a bit misleading here and 'to condense' seems more appropriate.
To understand this action, we've to take a look into how fact table data is managed in SAP BW.
If you're yet unfamiliar with the BW-star schema (extended star schema) it might be a good idea to check the documentation or the BW related blogs and WIKIs here at SDN.

The fact tables are used to store the actual business data that should be reported on to. In the simplest case, this data is loaded from the source system (e.g. the ERP system) on a regular basis, say daily.
So, we get new data in more or less large chunks or packets.
The BW term for this is 'request'.
A data request is a specific chunk of data that only has one thing in common: it has been loaded into the fact table together. [more on this]

Requests and what they are good for

This request wise processing provides several options that are otherwise quite hard to achieve in datawarehouse systems:

  • we can load just the data that had been changed since the last load (delta loading)
  • we can check the data after each load for technical and quality issues and decide wether or not it should occur in the reporting
  • in case some of the data was not imported correctly, we can easily delete it again, without impairing the reports that use the already available data.

If you think about it, this means, that it's totally possible to load information about the same business objects (let's say: direct sales to high level customers in southern germany) can be loaded several times.´

If you load the sales transaction data every day to later on report it on a weekly base then you'll get the the sum of all the sales aggregated over a week - and use data of 7 requests work of data (1 request per day, 7 days a week).

But as we see, for our reporting requirement (sales on weekly base) it's actually not necessary to keep all the data load pakets (requests).
Once we are sure about that the data is technically and quality wise OK, then we might just as well sum up the data to the weekly level, store this and throw away the 7 requests.
This is what the compression of Infocubes pretty much is about.
In SAP BW this is implemented based on two tables:
the F-facttable to which all data load requests go and the E-facttable, which contains the information for the pre-aggregated (condensed) data. [more on this]

Getting closer...

On reporting time, the SAP BW OLAP engine knows about the fact that our data is stored in two tables.
So for every BW query against an Infocube, we usually see TWO nearly identical SQL statements, that only differ in the facttable that is actually used.

Now we have:

  • two tables,
  • we've data that needs to be read and aggregated request-wise from the F-facttable,
  • we've the aggregation result that has to be saved in the E-facttable
  • we've data that afterwards needs to be deleted from the F-facttable - otherwise some numbers would be included twice in our report!

Looking at this database requirements, there's an Oracle feature available, that really is made for it (it really is ;-)): PARTITIONING!

Without partitioning the final deletion of already condensed requests would require to

  • scan at least one full index in order to find all rows matching the requests to be deleted,
  • remove them
  • and update the indexes afterwards.

With partitioning all we've to do is to drop the partitions that contain our request.
That's the reason why the F-facttables are always partitioned based on the request dimension (on DBMSes that support partitioning, of course)

So, we can easily get rid of data, when the table is partitioned the right way.

But what about the indexes?

There's a treat in Oracle available for that as well: local partitioning of indexes.
Simply put this means: for every table partition a corresponding partition is created for the partitioned index.

With this, we don't even have to rebuild the indexes after dropping a table partition.
All we've to do is to also drop the corresponding index partition together with the table partition.
The remaining index will still be completely correct and will still cover all data in the table.

Ok, now we arrived at the start of the problem ;-)

All this sounds quite good.
In fact, it's great!
And (of course) here begins the problem.

This great combination of clever data design, implementation and database feature exploitation only works properly if the indexes really are partitioned exactly as the table is.
So BW has to take care of this since Oracle allows to create indexes with a different partitioning scheme or without partitioning as well.
If this is the case and we drop table partitions, then Oracle would have to read every row of the dropped partition to know every deleted row and take this information to maintain the index data.
Obviously this would render the partition advantage null and void.
So, Oracle simply flags all indexes for which the same drop partition cannot be performed as UNUSABLE.

Such UNUSABLE indexes can be repaired simply be rebuilding them.
The Oracle cost-based optimizer is clever enough to ignore those indexes (also see oracle parameter "skip_unusable_indexes"), so queries will not fail because of this.

Except... except we force Oracle to use the broken index by using a hint.

"Where the heck to we do such stuff?" - is that the question you've got in mind right now?

Well, we do it everytime you run the update statistics job.
And collecting CBO statistics after deleting lots of data from a central fact table in BW is usually done as part of the data loading process (chain).

In Oracle update statistics means calling a PL/SQL stored procedure in the DBMS_STATS package. And in there, Oracle will run SQL statements like this:

select /*+ no_parallel_index(t,"/BIC/F100069~900") dbms_stats cursor_sharing_exact use_weak_name_resl dynamic_sampling(0) no_monitoring no_expand index(t,"/BIC/F100069~900") */ count(distinct sys_op_lbid(15948,'R',t.rowid)) as nrw ,count(distinct sys_op_lbid(15948,'L',t.rowid)) as nlb ,count(distinct "KEY_IC_0025P") as ndk ,null as clf from "SAPR3"."/BIC/F100069" t where TBL$OR$IDX$PART$NUM("SAPR3"."/BIC/F100069",0,4,0,"ROWID") = :objn

p>No reason to understand the whole statement for now, but the important part here is the INDEX hint.
With this hint Oracle is forced to use the index (for which statistics should be collected).
If our index is UNUSABLE at this point in time, then Oracle has no other choice as to report:

ERROR at line 1:
ORA-20000: index "SAPR3"."/BIC/F100069~900"  or partition of such index is in unusable state
ORA-06512: at "SYS.DBMS_STATS", line 15181
ORA-06512: at "SYS.DBMS_STATS", line 15203
ORA-06512: at line 1

LOOOONG story short.

Mashing all this together:

WHEN
        we've a partitioned table with a non-partitioned index
AND
       we drop a partition of the table, so that the non-partitioned index is flagged UNUSABLE
AND
       when we finally run an update statistics on this table,
THEN
       we end up with our ORA-20000 error message.

Why do I tell you this, when BW does take care of the database objects so well?

Because sometimes it fails to do so.
One reason for such a failure is a bug that has been introduced with a SAP Basis SP.
This bug is already fixed with sapnote 1479683 but there are many customers around who haven't yet installed the note and who are just facing miraculous errors in their info cube compression process chains.

As you can see the connection between the ORA-20000 error message about the unusable index and the real cause is rather a long distance one, although straight forward once you undertand it.

The solution (ahem... TATAAA!)

To finally get rid of this problem, I wrote note #1513510.

In there you'll find that you have to
a) import the fix for the SAP Basis bug (note #1479683)
b) recreate (NOT just rebuild) the indexes

For step b) the easiest and quickest way is to use the same function modules that SAP BW does use for the this task.

Excurse... Aggregates are just little infocubes...

In note #1513510 I included an SQL statement to find all partitioned tables for which there are non-partitioned indexes, so that the DBA or BW-Poweruser can look up the problematic ones without having to wait for process chains to fail.
The statement looks like this:

select 
    /*+ 
    no_parallel_index(t,"/BIC/F100069~900")
    dbms_stats 
    cursor_sharing_exact
    use_weak_name_resl 
    dynamic_sampling(0) 
    no_monitoring 
    no_expand 
    index(t,"/BIC/F100069~900") 
    */ 
 count(distinct sys_op_lbid(15948,'R',t.rowid)) as nrw
,count(distinct sys_op_lbid(15948,'L',t.rowid)) as nlb
,count(distinct "KEY_IC_0025P") as ndk
,null as clf 
from 
    "SAPR3"."/BIC/F100069" t 
where 
    TBL$OR$IDX$PART$NUM("SAPR3"."/BIC/F100069",0,4,0,"ROWID") = :objn

If you run this statement you may come across tables like /BIC/F100234.
But you don't have any Infocube named "100234" - so what are those tables about?
They belong to aggregate infocubes. [more on this here]
These are (small) subsets of data that the OLAP processor can choose to deliver the reporting result much quicker. In this respect aggregates are very much like database indexes.
Since the aggregates really are subsets of the actual large infocube they have an F- and E-facttable as well and the same problem can occur wich them as well.

If you now want to know to which infocube a specific aggregate belongs to, you can easily look it up in the RSAGGRDIR table.
For every aggregate you'll find an entry with in the table that maps the aggregate to the infocube.

Checking this table in SE16 delivers an output similar to this:

Table:          RSDDAGGRDIR

   AGGRUID                   OBJVERS INFOCUBE   AGGRCUBE ...
                                                         ...
   03T89W5IUUEFPARPLPZ29YZF3 A       0BWVC_AGV  100108   ...
   200H0IWIR23WASLOESNT8BKJZ A       0BWVC_AGV  100099   ...
   3VILFVABC6MYNF9R10M0WYVHR A       ICD05      100772   ...
   3VIVFYBTHUCH8HF58HQCRGJXS A       ICD03      100778   ...
   3VIVG4AW8R8BQ0JPRVJWKZJZK A       ICD03      100779   ...
   3VIVG8ZVTWHX3SFLC8ZEQ6RQO A       ICD03      100780   ...
   3VP09ETI53LHVKWQHLL79RK5X A       RSDRICUBE  100032   ...
   40VXFTXRAJ6NNT88CWOEA3LYN A       0BWVC09CP  100071   ...
   40VXFU60ZFEEYGCAFPLSI952N A       0BWVC09CP  100072   ...
   40VXK7M8IUTH0IH052QGOF94F A       0BWVC09CP  100073   ...
   [...]
   

Ok, I hope some of you really made it to the end of this rather lengthly first BW blog post. I'll try to keep them shorter in the future ;-)

regards,

Lars

Lars Breddemann  Active Contributor Gold: 1,500-2,499 points is senior support consultant at SAP global active support.



http://www.sdn.sap.com/irj/scn/weblogs?blog=/pub/wlg/22941

Posted by AgnesKim
Technique/그외2010. 12. 23. 14:45

The Future for SAP Business Intelligence
Nic Smith SAP Employee 
Business Card
Company: SAP
Posted on Dec. 22, 2010 08:27 PM in Business Intelligence (BI), Analytics, Business Objects, Business Solutions, Crystal Reports

 
 

Watch this video on the future of Business Intelligence at SAP:

 


 

Stay connected with SAP BusinessObjects 4.0

www.sap.com/analytics 

www.facebook.com/sapanalytics 

www.twitter.com/businessobjects 

more on the SAP Business Analytics blog: http://blogs.sap.com/analytics 

 

Nic Smith   Insight on Business Intelligence at SAP



http://www.sdn.sap.com/irj/scn/weblogs?blog=/pub/wlg/22713%3Futm_source%3Dtwitterfeed%26utm_medium%3Dtwitter%26utm_campaign%3DFeed%253A+SAPNetworkWeblogs+%2528SAP+Network+Weblogs%2529
Posted by AgnesKim
Technique/그외2010. 12. 23. 01:37

SOA Design Principles - Service Abstraction
Farooq Farooqui 
Business Card
Company: Cognizant UK
Posted on Dec. 22, 2010 06:10 AM in SAP Process Integration (PI), Service-Oriented Architecture

 
 

Service Abstraction is one of the important principles of service design. Using service abstraction, provider can abstract technical and implementation service logic from the consumers. Application of this principle converts services into back box. The consumer has only required functional and technical information, which goes in service contract.

 

    

 

The key question is why we understand and apply service abstraction during the Service design? To answer this question let’s take an example a service ValidatePassport.

 

A service ValidatePassport contains different service operations. GetPassportDetails, for instance, is one of the operations that takes the passport details such as passport number, country of issue, etc and return the passport details. The moment consumers trigger the service he wants response back. To full fill this contract, a provider implemented a service and used different systems resources such as Middleware, SAP system, and Oracle database. At runtime service ValidatePassport check the system which is up and running. And accordingly it will read the details from one of the system and send the response back to the caller.

In this instance a provider uses different systems, applications, and programming logics. If provider does not abstract this information from the consumer then consumer can assume many things and make wrong judgment at the time of implementing the service at his end. Moreover, each time provider has to notify consumer if there is change in system, programming language, or implementation logic. Hence service abstraction gives freedom to owners to evolve their service and change implementation logic as required.

What is Service Abstraction?

  • Hiding design details and implementation logic from the outside world.
  • Turning service to a black box by hiding systems information, processing logic, and programmatic (Java, ABAP etc) approaches.

Why Service Abstraction require?

  • Encourage provider to share less information with outside world.
  • Give freedom to consumers to implement service efficiently – without any assumptions and wrong judgments.
  • Encourage service provider to advance the service using different IT technologies and IS systems.

What shouldn't goes in service contract is identified by the service abstraction?

  • Technical and implementation logic of the service
  • Programmatic logic such as technical systems, programming languages, and technical frameworks.
  • Systems processing details –involvement of systems when service is executed.
  • Business rules that process service request for different input messages.
  • Systems resources, internal validation, error handling, authentication, certificates, and business process etc.
  • Uncertain conditions such as availability of the service, possibility when service can become less responsive, etc.
  • Composite services description and details – if main service is composed of many other services. 

Farooq Farooqui   is a SAP certified consultant. He posses honors degree in electronics engineering and has nearly 5 years of SAP experience.




http://www.sdn.sap.com/irj/scn/weblogs?blog=/pub/wlg/22771%3Futm_source%3Dtwitterfeed%26utm_medium%3Dtwitter%26utm_campaign%3DFeed%253A+SAPNetworkWeblogs+%2528SAP+Network+Weblogs%2529
Posted by AgnesKim
Technique/그외2010. 12. 10. 18:26

Web Version of ABAP Keyword Documentation
Horst Keller SAP Employee Active Contributor Bronze: 250-499 points
Business Card
Posted on Dec. 10, 2010 12:58 AM in ABAP

 
 

Now that AS ABAP, 7.0, EhP2 is released, there are no resaons any more to restrict the Web Version of the ABAP Keyword Documentation to Release 7.0.

And here it is, the 7.02-Version (thanks to the colleagues who made it happen):

http://help.sap.com/abapdocu_702/en/index.htm
http://help.sap.com/abapdocu_702/de/index.htm

Last but not least, the 7.02-version has lots of improvements regarding contents, even for 7.0-subjects. Besides an improved structure, there is e.g. a new chapter on date-time-processing (http://help.sap.com/abapdocu_702/en/abendate_time_processing.htm) or many  chapters were reworked as e.g. the one on built-in data (http://help.sap.com/abapdocu_702/en/abenbuilt_in_types_complete.htm) and their implications (e.g. http://help.sap.com/abapdocu_702/en/abenconversion_elementary.htm or http://help.sap.com/abapdocu_702/en/abenlogexp_rules_operands.htm).

Have Fun!

Horst

PS: The 7.0-version is still available under ".../abapdocu_70/...".

 

 

 

Horst Keller  Active Contributor Bronze: 250-499 points is a Knowledge Architect in the department TD Core AS&DM ABAP of SAP


http://www.sdn.sap.com/irj/scn/weblogs?blog=/pub/wlg/22553%3Futm_source%3Dtwitterfeed%26utm_medium%3Dtwitter%26utm_campaign%3DFeed%253A+SAPNetworkWeblogs+%2528SAP+Network+Weblogs%2529

Posted by AgnesKim
Technique/그외2010. 12. 9. 17:55

SAP NetWeaver Portal 7.3 – Unified Access to Applications and Processes
Aviad Rivlin SAP Employee Active Contributor Gold: 1,500-2,499 points
Business Card
Company: SAP Labs Israel
Posted on Dec. 09, 2010 12:30 AM in Enterprise Portal (EP)

 

Dear portal experts,

 

As you probably already now, the next major release of the Portal (also known as SAP NetWeaver Portal 7.3) was successfully released for ramp-up after a very successful beta program.

 

In this blog, I would like to share with you some of the new functionalities that are part of the new portal release, but fist let’s align on the release numbers. The current and most adopted SAP portal version is the SAP NetWeaver Portal 7.0 with enhancements package 1 and enhancements package 2. When talking about NetWeaver 7.3 release, please note that this is neither a support package nor an enhancement package; it is the next major release of the SAP NetWeaver Portal with plenty of new capabilities and functionalities.

 

Those of you, who have attended the Unified Access to Applications and Processes session at SAP TECHED 2010, should already be familiar with these topics. But for those of you who did not have the chance to join us at TECHED, let’s start…

 

The next major release of the portal provides enhanced functionalities in various areas: access to SAP and non-SAP application, simplification of content creation, Web Page Composer enhancements, Knowledge Management enhancements, Wiki’s and more. In this blog, I will concentrate on the key enhancements in the area of content creation and access to SAP and non-SAP applications.

 

  • Simplified implementation process for SAP Business Packages – a simplified process for business packages implementation with guided wizards for system aliases assignment and role customization. Refer to the diagram bellow explaining the new and simplified process:

 

 

 

  • Simplified content creation via automatic iView (application) upload from the back-end system to the portal.

 

  • Top-down approach for role creation - the ability to start building the navigation hierarchy from the role itself and create\add folders, pages and iViews from within the role (this will simplify the work, especially for those of you who are new to the Portal).

 

  • Enhanced transport capabilities archived via integrating with the Central Transport System (CTS+) and the new concept of synchronized folder. A synchronized folder is a folder within the transport package that contains references to the PCD objects you would like to transport. Once the transport package is actually exported, only then, the PCD objects are grabbed into to the transport package (for those of you who are looking for the traditional behavior of the transport package, no worries, it is supported as well, side by side with the new functionality).

 

  • Enriched Portal role types – with the new portal version we have 4 different role types: free-style roles (the roles as you know from the current portal version), workcenter roles (roles that implement the workcenter role user experience guidelines), roles from packages (role from SAP Business Package) and role from PFCG (uploaded roles from the PFCG ABAP repository). We will deep dive into the details of each role in one of the coming blogs.

 

  • Interoperability with 3rd party Portals – many customers are running a dual vendor portals approach (i.e.: more than one portal vendor in the organization), and the typical question is: how can I expose to the end user one, harmonized portal, although from an IT perspective I have more than one portal vendor? The solution covers: branding alignment, single sign-on and session management.

 

  • AJAX framework – many documents and sessions were already gives about the new AJAX framework and from now on it is available for you to use! This is actually the new default framework page when installing the portal. You can find plenty of information about the AJAX framework already today and plenty more to come…

 

  • QuickLaunch – a new navigation concept for navigating within your portal though a search capability. For example, you are looking to create a new purchase order, but do not know where to find this iView. You simply type\search for the word “purchase” and all the iViews\pages\roles with the sub-string “purchase” are available for you.

 

       

 

 

That’s it for this time. You can expect more details in the coming weeks. If you are interested in specific topics in the area of the SAP NetWeaver Portal 7.3, please list them bellow and I will do the best to cover them in the coming blogs.

 

 

Happy New Year!

Aviad Rivlin 

 

 

Aviad Rivlin  Active Contributor Gold: 1,500-2,499 points is a member of the SAP NetWeaver Portal Solution Management Group


http://www.sdn.sap.com/irj/scn/weblogs?blog=/pub/wlg/22431%3Futm_source%3Dtwitterfeed%26utm_medium%3Dtwitter%26utm_campaign%3DFeed%253A+SAPNetworkWeblogs+%2528SAP+Network+Weblogs%2529

Posted by AgnesKim
Technique/그외2010. 11. 2. 17:38

BI annotator

BI annotator - July 30, 2007

BI annotator allows allows business objects (BI) users to relate and combine text data to structured data from a database/datawarehouse. Both text and relational data are exposed to business users via a single SAP BusinessObjects semantic layer (or universe) and directly consumed by BusinessObjects XI R2 reports..

Relating external text data and corporate data adds tremendous value to business intelligence applications by increasing the contextual information available to users for making their business decisions. Examples include combining farm lot crop yields with local temperature data feeds, combining company revenues by geography with US government demographic data feeds, or integrating ERP manufacturing information with a hand created sales forecast from an Excel spreadsheet. Managing mixed data access through the BusinessObjects XI R2 BI platform increases the reliability, security, and audit-ability of the resulting BI solution. The seamless integration of relational and text data will allow users to create reports analytics and dashboards that provide a 360 degree view of their business.

 


How can BI annotator helps with Business Intelligence solutions?

BI annotator integrates text data feeds with relational data sources by indexing business dimension values and parsing the text to generate a relationships star schema that maps the individual text items to the dimensional values in a database/data warehouse.

The text-to-dimensions relationships are maintained in a 'Coordinates' table. This 'Coordinates' table created and populated by the BI Annotator at index time maintains the relationships between the dimensions of interest in a given universe and the text items. This 'Coordinates' table contains joint data that allows easy end user integration of the relational and text data. It flexibly relates the text to the various dimensions of the universe.

The Physical View of BI annotator's architecture is shown here:

 


Enjoy!

We hope you use this prototype to discover how integrating text data via a universe can change the way people consume information. Please give us feedback to make sure we know how you're using this and what else you would like to see it do in the future. And remember, this is a prototype only and NOT for use in production environments.

And remember, this is a prototype only and NOT for use in production environments.


Download BI Annotator Prototype


http://www.sdn.sap.com/irj/boc/innovation-center?rid=%2Fwebcontent%2Fuuid%2F608383af-44f1-2b10-02a9-a472d39ef364
Posted by AgnesKim
Technique/그외2010. 10. 26. 19:16

New SAP iPhone App - SAP Business One Mobile Application
Karin Schattka SAP Employee 
Business Card
Posted on Oct. 26, 2010 02:30 AM in Mobile

URL: http://itunes.apple.com/app/sap-business-one-mobile-application/id392606876

 
 

With the SAP Business One mobile application for iPhone, you can view reports and content, process approval requests, manage customer and partner data, and much more.

Key features:
• Alerts and Approvals - Get alerts on specific events - such as deviations from approved discounts, prices, credit limits, or targeted gross profits - and view approval requests waiting for your immediate action. Trigger remote actions, and drill into the relevant content or metric before making your decision.

• Reports - Refer to built-in SAP Crystal Reports that present key information about your business. Add your own customized reports to the application, and easily share them via e-mail.

• Business Partners – Access and manage your customer and partner information including addresses, phone numbers and contact details, view historical activities and special prices; create new business partners and log new activities; contact or locate partners directly. All changes automatically get synchronized with SAP Business One on the backend.

• Stock Info - Monitor inventory levels, and access detailed information about your products including purchasing and sales price, available quantity, product specifications and pictures.

Please download SAP Business One Mobile Application directly from iTunes store and watch the YouTube video:

Karin Schattka   is part of the SAP Community Network team.


http://www.sdn.sap.com/irj/scn/weblogs?blog=/pub/wlg/21777%3Futm_source%3Dtwitterfeed%26utm_medium%3Dtwitter%26utm_campaign%3DFeed%253A+SAPNetworkWeblogs+%2528SAP+Network+Weblogs%2529

Posted by AgnesKim
Technique/그외2010. 10. 1. 11:32

제작년인가 프로젝트 하면서 산출물 만들 때 많이 당했던 오류라.. 스크랩.

http://myjay.byus.net/tt/trackback/689

친절한 manual 을 만들어주신 @myjay_ 님께 감사 :)

Posted by AgnesKim
Technique/그외2010. 8. 12. 22:12

Twibap: the ABAP Twitter API
Uwe FetzerActive Contributor:
Business Card
Company: SE|38 IT-Engineering & Consulting
Posted on Aug. 12, 2010 04:34 AM in ABAP, Beyond SAP, Code Exchange, JavaScript

URL: http://dev.twitter.com

 
 

Prolog

You probably know my last year's SCN Blog post "A story about Twitter, XML and WD4A" ;-)
In April this year I've received a short DM from Mark F. "getting #SAP #ABAP on this list would be wholesome" (he referted to the site: http://dev.twitter.com/pages/libraries).
Nice idea, I thought, give me time until the summer break.

End of July I wanted to refresh my Twitter API knowledge by reading the docs and I saw this message on the dev site: "The @twitterapi team will be shutting off basic authentication on the Twitter API. All applications, by this date, need to switch to using OAuth."
No problem, with OAuth I've battled already while developing Wave and Streamwork apps. (haha, more later)

Chapter 1 - The Twitter API

The Twitter REST(?) API is pretty good described on http://dev.twitter.com/doc I think, no further explanation is needed here.


Chapter 2 -  The JSON Parser

In my Twitter WDA client I've used the XML response. Since I felt in love with JSON while working with Python, I've decided to use JSON this time.
First prob: how to parse JSON ABAP?
My search on SCN found the nice JSON function group from Quentin Dubois (Wiki page)
Because the Twitter response contains not only flat data but also embedded objects (a status object always contains a user object) and some responses are arrays, the mentioned function group is not really the solution I needed, so I've decided to write my own parser (but with parts of the code of the module, hope Quentin doesn't kill me now).

The result of a parsed JSON object is now a hashed key/value table, where we can read each element by simply call e.g. "text = simplejson->get_value( 'text' ).".
Is the result of the read element again an object, you have just to parse it again:
user = simplejson->get_value( 'user' ).       "returns another object
user_data = simplejson->parse_object( user ). "parse object
simplejson->set_data( user_data ).            "set new data in parser
screen_name = simplejson->get_value( 'screen_name' ).  "get element

The result of a parsed JSON array is a standard table of the hashed key/value table.

With this tiny simplejson helper class I wrote my first twitter API classes, and since the basic authentification is not cut off yet, all test went well until here.

Chapter 3 - OAuth

The next step of the journey was the implementation of OAuth. With a look at my Python sources (Streamwork OAuth implementation) and the first chapters of the Twitter OAuth docs it seemed very familiar and I began with the realization.


An OAuth request contains, amongst others, two fields called "oauth_nonce", a string with random characters, and "oauth_timestamp", the seconds counted from Jan. 1st 1970.
Because there are no standard functions (I think so), I've developed two small helper methods:


Hint under friends: if the timestamp is not correct, Twitter will refuse your request, believe me ;) -> set your system time correctly!

-> read more: OAuth at Twitter

Chapter 4 - HMAC-SHA1

But the first test results brought me back down to earth: I've overseen the tiny remark "Twitter requires that all OAuth requests be signed using the HMAC-SHA1 algorithm." WTF?
Streamwork uses PLAINTEXT authentification, but what is HMAC-SHA1? Googlegooglegoogle

The search brought me two results (ok, much more than two, but these two are the most relevant ones):
- a SHA1-Javascript library
- a simple note in an SCN-Forum post, means there is a function module called "CALCULATE_HASH_FOR_CHAR"

An SHA1 function module? Great. Looking into the source of FM "CALCULATE_HASH_FOR_CHAR" and the question marks in my brain appeared again (only a system-call in it). What does the FM docu say? Nothing, no docu available. The usage of this FM was definitely too "hot" for me. What, if I overwrote some needed cryptographic stuff in the system. Not fatal on my own systems, but what about client systems? No, thanks.
Fortunately I remembered, that I've read somewhere somewhat about the usage of Javascript within ABAP. SE24, "CL_*JAVA*", <F4> -> et voilá: found the class "CL_JAVA_SCRIPT".
Google again -> points me to this SAP help site


Again WTF....

But thanks god (and SAP!), we still have the old docus available: here you can find the relevant part from NW7.0

Chapter 5 - Javascript

Although I don't like Javascript very much, playing around with the CL_JAVA_SCRIPT class, I was surprised about the functionality of the class. Even whole ABAP-OO classes can be bound the Javascript source.
A CL_PYTHON would definitely be better, but the class works great atm and is probably the only way to use open source libraries for functions not delivered by SAP!

Back to topic:
my first experiments with the class I've done like described in the docu: with inline code. But for sure, this is not the solution I want to build into the API. Where to store the Javascript sources? Where they belong: in the Mime repository.
Now we have the SHA1 library and an additional single liner called twibap.js stored in the mime repository and with this code snipped we can load the source back into an ABAP string:

twibap.js contains:

and with this code we finally can sign the message:

In addition I only had to develop my own encoding method called “percent_encode”, because the "cl_http_utility=>escape_url()" method doesn't fit to the OAuth dictate, where the only characters you can ignore are "- _ . ~" (and some other abnormalities).

The whole Twitter workflow works nice now, but I was not very satisfied with this JS solution.
Therefore back to SAP notes and google for a deeper search for more information about the function module "CALCULATE_HASH_FOR_CHAR".


Chapter 6 - The SecureStore

In Note 1416202 I finally found the answer. The function modules are NOT "secret", but "The raw documentation was not activated."
With NW 7.01 SP7 the documentation will be delivered (I was so close to install SP7 on my system..., but luckily I found the docu in the infinite vastness of the internet).

In the documentation of the function group "SECH" and its function modules we can read, that we can use these function modules for our own purposes. So did I:

Hey, it works! Party! Trashed the Javascript part.

Boom, Dump, failed again. What happened? In the first steps of the OAuth authorization process (request_token etc.) the oauth_secret contains only the consumer_secret (42 characters + "&").
The function module 'SET_HMAC_KEY' works brilliant until that point, where I want to sign a user action (e.g. sending a tweet). In this case the secret combines the consumer secret and the token secret (of the user). The function module responses with an "parameter_length" exception.
With some experiments I found out, that the FM only accepts 81 characters as maximum. Hey, why? I only want to SET the key, no process at this moment. And in addition: nowhere in the HMAC-SHA1 OAuth key definition is a length maximum mentioned.

In my despair I opened a SCN forum thread.
And what a surprise (or not): no 24 hours later I've got the solution :) SCN members rock!

The solution: if the key is longer than 81 characters, we have to store the hash of the key, not the key itself (still don’t know why).
The code snippet:

Now the Twitter API is finally finished...

Chapter 7 - A simple client

... and we can build our first simple (very simple!) Twitter client (output of our own home timeline):

Time to build a SAPLINK nuggets for the beta test. To test the nugget I've imported it to another system, activated all sources (ignored errors, because of recursive definitions) and started the test client.
Oh noooo, again -> Dump: FM 'SET_HMAC_KEY' does not exist.

Chapter 8 - Back to Chapter 5

Why? And why me? The answer of the first question is easy. I've developed the API on 7.01 and imported the nugget in a 7.00 system (but I cannot answer the second Q)
Thanks to my decades long experience ;) I've only stared out the Javascript part. Now I only had to activate the part again, create a nugget especially for 7.00 systems and include the Mime objects into the nugget again.

Epilog

You: "And for what is it good for?"
Me: "No idea"
You: "But why did you make it"
Me: "You could also ask: Why are you running Android 2.2 (Froyo) on a WindowsMobile phone. The answer would be the same: Because it works, and it makes so much fun ;)"


(probably the tiniest Froyo phone of the world)

http://www.sdn.sap.com/irj/scn/weblogs?blog=/pub/wlg/20474%3Futm_source%3Dtwitterfeed%26utm_medium%3Dtwitter%26utm_campaign%3DFeed%253A+SAPNetworkWeblogs+%2528SAP+Network+Weblogs%2529

Posted by AgnesKim