Quantcast
Channel: SCN : Blog List - SAP HANA Developer Center
Viewing all 676 articles
Browse latest View live

Upgrade your HANA to SAP HANA SPS 07 Database Revision 70

$
0
0

Prerequisites:

 

  • The SAP HANA lifecycle manager (HLM) is installed and configured on the SAP HANA system.
  • Read the SAP HANA Update and Configuration Guide.
  • For offline update, Make sure that all the necessary installation files for the individual components available in the repository.
  • Make sure that  a valid, permanent SAP license for the SAP HANA database is applied.
  • Take the necessary backups for the SAP HANA database and the other components (including config files).

 

 

Update Process:

 

  • Stop all processes.
  • Perform an automated Update with the SAP HANA lifecycle manager (HLM).
  • Update the depending components.
  • Perform the post-update steps.
  • Restart all processes.

 

Please note the following:

 

  • Do NOT under any circumstances upgrade from maintenance revision 69.01 to SP revision 70.00  (SPS 07) due to the incompatibility of these 2 versions.
  • In SAP HANA Database Revisions 64 - 66 due to a programming error it is possible that an inconsistency between main and delta of column store tables can occur. So you  need to follow these instructions.

 

    1. Prevent the inconsistencies from happening again by running the python script(SAP note 1919033).

 

    • There is the python script keepTransaction.py attached to the note 1919033

 

    • Download the python script and start it on your system. Therefore, login as sidadm user and execute the following command:

nohup python -u <path_to_python_script>keepTransaction.py <master indexserver host> 3<instanceid>15 <username> <password> &

 

    • The database user provided as input parameter for the script needs at least the roles "PUBLIC" and "MONITORING".

 

           2.   Before upgrading the system ,merge all tables to repair potential inconsistencies as described in SAP Note

1919034

 

    • There are 4 files attached to the SAP note 1919034 including several SQL Statements and procedures. Additionally, there are explanations for each statement and procedure in the files.

 

    • Make sure that the script keepTransaction.py is running.

 

    • In the attribute.ini section idattribute set parameter check_duplicates_on_merge = on

 

    • Execute all statements in preparation-1.sql as SYSTEM user. This creates a temporary schema, helper procedures, tables and views.

 

    • Check with the statements provided in preparation-2.sql which users own schemas and need to grant privileges to user SYSTEM. Additionally, there is the option to adjust the tables which will be merged. This adjustment is not recommended because all tables should be merged before upgrading the system.

 

    • Each user which was found in the above step needs to adjust and run the SQL statements in preparation-3.sql to allow SYSTEM user to merge the tables (details can be found in the file).

 

    • As SYSTEM user check with the statements provided in execution.sql if all rights have been granted and start the merge process by calling procedure MERGE_ALL_TABLES.

 

    • When the procedure has finished successfully, upgrade the system to SP7 or any newer revision..

 

    • Set the parameter check_duplicates_on_merge back to default value.

 

    • If the system cannot be upgraded immediately after performing above steps, the script keepTransaction.py needs to be running until the upgrade.

 

Important SAP Notes:

 

1944771 - SAP HANA SPS 07 Database Revision 70

 

1919033 - Related to 1918267: How to run script keepTransaction.py

 

1919034 - Related to 1918267: How to run procedure MERGE_ALL_TABLES

 

1948334 - SAP HANA Database Update Paths for Maintenance Revisions

 

1918267 - SAP HANA DB: corrupt deltalog of column store table


SAP HANA SP07 brings New Improved Decision Table to Accelerate your Modeling Experience

$
0
0

In this blog I unveil new features, of Business Rules, introduced in SAP HANA SP07 release. One of the major highlight in this release - is the Team Provider integration - that helps to model decision table in more robust and easy access mode; more importantly without system connection - this lets you to model rules in an offline mode and can later synchronize them with your HANA server. Besides, there is new, improved, richer and engaging OneView user experience of working with decision table - the Same with More stuff.

 

The concept of Team  is well known in eclipse world. It is a component that is responsible for providing repository tooling integration in eclipse . For example EGit is eclipse Team Provider for Git - one form version control system . Version control system is a way in which multiple version of the files are managed and maintained. Version control becomes important when developers work in distributed environment from various locations – in connected or disconnected manner and comes with additional benefits of collaboration, change management and continuous integration. Here are certain terms and operations that you must know before venturing into the team provider – Repository, History, Latest, Revision, Share, Unshare, Checkout, Commit, Synchronize, Merge etc. If you interested in more, refer official eclipse guide or any other material shared over Internet.

 

Now as you have become familiar with Team Provider by eclipse, let me also broadly give you an overview of SAP adaptation to eclipse 'Team Provider'. Since SP06, a new perspective named as SAP Development Perspective was introduced. This perspective is now recommended approach to model information models like Attribute view, Calculation view etc. Users can model their information model in an offline mode and later connect them to HANA system. Users may then choose to remodel or modify them from Modeling perspective or SAP Development perspective, in HANA Studio, depending upon their preference to work. While former works only in Online mode, the latter is well equipped to support development in Offline mode. These modes and their benefits lies only during modeling in HANA Studio, they are not related to activated objects and does not contribute in any way to the performance when used in application. This blog, however, does not focus on these working modes and their benefits or short comings. You may read SAP HANA Developer Guide to know more about them.  As these terms and features must be known to users before working with decision table in SAP Development perspective, I chose to brief you on them before starting on real content of the blog. Write to me if you need more information on them.

 

Let us explore the benefits of decision table venturing into SAP HANA Development space. The very first and most important benefit is the capability to work in offline mode and later connecting it to system. Second benefit lies in the fact that modeling is done at the intermediate layer in much faster way as there is no communication of local files with the system which nullifies the latency which at-times occur due to network problems. Third major benefit is better reliability, tracking and collaboration in distributed development environment and final benefit lies in the fact that version control mechanism that is known-to-most and can be explained with ease Users can now create, modify and delete decision table in offline mode and commit and activate in online mode.

 

Here I would guide you to important steps that would help you do the understand various operations pertaining to Decision Table in Team Provider. You may choose to use these operations in same or different sequence.

 

  1. Open SAP Development perspective in HANA Studio
  2. Select Repositories view and Create repository workspace

    Image1.jpg
  3. Select Project Explorer view and Create a project if not already done
  4. Select the project and right click menu option New > Others…
  5. In the dialog, choose Decision Table under SAP HANA > Database Development
    Note: As the project is not yet shared, decision table will open in Read-Only mode.

    Image2.jpg
  6. Share the project

       Note: After sharing the project, you will see system information beside the project name

 

       Image4.jpg

 

7.    Model the decision table with new enhanced and improved editor - OneView Editor
       [See the section below on OneView editor]

 

8.    After modeling the decision table, the last step is to Commit and Activate the decision table.

      Note: The change in the marker of decision table before and after Activation.

 

       Image8.jpg

 

Note : As soon as you Share the project with HANA Server, you can as well see your Decision Table in Modeler perspective under the Package with same name as that of the project. If you have specified a package while creating decision table in SAP HANA Development Perspective then you will it in that respective package. See below -

 

Image9.jpg

 

 

Decision Table OneView Editor
With SP07, Decision table has now a new improved look – similar to other information models – named as OneView editor. OneView implies to providing same user experience for all information models in HANA Modeler. As a result, you will see decision table editor with four panels – Scenario, Details and Output and Properties. Each of these panels have a characteristics associated with them.

 

Image51.jpg

 

 

(a) Scenario Panel
This is one of the important panels which has two nodes ( i ) Data Foundation ( ii ) Decision Table

 

( i ) Data Foundation
When you select this, you will find provision to add physical table/table type/information models on Data Foundation node in Scenario panel.


Image6.jpg

 


Note : All the physical table and table types in the System could be found at Content node under SAP HANA System Library in your project. You can use them drag-n-drop them directly from here onto the Data Foundation.

Note : All the information models in the System could be found at Repository Content node under SAP HANA System Library. You need to checkout them in your project first before using them in Data Foundation.

 

Image10.jpg

 

Once the data foundation is ready – with decision table condition and actions, select Decision Table segment in Scenario panel to further modify the content of the decision table.       

 

 

Decision table
When you select this, you see decision table structure in the Details panel. Use various options to complete the modeling of decision table by filling the required adding cells values or changing the layout etc.

 

Image7.jpg

 

(b) Details Panel
This panel changes based on the selection in Scenario panel nodes.

 

(c) Output Panel
Unlike Details panel, output panel is independent on the selection of Scenario nodes. It contains Vocabulary, Condition and Actions of Decision table. This is same as the decision table output panel in old editor, with same set of operations like change layout, Add Condition, Add Actions, Create Parameters, Create calculated attribute, Edit, Remove etc.

 

(d) Properties Panel
This panel is based on the selection on nodes of the Output panel. It is mainly used to view the metadata information of all the nodes, including decision table node in the output panel. It is only through this panel that an attribute can be aliased.

 

With OneView editor and Team Provider integration, we have provided a major overhaul to what you see when we launched the decision table for first time in SP05. While the underlying content and execution model has not changed, the look-and-feel of how you author and manage the decision table has been significantly altered. This new interface, together with techinical benefits of Team Provider - is to better end the hassels of those who remain divided with different set of user experiences in same modeling environment.

 

You are now well equipped with the first-hand information required to work with decision table in OneView editor and SAP Development perspective. For more information refer Developer Guide. Write to me or drop in comments if you have any suggestions.

HANA Studio for Mac now allows native XS development

$
0
0

Hi Developers and fellow Mac OS-X aficionados,

 

This might have gone unnoticed as I learned on Hacker News, so let me tell you that the HANA Studio version that is available for download on the Developer Center does work with HANA XS development now. No need for regi, you can just commit from your local into your remote workspace, activate content, etc.

 

Keep in mind that HANA Studio for Mac is not supported by SAP. Don't use it for anything else than HANA Developer Edition. Having said that, it works just fine

 

have fun with HANA on Mac

--juergen

 

PS - for those Mac users who don't know:

  • the JDBC drivers distributes in the HANA Client bundle are pure Java drivers. So you can download and extract any (64-bit) OS version of HANA Client and just grab the ngdbc.jar and it'll work just fine with your Mac
  • the node.js drivers on SAP/node-hdb · GitHub (Open Source under Apache License 2) work nicely on a Mac. And are lightning fast. And Open Source. And easy to install (if you already have Node installed, it's npm install hdb and you're done)

My HANA Certification (C_HANAIMP_1) Experience

$
0
0

Hi All,

I have cleared the SAP Certified Application Associate - SAP HANA 1.0 examination last week.

Through this blog, I would like to take you all through the experience I had and would like to share some information pertaining to the certification.

 

Those who have limited knowledge on this certification, please go through the following link:

(http://training.sap.com/v2/certification/c_hanaimp_1-sap-certified-application-associate---sap-hana-10-g/)

1.jpg

  • It’s for Application consultants.
  • Exam Code: C_HANAIMP_1 (80 questions, 180 minute)
  • Certificate awarded: SAP Certified Application Associate - SAP HANA 1.0
  • Topics: Business content, Data Modelling, Data Provisioning, Optimization, Reporting, Security and Authorization

 

I have done this certification with my personal interest, I did not undergo any sort of training but prepared completely by myself. To appear for this certification without attending any SAP course/training, you need to fulfill the following criteria. I have done the exam in India and I’m not sure if the following will change in other countries.

  • You need minimum 1 year of HANA implementation experience or 2 years of Support experience with minimum of 6 months experience in the module and version in which you wish to take up certification.
  • The last 1 year should be with your present employer.
  • Your present employer should be a CUSTOMER or PARTNER of SAP.

 

If you fulfill the above criteria,you can contact SAP and they will ask you to send your updated project profile as per the format given by them for technical evaluation. On establishing your eligibility, they will guide you through the process of registration.

Please send your profile once you are ready to appear for the certification as the certification approval mail is valid only for 2 months. After getting the approval mail, you will have to take the certification within 2 months of time failing which the payment will be returned.  You are requested to complete your certification within the given time-frame.

Evaluation of profile will take minimum 3 - 7 working days.

Please visit the webpage https://training.sap.com/ for available certification details.

Please note no assistance with regard to study material/questionnaire is provided for direct certification. Study material is offered only with training and not as a standalone component.

Certification fee for Associate level is subject to change as per SAP policy in a timely manner.


Important Note:

  • Clarifications required by SAP on the profile should be adhered to without which evaluation will not be done.
  • Please update your profiles at the first instance. If SAP does NOT approve the profile on the basis of incorrect or incomplete information, you will not be allowed to update your profiles or re-apply for direct certification for the next 6 months.
  • Once profile is approved, the same will not be re-validated. Re-validation of profile will take please once in six months.
  • Certification dates are not pre-scheduled & will be shared only on approval on receipt of documents (Hard and Soft Copies) for registration. If your profile is approved for direct certification, documents required for registration will be shared to you on the same approval mail.
  • You can also send a mail to SAP India regarding Certification details: education.india@sap.com


NOTE:

1) Email confirmation from the HR of your organization with details (like years of experience in the company and the tools that you have used) to be sent across to education.india@sap.com. SAP will provide you with a template of the mail. This is a new prerequisite and they are very strict on this without which they will not allow you to appear for the exam.

2) There would be some delays from SAP in replying to your mails. But don’t panic. They will guide you at appropriate times instead of sending reply for each and every mail of yours.

3) They will share with you the available dates and venues. You need to pick the convenient date and reply them back. Do those carefully as the following policies exist:   

          

Please note the cancellation / rescheduling policy:

(a) No charge shall be made where notice of cancellation / rescheduling / change of location is received by SAP at least 10 working days prior to the date of the Certification;

(b) 50% of the certification fee will be charged where notice of the cancellation / rescheduling / change of location, is received by SAP between 10 and 5 working days prior to the date of Certification;

(c) 100% Certification fee will be the cancellation fee charged where either no notice of cancellation / rescheduling / change of location, is received or the same is received less than 5 working days prior to the date of Certification

 

The Actual Test day story:

Mandatory Documents required to enter the test centre  are:

  1. Government photo identification card (Passport / Pan Card / Driving License / Voter ID)
  2. ID card issued by the training center/Company Identification card (Direct certification participants)

 

I attended the certification on 5th December at a SAP Authorised Prometric Centre in Chennai, India. The test was supposed to start at 8 AM in the morning but due to some technical issues, it was delayed by 15 minutes. To be frank, I was  very much tensed, some thoughts were  haunting me:

  • Will I Pass? Yes, No, May be.2.jpg
  • What will happen if I fail?
  • Have I taken a risk by putting the money on certification?

 

Anyway, somehow  I managed to get rid of that 15 minutes. We were around 14 people who had come for different certifications. All were given individual systems and instructions were dictated by Proctor (invigilator from SAP)

Some of the Instructions were terrifying: If you fail, you need to pay the amount (without any discounts) for attending the re-certification exam. These proctors should avoid giving these types of instructions before the exam. Isn't it?

After entering all the credentials given by him, yes the long awaited screen has come, where the FIRST question was visible.

 

Now something about that exam:

Exam contains 80 questions and total time is 180 minutes. I would say 3 hours is very much needed for this certification.

Questions can be divided into 3 parts:

1) Multiple choice questions, where there is only one answer(Radio Button type)

2) Multiple choice questions, where there are more 1 answer (Check box type) (You will find the number of correct answers beside the question like 2 ,3 etc.)

3) A new type of question like below:(only one question like this).It contains 4 lines like :

A -- You will have a drop down against it with 5 options. You need to pick the correct one that suits A

B -- You will have a drop down against it with 5 options. You need to pick the correct one that suits B

C -- You will have a drop down against it with 5 options. You need to pick the correct one that suits C

D -- You will have a drop down against it with 5 options. You need to pick the correct one that suits D

 

  • Each correct answer carries 1 point and wrong answer carries 0 points. If you have questions which have more than 1 correct answer you will have to include all the correct answers for that question. Remember there are no partial points.
  • From the first question onwards, you will have a NEXT question button with which you can start sailing to the next questions.
  • From the second question onwards, you will have a PREVIOUS question button, where you can go back to the previous questions and correct if necessary.
  • One more button is available called ASSIST is also available. When you click assist button, it will show a block in the right side with 80 squares indicating each question number in each squares. The question which you are answering will be a white background and the one answered will be in green background.
  • Also, there is a flagging option that is available for each question. If you are not sure whether your answer is right, you can flag that question, and that will notify you that you need to attend that question again. But the proctor was telling a drawback with the flagging option. In-case if any technical problem happens, the flags you have set prior to that would be lost. So beware of this button.
  • In-case if any technical problems happens in between your test, don’t panic. There will be some network person who will help you to retain the session and whatever answers that you have marked till that time will be safe. You can continue from where you have stopped.
  • You will have a time notification in the right top corner. Keep an eye on that always.

 

Some personal tips from my side:

  • Instead of using the assist button and flag option, use a piece of paper (ask Proctor to give one) and mark 1-80 numbers in that. Start answering from the first question and Strike off the numbers in the paper, which you have answered with 100% confidence.
  • Try not to spend more time on individual questions, instead quickly keep answering the questions that you know from the available 80.
  • Once you have done with the first round, check the numbers in the paper that you have not striked off, go back to that again, read it more carefully and continue the same process 2 – 3 times. Now  you will have to take a judicious decision by answering the questions  which is less doubtful.
  • If you continue with the above steps, after 2.5 hours, you will find very few questions yet to be answered, now dedicate the last 30 minutes exclusively for answering that questions.
  • Also, as said before, some questions will have more than one correct answer. Ensure that you have marked all the correct answers. Don’t fully depend on the Green Background in the assist option as the background colour will become green even if you mark 1 answer for a question which is supposed to have 3 right answers.
  • Read the question twice or thrice, there will be few tricky questions asked, where you will be tempted to choose the wrong answers. Beware of such questions.
  • Keep yourself calm and concentrate on the questions.

     3.jpg

After 2 hours 55 minutes, a pop up will come indicating that you have just 5 minutes for completion.

    4.jpg

It’s time now and you will have to submit your answers. You can use the SUBMIT button for this.

NOTE: This SUBMIT button will be there from the first question itself. Ensure that you don’t click that by mistake in between the test.

I have not seen such a costly SUBMIT button in my life.

       5.jpg

If you don’t submit even after 3 hours, session will automatically disconnect.

You will now see your result: Passed or Failed.

With God’s Grace, I passed my exam.

Please be informed that 66% or above will be considered as Pass and SAP certificate would reach to your address in 4-6 weeks.

6.jpg

You will also see the individual % of marks in each session and the overall total.

Basically SAP divides your marks into following sessions:

Business Content

Reporting

Provisioning

Modelling

Security

Optimization.

 

Now how to prepare:

HANA 100(411 Pages) and HANA 300 (628 Pages) are the most important SAP training materials. Like everybody, I would also say to go through each and every line in those docs twice or thrice. Questions can come up from anywhere.


Note:

a) TZHANA is an old version which has now been replaced with HANA 100.

b) Distributing the SAP learning materials for SAP courses is illegal and constitutes a copyright law violation.

c) It's good to go through SDN HANA space as well. That will help you to understand the practical issues which consultants are facing in the commercial projects. Also if possible please go through the HANA developer guide.

d) BW knowledge is always good.

e) Apart from the core HANA topics, there will be some questions related to BODS, BO and SQL as well.BODS questions would be straight forward.BO and SQL would be slightly tweaked.

 

These are the top 50 links which I would recommend it to you:

1) SP04 Features - SAP HANA Modeler - https://www.experiencesaphana.com/docs/DOC-1798

2) DXC - https://websmp206.sap-ag.de/~sapidb/011000358700000421852012E

3) HANA Certification -- http://scn.sap.com/community/hana-in-memory/blog/2012/08/27/my-experience-on-hana-certification

4) Use HANA DSO in modeller -- http://www.saphana.com/community/blogs/blog/2012/09/27/modeler-unplugged-episode-11--importing-bw-models-as-native-hana-models

5) Upgrade and migration BW on HANA -- http://www.saphana.com/community/blogs/blog/2013/05/03/upgrade-and-migration--bw-on-hana

6) HANA Cook Book -- https://cookbook.experiencesaphana.com/bw/operating-bw-on-hana/hana-database-administration/user-access-administration/user-provisioning/deactivatingactivating-users/

7) HANA Academy very good link -- http://saphanaacademy.blogspot.in/

8) Concepts in HANA -- http://scn.sap.com/community/hana-in-memory/blog?start=45

9) Concepts in HANA -- http://scn.sap.com/community/developer-center/hana/blog?start=30

10) Data Types in HANA -- http://help.sap.com/hana/html/_csql_data_types.html

11) Changes after Migration -- http://www.saphana.com/docs/DOC-3723

12) Open SAP -- https://open.sap.com/course/hana1

13) BW on HANA and very Large Tables -- http://www.saphana.com/community/blogs/blog/2013/04/15/bw-on-hana-and-very-large-tables

14) HANA Help.SAP.Com -- http://help.sap.com/saphelp_hanaplatform/helpdata/en/20/cbb10c75191014b47ba845bfe499fe/content.htm?frameset=/en/2e/1ef8b4f4554739959886e55d4c127b/frameset.htm

15) HANA Discussions -- http://www.saphana.com/community/implement/content?filterID=contentstatus%5Bpublished%5D~category%5Bdata-modeling%5D

16) HANA SQL Reference -- http://help.sap.com/hana/html/sql_import_from.html

17) HANA CE Functions -- http://inmemoryhana.blogspot.in/

18) HANA Interview Questions -- http://rajkumarsap.weebly.com/1/post/2012/10/sap-hana-interview-questions.html(Good One)

19) BW Powered by HANA -- http://www.saphana.com/docs/DOC-2129

20) HANA Developer Edition download -- https://hanadeveditionsapicl.hana.ondemand.com/hanadevedition/

21) Indexing Video -- http://www.saphana.com/docs/DOC-3169

22) HANA SQL -- http://help.sap.com/hana/html/sql_grant.html

23) HANA Modelling YouTube Video -- http://www.youtube.com/watch?v=ZZvx2BemlvM&feature=youtu.be&goback=.gde_3683024_member_210180594

24) HANA Videos -- http://www.saphana.com/community/hana-academy

25) Common errors while using HANA Studio -- http://scn.sap.com/docs/DOC-47603

26) HANA Real World Example -- http://www.saphana.com/community/blogs/blog/2013/08/28/sap-hana-live-real-world-example

27) Help.Sap.Com for HANA -- http://help.sap.com/hana_platform#section7

28) HANA for Beginners --- http://www.saphana.com/community/blogs/blog/2012/09/07/sap-hana-for-beginners

29) HANA materials and interview questions -- http://sapbobwbi.blogspot.in/2013/09/sap-hana-interview-questions-2.html  (Good One)

http://sapbobwbi.blogspot.in/2013/09/sap-hana-interview-questions-1.html  (Good One)

30) Free Access to HANA sandbox -- http://sapignite.com/free-access-to-sap-hana-sandbox-for-your-first-hana-experience/

31) Certification King -- http://certificationking.com/download/SAP.htm

32) HANA Material Links -- http://dailylearnbw.wordpress.com/2012/11/15/if-you-plan-to-write-hana-certification-exam-check-this-out/

33) HANA Developer centre -- http://scn.sap.com/community/developer-center/hana?rid=/webcontent/uuid/c022341c-5ed1-2e10-0b98-9b6a3314dd25

34) HANA Training Materials -- http://www.gobookeee.com/sap-hana-training-material/

35) HANA Certification Pathways -- http://www.hanageeks.com/2013/05/sap-hana-certification-pathways.html

36) Free HANA Tutorials -- http://www.freehanatutorials.com/

37) SCN Discussions -- http://scn.sap.com/thread/3194590

38) Succeed HANA Certification -- http://www.thesapjobs.com/sap-hana-certification/

39) SCN Discussions -- http://scn.sap.com/community/developer-center/hana/blog/2013/06/25/want-to-learn-sap-hanawhere-to-startcertification

40) HANA Questions -- http://sapdairy.blogspot.in/2013/05/sap-hana-certification-questions.html   (Good One)

41) HANA Links -- http://scn.sap.com/community/hana-in-memory/blog/2012/08/27/my-experience-on-hana-certification

42) Certifications sample Questions -- http://www.sap-school.com/certificationquestionsHANA.html   (Good One)

43) SCN Discussions -- http://scn.sap.com/thread/2131589

44) HANA services Library -- http://scn.sap.com/docs/DOC-46745

45) BW on HANA FAQ -- http://www.saphana.com/community/learn/solutions/net-weaver-bw/bwonhanafaq

46) Business Suite on HANA FAQ -- http://www.saphana.com/community/learn/solutions/sap-business-suite-on-hana/business-suite-on-hana-faq/content

47) BW 7.4 on HANA -- http://scn.sap.com/docs/DOC-35002

48) HANA Blogs -- http://www.saphana.com/community/blogs/blog/2012/06/20/does-sap-hana-replace-bw-hint-no--part-2

49) HANA Courses -- http://scn.sap.com/docs/DOC-39531

50) HANA Reference for Developers -- http://scn.sap.com/community/hana-in-memory/blog/2013/08/17/hana-reference-for-developers--links-and-sap-notespart-2#comment-381371

 

Note: Taking the HANA certification is always good but don't presume that it will take you directly to a HANA job.

Yes, with the 'CLOUD' also coming into picture, job opportunities will increase, for sure!

 

On a lighter note: As I told in the beginning of this blog, there was around 15 minutes delay before the exam. I was very much tensed and decided to draw a picture to relax myself. Finally I had completed a beautiful picture and am dedicating this to all the members here in SDN who are preparing for certification.

7.png

 

Seeing this, hope you all can make out how good artist I am.

Anyway Jokes apart, Prepare seriously for your exam. Do your part well and the rest God will take care.

Just think only one point: If many 100’s can pass the HANA exam, WHY CAN’T I.

               11.png

All the best to those who are going to prepare for the exam!

 

Enjoy HANA!

 

Regards,

Prabhith

What's new in SAP HANA SPS 7 - Installation and Update

$
0
0

Introduction

 

In the upcoming weeks we will be posting new videos to the SAP HANA Academy to show new features and functionality introduced with SAP HANA Support Package Stack (SPS) 7, released December 3, 2013. To get the best overview of what’s new in SAP HANA SPS 7, see Ingo Brenckmann's blog on the SAP HANA community site.

 

The topic of this post is installation and update, and complements a number of tutorial videos posted to the SAP HANA Academy site and on YouTube:

 

What's New with SPS 7?

SAP HANA components (server, studio, client) are installed with installer hdbsetup in graphical mode or hdbinst for command line with optional configuration file. 

As of SPS 4, the installation process was complemented with a tool called Unified Installer in the documentation (and install.sh on DVD). This Unified Installer called, with some prerequisite checking and error handling, the installers for SAP host agent and for SAP HANA server, studio, and client, and script hanaconfig.sh for AFL, etc., all with the required parameters. Obviously, when preparing a SAP HANA system, it was a great timesaver to be able to run only one installer instead of running all those different installers and scripts. Unfortunately, the Unified Installer tool did not have the sophistication of the component installer hdbinst, as it would not accept parameters on command line or file, for example.

Until SPS 7, this concerned SAP employees and SAP certified hardware partners, as apart from the different Cloud offerings, the SAP High-performance Analytical Appliance was only available as ... appliance, that is, as a closed computer system with both hardware and software pre-configured and optimised, similar to SAP Netweaver Business Warehouse Accelerator, for example.

As of SPS 7, an additional approach besides appliance delivery is now available. This approach is called SAP HANA tailored data center integration, or DCI, for those that prefer to use acronyms. With tailored data center integration, any SAP HANA certified engineer can now install SAP HANA on certified hardware. This put a spotlight on the installation process and this process has now been greatly enhanced with SAP HANA lifecycle management tools.


For more information about data center integration, see the blog posted by Adolf Brosig, Take Your Choice: How to Integrate SAP HANA as Smoothly as Possible Into Your Data Center or the overview presentation by Jens Rolke, Overview - SAP HANA tailored data center integration.

For more information about certified SAP hardware partners, see Rely on hardware from global technology leaders.


Does this also mean can we now run the SAP HANA database on Windows, one might ask? No, it does not. The only supported operating system for the SAP HANA server is still SUSE Enterprise Linux (SLES) 11. For more information about supported operating systems for server and clients, see the Product Availability Matrix (PAM) for SAP HANA.

 

Unified Installer is now deprecated and will no longer be shipped with future releases.

 

SAP Certified Technology Specialist - SAP HANA Installation

 

Interested in performing SAP HANA server installations? With the tailored data center integration offering, installing SAP HANA is no longer restricted to certified hardware partners and a new certification has been introduced to guarantee required skills and knowledge: E_HANAINS131 - SAP Certified Technology Specialist  - SAP HANA Installation. Prerequisite is the SAP Certified Technology Associate certification. The training HA200 presents the required knowledge.

 

For more information about this certification, seeSAP Training and Certification Shop.

 

SAP HANA lifecycle management

SAP HANA lifecycle management tools come, like the SAP HANA single component installers, in two flavours: as command line or with screens: hdblcm and hdblcmgui. The GUI is convenient for a guided installation requiring minimal input and uses default values for parameters where possible. The command line proposes the same interactive installation, in case X-Windows is not configured on the Linux host.

 

The introduction video below discusses important concepts, what documentation to read, and where, what and how to download SAP HANA software components, etc.

 

 

The next video shows the actual installation using the new Lifecycle Manager GUI in interactive mode performed on a Windows client using Xming X Server for Windows together with PuTTY and SSH.

 

 

As mentioned, not every parameter can be specified using the guided interactive installation, and one such parameter is Autostart. This parameter controls whether the SAP HANA system is configured to start automatically when the host operating system starts (using sapinit). This defaults to no.

Should you wish to change this behaviour, see the video below on how to configure SAP HANA database to start at system boot.

 

 

Besides interactive installation, lifecycle management allows for a scripted installation as well, and this clearly is its forte. Imagine performing a full SAP HANA installation including server, studio, client, AFL on a distributed system of 10 nodes. This can now be achieved with a single parameter file and a single command.

 

Another example listed in the installation guide is about a hardware partner that wants to automate the installation of nine SAP HANA systems (1 Extra Large, 5 Large, and 3 Small). By specifying one parameter, a template configuration file is generated, which can than be edited using any text editor.

 

./hdblcm --action=install --dump_configfile_template=/home/root/HANA_install.cfg

 

As such, the partner can create configuration files for each of the three system types and call hdblcm with the configuration file parameter in batch mode. Whereas interactive mode will prompt for password, batch mode runs the installer without asking for any input. Passwords can be stored in an XML file and passed to the installer as a stream by standard input, or they can be specified in the configuration file.

 

cat ~/Passwords.xml | ./hdblcm --configfile=/home/root/HANA_install_S.cfg --read_password_from_stdin=xml -b

 

When the installation script is run, SAP HANA is installed on both the single-host and multi-host systems, without any additional input. By reusing the same configuration files, the installations are reliable, flexible, and efficient.

 

SAP HANA studio and client installations

One of the new features introduced with SAP HANA SPS 7 is that local Administrator rights on Windows computers (or root privileges on Linux and UNIX) are no longer required to install SAP HANA studio or the SAP HANA client. If you wish to install these component but you do not have administration rights on a particular system, the product will simply install in your home environment and will be available for use for your eyes only.

 

SAP HANA studio is an Eclipsed-based IDE that can be used for SAP HANA development, analytical modelling and database administration.

The video below shows how to install SAP HANA studio on Linux and Windows.

 

 

SAP HANA client comprises ODBC, JDBC, and MaxDB runtime. On Windows, ODBO for Excel OLAP analysis is included as well.

 

The video below shows how to install the SAP HANA client on Linux and Windows.

 

 

SAP HANA studio update site

Another great feature of SAP HANA studio is the update site. You can configure a SAP HANA system to host the update package for SAP HANA studio. This way, when you update the SAP HANA server using the lifecycle management tools, an update will be immediately available for the SAP HANA studio. Anyone using studio in your environment can than update their version to the latest available at their convenience. Users can even configure studio to check automatically at a certain schedule or at startup, for example, to verify if an update is available. The update site mechanism uses HTTP(S), the same as for the other components of the Eclipse IDE or SAP HANA tools [http://tools.hana.ondemand.com], and can be made available - if desired - to anyone with internet access.

 

As of SPS 7, the update site is hosted by SAP HANA XS. Previously, Software Update Manager (SUM) was used for this purpose.

 

The video below shows how to update SAP HANA studio automatically using an update site (and how to configure SAP HANA XS to host this).

 

 

To be continued...

You can view more free online videos and hands-on use cases to help you answer the What, How and Why questions about SAP HANA and Analytics on the SAP HANA Academy at academy.saphana.com or follow us on Twitter @saphanaacademy.


SAP HANA - How to create an Analytic View

$
0
0

Today we will create an Analytic View which is one of the artifacts of the HANA. Please go through my previous post which describes about Attribute view.SAP HANA - How to create an Attribute View

 

What is an Analytic View ?

1.Analytic view is an information view.

2.It joins together one central fact table, which contains the measures with any number of other tables (or) attribute views.

3.This is the basic view type that is directly used as source of data for reporting.

4.Analytic view can be comparable to InfoCubes in SAP BW.

 

Analytic View = 1 Fact Table(with Key figures) + ( 1 or more Tables/Attribute views ).

 

In this example we will use one fact table from EPM demo data model and an attribute view which is create in my earlier post..SAP HANA - How to create an Attribute View

 

You can avail the 30days Free trail here Get 30 days of free access to SAP HANA, developer edition

 

1. Using Quick Launch,choose ANALYTIC VIEW and click create. Alternatively right-click in the navigator pane on the desired package and click NEW and then ANALYTIC VIEW.

2. In the dialog box that comes up fill in the necessary information and Click on Finish.

3.Now you see the below screen.There are 3 parts in Analytic View to be look at (a)Semantics (b)Logical Join and (c)Data Foundation.

4.Data Foundation is in which we built the Fact table by using one or more tables from the Schema.Here i am using only 2 table Sales Order Header and item table from EPM data model.These two tables are joined and output columns are determined like below.By this our fact table is ready with key figures Gross Amount,Net Amount and Quantity.

Note: Only tables are used in the Data Foundation

5. Next step will be Logical Join here we are going to join the Fact table created in above step with other Attribute Views(Dimensions) : Customer and Material

Note: Only Attribute Views(dimensions) are used in the Logical Join.

The Logical Join will look like this.

 

6. Save and activate the Analytic View. Preview data will look like below.

 

 

All the activated Attribute and Analytic Views can found in _SYS_BIC schema(run time objects).

We can use this analytic View in any HANA native development applications like SAPUI5 applications.

 

Happy Learning..:)

 

Please don't forgot to post comments and correct me if i am wrong.

How to read from and write to HANA via XS OData Services using jQuery and datajs

$
0
0

Overview

We recently adapted a jQuery application using a PHP/mySQL backend to working with XS OData services exposing HANA tables instead. All our files are checked into the HANA server's repository to avoid any cross-domain policy issues. The data read from the service is saved into a client-side database via javascript, or posted back to the service from the client.

 

Some of the pitfalls we stumbled upon were the exact formats of the JSON payloads in the requests and responses, request and response headers and details of the service definition. We used the native jQuery AJAX function as well as datajs' OData for both reading and writing. Here's a basic overview of how to get it running.

 

Thanks to Thomas Jung and Martin Strenge for quickly answering all the questions we had.

 

Prerequisites

To understand this blog post, have some understanding of Javascript and for example the Chrome dev tools, and access to a HANA server.

 

Service definition

This is as simple as they come, exposing a single table to read from/write to.

 

service namespace "App.services" {

  "App.tables::tables.myTable" as "myTable";

}

 

Important: Expose the designtime object, not the runtime object. This was one of the mistakes that took us some time to find. Changes made to the designtime table definition were not reflected in the runtime table, and thus the service, and caused errors.

 

This will of course only work for simple inserts, for anything non-trivial use the create statement to bind e.g. a SQLScript procedure to the service.

 

Reading from the database via OData

Reading via datajs

OData.read("../services/service.xsodata/myTable", function (data) {

  App.saveResultsToLocalDB(data.results);

},

function (err) {

  console.log(err);

});

 

What the Request should look like (in excerpts)

Request URL:http://server:8000/App/services/service.xsodata/myTable

Request Method:GET

Accept:application/json

 

What the Response should look like (in excerpts)

content-type:application/json

 

The returned JSON object will look sommething like this:

 

{"d":{"results":[{"__metadata": {"uri":"http://server:8000/App/services/service.xsodata/myTable('1369226e-8f57-3877-c380-134a64d97b10')","type":"App.services.myTableType"},"key":"value", ...}]}}

 

So your application needs to drill down into the data object to data.d.results.

 

Reading via jQuery ajax

$.ajax({

  type: "GET",

  url: "../services/service.xsodata/myTable",

  cache: false,

  dataType: "json",

  error : function(msg, textStatus) {

    console.log(textStatus);

  },

  success : function(data) {

    App.saveResultsToLocalDB(data.d.results);

  }

});

 

What the Request should look like (in excerpts)

Request URL:http://server:8000/App/services/service.xsodata/myTable?_=1387451046241

Request Method:GET

Accept:application/json

 

What the Response should look like (in excerpts)

content-type:application/json

 

The returned JSON object will look the same as above:

 

{"d":{"results":[{"__metadata": {"uri":"http://server:8000/App/services/service.xsodata/myTable('1369226e-8f57-3877-c380-134a64d97b10')","type":"App.services.myTableType"},"key":"value", ...}]}}


Writing to the database via OData

Writing via datajs

Do NOT use JSON.stringify(), rather send the JSON object directly when using datajs. It's important that the JSON object sent is just a single object {...}, not e.g. an array [{...},{...}] for XS to be able to use it.

 

OData.request( {

  headers: {"accept": "application/json"},

  requestUri: "../services/service.xsodata/myTable",

  method: "POST",

  data: data

  },

  function (data, response) {

    console.log(data);

  },

  function (err) {

    console.log(err);

  }

);

 

What the Request should look like (in excerpts)

Request URL:http://server:8000/App/services/service.xsodata/myTable

Request Method:POST

Accept:application/json

Content-Type:application/json

Request Payload:

 

{"key":"value",...}

 

What the Response should look like (in excerpts)

The returned JSON object will look the same as with the GET request above:

 

{"d":{"results":[{"__metadata": {"uri":"http://server:8000/App/services/service.xsodata/myTable('1369226e-8f57-3877-c380-134a64d97b10')","type":"App.services.myTableType"},"key":"value", ...}]}}

 

Writing via jQuery ajax

Data (the payload sent to the server) needs to be a string, so use JSON.stringify(). It's important that the JSON object sent is just a single object {...}, not e.g. an array [{...},{...}] for XS to be able to use it.

 

data = JSON.stringify(data);

 

$.ajax({

  type: "POST",

  url: "../services/service.xsodata/myTable",

  data: data,

  cache: false,

  dataType: "json",

  contentType: "application/json",

  error : function(msg, textStatus) {

    console.log(textStatus);

  },

  success : function(data) {

    console.log(data);

  }

});

 

What the Request should look like (in excerpts)

Request URL:http://server:8000/App/services/service.xsodata/myTable

Request Method:POST

Accept:application/json

Content-Type:application/json

Request Payload:

 

{"key":"value",...}

 

What the Response should look like (in excerpts)

The returned JSON object will look the same as with the GET request above:

 

{"d":{"results":[{"__metadata": {"uri":"http://server:8000/App/services/service.xsodata/myTable('1369226e-8f57-3877-c380-134a64d97b10')","type":"App.services.myTableType"},"key":"value", ...}]}}

Architecture Bluebook: SAP HANA Redefining Customer Analytics

$
0
0

SAP Customer Engagement Intelligence, powered by HANA is a suite of high performance applications (HPAs) for marketing and sales lines of business that only became possible with our key innovation SAP HANA, enabling a fundamental change in customer analytics. The latest architecture bluebook describes the concepts of SAP Customer Engagement Intelligence 1.1 SP01. The target group for this bluebook is technical readers who need to know the concepts and want to learn about customer analytics. It may also help consultants to understand the capabilities of SAP Customer Engagement Intelligence and what benefits they can expect, and to some degree how to implement this product.

HPAs are built as add-ons to the SAP NetWeaver Application Server (AS) ABAP 7.40 with SAP HANA as the primary database. As shown in the following high-level architecture diagram, the Business Suite Foundation, gateway components, and SAP NetWeaver UI Extension Add-On are deployed on top of the AS ABAP.

Figure01-High-Level Architecture of HPAs.jpg

SAP Customer Engagement Intelligence is natively built using SAP HANA for transactional and analytical features. It provides high value to customers and comes with the following key capabilities and benefits:

  • Big Data – Leverage SAP HANA natively for high-performing analysis on huge volumes of granular data and built-in predictive calculations – all based on data replicated from SAP and non-SAP systems
  • Insight-to-action – Combine analytics with transactions to convert insight immediately into action
  • Superior user experience leveraging SAPUI5/HTML5 including social collaboration
  • Adoption in a risk-free, non-disruptive, side-by-side mode, with on-premise deployment or in the SAP HANA Enterprise Cloud
  • Continuous innovation with quarterly deliveries

Analyzing, measuring, and proving the top-line and bottom-line contribution of marketing activities are critical success factors in a business environment that challenges marketing budgets. With SAP Customer Engagement Intelligence, SAP enables marketing departments to drive demand, qualify leads, and win customers more efficiently based on real-time consumer insights and targeted action.

 

Enjoy reading the full architecture bluebook about how SAP HANA redefines customer analytics, availale in the solution launch JAM!


SAP HANA Replication issue

$
0
0

Hi,

After the patch upgrade for SAP HANA , we are not able to replicate the newly created tables from ECC System to SAP HANA.

 

Below are the current Version details for SAP HANA


SAP HANA Studio

------------------------------------------------------------------

 

Version: 1.0.41

Build id: 201211131031 (370506)

------------------------------------------------------------------

SLT Server

------------------------------------------------------------------

 

Component version - SAP EHP 2 for SAP NetWeaver 7.0

Installation number  - 0020563958

 

Software
  Component
ReleaseLevelHighest Support
  Package
Short
  Description of Support Package
SAP_BASIS70210SAPKB70210SAP Basis Component
SAP_ABA70210SAPKA70210Cross-Application
  Component
PI_BASIS70210SAPK-70210INPIBASISBasis Plug-In
SAP_BW70210SAPKW70210SAP Business
  Warehouse
DMIS2010_1_7005SAPK-91705INDMISDMIS 2010_1_700

 

 

------------------------------------------------------------------
SLT Server Database Data

------------------------------------------------------------------

Database system  -        MSSQL

Release                 -       10.50.2769

Name                     -       NS2

-------------------------------------------------------------------

Thanks.

What's new with SAP HANA SPS07

$
0
0

What's new with SAP HANA SPS07? Check out everything it has to offer with updated features from HANA tailored data center integration to disaster recovery. To learn more about each of these features and how they can help your business, check out this Slideshare playlist with detailed information about each important update

http://spr.ly/6181drTT

 

 

What´s New?   SAP HANA SPS 07


check out this Slideshare playlist


HANA SPS07 Extended Application Service


HANA SPS07 Text Analysis

 

HANA SPS07 Architecture & Landscape


HANA SPS07 Tailored Datacenter


HANA SPS07 SQL Script


HANA SPS07 Replication

 

HANA SPS07 Disaster Recovery

 

HANA SPS07 Shine

 

HANA SPS07 Security

 

HANA SPS07 Smart Data Access

 

HANA SPS07 River

 

HANA SPS07 Business Intelligence

 

HANA SPS07 Modeling Enhancements

 

HANA SPS07 LCM

 

HANA SPS07 Geospatial Processing

 

HANA SPS07 Fuzzy Search

 

HANA SPS07 Studio Development Perspective

 

HANA SPS07 Backup & Recovery

 

HANA SPS07 Web-Based Development Workbench

 

HANA SPS07 App Function Library

 

 

Regards

Srinivas

SAP HANA - EXPECT MORE FROM YOUR DATABASE

$
0
0

The announcement of the SAP HANA platform has created a lot of buzz in the IT and business world. In the last few months, SAP HANA has seen a great leap forward in its development tools. It is now much simpler to use. Today we have a central platform that remains as solid and stable as before, while the development platform has been significantly improved.

 

Why HANA?

Now a days enormous data volumes are being generated, doubling in size every two years, primarily on account of increasing electronic data sources. Conventional data sources or graphical representation technologies simply can’t process these volumes. We need new ways to solve this problem, ideas for how to cope with the collection, storage, distribution, search techniques, analysis and graphical representation of such large volumes – in acceptable runtimes. Add to that the fact that part of this data is unstructured.

Until now, BI in companies has been affected by the fact that data is no longer current by the time it’s used for analysis. Analysis takes too long and can’t aid decision-making. In the cocktail of big data issues, real-time technologies no doubt make the most direct contribution to improving corporate BI. Another major drawback with the current BI in companies is the fact that almost the only data being referred to is structured data.

Today’s business users need to react much more quickly to changing customer and market environments. They demand dynamic access to raw data in real time. SAP HANA empowers users with flexible, on-the-fly data modeling functionality by providing no materialized views directly on detailed information. SAP HANA liberates users from the wait time for data model changes and database administration tasks, as well as from the latency required to load the redundant data storage required by traditional databases.

Some use the term “in-memory” in the context of optimizing the I/O access with database management, centering on accessing data from the hard disk by pre-storing frequently accessed data in main memory. The term is also used for a traditional relational database running on in-memory technology. Some solutions offer columnar storage on traditional hard-disk technology, while other platforms offer the option of storing data on solid state disks (SSD). Although these disks have no moving parts and access data much more rapidly than hard disks, they are still slower than in-memory access.

Only SAP HANA takes full advantage of all-new hardware technologies by combining columnar data storage, massively parallel processing (MPP), and in-memory computing by using optimized software design.

 

Why SAP HANA is a game changer?

Imagine you own a retail store that wants to treat its loyal customers by offering them discounts on their next purchase. A customer just bought something and passed her credit card to the retail executive. Just at the time the retail executive swipes her credit card, the internal system immediately gives him the information that last time she bought black shoes at your store. You instantly offered her a discount on a black dress she might be interested in, that will go well with her black shoes!

 

This is the power of real-time data analysis and decision making to improve business performances manifolds!. SAP HANA has several incredible features that set it apart from traditional databases. Let’s delve into these and find out why SAP HANA is getting so popular:

1.       Columnar Data Storage

2.      In-Memory Database System

3.      Parallel Processing

4.      SAP HANA provides Real-time Analytics

5.      Innovations are possible With SAP HANA

 

What are the technical components that make up HANA?

 

SAP HANA is a combination of three different products – TREX: a search engine, P*Time:  an in-memory, light-weight online transaction relational database management system (OLTP RDBMS) technology and MaxDB: a database technology with persistence, conventional RDBMS features and columnar storage capabilities. SAP built the HANA appliance in collaboration with Stanford University and Hasso Plattner institute in Germany.

The earlier in-memory products of SAP were based on TREX and Live Cache technologies. For e.g., BI Accelerator, BI Explorer, Enterprise Search. SAP SCM-Advanced Planning and Optimization is based on Live Cache.

At the time of launch, HANA started with 1TB of RAM and supported up to 5TB of uncompressed data. By 2011, RAM capacity of 8TB supported up to 40TB of uncompressed data. By 2012, HANA was able to run on servers with 100TB of RAM powered by IBM.

 

What are the landscape considerations for SAP HANA?

 

SAP HANA supports ABAP (Advanced Business Application Programming) and doesn’t support Java stack based applications. It works only on SUSE Linux OS and supports distributed installations. Customer’s HANA Appliance has to be installed only on servers provided by SAP certified hardware partners.

 

If SAP HANA and Hadoop come together?

 

Hadoop is a free, Java-based programming framework that supports the processing of large data sets in a distributed computing environment. It is part of the Apache project sponsored by the Apache Software Foundation.

Hadoop makes it possible to run applications on systems with thousands of nodes involving thousands of terabytes. Its distributed file system facilitates rapid data transfer rates among nodes and allows the system to continue operating uninterrupted in case of a node failure. This approach lowers the risk of catastrophic system failure, even if a significant number of nodes become inoperative.The Hadoop framework is used by major players including Google, Yahoo and IBM, largely for applications involving search engines and advertising. The preferred operating systems are Windows and Linux but Hadoop can also work with BSD and OS X.

SAP HANA is particularly efficient at making real-time decisions and provides support systems for decision-making. It is also very good at managing large amounts of data, although not yet at the same level as Hadoop.

On the other hand, Hadoop enables large amounts of data to be stored arbitrarily in an efficient way. One of its primary strong points is that it allows us to find a needle in a (huge and unstructured) haystack. In short, it carries out real and complex data mining, always running in batch-processing mode.

After testing both technologies in a pilot development, GFT strongly supports the combination of SAP HANA and HADOOP as complementary technologies. With this joint venture we can face the challenge of managing large volumes of data, both in terms of speed and on a variety of scales. The following chart compares volume and velocity against volume and variety for SAP HANA and Hadoop, as well as the combination of both. Evidently, a combination of the two enables the total spectrum of possibilities to be covered, and therefore produces the optimum results.

In conclusion, combining both technologies can leverage their individual strengths allowing them to build a comprehensive Big Data solution. SAP is currently working on integrating SAP HANA and HADOOP.

 

Is it a platform?

 

While the SAP High Performance Analytics Appliance (HANA) initially started out as a specialty engine designed to run analytic applications, HANA has rapidly morphed into a full-blown database platform. In fact, if you include all the database engines that SAP now owns such as Sybase and HANA, the company says it will be the number two supplier of database engines by 2015.

The interesting fact here is, SAP plans to move its current data warehouse offering that runs on top of its NetWeaver middleware to the HANA platform. The company also plans to deliver master data management and data governance services via HANA while also moving a range of business intelligence, predictive analytics, OLAP functionality and enterprise performance management applications to the platform as well.

 

River Definition Language (RDL):

 

River Definition Language (RDL), is a new integral language for developing native business applications running on top of SAP HANA. River focuses on intent, using an object-oriented, highly declarative, and modular syntax. It covers all aspects of the application, including error handling and access control. River cross-compiles into efficient code that runs either in XS (Javascript) or in the SAP HANA Database Engine (SQLScript).

Key Design Principles:

  1. No Runtime
  2. Simplicity
  3. Coherency End to End
  4. Openness

 

Is it time to switch to SAP HANA?

 

620x440xTomorrow.jpg.pagespeed.ic.-7jbbl_z9C.jpg

Now SAP Business Suite customers are wondering whether they should replace their underlying relational databases with this new technology. In-memory databases (IMDBs) have the ability to provide real-time information in nanoseconds instead of milliseconds, a capability that is important for many emerging applications.

So, whether an insurance company wants to calculate the premium amount on various policies or a hospital requires data of all the heart surgeries taken place in the last quarter or a Real estate wants quick access to its database to follow-up with a client, SAP HANA can do all these with the kind of features it is blessed with! Characteristics like no aggregate tables, strong computation power with built-in multitenancy, flexible modelling, large memory footprint, no data duplication and fast data loads leads to immediate access to huge volume of data, in turn leading to quick and efficient decision making.

 

So, enter the new era of SAP HANA and get started with unbelievable adventures!

 

 

Thanks for your patience

Research paper on SAP HANA by NEW FRONTIER, ORACLE and PAC

$
0
0

The three research papers I have gone through are written by three big companies namely:-

 

 

 

 


New Frontier Software Inc.

 

New frontier is a software consultancy which enables other firms to maximize the value of their SAP implementations by rapidly delivering information to the business.

The major software and database technology they use mainly comes from software companies like

  • IBM & Netezza ,Kognitio Microsoft ,Oracle,Sybase ,Teradata

 

Business Intelligence tools used by New frontier are from software companies like

  • Cognos,Microsoft, Micro Strategy ,Oracle ,SAP Business Objects

 

1st Research paper name: SAP HANA Implications and Alternativesin 2012

 

Key Points

 

  • How database underpin SAP
  • How SAP Positions HANA and what it means to SAP
  • Explanation of the HANA Technology
  • The various options for SAP customers

 

Summary

 

Today HANA is at an early stage. SAP has its own database called SAP HANA which is high in performance for memory database application.

 

Earlier many SAP Customer had oracle, Microsoft and IBM databases to store their data but now they use SAP because of its Cloud service.

 

Platform database for the user by SAP HANA is faster than previous database engines typically oracle DB2 and SQL Server.

 

Now SAP Netweaver is independent from Oracle IBM or Microsoft .

 

Gartner even described SAP as throwing the gauntlet down to oracle and other database vendors.

 

  1. Dr. Vishal Sikka is the SAP Executive board member who mentioned :

 

"HANA is at the heart of SAP'S renewal and is core to everything we are working on.

 

HANA Technology

 

  • Memory Database
  • Data Management
  • Hana modeling studio
  • Real time replication
  • Data Services
  • Calculation Planning Engine.

 

Option for SAP Customer

 

How much Does HANA Cost?

 

Firms will get high Performance by shifting to HANA database

Cost is 15,000 pound per 64 gb blade

 

Maturity

 

Fit for Purpose, New frontier is completely neutral about databases, according to the need of database fit for particular purpose.

 

PAC

 

Pierre Audoin Consultants is a privately held research and consulancy firm for the software

and ICT Services market.

 

2nd Research paper by PAC

 

SAP HANA - Where does it take us and how to benefit,in  Nov 2012.

 

PAC Pierre Audoin Consultants is a privately held researchand consulting firm for the software

and ICT Services market.

 

Key Points

 

  • SAP HANA- Where does it take us and how to benefit?
  • Aspects to be considered when Implementing SAP HANA
  • Application running on SAP HANA

 

Summary

 

SAP users are already expressing a lot of interest in the topic of SAP HANA.

 

Three different usage scenarios for SAP HANA can be identified :

  1. 1.     Side by Side
  2. 2.     Primary Persistence
  3. 3.     Platform

 

SAP HANA Projects are primary driven by three aspects :

 

  1. 1.     The aim of improving general performance to produce results faster
  2. 2.     The need to fulfill the rising demand of lines of business for analytics more quickly and effectively.
  3. 3.     Real time Data Management.

 

 

Aspects to be considered when Implementing SAP HANA :

  1. 1.     Preparation tasks
  2. 2.     "SAP HANA readiness" of the SAP environment
  3. 3.     SAP HANA Strategy
  4. 4.     SAP HANA ROADMAP
  5. 5.     Costs of SAP HANA
  6. 6.     Return on Investment

 

The following are the applications run on SAP HANA:

ChriTra,SAP Forecasting,SAP collection Insight and many more.

 

 

ORACLE 

 

Oracle Corporation is an American multinational computer technology corporation headquartered in Redwood City, California, United States. The company specializes in developing and marketing computer hardware systems and enterprise software products – particularly its own brands of database management systems. Oracle is the second-largest software maker by revenue, after Microsoft.

Every new SAP customer needed to buy database software and most of that revenue went to ORACLE. So SAP is probably Oracle's biggest- ever Customer.

When oracle started to acquire application vendors like JD Edwards, Siebel, and PeopleSoft the friendship cooled down.Larry Ellison,Founder and CEO, Oracle started to attack SAP publicly, instead of microsoft,Oracle's Exadata(database appliance on SUN) is the most obvious alternative to HANA database and later this year Exalytics , In memory appliance basen on Times Ten.

 

3rd research paper by ORACLE

Analysis of SAP HANA High Availability Capabilities in June 2013

 

Oracle at the starting page is mentions that the aim is not to compare SAP HANA CLOUD Database with ORACLE EXADATA CLOUD Database but ironically in whole research paper Oracle is only explaining their own technology and trying to convey that “ORACLE IS GOOD THAN SAP”.

 

Oracle mainly mentions in this research that oracle is superior to SAP.

  • SAP HANA is immature.
  • SAP HANA is not industry ready enterprise.

                                                                 

This white paper also covered :

 

  1. 1.     SAP HANA DATABASE
  2. 2.     SAP HANA APPLIANCE
  3. 3.     Analysis of high availability
  4. 4.     Scalability Features,
  5. 5.     Disaster recovery Features
  6. 6.     Summary analysis of all HANA features


I hope u  ALL liked my summaries on the three research papers

DB000: Initial post for XS/SAPUI5 blog

$
0
0

Over the upcoming year, I hope to post regularly regarding my experiences developing web applications using XS for the back-end and SAPUI5 for the front-end.

 

As a short background, during the past year I have developed a few different applications using XS and SAPUI5.  Overall, I have found the two technologies to be quite useful and generally easy to use.  However, as with all technologies, especially those under active development, I have encountered various difficulties along the way and on occasion been extremely frustrated trying to find workarounds to things which seem like they should be simple to achieve. My hope in writing these posts is that others will be able to see the benefits of these SAP technologies and be able to more quickly and easily incorporate them into their development arsenal.

 

Some of the topics I plan to cover in the short term are:

  • Libraries in XS
  • HTTP request handling in XS
  • HTTP responses in XS
  • Sending and receiving binary data using XS
    • Using SAPUI5 FileUploader
    • Handling multi-part form HTTP requests in XS

 

As there already exists good material for creating XS projects, I will assume you can already do this.  For those who have not already created an XS project, if you wish to do so using HANA Cloud, I refer you to a post by Stoyan Manchev.  For On-Premise HANA developement, I refer you to the "Getting Started" series from the SAP HANA Academy.


For most of the topics I intend to cover, it should not make a difference whether or not you are developing on top of the HANA Cloud or against an on-On-Premise HANA installation.

DB001: Using libraries in XS

$
0
0

Fundamental to any serious development is the ability to have code which is organized and easy to maintain.  Therefore I am choosing the topic of using libraries in XS as the first post for my XS/SAPUI5 blog series.

 

The actual mechanism for creating and using a library within XS is quite straigt-forward.  First, the libary function is placed in an .xsjslib file.  For this example, I am calling the library SimpleLib.xsjslib.

 

////////////////////////////////////////////////////////////////////////////////////////////////////
// David Brookler's XS/SAPUI5 Blog
// 001 -- Using libraries in XS
// SimpleLib.xsjslib
//////////////////////////////////////////////////////////////////////////////////////////////////// 
function simpleLibFunction(s) {  return "This came from simpleLibFunction() -- " + s;
}


Next we need to import the library.  The $.import command has two parameters.  The first parameter is a string representing the path to the library with the directory separator being a period.  The second parameter is the name of the library without the .xsjslib. See line 8 below.

 

///////////////////////////////////////////////////////////////////////////////////////////////////
// David Brookler's XS/SAPUI5 Blog
// 001 -- Using libraries in XS
// simpleLibCall.xsjs
//////////////////////////////////////////////////////////////////////////////////////////////////// 
// import the library
$.import("blog.b001", "SimpleLib"); 
// create a variable for simpler access to the library
var SimpleLib = $.blog.b001.SimpleLib; 
// create a response object
var oResponse = {};
oResponse.directCall = $.blog.b001.SimpleLib.simpleLibFunction('direct call');
oResponse.callThroughLibVariable = SimpleLib.simpleLibFunction('call through variable'); 
// send the response object as a JSON string
$.response.setBody(JSON.stringify(oResponse));
$.response.status = $.net.http.OK;

 

Once the library has been imported, a call to a library function simply involves prefacing the call with the fullpath to the library, including the library's name.  The root is represented by a $.  See line 15 above,

 

It is also possible to create a variable to reference the library.  See line 11 above.  This variable can then be used to simplify access to the library.  See line 16 above.

 

The result of calling the XS web service will be:

 

{     directCall: "This came from simpleLibFunction() -- direct call",     callThroughLibVariable: "This came from simpleLibFunction() -- call through variable"
}

 

In the next post, I plan to look at how to create libraries which control what is exposed as its interface.

SAP HANA - How to create an Analytic View Part1

$
0
0

This is an extension to my previous post on Analytic View SAP HANA - How to Create Analytic Views

In this post we will try to look at the Calculated columns and Restricted Columns in Analytic Views.

H1.png

CALCULATED COLUMN :

Calculated Column in analytic view is of two types 1.Measure  2.Attribute

In this example we gonna create a new calculated column measure called TOTAL_AMOUNT which is calculated by multiplying Order Quantity(from Data Foundation) and Material Unit Price(from Attribute View)

 

1. Right click on the CALCULATED COLUMNS tree structure under output ( screen shot above)

2. Provide the necessary information in the popup'd window. Name, Data Type,Column Type(here Measure).

H1.png

   In the expression editor drag and drop the fields from the ELEMENTS on which we need to calculate. In this example we are selecting QUANTITY from the output columns list of Data Foundation and UNIT PRICE from the output column list of Attribute View AV_MATERIAL_INFO like below

different Operations and Functions available.

H1.png

3. Click OK and CHECK and ACTIVATE the analytic view.

4, Now newly create calculated column is available for reporting.

H1.png

RESTRICTED COLUMN :

Restricted column is used to restrict a measure based on a condition of master data.This feature helps us to push the processing of data set result into HANA rather than to client side.

In this example we gonna created a restricted measure of Net Amount only for product category "Flat Screens"

 

1. Right Click on Restricted Column in output panel of the analytic view

2. Provide the necessary information like below, Select the restricted measure column here Net Amount and Add Restriction column here Product Category and select the value as "Flat Screens"

H1.png

3. Click OK and CHECK and ACTIVATE. Go to Data Preview here you can see a new restricted column was added in which the values are displayed for which Product Category = Flat Screens for other it is null( ? ).

H1.png

In Analysis Tab

H1.png


What's new in SAP HANA SPS 7 - Installation and Update

$
0
0

Introduction

 

In the upcoming weeks we will be posting new videos to the SAP HANA Academy to show new features and functionality introduced with SAP HANA Support Package Stack (SPS) 7, released December 3, 2013. To get the best overview of what’s new in SAP HANA SPS 7, see Ingo Brenckmann's blog on the SAP HANA community site.

 

The topic of this post is installation and update, and complements a number of tutorial videos posted to the SAP HANA Academy site and on YouTube:

 

What's New with SPS 7?

SAP HANA components (server, studio, client) are installed with installer hdbsetup in graphical mode or hdbinst for command line with optional configuration file. 

As of SPS 4, the installation process was complemented with a tool called Unified Installer in the documentation (and install.sh on DVD). This Unified Installer called, with some prerequisite checking and error handling, the installers for SAP host agent and for SAP HANA server, studio, and client, and script hanaconfig.sh for AFL, etc., all with the required parameters. Obviously, when preparing a SAP HANA system, it was a great timesaver to be able to run only one installer instead of running all those different installers and scripts. Unfortunately, the Unified Installer tool did not have the sophistication of the component installer hdbinst, as it would not accept parameters on command line or file, for example.

Until SPS 7, this concerned SAP employees and SAP certified hardware partners, as apart from the different Cloud offerings, the SAP High-performance Analytical Appliance was only available as ... appliance, that is, as a closed computer system with both hardware and software pre-configured and optimised, similar to SAP Netweaver Business Warehouse Accelerator, for example.

As of SPS 7, an additional approach besides appliance delivery is now available. This approach is called SAP HANA tailored data center integration, or DCI, for those that prefer to use acronyms. With tailored data center integration, any SAP HANA certified engineer can now install SAP HANA on certified hardware. This put a spotlight on the installation process and this process has now been greatly enhanced with SAP HANA lifecycle management tools.


For more information about data center integration, see the blog posted by Adolf Brosig, Take Your Choice: How to Integrate SAP HANA as Smoothly as Possible Into Your Data Center or the overview presentation by Jens Rolke, Overview - SAP HANA tailored data center integration.

For more information about certified SAP hardware partners, see Rely on hardware from global technology leaders.


Does this also mean can we now run the SAP HANA database on Windows, one might ask? No, it does not. The only supported operating system for the SAP HANA server is still SUSE Enterprise Linux (SLES) 11. For more information about supported operating systems for server and clients, see the Product Availability Matrix (PAM) for SAP HANA.

 

Unified Installer is now deprecated and will no longer be shipped with future releases.

 

SAP Certified Technology Specialist - SAP HANA Installation

 

Interested in performing SAP HANA server installations? With the tailored data center integration offering, installing SAP HANA is no longer restricted to certified hardware partners and a new certification has been introduced to guarantee required skills and knowledge: E_HANAINS131 - SAP Certified Technology Specialist  - SAP HANA Installation. Prerequisite is the SAP Certified Technology Associate certification. The training HA200 presents the required knowledge.

 

For more information about this certification, seeSAP Training and Certification Shop.

 

SAP HANA lifecycle management

SAP HANA lifecycle management tools come, like the SAP HANA single component installers, in two flavours: as command line or with screens: hdblcm and hdblcmgui. The GUI is convenient for a guided installation requiring minimal input and uses default values for parameters where possible. The command line proposes the same interactive installation, in case X-Windows is not configured on the Linux host.

 

The introduction video below discusses important concepts, what documentation to read, and where, what and how to download SAP HANA software components, etc.

 

 

The next video shows the actual installation using the new Lifecycle Manager GUI in interactive mode performed on a Windows client using Xming X Server for Windows together with PuTTY and SSH.

 

 

As mentioned, not every parameter can be specified using the guided interactive installation, and one such parameter is Autostart. This parameter controls whether the SAP HANA system is configured to start automatically when the host operating system starts (using sapinit). This defaults to no.

Should you wish to change this behaviour, see the video below on how to configure SAP HANA database to start at system boot.

 

 

Besides interactive installation, lifecycle management allows for a scripted installation as well, and this clearly is its forte. Imagine performing a full SAP HANA installation including server, studio, client, AFL on a distributed system of 10 nodes. This can now be achieved with a single parameter file and a single command.

 

Another example listed in the installation guide is about a hardware partner that wants to automate the installation of nine SAP HANA systems (1 Extra Large, 5 Large, and 3 Small). By specifying one parameter, a template configuration file is generated, which can than be edited using any text editor.

 

./hdblcm --action=install --dump_configfile_template=/home/root/HANA_install.cfg

 

As such, the partner can create configuration files for each of the three system types and call hdblcm with the configuration file parameter in batch mode. Whereas interactive mode will prompt for password, batch mode runs the installer without asking for any input. Passwords can be stored in an XML file and passed to the installer as a stream by standard input, or they can be specified in the configuration file.

 

cat ~/Passwords.xml | ./hdblcm --configfile=/home/root/HANA_install_S.cfg --read_password_from_stdin=xml -b

 

When the installation script is run, SAP HANA is installed on both the single-host and multi-host systems, without any additional input. By reusing the same configuration files, the installations are reliable, flexible, and efficient.

 

SAP HANA studio and client installations

One of the new features introduced with SAP HANA SPS 7 is that local Administrator rights on Windows computers (or root privileges on Linux and UNIX) are no longer required to install SAP HANA studio or the SAP HANA client. If you wish to install these component but you do not have administration rights on a particular system, the product will simply install in your home environment and will be available for use for your eyes only.

 

SAP HANA studio is an Eclipsed-based IDE that can be used for SAP HANA development, analytical modelling and database administration.

The video below shows how to install SAP HANA studio on Linux and Windows.

 

 

SAP HANA client comprises ODBC, JDBC, and SQLDBC (runtime for SAP Netweaver to communicate with SAP HANA). On Windows, ODBO for Excel OLAP analysis is included as well.

 

The video below shows how to install the SAP HANA client on Linux and Windows.

 

 

SAP HANA studio update site

Another great feature of SAP HANA studio is the update site. You can configure a SAP HANA system to host the update package for SAP HANA studio. This way, when you update the SAP HANA server using the lifecycle management tools, an update will be immediately available for the SAP HANA studio. Anyone using studio in your environment can than update their version to the latest available at their convenience. Users can even configure studio to check automatically at a certain schedule or at startup, for example, to verify if an update is available. The update site mechanism uses HTTP(S), the same as for the other components of the Eclipse IDE or SAP HANA tools [http://tools.hana.ondemand.com], and can be made available - if desired - to anyone with internet access.

 

As of SPS 7, the update site is hosted by SAP HANA XS. Previously, Software Update Manager (SUM) was used for this purpose.

 

The video below shows how to update SAP HANA studio automatically using an update site (and how to configure SAP HANA XS to host this).

 

 

To be continued...

You can view more free online videos and hands-on use cases to help you answer the What, How and Why questions about SAP HANA and Analytics on the SAP HANA Academy at academy.saphana.com or follow us on Twitter @saphanaacademy.


HANA Text Analysis with Custom Dictionaries

$
0
0

HANA Text Analysis with Custom Dictionaries


Prerequisites:

  • How to create a developer workspace in HANA Studio.
  • How to create & share a project  in HANA Studio
  • Run HANA Text Analysis on a table

 

With release of HANA SPS07, a lot of new features are available. One of the main features is the support for custom dictionaries in Text Analysis. By default HANA comes with three configurations for text analysis:

  • Core Extraction
  • Linguistic Analysis
  • Voice of Customer

 

One of the main issues you can come across while working on HANA Text Analysis is defining your own custom configurations for Text Analysis engine to work upon.  In the following lines, you will find how to create your own custom dictionary, so you could benefit more from HANA text analysis capabilities.

 

Scenario:

 

Assume that your company manufactures laptops and have recently launched some new laptops series. You want to know if the consumers out there who have bought the machine are facing any problems or not. The consumers will be definitely tweeting, posting, blogging about the product on the social media.

 

You are now harvesting massive amount of unstructured data through social media, blogs, forums, e-mails and other mediums. The main motivation behind this will be to gain customer perception about the products (laptops). You may want to receive early warning of product defects and shortfalls and listen to channel and market-specific customer concerns and delights.

 

With HANA SPS07 we can create custom dictionaries which can be used to detect word/term/phrase occurrences which may not be detected while we run Text Analysis without any custom dictionary.

 

You need to follow the following steps to get started with custom dictionaries:

 

1. Create the source XML file

 

I have created some dummy data in a table with “ID” and “TEXT” columns.

 

User_tweets table structure

 

ID

TEXT

1

The #lenovo T540 laptop's latch are very loose.

2

my laptop's mic is too bad. It can't record any voice. will not be buying #lenovo in near future

3

LCD display is gone for my T520. Customer care too is pathetic.

4

T530 performance is awesome. Only problem I am facing is with microphone. :-(

 

The mycustomdict.xml file has the following structure:

 

<?xml version="1.0" encoding="UTF-8"?>

<dictionary name="LAPTOP_COMPONENTS">

   <entity_category name="Internal Parts">

      <entity_name standard_form="Inverter Board">

            <variant name ="InverterBoard"/>

            <variant name ="InvertrBoard"/>

      </entity_name>

      <entity_name standard_form="LCD Cable">

            <variant name ="lcdcable"/>

            <variant name ="cable lcd"/>

      </entity_name>

   </entity_category>

</dictionary>

 

Please refer to the following guide http://help.sap.com/hana/SAP_HANA_Text_Analysis_Extraction_Customization_Guide_en.pdf  to know more about the creation of the source xml file to build custom dictionaries.

 

Using the above mentioned custom dictionary, HANA text analysis engine will detect “inverter board” & “LCD Cable” as entities of type internal parts of a Laptop.

 

2. Compiling the mycustomdict.xml file to a .nc file

 

First copy the XML file to your HANA machine using some FTP client.

I have copied the mycustomdict.xml to  /home/root/customDict folder

 

You can find the dictionary complier “tf-ncc” in your HANA installation at:

/<INSTALLATION_DIR>/<SID>/HDB<INSTANCE_NO>/exe/dat_bin_dir

 

Text analysis configuration files can be found at the following path:

/<INSTALLATION_DIR>/<SID>/SYS/global/hdb/custom/config/lexicon/lang

 

Run the complier on the source mycustomdict.xml file:

export LD_LIBRARY_PATH/<INSTALLATION_DIR>/<SID>/SYS/exe/hdb:/<INSTALLATION_DIR>/<SID>/SYS/exe/hdb/ dat_bin_dir

 

/<INSTALLATION_DIR>/<SID>/HDB<INSTANCE_NO>/exe/hdb/dat_bin_dir/tf-ncc -d /<INSTALLATION_DIR>/<SID>/SYS/global/hdb/custom/config/lexicon/lang -o /<INSTALLATION_DIR>/<SID>/SYS/global/hdb/custom/config/lexicon/lang/mycustomdict.nc /home/root/customDict/mycustomdict.xml

 

After executing the above command a file named mycustomdic.nc will be generated in the

/<INSTALLATION_DIR>/<SID>/SYS/global/hdb/custom/config/lexicon/lang folder which will be later used by the text analysis engine.

 


3. Create custom HANA Text Analysis configuration file

 

After compiling the xml file, we need to create a custom text analysis configuration to refer to the compiled .nc file we created in the previous step. The configuration file specify the text analysis

processing steps to be performed, and the options to use for each step.

 

In HANA studio create a workspace and then create and share a project.  Under this project create a new file with extension “hdbtextconfig”. Copy all the contents of one of the predefined configurations delivered by SAP as mentioned above. They are located in the HANA repository package: “sap.hana.ta.config”. For this scenario, I have copied the contents of the configuration file “EXTRACTION_CORE_VOICEOFCUSTOMER”.

 

Creating a Text Analysis Configuration: Section 10.1.3.2.1 of the HANA developer guide SPS07: http://help.sap.com/hana/SAP_HANA_Developer_Guide_en.pdf

 

After copying, modify the “Dictionaries” node under configuration node name "SAP.TextAnalysis.DocumentAnalysis.Extraction.ExtractionAnalyzer.TF” and add a child node for <string-list-value>

 

<string-list-value>mycustomdict.nc</string-list-value>


config.png

 

Now save, commit and activate the .hdbtextconfig file. After activation, now we can run Text Analysis engine using the custom configuration. To run text analysis, run the following SQL command:

 

CREATE FULLTEXT INDEX <indexname> ON <tablename> CONFIGURATION ‘<custom_configuration_file>’

TEXT ANALYSIS ON;

The fulltext index will be created as “TA_<indexname>".  For our scenario table the output of the fulltext index table is:

 

fulltext_index.png

 

As you can see the Text Analysis engine have indentified LCD, latch, Mic as internal parts. The above results can be used for data mining or analytical purposes.

SAP TechEd and me

$
0
0

Greetings!!!

 

I got a chance to attend SAP TechEd Bangalore 2013. It was three days of knowledge overflow. We had a stall at TechEd and I was most of the time involved in the stall activities.

 

I will share the day by day experiences that I had during this amazing TechFest.

 

Day 1:

 

First Things First - Never trust the traffic conditions in Bangalore, planned to reach at 9 but reached the TechEd premises at 12 and missed the Keynote from Vishal Sikka. But thanks to TechEd Live I was able to attend it virtually later .

I got a chance to visit all the other stalls and SCN club house. Got the SCN badges and got my photos clicked from the professional photographer.
The most important thing on this day was DemoJam. The wait for this event was killing me for the whole day. Got a sneak peek into the future from the DemoJam. Six amazing demos all back to back!!!

Couldn't wait for the Keynote from Saurav Ganguly, as I could never have reached home on the same day if I had attended it.

 

Day 2:

 

This day I planned my travel and reached there before 9 .

I had a planned to visit a couple of sessions today but could attend just one of the Hands-on-session due to crowded stall .

 

Hands-On Session:  Predictive Analytics and SAP HANA.
This session was about using the advanced functionality AFL in SAP HANA. I learnt how to use the AFL Wrapper using SQL as well as the graphical 'AFL model'. SAP provides some(I think 20) of the basic procedures with the SAP HANA AFL Library. One can either use these procedures or opt for a separate R Server approach, where R Server running on a separate Linux machine will be executing the functions as per requirements.

This was a really informative session.James Michael Amulu along with his great team conducted the session really well.

 

Apart from this, I also got a chance to visit demo pods and know more about current affairs in SAP. To name a few I got to visit the SAP Lumira, SAP HANA 101, SAP HANA Marketplace, SAP Certification, SAPUI5, Fiori and UX.

My deductions from the overall experience was that SAP HANA and new UI paradigms were dominating most of the TechEd. SAP is now reinventing their solutions with SAP HANA and SAP Fiori, the User Experience in business should be as good as personal experience.

 

Day 3:

 

I again had a plan to visit a couple of sessions on this but I was unable to attend even one of them. However, I got to interact with many people on this day. Ravindra Channe, an expert in HANA, also visited our stall and I could interact with him.

I also paid a quick visit to the other stalls and gathered knowledge as much as I could.

 

I had opted for certification on SAP HANA and this was the date I had been assigned for the certification exam. The exam was scheduled after the TechEd main event. It was delayed a bit but I was able to clear the certification which made my day .

 

Overall it was an overwhelming experience at TechEd and it left me exhausted over the weekend till Monday.

Waiting for SAP TechEd 2014 already.

 

A very Happy New Year to all and a big thanks for reading this blog.

So you want to fire of voice commands to HANA on your MAC?

$
0
0

Sometimes you just wake up on a saturday morning with an idea. “I wonder if I can connect to HANA via speech from my Mac”. It would mean I could fire off a request to HANA, get a response back and have the results returned to me via Speech.

 

Now I have showed you on a number of occasions that can be done on the iPhone by using Siri, but I was wondering how difficult it would be to do it from a Mac. It turns out, it’s a piece of cake.

 

The first thing I did was look for an ODBC driver for OS X. Now I pretty much expected that we do not have one, but I tried anyway:

 

Untitled.png

 

 

And lo and behold, my buddy and fellow mentor Carsten responded. No ODBC driver… Now normally that would have ended my adventure, but as HANA is amongst many things a developers platform, it’s pretty easy to use RESTful services (XS, oData) to connect to HANA. Why would anyone still need ODBC I asked myself:

 

 

Untitled.png

 

So with that said, let’s script!

 

A quick Google shows the Mac has something called “speakable items”. You can use voice commands to have your Mac do whatever you want. Well, that means it can also speak to HANA.

 

 

Untitled2.png

 

So whenever I say “Mac + a command”, the command is executed and this little fellow will pop up executing my every wish:

 

Untitled.png

 

How cool is that!

 

 

But how do I attach a command to my speakable items? It is simple: AppleScript.

 

AppleScript is Apple’s own scripting language which can automate a ton of stuff on your Mac. The cool part is that once you save your AppleScript in a specific folder called “speakable items”  you can actually trigger actions by voice. Just name the script to a voice command and we should be good to go. The fun starts ;-)

 

First off, I need a JSON parser for AppleScript . As simple Google search gives me one:

 

JSON Helper

 

It turns out to be a fantastic piece of software which runs in the background waiting for AppleScript to ask it to do something, just like Apple's own System Events does. A brief look at the samples gives me an idea on how I need to create my script. In the end, 5 lines of code is all I need:

 

tellapplication "JSON Helper"

       setmyRecordtofetch JSON from "http://smartwatchplusapp.appspot.com/temperature" name "admin" password "welcome"

       setmyResultto|TEMPERATURE|ofitem 1 ofresultsofdofmyRecordasstring

       say ("The temperature of the HANA Alerter is " & myResult& " °C")

endtell

 

 

Basically I call JSON helper from my script, I feed it my XS service (the mentioned URL is the XS URL I rerouted to a public domain, the original would look like this: http://54.246.85.50:8000/alerter/services/temperature.xsodata/TEMPERATURE/?$orderby=TIMESTAMP%20desc&$top=1&$format=json), username and password for basic authentication and finally I format the result into something readable. The last step in my script (say) speaks the output of my service. Executing services via speech by voice just became a reality!

 

I just used one of my services which was still active (from my TechEd HANA Alerter demo). Ofcourse you can do whatever you want with this technology. Any HANA table or view you can make accessible via voice on your Mac!

 

Below a clip on how it will all look in the end:

 

 

Have fun and let me know in the comments which use cases you build! I'm sure you can be tremendously creative.

 

Thank you for reading and take care,

 

Ronald.

Filtering Rules using SAP HANA Decision Table

$
0
0

Filters are often applied by the business users or rule designers to control the output based on the multiple parameters specific to any industry like filter customers that have a specific plan in telecom industry or filter wheels that is right fit to a vehicle based on make, manufacturer, model etc. or filtering customers that have specific policy in Life Insurance sector or filtering messages in your inbox

This blog will walk you through the process of defining rules and using them to filter the content based on specified action and/or input.

 

Facts

  • Decision Table directly does not support filtering of rules. It will be used in consonant with Calculation View to achieve filtering.
  • There could be several approach based on the requirements like performance, filtering to be done first and then the rules are to be executed or vice versa etc. In all the approach Calculation View has to be used to filter no-matter at which stage you choose to filter.
  • This solution could be applied since HANA release SP06

 

 

Usecase
Consider a major wheel retailer company that would want to filter the wheels based on the vehicle parameters like model, make and manufacturer. Besides, they also a set of rules – which have to be applied - when the filtering of wheels is done.

 

 

 

Solution
Here is a step-to-step guide that could be used to filter rules using decision table based on the usecase described above.  Explore this solution that has been divided into 3 sections (a) Data model (b) Decision Table model (c) Consumption model

 

(a) Data Model

I have created two database tables named VEHICLE and WHEEL. VEHICLE table contains all the metadata about the vehicle and WHEEL contains all the metadata about the wheels that would later be suggested as fitting to a particular vehicle.

 

Image1.jpg

 

(b)  Decision Table Data Foundation
You can use the tables to create the data foundation of the decision table

 

Image2.jpg

 

Decision Table

  • Add the Attributes, Data Foundation, to create Condition and Actions of decision table as

 

Image3.jpg

 

Note:  Action as Parameter – ISWHEELPROPER, which is set to 1 if the wheel is proper fit for vehicle or 0 otherwise.  Through this decision table would do first set of filtering by making the database record feed in an extra column i.e. ISWHEELPROPER with 0 or 1

 

  • Fill the decision table with Condition Values and Action Values

 

Image5.jpg

  • Finally,  Save, Validate and Generate Decision Table.

       This would generate the Result View that would be used in Calculation View

 

 

For more details on modeling decision table refer my blog series

 

Calculation View

 

  • Use the result view of the decision table in Projection shown as Projection_1
    Note: You can find result view in “_SYS_BIC/<your-package>/<your-decision-table-name>_RV
  • In this Projection_1,  create Input Parameters and Filter. 
  • Input parameter are the one the you want user to input like Vehicle Make, Year and Model based on which the suggestions of right-fit wheels would be made.

    Image6.jpg

 

 

  • Filter is also based on the ISWHEELPROPER =1 (attribute from the decision table) i.e. The right fit is determined by two factors (a) input parameters of calculation view and (b) ISWHEELPROPER attribute from decision table

    Image7.jpg
  • Finally, Save , Validate and Generate Calculation View

 

 

 

Test

 

Use Data Preview of Calculation View to test  the result of calculation view

 

Image8.jpg

 

 

Image9.jpg

 

 

(c)  Consumption Model

Calculation view can further consumed using OData service.

 

 

You can thus use decision table to control the items that are consumed in your application, and can bring in ability of controlled consumption to the database. Follow this blog to successfully create custom application in HANA Server especially where filtering-rules are needed. Do write in your suggestions and feedback. If you have any queries on filtering rules then drop me comments, I would be happy to help you !

Viewing all 676 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>