Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
Showing results for 
Search instead for 
Did you mean: 
Former Member

Ah, so comforting to hear a familiar voice, whose dulcet tones are familiar, if not soothing.  In this video, Tahir Hussain Babar, aka Bob, of the SAP HANA Academy, shows us how to create a Remote Source in HADOOP for SPS09.

Note that in SPS09 there is no need to install UNIX ODBC drivers on the SAP HANA Linux server.  All tasks can be carried out in SAP HANA Studio.

When teaching using your voice is extremely important: from the first sound that comes out of your mouth an impression is created.  Children are often the most unforgiving and conventional of critics.  If the pitch is right and the right chord of authority, confidence and friendliness is struck then your audience may pay attention to what you have to say for the first thirty seconds.  If at this point you state clearly and succinctly what you want your students to do, you have bought yourself another two minutes.  Once an understanding has been reached, and the necessary pause for questions and clarification has been completed, you should have a settled start.  If the tasks are well prepared and match students’ capability while providing challenge and, if not enjoyment, clear measures of progress are signposted you should be home and dry.  In the long term, students find tried and tested habits reassuring.  This is one of the many things I like about Bob’s videos. They employ a pattern which is clearly discernable and is easily followed.

Bob as always introduces himself and places his video in context.  This is also reflected in the title on the chalkboard at the start of the video. He briefly discusses what each video in this mini-series is going to cover before, unusually for Bob, he explicitly places himself in the seat of a modeller. This is someone who is going to use the SAP HANA Studio to access their data. At this point the Java Developer has done their role by creating a map reduced archive file and it’s available in the SAP HANA repository.  Bob completes this task as the DEV01 user and demonstrates this by running the SQL code below which gives him a list of the map reduced jobs.


After the SQL has been run, it finds this Java package which contains the map reduced jobs.   Bob then explains why this view is important.  The schema name indicates where the package is contained.  The package name is important, this includes the case and syntax, as it will be referred to later.  When creating a remote data source you need to make an adapter.  The name shown above was not chosen: it selected by the system by default.

Bob then quickly demonstrates how to create a remote source using the GUI as below: before stating his preference for using the code. My experience of making systems concurs with his view point.  I have found that using code, irrespective of task, is more reliable and less tiresome to document.

After opening a SQL console Bob goes through the code a line at a time.


This line seems self explanatory but in previous releases to connect to HADOOP you would first need to connect to HIVE by creating a HIVE ODBC.IMI file and install drivers on your Linux Server.

Adapter “hadoop”

This is an improvement for SPS09.  You can now use this standard adapter for HADOOP whereas before you may have used Oracle etc.

Next you choose your configuration.  This is new for SPS09.  There are two websites you are connecting to for the command configuration.  Webhdfs connects you to your HADOOP distributed file system via a URL.  Webhcat connects you to your web h catalog or map reduced jobs.  The default ports have been used above and it is normally straight forward to set up.


USING ‘user=hive;password=hive’;

Lastly you need to choose how you are logging onto the system and as CREDENTIAL TYPE PASSWORD has been chosen you need to select your password.

Once you have executed the code above you have your remote source.  You can now build functions and UDFs to access that data.

So we reach the end of another of Bob’s videos.  He has followed his tried and tested formulae to explain as well as demonstrate and to understand as well as follow.  All he needs to do now: is spend his holidays marking his students’ work. 

Labels in this area