Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
Showing results for 
Search instead for 
Did you mean: 
Developer Advocate
Developer Advocate
This is a searchable description of the content of a live stream recording, specifically "Episode 6 - Extending the CAPM bookshop tutorial – hacking & learning" in the "Hands-on SAP dev with qmacro" series. There are links directly to specific highlights in the video recording. For links to annotations of other episodes, please see the "Catch the replays" section of the series blog post.

This episode, titled "Extending the CAPM bookshop tutorial – hacking & learning", was streamed live on Fri 22 Feb 2019 and is approximately one hour in length. The stream recording is available on YouTube.

Brief synopsis

We’re pretty much at the end of the tutorial as it stands. But there’s always more to learn by hacking on it, to tweak and extend it. We do that in this episode, uncovering more features of CAPM and CDS.

00:05:07: A quick look at the CDS Language Support extension for VS Code, which is based on the use of the Language Server Protocol which was created by Microsoft.

00:07:15: Downloading the CDS Language Support extension from the SAP Development Tools (cloud) site so we can have a look inside it.

00:08:15: The .vsix file is actually a bundle that is a gzipped tarball, which means that we can change the extension to .tgz and unpack it to have a look inside, where we see, amongst other things, the cds-compiler package and the cds-lsp package within the @Sap namespace.

00:10:00: We notice that the CDS package is now at 3.5.2, following a couple of point updates since the move from 3.0.0 to 3.5.0 a couple of weeks ago.

00:11:09: We note that the cds-lsp package is not part of the main @Sap/cds install, i.e. it's not a dependency that we can see with npm info @Sap/cds. It's the language server implementation for the VS Code language server client.

00:11:59: Using the cds-lsp package "standalone" as a server for another language server client ... in Vim. The config in my .vimrc for this looks like this:
set runtimepath+=~/.vim/bundle/languageclient
set hidden
autocmd BufRead,BufNewFile *.cds setfiletype cds
set signcolumn=yes
let g:LanguageClient_serverCommands = {
\ 'cds': ['/Users/i347491/.vim/bundle/languageclient/startcdslsp']
\ }

and I'm using the LanguageClient-neovim plugin.

00:13:22: Seeing the language client in Vim in action, connecting to the language server provided by the @Sap/cds-lsp package and offering autocompletion (in Vim via Ctrl-X Ctrl-O) as well as CDS syntax checks.

00:14:58: Talking briefly about a very useful macOS app, in response to a question from Ronnie about keyboard-shortcut based anchoring of windows. I use Spectacle which is the single most useful little app I have on this machine. Recommended!

00:16:23: Looking quickly at a keyboard shortcut I have in my config that can quickly take me to inside the globally installed @Sap/cds package directory so I can hunt around and discover things. I manage this with features in my scripts repo (look at the .bmdirs, .bmfiles and shortcuts files in particular).

00:17:36: Starting the new project with cds init. This project will be to create a cut down version of the Northwind dataset and services, and so we call it "northbreeze" (naming things is hard).

00:18:49: Security alert - some strange person in the field.

00:19:50: Creating a new file db/model.cds, using the command palette to refresh the Explorer to see the newly created directory db/. Pierre points out that you can actually create the db and srv folders automatically when you initialise the project. Have a look at cds help init to find out more on this.

00:21:39: Looking at the Products data in Northwind, via$format=json.

00:22:26: Talking about a nice triple of data types that we can use: Products, Suppliers and Categories - there are relationships between them.

00:23:19: Noting that for Products, we don't get all the entities in the entityset (there are 77 in total) in one go - we can see that at the end of the output there's a link to the next 'batch', via the use of a $skiptoken. We have to bear this in mind when thinking about grabbing the data from the Northwind servers programmatically.

00:26:45: Starting to define our model, something like the definition in the tutorial we followed previously - Create a Business Service with Node.js using Visual Studio Code.

00:27:40: Delighted by the autocompletion providing us with an entity definition skeleton that includes the cuid aspect which we saw in a previous live stream episode ... it comes from the @Sap/common.cds file we explored in episode 1. The definition looks like this:
* Aspect for entities with canonical universal IDs.
abstract entity cuid {
key ID : UUID; //> automatically filled in

This aspect will cause the entity to have a key property ID defined as a UUID type, automatically.

00:28:50: Talking about how the CAPM best practices influence how we name our things. So for example our first entity is capitalised and plural (Products) and the properties are simple (e.g. "name" rather than "product_name"). Perhaps I should have used more underscores in the property names, or camel case, but there you go.

00:32:40: Rather than me typing everything in, I bring in the model definition from another file, to save time.

00:33:45: Looking at the relationships between Products, Suppliers and Categories as described by the Association keyword, noting that the definitions are bi-directional. These definitions result in the appropriate navigation properties in the OData service that we get.

00:35:25: Adding a simple service that just reflects the entities we have so far, in a one-to-one mapping.

00:37:06: Trying the service out on localhost, even before we've set up the database as a persistence layer. Inspecting the metadata document. Confirming the service is working.

00:38:25: Looking at a blog post 5 Ways to Make HTTP Requests in Node.js which takes us through different ways to make HTTP requests when in the Node.js context, from the simple (and builtin) http library which is standard, through to axios which supports the Promise API.

00:40:08: Creating a new project ('grab') where we'll write the JavaScript to grab the data from the Northwind service, using npm init -y to set up a basic package.json file and then immediately installing (locally to this project) the axios package.

00:41:55: Looking at the blog post example to see how we use axios, especially taking advantage of axios.all([...]).

00:44:07: Starting to create grab.js, bringing in the axios module and trying out a first HTTP request to$skiptoken=0.

00:46:45: We see that the result of the call to axios.get is a Promise. And that holds a lot of, err, promise.

00:46:55: Extending the basic call to specify console.log to a chained .then(), noticing that we're supplying a reference to the function (console.log) and not actually invoking the function. In other words, then takes a function as its parameter here.

00:48:34: We get the axios object as output.

00:49:06: Now we change the call to axios.get to axios.all which will enable us to specify multiple HTTP request, which we do, for the first and the second batch of 20 products.

00:50:25: We start up VS Code to run this in now, so we can use the debug feature to inspect what we get after the axios.all finishes. We use F9 to set a breakpoint on the console.log(x) statement so we can stop and have a look at what we get.

00:51:44: We see that what we get (in x) is an array of two objects (maps). Each is from axios, and represents the HTTP response, and a nicely parsed bit of data from the payload (converted from JSON into a JavaScript object) in x[0].data (for the first one, of course).

00:53:33: Sir Rodney's Scones! Clearly the correct pronunciation of "scones" matches "stones".

00:53:55: Finally in this episode we'll coalesce the data from the two objects into a single array (i.e. to get all 40 products into a single structure).

00:55:15: Noting that we can use concat to bring together values from different arrays, noting that the function is pure, i.e. it doesn't modify either of the two source arrays, rather, it produces a new array.

00:55:57: Now we know we can use concat, we'll employ it in a single expression function definition that we'll pass to then. This expression is a call to reduce on the array of objects, and looks like this:
xs.reduce((a, x) => a.concat(, [])

I'll leave that there for us to stare at for a bit. Single expression, no moving parts, no mutation and pure, in a promise context (which therefore causes the result of this expression to be available to the next then in the chain. Nice. Well, I like it anyway!

00:58:17: As a result of this, we get a nice single array of 40 product objects, which is exactly what we wanted. And we can tidy the code up now to make it really clean.