Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
Showing results for 
Search instead for 
Did you mean: 
With my background in experimental physics and the many years spent coding in NI's Labview, I fell in love with the SAP Data Intelligence Modeler application at first sight.  We must admit it: the idea of being able to create very complex data pipelines with simple drag-and-drop gestures is irresistible. Nevertheless, with my becoming more and more expert of the Modeler UI and its endless possibilities, an idée fixe made its way into my mind: how do I automate the creation, modification, and execution of a pipeline?

The answer was immediately clear: I needed a way to programmatically interact with the Modeler.  Fortunately for me, the SAP developers had already foreseen such a necessity and equipped the application with a Swagger REST API: in this post I will show you how to find and use it.



  • The steps of this post have been performed with Data Hub 2.7.1 and Data Intelligence 1911.0.22, but should be valid for any Data Hub ≥ 2.5 and any Data Intelligence 19** version.

  • Recommended browser: Chrome.

  • A basic knowledge of the SAP Data Intelligence Modeler application is required.

  • To complete Step 6, a Unix terminal with curl is required.


Step 1: the Swagger extension for Chrome

Chrome is my preferred browser when it comes to work with SAP Data Intelligence.  Among its many qualities, there is also the possibility to install a Swagger UI Console that will come in handy to complete the examples described in this post.  After completing the installation, a swagger icon will appear in you browser toolbar: we will use it in the following step.


Step 2: find the swagger.yaml file of the Modeler API

Log into the SAP Data Intelligence Modeler and edit the URL by adding swagger.yaml at the end, as in the following example:


Step 3: visualize the API commands with the Chrome's Swagger UI Console

The content of the yaml file will appear in the browser window.  As-is, the text does not look really appealing, but in fact it contains the definition of the whole REST API.  To visualize the real power of this file, we just need to click on the icon of the Chrome's Swagger UI Console.


Chrome opens the Swagger Console in a new tab that we can use to browse and execute all the available commands of the Modeler API.


Step 4: query the Modeler REST API from the Swagger Console

The Chrome Swagger console can be used to directly execute the REST API queries: just click on one of the categories (repo, rt, mon, sys) to open the list of the available commands.  Clicking on the command will show a brief description, will provide a way to prepare and fire a test query, and will also display the result of the query.


Step 5: start and stop a graph using the Modeler REST API

In the previous step, we learned how to retrieve the list of all graphs from the Modeler repository (aka vFlow artifacts) using a GET call.  We can now be a bit more ambitious and see how to execute POST and DELETE calls to start and stop one instance of one of the demo graphs: the Data Generator.

The corresponding REST queries belong to the runtime (rt) category:


To start a graph, the corresponding graph src shall be provided as a key-value pair in a json structure that complements the POST query.  For the Data Generator demo graph, the source is as you can easily see from the Modeler GUI:


The corresponding json structure is the following:
"src": ""


After copying the above text into the corresponding field of the Swagger Console, you can try the command out to start an instance of the Data Generator graph:


As you can see from the image above, the response body of a successful execution contains the handle of the new graph.  As you probably know, this is the unique identifier of the running instance of the graph and we can use it to stop the graph:  just copy and paste it in the corresponding field of the "Delete or stop a graph" command, set the stop flag to true, and try it out!


You can of course cross check the correct execution of the above commands directly from the Modeler UI.


Step 6: use the Modeler REST API from a Unix terminal

The Chrome Swagger Console is definitely a cool tool, but my promise at the beginning of this post was to show you how to manage the Modeler in a programmatic way.  In other words, we need to be able to repeat the above steps from a terminal.  Fortunately, with the full list of available commands and the corresponding endpoints at our fingertips, the task is plain sailing.  All you need to do is to copy from the Swagger Console the Request URL of the commands you want to execute and use curl to fire them:
export tenant="<tenant>" #replace with your tenant name
export user="<user>" #replace with your user name
export pwd="<pwd>" #replace with your password
export PipelineURL="https://<url>:<port>" #replace with the modeler endpoint

curl -k -X GET -H 'X-Requested-With: XMLHttpRequest' --user $tenant\\$user:$pwd $PipelineURL/app/pipeline-modeler/service/v1/repository/graphs   #to list all graphs from the repository.
curl -k -X POST -H 'X-Requested-With: XMLHttpRequest' --user $tenant\\$user:$pwd -d '{"src": ""}' $PipelineURL/app/pipeline-modeler/service/v1/runtime/graphs  #to start an instance of the Data Generator graph
curl -k -X DELETE -H 'X-Requested-With: XMLHttpRequest' --user $tenant\\$user:$pwd "$PipelineURL/app/pipeline-modeler/service/v1/runtime/graphs/<handle>?stop=true"  #to stop a running graph given its handle (to replace in the string)




In this first episode of "Zen and the art of SAP Data Intelligence" we learned how to use the SAP Data Intelligence Modeler REST API.  It is a powerful tool that unlocks very interesting scenarios.  Pipeline execution can be monitored and managed from a remote system.  Parametrized pipelines can be configured and executed according to external factors.  Complex data flows can be envisaged, with pipelines starting and stopping other pipelines: load balancer frameworks, on-demand executions, integration with external monitoring tools, and so on.

If you would like me to deep dive in some of the above scenarios or in any other SAP Data Intelligence topic, just add your request as a comment to this post.  Any other positive and negative feedback is also welcome and encouraged: you all know the new experience economy mantra, right?

Thank you for reading!


For the philomaths

Further information about the topics treated in this blog post can be found in the following references: