Application Development Blog Posts
Learn and share on deeper, cross technology development topics such as integration and connectivity, automation, cloud extensibility, developing at scale, and security.
Showing results for 
Search instead for 
Did you mean: 

In my (admittedly, slightly dated) last post, I outlined the approach we took in our project to tackle the issue of continuous delivery together with SAP HANA. But that was only half of the story.

Today I'd like to share with you how we solved this problem for the frontend tier: SAP BusinessObjects reports (which consume the HANA views).

As with SAP HANA, transporting stuff from one BusinessObjects BI system to another, is primarily a manual process: in "Promotion Management" within the CMC, you

  • manage your systems (source, targets)
  • create a promotion job
  • select the objects which should be transported (e.g. Universes, folders containing Information spaces etc.)
  • transport this job from A to B


The issue we had in our project with this? Well, for one thing, we frequently ran into situations where an object couldn't be transported because the target system already contained an object with the same name but with a different CUID.

Now, granted that you cannot blame the transportation mechanism when people go and create objects manually on target systems when in fact they should be transported, this was annoying nonetheless.

But the main motivation for what follows was, that the process is a manual one. As with the backend we needed to find a way to script this process, so that we could integrate it into our build automation.

Process flow

The build begins with exporting the selected objects from our development system to LCMBIAR files and pushing these to a github repository.

The trickiest part here was to figure out the correct export queries to use in the properties file, which the the Promotion Management command-line tool ( takes as a parameter. In the meantime, there's more documentation available (e.g. this article as well as SAP Note 1969259), but at the time, there was little to be found. Experimenting with the installation's Query Builder (http(s)://<bobj_instance>/AdminTools/querybuilder/ie.jsp) proved helpful.

In the following build stage, the respective target system (e.g. QA, PROD) is first cleared by deleting any existing objects. To this end, I used the Java API for SAP BusinessObjects BI:




import java.util.Iterator;

import java.util.Properties;

import com.crystaldecisions.sdk.exception.SDKException;

import com.crystaldecisions.sdk.framework.CrystalEnterprise;

import com.crystaldecisions.sdk.framework.IEnterpriseSession;

import com.crystaldecisions.sdk.occa.infostore.IInfoObject;

import com.crystaldecisions.sdk.occa.infostore.IInfoObjects;

import com.crystaldecisions.sdk.occa.infostore.IInfoStore;

import com.crystaldecisions.sdk.plugin.authentication.enterprise.IsecEnterprise;

public class DeleteObjects {

  private static final String QUERY_UNIVERSES =


  private static final String QUERY_CONNECTIONS =


  private static final String QUERY_LCMJOBS =


  public static void main(String[] args) {

    Properties props = null;

    try {

      props = new Properties();

      props.load(new FileInputStream(""));

    } catch (IOException e) {

      System.out.println("An error occurred while reading the properties file.");



    delete(args, props);


  private static void delete(String[] args, Properties props) {

    IEnterpriseSession enSession = null;

    try {

      // Log on

      enSession = CrystalEnterprise.getSessionMgr().logon(



        props.getProperty("cmshost") + ":" + props.getProperty("cmsport"),


      // Retrieve the InfoStore

      IInfoStore infoStore = (IInfoStore) enSession.getService("InfoStore");

      if (args[0].equals("universes"))

        searchAndDestroy(infoStore, QUERY_UNIVERSES);

      if (args[0].equals("connections"))

        searchAndDestroy(infoStore, QUERY_CONNECTIONS);

      if (args[0].equals("lcmjobs"))

        searchAndDestroy(infoStore, QUERY_LCMJOBS);

      if (args[0].equals("folder")) {

        String QUERY_FOLDER =

          "SELECT * FROM CI_INFOOBJECTS WHERE SI_KIND = 'Folder' AND SI_NAME = '" + args[1] + "'";

        searchAndDestroy(infoStore, QUERY_FOLDER);


    } catch (SDKException e) {

      System.out.println("Exception while working with the CMS");

    } finally {





   * Indiscriminately delete everything returned using the passed query


   * @Param infoStore

   * @Param query

   * @throws SDKException


  private static void searchAndDestroy(IInfoStore infoStore, String query)

      throws SDKException {

    IInfoObjects infoObjects = infoStore.query(query);

    for (Iterator<IInfoObject> iter = infoObjects.iterator(); iter.hasNext();) {






This utility is packed into a JAR archive and placed onto the file system (in our case in /usr/sap/bobj/sap_bobj/util).

It is then sequentially called for all relevant objects, e.g.

java -cp <classpath_containing_all_jars> universes

java -cp <classpath_containing_all_jars> folder WAP

The latter example deletes the folder "WAP" and everything in it (subfolders, Information Spaces etc.). A file containing credentials for connecting to the relevant instance needs to be provided (in this example in the file config.props😞





As a last step, the LCMBIAR files created previously are pulled from github and imported, again using

There is one important post-delivery step in our pipeline regarding connection objects, but I'll shed light on this in a separate article.

  • SAP Managed Tags:
1 Comment