Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
Showing results for 
Search instead for 
Did you mean: 

Edit: Since Lumira 1.27, the Universe driver's been revamped - the DSL library is directly invoked by Lumira, instead of delegating to Web Intelligence. This blog is retained for older builds of Lumira to run their course, then will be taken down - TU

There's an anecdote I heard where someone asked Miles Davis his definition of Cool: "Give the essence. Withhold everything else."

It might seem quixotic to start off a blog about troubleshooting techniques for data discovery and reporting tools with a quote from one of the greatest jazz musicians ever, but, hey, I work with a couple of technologies that are pretty cool. I support SAP Business Intelligence and Analysis tools, primarily Web Intelligence, and now SAP Lumira.

Web Intelligence is cool since it uses a Semantic Layer - the Universe - to represent the data source to the user.  The Universe shields users from the complexity of the underlying implementation and presents data objects in an intuitive business language of dimensions, details and measures. The Semantic Layer gives the essence, allowing the WebI designer to hit the perfect note - the chart/graph/table that clarifies and illuminates the meaning of the data.

If WebI hits the perfect note, then Lumira is a melody - a series of notes that carry a narrative, providing context and flow for the presented information.  Lumira tells a story.

Now you don't have to use Universes with Lumira.  Lumira supports a variety of data sources. But there's compelling reasons to continue using Universes if you have a large investment in SAP BusinessObjects.  Combine Universes and Lumira, and you have something that's powerful - a story that tells the essence.  Cool jazz.

How Lumira Desktop consumes a Universe

Lumira Desktop consumes Universes that are managed by SAP BusinessObjects BI Platform 4.1.  At time of this blog, it's not able to consume stand-alone Universes outside of Platform. This is because Lumira uses the Web Intelligence Processing Server to consume data from the Universe - the Web Intelligence Processing Server that's part of BI Platform.

Here's a highly idealized logical architectural view below.  Lumira uses the BI Platform SDK to communicate with the Platform Central Management Server (CMS), and the WebI SDK to communicate with the WebI Processing Server:

Lumira first requests from the Platform Central Management Server (CMS) a list of Universes housed on the system.  Once the User selects one, Lumira then requests the WebI Processing Server to create a virtual Web Intelligence document for the Universe, then presents a Query Panel to the User.  The Query Panel is analogous to the Query Panel available with the WebI design tool. The User can then select dimensions, details and measures as result objects, and specify filtering along dimension values. 

Proceeding to the next step, Lumira will request the WebI Processing Server to process the query and return the data.  WebI Processing Server sends the query to the DSL Bridge Service, housed in an Adaptive Processing Server, to transform the query into a data-source-specific SQL statement. WebI Processing Server then executes the SQL via a database driver, retrieves the data from the RDBMs, marshals the data in XML format, and returns it to Lumira.

If the WebI Processing Server encounters any errors, it returns the error message back to Lumira.

Unfortunately, in many situations, the error message returned to Lumira, and which Lumira presents to the User, can be quite cryptic.  For example, here's an error message I encountered the other day:

The error message in its entirety says "Cannot load selected universe: csEX".

That's not really actionable - it doesn't inform the User, me, what happened to cause that error, and doesn't guide the User on how this error can be resolved.

There's other messages that can be just as opaque.  Another such error is "ERR_WIS_30270 (WIS 30270)" - it's basically the WebI Processing Server stating an error was returned when it requested something from another service.  It doesn't state what the triggering error was, just that it wasn't the WebI Processing Server's fault.

There is, however, a way to find out the root cause error message. When you open a Support case with such an error, one of the first things the Support person (who can be me) will likely ask you to do is enable and collect traces for Lumira, WebI Processing Server, and the APS.

So you mind as well collect the traces before you even open up a Support ticket, since having ready traces will lead to faster resolution.  In fact, if you take a bit of time and go through the traces yourself, you might be able to resolve the issue without the help of SAP Support! Not that I don't enjoy working with customers, but whatever gets your Lumira Desktop up and running is all good.

Tracing Lumira Desktop and BI Platform

So let's walk through a trace exercise. So that you'll be able to follow along and perform the steps on your own deployment of Lumira Desktop (here I'm using 1.20) and BI Platform (my version is BI 4.1 SP04 Patch 2), let's use a UNX Universe created from the eFashion UNV Universe.  Start the Information Design Tool, log onto BI Platform, and convert the eFashion Universe you find in the "webi universes" folder into a eFashion.unx Universe.

To ensure the eFashion.unx is working (the backend datasource is MS Access), go to the WebI Java Report Panel, open the Universe and create a query:

Here we select the dimensions "Year", "State", "City" and "Store name", and measure "Sales revenue".  A filter is defined were "State" is restricted to "California" and "New York".  Remember this, since you're going to be selecting the same objects and filter in Lumira next.  Run this query, to ensure you're getting data:

The key here is, if you encounter any issues consuming an Universe in Lumira, first test the same Universe with the same query in Web Intelligence on the same server, using the Web Intelligence Applet interface in BI launch pad. If it doesn't work in WebI, it won't work in Lumira.

Now that we have a working Universe, let's first set up and enable tracing.  On the machine where you're running Lumira Desktop, follow SAP KBase 1782007 to modify the file BO_Trace.ini by default found at C:\Program Files\SAP Lumira\Desktop.  My file contains:

active = true;     

importance = xs;  

alert = true;     

severity = assert;

size = 10000;

keep = true;

log_dir = "C:/LumiraLogs/logs";


log_ext = "log";

where I created the folder C:/LumiraLogs/logs to hold my Lumira trace files.  When you start up Lumira Desktop, you'll see this folder fill with trace files TraceLog_<PID>_<Timestamp>_trace.log:

Now set up tracing for the WebI Processing Server and APS by logging onto the CMC:

Go to the Servers page, then open the "Service Categories" node and select "Web Intelligence Services".  Control-click to select the WebI Processing Servers and APS, right-click to "Edit Common Services", then set the TraceLog Service Log level to "High".  You don't need to restart the services to have the setting take effect.

Now consume eFashion.unx in Lumira Desktop.  Create a New Dataset, select Universes:

choose the eFashion.unx Universe:

and define the query as we did for Web Intelligence.  Select the same Result objects:

where "Preview and select data" is enabled, then define the same filter:

then check out the Preview, comparing to what you saw in Web Intelligence:

and finally go forward to check the Prepare stage:

Things look good here - there's no errors, so you'll not be looking for errors in the traces.  It's good to start of reading traces from a successful run, since that will give you an idea of the complete workflow that Lumira uses to consume Universe data sources.

Collect log files - take copies and move the copies to a folder where they won't get overwritten or deleted.  Collect the TraceLog_*.log files from the Lumira log folder.  Also, log onto the server machine housing your BI Platform deployment, and collect the WebI Processing Server and APS trace files, found on Windows deployments by default in the folder C:/Program Files (x86)/SAP BusinessObjects/SAP BusinessObjects Enterprise BI 4.0/logging.  Collect all files that's been updated since the time you started running Lumira.

Analyzing Lumira Traces

Now comes the fun part - analyzing what's happening inside Lumira when it consumes Universe data.  Open the Lumira TraceLog_*.log file using a text editor.  Personally, I use Notepad++ - it's quite popular and feature-rich, and we'll be using some of its features to analyzie the log.

You'll see that the log is fine-grained and very detailed.  What is of interest here, and our focus, is specific to how Lumira is interacting with BI Platform, and nothing else.  We can identify this interaction in the log by searching for specific keywords.  Whenever Lumira uses the BIPSDK or WebI SDK to communicate with Platform, it records a log entry with text starting "START OUTGOING CALL", and when the communication is done, a log entry with text starting "END OUTGOING CALL".

So in Notepad++, I go to Search -> Find... and "Find All in Current Document" the text "OUTGOING CALL" to observe:

you can see that Lumira makes a lot calls to get the data.  In Notepad++, if you click a line in the search result, it will focus the file view centered on that line.  Try clicking on a "START OUTGOING CALL", then a "END OUTGOING CALL".  You'll note that only a few lines separate "START" from "END" in each call - Lumira in this interval is merely waiting for a response from Platform for the request it has sent.  If you look before (above) a "START OUTGOING" line, you'll see Lumira preparing the 'message' it's going to send to Platform.  If you look after (below) a "END OUTGOING" line, you'll see Lumira retrieving, parsing and working on the information returned from Platform.

To get a cleaner view of all the communication that's happening, search on "END OUTGOING CALL" to see:

each "END OUTGOING CALL" line is very informative.  It describes whether the BIPSDK or WebI SDK channel was used, which BI Platform server the communication went to, and even the service request call made.

I won't describe each and every service call, but here's a highlight:

CentralManagementServer.newService - request for a Service from the CMS

CentralManagementServer.queryEx3 - request information from the CMS repository database

WebIntelligenceProcessingServer.createSession - create a new Web Intelligence Session

WebIntelligenceProcessingServer.createDocumentEx - create a new Web Intelligence document

WebIntelligenceProcessingServer.processDPCommandsEx - send a data provider request - i.e., create/modify/refresh a Universe query

WebIntelligenceProcessingServer.getDPResultsEx - retrieve where the data from a refreshed query can be retrieved

WebIntelligenceProcessingServer.getBlob - get a large binary object that contains the data

WebIntelligenceProcessingServer.closeInstanceEx - close the instance of a WebI session

For today's exercise, we'll focus on understanding the processDPCommandEx, getDPResultsEx, and the getBlob request calls - they're the ones doing the heavy lifting of creating the query, sending the query to the WebI Processing Server, and then retrieving the data as a Blob (binary large object) that contains the data in XML format. The call order is typically a whole bunch of processDPCommandsEx calls, followed by a couple of getDPResultsEx calls, then finally a few getBlob calls to get the data.


Each time you create/modify/update/refresh a query, a processDPCommandsEx call is made by Lumira to the WebI Processing Server.  The particulars or a DPCOMMANDS request is structured as a XML sent to the WebI Processing Server.  You can find this XML by searching for the line "processDPCommandsEx command" in the log preceding a "START OUTPUT CALL" for the call itself.  After the "END OUTPUT CALL", you can read the response from the WebI Processing Server by searching for the line "processDPCommandsEx returns".

Here's an example of a command and response that requests an addition of a new query:

|processDPCommandsEx command



|TraceLog.Context: context var [ActionID]='Clntkt3NC0wtk2YQsWNlaTI46'


|START OUTGOING CALL Outgoing: FROM [Webi SDK.CorbaServerImpl.doProcess()#BIPW08R2:11432:37.75:1]


|END OUTGOING CALL Outgoing: SPENT [00.855] FROM [Webi SDK.CorbaServerImpl.doProcess()#BIPW08R2:11432:37.75:1] TO [webiserver_BIPW08R2.WebIntelligenceProcessingServer.processDPCommandsEx#localhost:7004:8804.1499160:1]


|processDPCommandsEx returns

|<?xml version="1.0" encoding="UTF-8"?>

<METHODRETURN SERVER_VERSION="RELEASE[psbuild@VCVMWIN214] on Aug 13 2014 15:51:02" NAME="processDPCommandsEx">


   <DOCTOKEN VALUE="P0pi7004KT186{BIPW08R2}{BIPW08R2.WebIntelligenceProcessingServer}{AQnzqXNWe6pLvL8_GuAC7nE}" TYPE="" WIDID="0"/>






     <OUTPUTS CID="0" CNAME="adddp">

       <OUTPUT OID="0" OTYPE="dplist">


           <DP DPID="DP0" DPNAME="Query 1" DPORDER="0" DPTYPE="DSL"/>



       <OUTPUT OID="1" OTYPE="datasourceid" DPID="DP0">

         <DP DPID="DP0" DPNAME="Query 1" DPTYPE="DSL">

           <DS DSID="&lt;datasource:DataSourceIdentifier xmi:version=&quot;2.0&quot; xmlns:xmi=&quot;" xmlns:xsi=&quot;" xmlns:datasource=&quot;" xmlns:datasourceinfo=&quot;" xmlns:info=&quot;"><dataSourceInfo xsi:type=&quot;datasourceinfo:DslUniverseDataSourceInfo&quot; dataSourceType=&quot;DSL&quot; caption=&quot;[BIPW08R2:6400]&quot; universeMetaVersion=&quot;dsl&quot;&gt;&lt;resourceInfo xsi:type=&quot;info:CmsItemInfo&quot; cuid=&quot;Aauu.TeuZUxPro7hBzVFKEs&quot; path=&quot;BIPW08R2:6400&quot;/&gt;&lt;/dataSourceInfo&gt;&lt;/datasource:DataSourceIdentifier&gt;" DSKEY="DS0" NAME="eFashion" LONGNAME="eFashion" DOMAINID="0" TYPE="DSL" CATALOG="DSL" OLAP="false" MAXOPERANDSFORINLIST="-1" ISMULTICONTEXTSELECTION="false" ALLOWCOMBINEDQUERIES="false" ALLOWSUBQUERY="false" ALLOWRANK="false" ALLOWPERCENTRANK="false" ALLOWSAMPLINGMODE="nosampling" ALLOWCALCULATION="false" ALLOWBOTHEXCEPT="false" ALLOWOPTIMIZATION="true" />






I've edited out much of the log file contents for this call, to focus on the outgoing DPCOMMANDS requesting the command "adddp" to create a new Data Provider for the query, the outgoing call, then the return for the request that specifies parameters for the newly-created Data Provider.  You'll see that every processDPCommandsEx call has the same pattern.

The most interesting DPCOMMANDS is the "refreshbatch" command, that requests the WebI Processing Server to generate a SQL Statement from the query, and use that to retrieve data from the data source.  Here's highlights from the trace file for a single refreshbatch request:

|TraceLog.Context: context var [ActionID]='Clntkt3NC0wtk2YQsWNlaTIa9'


|processDPCommandsEx command

|<DPCOMMANDS><DPCOMMAND CID="0" CNAME="refreshbatch"><DP DPID="DP0" DPTYPE="DSL" /><OUTPUTMODE><OUTPUTITEM OID="0" OTYPE="documentdict" /><OUTPUTITEM OID="1" OTYPE="compatibleobjects" /><OUTPUTITEM OID="2" OTYPE="properties" /><OUTPUTITEM OID="3" OTYPE="dpparams" /><OUTPUTITEM OID="4" OTYPE="alteredhierarchies" /></OUTPUTMODE></DPCOMMAND></DPCOMMANDS>


|START OUTGOING CALL Outgoing: FROM [Webi SDK.CorbaServerImpl.doProcess()#BIPW08R2:11432:38.272:1]...


|END OUTGOING CALL Outgoing: SPENT [01.285] FROM [Webi SDK.CorbaServerImpl.doProcess()#BIPW08R2:11432:38.272:1] TO [webiserver_BIPW08R2.WebIntelligenceProcessingServer.processDPCommandsEx#localhost:7004:5484.1499462:1]


|processDPCommandsEx returns

|<?xml version="1.0" encoding="UTF-8"?>

<METHODRETURN SERVER_VERSION="RELEASE[psbuild@VCVMWIN214] on Aug 13 2014 15:51:02" NAME="processDPCommandsEx">


   <DOCTOKEN VALUE="P0pi7004KT214{BIPW08R2}{BIPW08R2.WebIntelligenceProcessingServer}{AQnzqXNWe6pLvL8_GuAC7nE}" TYPE="" WIDID="0"/>






     <OUTPUTS CID="0" CNAME="refreshbatch">

       <OUTPUT OID="0" OTYPE="documentdict">



             <FOLDER CAPTION="Variables" SNAME=""/>

             <FOLDER CAPTION="Formulas" SNAME=""/>

             <FOLDER CAPTION="GroupingVariables" SNAME=""/>


           <DATASOURCE CAPTION="Query 1 - eFashion" SNAME="[Query 1]" KEY="DP0">

             <BODIMENSION CAPTION="City" SNAME="[City]" KEY="DP0.DOa" UUID="OBJ_166" SRCKEY="DS0.DOa" TYPE="string" FORMAT="" HELP="City located."/>

             <BODIMENSION CAPTION="State" SNAME="[State]" KEY="DP0.DO9" UUID="OBJ_218" SRCKEY="DS0.DO9" TYPE="string" FORMAT="" HELP="State located."/>

             <BODIMENSION CAPTION="Store name" SNAME="[Store name]" KEY="DP0.DOb" UUID="OBJ_376" SRCKEY="DS0.DOb" TYPE="string" FORMAT="" HELP="Name of store."/>

             <BODIMENSION CAPTION="Year" SNAME="[Year]" KEY="DP0.DO1" UUID="OBJ_188" SRCKEY="DS0.DO1" TYPE="string" FORMAT="" HELP="Year 2003 - 2006."/>

             <MEASURE CAPTION="Sales revenue" SNAME="[Sales revenue]" KEY="DP0.DO26" UUID="OBJ_147" SRCKEY="DS0.DO26" TYPE="number" FORMAT="T[Err]T[FS]T[Err]T[FS]T[FS]T[Err]" FORMAT_SAMPLE="#FORMAT; #FORMAT" HELP="Sales revenue $ - $ revenue of SKU sold" IS_AGGREGATED_OBJECT="Y" AGG_FUNC="Sum"/>




     <OUTPUT OID="1" OTYPE="compatibleobjects">

       <DP DPID="DP0" DPNAME="Query 1" DPTYPE="DSL">



             <OBJECT KEY="DP0.DO1" />

             <OBJECT KEY="DP0.DO9" />

             <OBJECT KEY="DP0.DOa" />

             <OBJECT KEY="DP0.DOb" />

             <OBJECT KEY="DP0.DO26" />






     <OUTPUT OID="2" OTYPE="properties">




             <DOCUMENTDPINFO NAME="name" VALUE="Query 1"/>

             <DOCUMENTDPINFO NAME="source" VALUE="eFashion"/>

             <DOCUMENTDPINFO NAME="iseditable" VALUE="true"/>

             <DOCUMENTDPINFO NAME="isrefreshable" VALUE="true"/>

             <DOCUMENTDPINFO NAME="lastrefreshdate" VALUE="December 10, 2014 2:02:05 PM GMT-08:00"/>

             <DOCUMENTDPINFO NAME="lastrefreshtime" VALUE="1418248925"/>

             <DOCUMENTDPINFO NAME="lastrowcount" VALUE="12"/>

             <DOCUMENTDPINFO NAME="lastrefreshduration" VALUE="1"/>

             <DOCUMENTDPINFO NAME="ispartiallyrefreshed" VALUE="false"/>

             <DOCUMENTDPINFO NAME="samplingresultmode" VALUE="nosampling"/>





     <OUTPUT OID="3" OTYPE="dpparams">

       <DP DPID="DP0" DPNAME="Query 1" DPTYPE="DSL">




     <OUTPUT OID="4" OTYPE="alteredhierarchies">






You can see that the returned XML from a "refreshbatch" command gives a very detailed view of the objects returned by the query, and the configuration parameters associated with the running of the query.


The getDPResultsEx call is made after a processDPCommandsEx "refreshbatch" request, to request from the WebI Processing Server what Blob object to request to retrieve the data gathered with the "refreshbatch".  Here's a highlight of a single getDPResultsEx call in the logs:

|TraceLog.Context: context var [ActionID]='Clntkt3NC0wtk2YQsWNlaTIa2'


|START OUTGOING CALL Outgoing: FROM [Webi SDK.CorbaServerImpl.doProcess()#BIPW08R2:11432:38.257:1]


|END OUTGOING CALL Outgoing: SPENT [00.037] FROM [Webi SDK.CorbaServerImpl.doProcess()#BIPW08R2:11432:38.257:1] TO [webiserver_BIPW08R2.WebIntelligenceProcessingServer.getDPResultsEx#localhost:7004:12768.1499455:1]


|execute returns
<?xml version="1.0" encoding="UTF-8"?>

<METHODRETURN SERVER_VERSION="RELEASE[psbuild@VCVMWIN214] on Aug 13 2014 15:51:02" NAME="getDPResultsEx">


   <DOCTOKEN VALUE="P0pi7004KT212{BIPW08R2}{BIPW08R2.WebIntelligenceProcessingServer}{AQnzqXNWe6pLvL8_GuAC7nE}" TYPE="persistent.current" WIDID="0"/>






     <BLOB KEY="TMP*23*15" FILE="Blob15.xml" />

     <BLOBMANAGER ROOT="C:\Program Files (x86)\SAP BusinessObjects\SAP BusinessObjects Enterprise XI 4.0\Data\BIPW08R2_6400\BIPW08R2.WebIntelligenceProcessingServer\sessions\_AQnzqXNWe6pLvL8_GuAC7nE\BRep\23/" />



The returned XML references the file in the cache where the WebI Processing Server stored the data retrieved from the database by running the query.  If you were to log onto the server machine where the WebI Processing Server is running, and open the Blob15.xml file in a text editor, this is what you'll see:



    <FLOWDATA  ID="0">


        <COLUMN DSID="DS0.DO1" DPID="DP0.DO1" FORMAT="" FIELDNAME="F1" NAME="Year" TYPE="character" />

        <COLUMN DSID="DS0.DO9" DPID="DP0.DO9" FORMAT="" FIELDNAME="F2" NAME="State" TYPE="character" />

        <COLUMN DSID="DS0.DOa" DPID="DP0.DOa" FORMAT="" FIELDNAME="F3" NAME="City" TYPE="character" />

        <COLUMN DSID="DS0.DOb" DPID="DP0.DOb" FORMAT="" FIELDNAME="F4" NAME="Store name" TYPE="character" />

        <COLUMN DSID="DS0.DO26" DPID="DP0.DO26" FORMAT="T[Err]T[FS]T[Err]T[FS]T[FS]T[Err]" FIELDNAME="F5" NAME="Sales revenue" TYPE="number" />



        <ROW F1="2004" F2="California" F3="Los Angeles" F4="e-Fashion Los Angeles" F5="982637.1" />

        <ROW F1="2004" F2="California" F3="San Francisco" F4="e-Fashion San Francisco" F5="721573.7" />

        <ROW F1="2004" F2="New York" F3="New York" F4="e-Fashion New York 5th" F5="644635.1" />

        <ROW F1="2004" F2="New York" F3="New York" F4="e-Fashion New York Magnolia" F5="1023060.7" />

        <ROW F1="2005" F2="California" F3="Los Angeles" F4="e-Fashion Los Angeles" F5="1581616" />

        <ROW F1="2005" F2="California" F3="San Francisco" F4="e-Fashion San Francisco" F5="1201063.5" />

        <ROW F1="2005" F2="New York" F3="New York" F4="e-Fashion New York 5th" F5="1076144" />

        <ROW F1="2005" F2="New York" F3="New York" F4="e-Fashion New York Magnolia" F5="1687359.1" />

        <ROW F1="2006" F2="California" F3="Los Angeles" F4="e-Fashion Los Angeles" F5="1656675.7" />

        <ROW F1="2006" F2="California" F3="San Francisco" F4="e-Fashion San Francisco" F5="1336003.3" />

        <ROW F1="2006" F2="New York" F3="New York" F4="e-Fashion New York 5th" F5="1239587.4" />

        <ROW F1="2006" F2="New York" F3="New York" F4="e-Fashion New York Magnolia" F5="1911434.3" />




exactly the data as you saw in the Preview window.


The getDPResultsEx gets the reference to the Blob XML file that contains the data, and it's up to the getBlobInfos and the getBlob calls to actually retrieve this file from the WebI Processing Server, so that Lumira can parse the contents to create a flat file.  Here's highlights from a getBlob call:

|TraceLog.Context: context var [ActionID]='Clntkt3NC0wtk2YQsWNlaTIb8'


|START OUTGOING CALL Outgoing: FROM [Webi SDK.CorbaServerImpl.doProcess()#BIPW08R2:11432:63.303:1]


|END OUTGOING CALL Outgoing: SPENT [00.035] FROM [Webi SDK.CorbaServerImpl.doProcess()#BIPW08R2:11432:63.303:1] TO [webiserver_BIPW08R2.WebIntelligenceProcessingServer.getBlob#localhost:7004:10372.1499477:1]


|[com.businessobjects.sdk.core.server.internal.blob.BlobInterpreter]Chunk request fullfilled from initial (getBlob) buffer

|[com.businessobjects.sdk.core.server.internal.blob.BlobStream]Read 1794 bytes from current read-ahead buffer


|[]End CSV file generation :1757 ms

The requested Blob XML file is streamed back to Lumira, that reads and parses the contents and generates the CSV passed to the Prepare engine.

Analyzing BI Platform Traces

By reading the Lumira traces and identifying the lines where it communicates with Platform, you can see how Lumira interacts with Platform to consume Universe data sources.  But that's a pretty one-sided view.  We can see the request going out from the Lumira side, but don't see how the Platform side is handling the request.

We should be able to see how Platform handles the request by reading the WebI Processing Server and APS traces, but usually there's a problem.  If you look at the logs you've gathered from Platform - all the *.glf files - you'll see that the total size is quite large.  That's expected, since for typical development or even product Platform deployments, you can't guarantee that only you are utilizing the system.  There may be other Users, in fact many other Users, also using the system, the actions all captured in the trace files you've collected.

The main difficulty in analyzing Platform trace files is to eliminate all irrelevant information from them, and focus only on the trace entries relevant to your workflow.  That's actually not very difficult to do.

If you look at all the highlights from the Lumira traces I've given above, you'll see that I've always included lines of the form:

|TraceLog.Context: context var [ActionID]='Clntkt3NC0wtk2YQsWNlaTIb8'

The "ActionID" line always appears before a "START OUTGOING CALL" is made, and you'll see that each call has an unique ActionID value associated with it.  In the BI Platform TraceLog framework, each and every client request that requests a service from Platform services always generates an unique GUID value, and sends that to the services with the request.  Every time a Platform service services such a request, it records this ActionID value in the trace file and Audition DB entries.

In this way, you can associate all Platform trace file entries that was triggered by a client request, and follow the workflow from when the request first came in, until the final response sent back to the client.  Here, the client is the Lumira Desktop.

The easiest way to read BI Platform trace files is to use the GLFViewer, available via SAP KBase 1909467.  If you don't have the GLFViewer already, open that KBase and download the viewer to your client machine.  Run the GLFViewer and File -> Open and read all the glf files you've gathered from Platform. Ensure all the information in the glf files are presented by going to View -> Choose Columns... and ensuring all columns have been enabled.  Find the "ActionID" column entry near the bottom, select it and click on the "Move Up" button till it's near top of the order:

Let's filter the trace entries to only show the ones associated with the processDPCommandsEx "refreshbatch" request call highlighted above.  That call has:

|TraceLog.Context: context var [ActionID]='Clntkt3NC0wtk2YQsWNlaTIa2'

and in the GLFViewer go to View -> Filter... and select "ActionID" for the Column and enter Clntkt3NC0wtk2YQsWNlaTIa2 into the text entry:

and click "OK" to filter.

Select View -> Indent Text According to Scope and you get quite a pretty display:

The very first line for the servicing of this ActionID is the WebI Processing Server entry:

START INCOMING CALL Incoming:processDPCommandsEx FROM [Webi SDK.CorbaServerImpl.doProcess()#BIPW08R2:11432:38.272:1] TO [webiserver_BIPW08R2.WebIntelligenceProcessingServer.processDPCommandsEx#localhost:7004:5484.1499462:1]

and the very last line is the WebI Processing Server entry:

END INCOMING CALL Incoming:processDPCommandsEx SPENT [1.282] FROM [Webi SDK.CorbaServerImpl.doProcess()#BIPW08R2:11432:38.272:1] TO [webiserver_BIPW08R2.WebIntelligenceProcessingServer.processDPCommandsEx#localhost:7004:5484.1499462:1]

and you can see that the "START OUTGOING CALL" in the Lumira traces led to a "START INCOMING CALL" in the WebI traces.  When WebI is done servicing this request and ends with "END INCOMING CALL", led to a "END OUTGOING CALL" in the Lumira traces.

If you search for "START OUTGOING CALL" in this filtered Platform traces, you'll see the entry made by the WebI Processing Server:

START OUTGOING CALL Outgoing:doIt FROM [webiserver_BIPW08R2.WebIntelligenceProcessingServer.processDPCommandsEx#localhost:7004:5484.1499462:1]

that's immediately followed by an entry by the APS:

START INCOMING CALL Incoming: FROM [webiserver_BIPW08R2.WebIntelligenceProcessingServer.processDPCommandsEx#localhost:7004:5484.1499462:1] TO [.doIt#BIPW08R2:4244:10794286.19727081:1]

This represents the WebI Processing Server calling the DSL Bridge Service in the APS to request generation of a SQL statement, as described in this trace entry by the APS:

NativeQueryNode::NativeQueryString: SELECT Calendar_year_lookup.Yr, Outlet_Lookup.State, Outlet_Lookup.City, Outlet_Lookup.Shop_name, sum(Shop_facts.Amount_sold)


  Calendar_year_lookup, Outlet_Lookup, Shop_facts


  ( Outlet_Lookup.Shop_id=Shop_facts.Shop_id  )

  AND  ( Shop_facts.Week_id=Calendar_year_lookup.Week_id  )

  AND  Outlet_Lookup.State  IN  ( 'California','New York'  )






Keep following the traces to see the APS trace entry "END INCOMING CALL" after passing back the SQL to the WebI Processing Server, that acknowledges it by a "END OUTGOING CALL".

You can see how you can trace every request sent by Lumira to Platform, and follow along till completion.  If you've been generating your own traces, try following the other calls made by Lumira by identifying the associated ActionID, then filtering the Platform traces using the GLFViewer.

Root Cause Error Identification using Traces

As you saw above, the trace framework that both Lumira and Platform uses may be used to gain tremendous insight into the internal workflow used by Lumira when consuming data from a Universe.  But how does this help with root cause error identification, for example for the "csEX" error shown above?

This is the workflow that's going to be used:

  1. Enable tracing on Lumira, WebI Processing Server and the APS.
  2. In Lumira, start consumption of Universes till the error appears.
  3. Collect all the traces.
  4. In the Lumira TraceLog traces, find the error message you saw. Search upwards for the "START OUTGOING CALL" that returned the error from WebI Processing Server.
  5. Find the ActionID associated with the "START OUTGOING CALL".
  6. Open the WebI Processing Server and APS traces in the GLFViewer.
  7. Filter with the ActionID value.
  8. Search the filtered traces for the first error encountered.

I'll follow this with the traces I had gathered for the "csEX" error.  I open the Lumira TraceLog file in Notepad++, and starting from the botton of the file, I search upwards for the "csEX" error message to find the following entries:

|TraceLog.Context: context var [ActionID]='Cv3UgdIoskY5vV3HXoDoa9E46'


|processDPCommandsEx command



|processDPCommandsEx returns

|<?xml version="1.0" encoding="UTF-8"?>

   <METHODRETURN SERVER_VERSION="RELEASE[psbuild@VCVMWIN214] on Aug 13 2014 15:51:02" NAME="processDPCommandsEx">



















the error return is preceded by a START OUTGOING CALL with ActionID=Cv3UgdIoskY5vV3HXoDoa9E46.  I open the WebI Processing Server and APS trace files in the GLFViewer, and filter on this ActionID value.

I then search from the top of the trace entries down for Trace column having value "Error" and find:

and the error originated in the APS DSL Bridge Service:

Caused by: java.lang.ClassNotFoundException:


  at Method)


  at java.lang.ClassLoader.loadClass(

  at java.lang.ClassLoader.loadClass(

  at java.lang.Class.forName0(Native Method)

  at java.lang.Class.forName(

  at com.businessobjects.connectionserver.ConnectionServer.getImplementation(

The DSL tried to load a Java class file that defines the csEX - the extended ConnectionServer library - but could not find it.

Searching SAP KBase for that error entry turns up a single hit - KBase 2030389, that suggests the issue is with loading the cs_ex.jar file.  I find and check the integrity of that file, and indeed, discover that it was somehow corrupted.  Replacing that file with a good one, and restarting the APS DSL Bridge Service, and the error goes away.


I hope you find the above information useful - Universes has tremendous upsides when it comes to making a complex data schema accessible to users.  The challenge lies when errors are thrown by the Web Intelligence Processing Server that's servicing a request from Lumira for data from the Universe.  The error messages at times may not truly identify the root issue that's caused the error.  In that case, you can use the TraceLog framework that both Lumira and BI Platform uses, to trace back the error seen in Lumira back to the originating cause.

Ted Ueda has been supporting SAP BusinessObjects and Analytics for close to ten years, and still finds the job fun every single day.  When not working, he's somewhere outdoors trying not to be eaten by bears.