I got a chance to work on SAP Demand Signal Management for one of the customers where we loaded IRI Market Research Data into DSiM. If you belong to BW background working on DSiM can be a bit overwhelming, because although it seems like its simple BW, there are a lot of concepts associated with DSiM that one needs to understand. Resolving errors can be a daunting task and you might feel stuck at times. I have consolidated here a list of all the issues I faced and how I went into resolving those. The version of DSiM that I was working on was DSiM 3.0
Data Load to /DDF/DS15 fails with the error below
While loading data to /DDF/DS15 (Time Propagation layer DSO), the data load fails with the error above. Issue is because 0DATE_MMTT infoobject is obsolete now. And so we need to reinstall the DSO /DDF/DS15 and the required infosources and transformations.
As shown in transformation above the infosource still contains old Infobject 0DATE_MMTT. So reinstall the InfoSource /DDF/MD_TIME_15_I and the transformation from standard BI content. Once reinstalled, the transformation looks like below :
TYPES : begin of ty_str,
str(40) type c,
end of ty_str.
DATA: it_str type table of ty_str,
wa_str like line of it_str,
lv_cnt type i,
lv_len type n,
lv_index type sy-tabix.
FIELD-SYMBOLS : <product_ext> type any.
"assert 1 = 0.
IF iv_datasource EQ '/DDF/PROD_ATTR_COL_DX' or iv_datasource EQ '/DDF/MRTPN_EXAMPLE'.
READ TABLE it_source_data INTO ls_source_data WITH KEY alias_fname = 'PRODUCT'.
ASSIGN COMPONENT 'DATA' OF STRUCTURE ls_source_data TO <product_ext>.
SPLIT <product_ext> at ' ' into table it_str.
describe table it_str lines lv_cnt.
lv_index = lv_cnt.
read table it_str into wa_str index lv_index.
cv_extract_value = wa_str-str.
Products missing in /DDF/PRODUCT
Sometimes while loading master data you may realize that the products are getting loaded in the acquisition and propagation layer DSOs i.e. /DDF/DS41 and /DDF/DS51, but you cant find the same product in /DDF/PRODUCT.
In our case we were using Direct update to update the data in the Infoobject /DDF/PRODUCT.
So the first thing to check here is check the logs for the Direct update program (/DDF/BW_MD_UPD ) in the Job Monitor to see if its running successfully or if it has any errors.
In our case the job was running fine and did not have any errors.
So we ran a program /DDF/FDH_SELECT_FOR_CHGLOG to add these product entries to the Change log table of harmonization table.
After this rerun the program /DDF/BW_MD_UPD in order to see if the entries are updated.
The source products should be available in /DDF/PRODUCT now.
File Naming Pattern Issue
Sometimes while setting the File Naming Pattern, the system might throw errors and its difficult to understand exactly what is allowed while setting up the naming pattern.
Wild Characters are represented using + and any number of wild characters are allowed as long as they are together. Also the file name can contain letters, numbers, underscore but hyphen is not allowed.
This class(/DDF/CL_ADU_FILE_TOOLS) and method (CHECK_FILENAME_PATTERN) shows the checks that the system performs for the File name mapping. You can put a breakpoint to check if you are still getting error in the filename pattern.
Cancel Process in Process Monitor
Lot of times you want to cancel a process from running but the cancel button is disabled in process monitor.
In that case you can go to SE38 program /DDF/ADU_CANCEL_INSTANCE
You can check the delivery ID from the manage tab of the DSO.
Dataload to /DDF/GTIMEREF InfoObject fails with the error below :
This is an system issue. Although we are not using the Consolidation Layer for Time Master data the system still checks the Consolidation Filter which is not correct.
The correction should be applied as mentioned in the below OSS Note:
After resolving the errors above, We were able to load the market research IRI data successfully into SAP DSiM. I would like to thank my colleague guilherme.costa4 for his help and guidance throughout the project. We also worked on the integration of SAP DSiM with CBP. I will be publishing another blog to talk about this integration part.
I hope this blog is helpful for everyone working on DSiM.