Showing results for 
Search instead for 
Did you mean: 

SAP Datasphere multiplies number of entries

0 Kudos

Hello experts,

we are currently creating a report within the SAP Datasphere that is used by SAP Analytics Cloud. While checking the numbers in the frontend (SAC) I noticed that these numbes do not match those in the SAP Datasphere. At the moment I know two sources of error

  1. graphical views
    The number of entries is multiplied by five. This seemed so strange to me that I couldn't believe it. Therefore, I checked the data (number of entries and amount) from the source system up to where the mistake appears. In an graphical view with left joins the data is multiplied by five. Recreating this faulty view as a sql view will lead to the correct data. Creating a new graphical view, deleting all joins, deploying, adding the joins and deploying againalso results in the correct data.

    The correct data is on the left side of the image. The sql statement referes to the newly create graphical view with deleting an readding the joins. On the right side is the original view where I noticed the error.

    Is this a known issue or am I doing something wrong here?

  2. Analytic Model
    When creating an analytic model from an view (fact), the values are also distorted.
    The fact view shows the correct data and when I create an Analytic Model that use the fact view I get double the amount and double entries. I haven't found a workaround for this problem yet.

Has anyone had similar experiences and found a way to avoid these problems?

Thanks in advance and



0 Kudos
Hello, I'm encountering the same problem. Have you managed to solve it?

Accepted Solutions (0)

Answers (1)

Answers (1)

0 Kudos

Hi Philipp,

this is a common occurence with data modeling in general. There is no way to help you directly without analyzing the view definitions in detail. But as a suggestion, take a look at the following:

- the "distinct" flag in some nodes of the graphical view can suppress duplicates, which is helpful to know when you are analyzing and getting duplicates in some area, but not another

- for every join check the join condition (mapping) => e.g. when you have a left join and not every primary key of the right side of the join is mapped in the join condition, this would create duplicates

- similarly check every association making sure that all primary keys of the right side are mapped, I think it would throw an error before deployment in that case anyway, but not sure (that could explain the Analytic Model behavior)

Good luck.

0 Kudos

Hi Irvin,

thanks for your answer.
We already checked the mapping of the joins within the Graphical View. At the moment we have two identical views showing different numbers. The Analytic Model also doubles the numbers when I delete all associations. This problem occures when creating the analytic model manually or through the link in the previous view (Fact or Analytical Dataset).

Out of couriosity I tried to delete one primary key of the association and created the Analytic Model. As long as one primary key of the right sight of the association is mapped no error will be thrown.