I have the problem with performance of reports. Sometimes it needs so much time, up to 5-10 minutes. I think it depends from records count of infoprovider. I know there are some ways for improving of performance by splitting a large Cube into small Cube, and creating aggregates. But how many record should has a cube before splitting it? And how about ODS?
Ver BW 350.
This depends upon the kind of system you have and volume of data.
ODS will have issues even for the small amount of records if secondary indexes are not created for the characteristics which you use in the drilldown of the report.
For the cubes try to do compression with zero elimination and keep the DB statistics updated.
Also try to keep the number of request as less as possible in the cubes and I genrally go with around 15-20 million records per cube....but in case of huge amount of data this limit can increase.