on 2019 Sep 30 11:00 AM
Is there a way to identify how much memory a cube would consume if no filter is applied and that it is fully loaded into the memory? This is basically to get an estimate of the size of data file (csv) if i extract all of the data from a cube.
Hello,
please note that estimating the csv sizes of table exports based on the "memory-used" figures might be heavily missleading due to the fact that tables in memory are in most cases compressed. Some tables / columns more efficient than the other.
So in case you have a ~70% compression ratio in the base case, you might end up with a 1GB csv file for a table using 300MB of memory.
Regards, Bojan
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi,
If you are in hana the below SQL script should be able to give you the memory occupied by the cube.
select table_name, sum( memory_size_in_total/1024/1024/1024) as "Peak" from m_cs_tables where table_namelike '%cube_name%' group by table_name
Regards
Gajesh
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
User | Count |
---|---|
67 | |
11 | |
10 | |
10 | |
9 | |
9 | |
6 | |
5 | |
4 | |
4 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.