2015 Oct 26 3:57 PM
I have a custom object to load via EMIGALL, into a Z-table. It has 11-billion rows. I wonder if EMIGALL can handle that, even with distributed imports? Or shall I look into BODS?
Anyone with experience loading large volumes of data into custom EMIGALL load objects, please comment... thanks!
2015 Nov 02 4:15 AM
Hi David,
After you have created your custom EMIGALL object , to load a huge file you need to split the files in sub files.You have to navigate to data import screen and click on utilities >>Break down migration file.
Regards,
Chandandeep.
2015 Nov 02 12:38 PM
Hi David,
As suggested by Chandandeep, there are functionalities available within EMIGALL to deal with huge data.
However, having said that, migrating 11 billion entries into a single table is too much for EMIGALL to handle with even you are doing it through distributed import.
I guess, it would take days to do the full migration.
Moreover, the database size would grow exceptionally, if you have 11 billion entries in a single table in Source System and can lead to serious performance issues.
I think you need to move to a different strategy, like BODS etc, for migrating such huge data.
Thanks,
Amlan