Enterprise Resource Planning Blogs by Members
Gain new perspectives and knowledge about enterprise resource planning in blog posts from community members. Share your own comments and ERP insights today!
Showing results for 
Search instead for 
Did you mean: 
Active Contributor

As part of a large Business One project we had trouble using SAP's Data Transfer Workbench because of the mere quantity of data involved. Memory management in DTW is not optimal and after several hours of running consumes almost 100% of RAM, slows down extremely and eventually stops responding altogether.

Meet the solution: Since the project was based on SAP Business One version for SAP HANA, we were able to leverage the possibilities of the Service Layer to import data massively and with minimal memory footprint on the machine importing the data.

To illustrate the difference, here are a few key pieces of information:

Technology usedCOM API (DI API)B1 Service Layer (HTTPS)
Libraries required.NET Framework, DI API, MS Access DB Engine.NET Framework only
Memory Consumption on clientStarts at 200MB, grows to several GB after a few hours110 MB steady, running 4 threads
Import speedStarts at approx. 3 records per second, eventually slows down to 012 records per second, running 4 threads

Here is a screenshot of our first version, running on an uncertified test server with much lower CPU power than a certified server:

Note that we also decided to not use any of the Microsoft ODATA libraries as suggested by the B1 solution architects. We instead built our own communication procedures using plain .NET functions.

Anybody interested in the tool can drop me a note, either right here on SCN or using our Facebook page: https://www.facebook.com/CEO.Consultoria

Looking forward to your comments!