Showing results for 
Search instead for 
Did you mean: 

Calculation jobs in SLT Replication

Active Participant
0 Kudos

Dear All,

Could you please tell what is the importance of calculation jobs while defining a configuration in ltr.

If I define a total of 10 jobs, how many should go for calculation and how many should for data transfer.

Is it based on the size of the tables? If the tables have millions of records or is based on how many number of tables I set for replication and how many for load?

A detailed explaination of the importance of calculation jobs in LTR is highly appreciated.

Thanks & Regaards,


Accepted Solutions (0)

Answers (3)

Answers (3)

Former Member
0 Kudos


Calculation jobs plays a major role specially when there are large tables in the source system. Defining number of jobs  ( Data Load, Transfer and Calc Jobs ) in the configuration has to be careful and keep in mind about resources in Both SLT and Source systems.

Number of jobs you define in the configuration, corresponding Background WPs are occupied in SLT and that many scheduled to run in the Source system during data transfer and loads. Calculation job calculates the number of records for a table and get ready for Load/replication.

Eg: If you have schedule a table for replication in LTRC, first calculation job kicks off in SLT and then internally kick another job in ECC. You can check that in SM37 in both the systems. If you check the Log in ECC for that job, then you see the number of records calculated is shown. More number of records, longer it run. Once the Calculation job is finished, then the Load Jobs kick off which does the actual transfer of data from Source to Target. More number of Load Jobs, more faster the data is transferred.

You may ask how do we calculate required number of Jobs for Calculation.

It varies from one scenario to another and you need to experiment with diff options.

We need more Calc obs when we parallise the Initial Load. Because during that process, multiple calculation jobs kick off which calculate the records in faster mode. So we did configured to use 5 Calculation jobs for a table which has 400M records with parallel processing. This reduced drastic amount of read time in the source system and helped the loads faster.

Mahesh Shetty

0 Kudos

Hi Vijay,

just my five cents on this.(which I figured out by first reading the sizing guide for SAP LTRS and then by doing it...)

As you have 10 jobs available, keep 1 job out of scope as this one will be the masterjob triggering and checking all others.

So you have 9 jobs left.

Now, ask yourself:

- How many tables are going to be replicated at once?

- How fast do you need those tables to be loaded initially?

- how often are tables subject to being changed in the source system?

Let's assume that you'll replicate 200 tables initially. And you want them to be loaded initially fast.

Then I'd start with 7 initial load jobs and 2 jobs left for calculation(replicating).

When you have completed te initial load, change it using TA LTR(You'll jump off to a Webdynpro).

Then, change it to ...and now it depends on your system and your needs...3 Initial and 6 calculation jobs.

Go and have a try......and let us know what your experience is.



Active Participant
0 Kudos

Hi Vijay,

In the source system, the number of available work processes, which are reserved for SLT replication should be equal to the number of configured Data Transfer Jobs on SLT side.

Some free work processes are also required when creating new logging tables and database triggers, as well as calculating access plans and creating runtime objects.

The Sizing Guide for SAP LT RS can help you in this.