Application Development Discussions
Join the discussions or start your own on all things application development, including tools and APIs, programming models, and keeping your skills sharp.
cancel
Showing results for 
Search instead for 
Did you mean: 

use of JOB_SUBMIT

Former Member
0 Kudos

Hi,

   i have to upload a file of almost 24,000 data and then split that data into parts .After execution whole output( containing all the parts) should be seen in the internal table.

Function Module:JOB_SUBMIT.

Please advice.

Regards,

Vinit.

6 REPLIES 6

Former Member
0 Kudos

Hi Vinit.

parameters:

JOBNAME LIKE TBTCJOB-JOBNAME default 'TEST'.

JOBGROUP LIKE TBTCJOB-JOBGROUP.

data:

JOBCOUNT LIKE TBTCJOB-JOBCOUNT.

* Open the Job

CALL FUNCTION 'JOB_OPEN'

EXPORTING

JOBNAME = JOBNAME

IMPORTING

JOBCOUNT = JOBCOUNT

EXCEPTIONS

CANT_CREATE_JOB = 1

INVALID_JOB_DATA = 2

JOBNAME_MISSING = 3

OTHERS = 4.


IF SY-SUBRC NE 0.

WRITE: /1 'JOB_OPEN FAILED reason code :', SY-SUBRC.

CHECK 1 = 2.

ENDIF.

* Create the actual payment

SUBMIT ZZAPXXX AND RETURN "add step zzapXXX to job

USER SY-UNAME

VIA JOB W_JOBNAME NUMBER W_JOBCOUNT

USING SELECTION-SET 'PAYMENT' "variant

WITH P_LAUFD = TVARV-LOW "parameters

WITH P_ID = P_ID.

INSTEAD OF SUBMIT STATEMENT YOU CAN USE JOB_SUBMIT FM.

But even submit statement will do.

* CLOSE JOB

CALL FUNCTION 'JOB_CLOSE'

EXPORTING

JOBNAME = JOBNAME

JOBCOUNT = JOBCOUNT

STRTIMMED = 'X'.

You can add multiple steps to the job by doing more than one submit for the same job before closing it.

0 Kudos

Hi Mohammed,

                     thanks for the reply, but i have to split it into parts i.e 6parts of 4000 each.

uploading through a file and then splitting it internally into parts and then bringing it together into internal table and then moving it to data base.....

will u please advice

Regards,

Vinit.

0 Kudos

Well Vinit,

Is it like you have 24000 records.

You have 6 files of 4000 records each.

You want to upload the 4000 records with the same program being run 6 times and process the records.

Till here it is fine.

You can open the job.

Write 6 submits with different file name 01-02-03-04-05-06.

Put the output with write statements or ALV LIST display.

Why you want to put it together in internal table again ?

Regards

0 Kudos

Hi Mohammed,

                    i want to upload  24,000 data from excel sheet and then split that data into batches of 4000 data each( 6 batches).i.e splitting in the program itself after uploading  .

After that i want to move that data sequentially into the database table.Would it be possible to move the 6 batches one after the another. i.e uploading first batch (4000 data) and then appending next batch

to the first batch......and so on upto 6 into the database table.

(data should be sequentially sent to the data base table in batches)

Please suggest a suitable method or technique for it.

Regards,

Vinit.

0 Kudos

If the final goal is to put that data into the database... Use a legacy. You are wasting your time with a program, or your company doesn't have the quality assurance some of them have.

In any case, to create a program just to populate a table (how many times do you think you will need to execute it?) once or twice in a life it's not worth the effort.

Seek info about transaction LSMW.

0 Kudos

Hi Vinit,

I don't know the reason for splitting .. But you can do it in one go all 240000 records.

Now for your answer... yes you can do it.

Create 2 programs one for upload and second for processing.

And pass 4000 records in itab to the other program to process via export and import.

But I would suggest to do it in one go.

Regards