Application Development and Automation Discussions
Join the discussions or start your own on all things application development, including tools and APIs, programming models, and keeping your skills sharp.
cancel
Showing results for 
Search instead for 
Did you mean: 
Read only

ztable backup

Arif1
Active Participant
0 Likes
866

hi,

a ztable is being used for SD module. which size is more then 50 GB. i want to take a as a schedule basis as below-

1. everyday backup program will take backup to another ztable for previous 2 days backup

2. after backup complete successfully. backup program will delete data from ztable.

my concern is. as it is very large table and after copying another table i deleting data from current table so, large size of log file will generate.

what is the best programming solution to to run this job with best performence and how to protect increase of large log file during job executing.

thanks//

Arif

5 REPLIES 5
Read only

ThomasZloch
Active Contributor
0 Likes
726

Did you look into using the archiving functionality that comes with SAP (transaction SARA)? I have not done it myself yet, but I think you can setup your own custom objects as well, including programs for selecting, archiving, deleting and restoring. This is proven to work with very large datasets.


Thomas

Read only

Arif1
Active Participant
0 Likes
726

Hi Thomas,

thanks for your answer. right now  Archiving backup is not possible. so i want to develop ABAP program just  insert another table and delete.


Read only

0 Likes
726

OK, so what is that large log file you are referring to? Rollback area?

You probably want to use "block processing", so that the rollback area does not grow too large.

See the example here:

http://www.kerum.pl/infodepot/00016

However, package size 20 is far too small, rather try with 10,000 (balance between memory used for rollback and commit frequency)

Thomas

Read only

Arif1
Active Participant
0 Likes
726

thanks for replay.

do you have any example same way to delete data from first table. main concern is scoy and paster backup table and delete first table. i have run my abap program which generate 20 GB log file for last 10 days data.

currently i am using in my program

  INSERT (w_target_tabname)

  FROM TABLE it_serial

  ACCEPTING DUPLICATE KEYS.

DELETE (w_source_tabname)

    FROM TABLE t_srlno_del.

Read only

0 Likes
726

Hi,

this sounds to me like need for partitioning.

If you are not afraid of native SQL, Oracle has some nice features for this.

If your partitioning Key is the date, you can create a new partition every n days.

May be this is already enough for the performance needs, in this case, you do not need to archive at all. If you still need to do so, you can move the partition to a seperate interim-table and then attach the interimtable to your archive table as a new partition. This will cost nearly no redologs or undo, because these are plain DDL operations. Internally it is just some minor updates of system tables and the entire partition is attached to another table.

You need to dig into the documentation of you DB-flavor.

For oracle you will like to google

alter table ... exchange partition

alter table ... add partition

Sapnet has some information about how partitioning can be used inside SAP.

Meanwhile a couple of DBs can work with partitions allthough there are functional differences in teh different flavors.

Volker