on 2020 Aug 25 8:06 AM
Hi All,
I have more than 120 Million record in oracle data base, using BODS to extract the data but taking more time and job is getting failed with the error unable to extend table MASTER.DT__5066_2767_1_1 by 8192 in tablespace INT_DATA .
BODS 4.2 I am using.
Can any one suggest me the fastest way or any optimised way to extract the all data, I have already used data transfer transformation in automatic mode but not much benefit.
Thanks you in advance..
Pankaj
Help others by sharing your knowledge.
AnswerRequest clarification before answering.
Hi,
I don't think your issue is raised due to poor performance - it is rather a oom DB issue or your harddisk has insufficient space.
Your dba needs to extend the datafile - please check following post:
To add a new datafile for the existing tablespace:
ALTER TABLESPACE INT_DATA ADD DATAFILE '/u01/oradata/userdata03.dbf' SIZE 200M;To find out the location and size of your data files:
SELECT FILE_NAME, BYTES FROM DBA_DATA_FILES WHERE TABLESPACE_NAME ='INT_DATA';Additionally, it is possible to set the tablespace to autoextend as well. Here are the docs.
In regards to performance improvements:
1.) Check whether you actually need all columns in the target and reduce the extraction to the amount of columns actually needed.
2.) increase the DOP of the dataflow
3.) Use Bulkloader Option in the settings of the target table within DS
Best regards,
Julian
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
| User | Count |
|---|---|
| 17 | |
| 8 | |
| 8 | |
| 6 | |
| 4 | |
| 4 | |
| 3 | |
| 2 | |
| 2 | |
| 2 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.