cancel
Showing results for 
Search instead for 
Did you mean: 

Data Services Support Issues on SAP extraction

Former Member
0 Kudos
534

I've been logging a number of issues we have with data services which rarely come back with anything helpful from SAP. 

Q:  I get an error reading table abc

A:  use ABAP dataflows

Q:  I have corrupt data in field x in table abc.  How can I work around this ?

A:  Use an ABAP dataflow

Q:  Extractor xyz gives me a storage space error.  How can I detemine how much temporary space is needed ?

A:  Use an ABAP data flow

While I'm sure ABAP data flows are wonderful, they don't suit our environment very well.  We chose data services so we can use the built in extractors and avoid having to make changes on the SAP side.  our change control around ECC means that we would rather not use them if possible.

Firstly, is this a typical support experience ? 

Secondly, I know all extractors are not officially supported, but it is the main new feature that was sold to us in DS4 - surely more effort should be going into helping customers.  Most of the issues seem to be ones that are not specific to Data Services, but to any use of the extractors..

Finally, why is table support so flakey ?  In every case where we have had trouble reading a table, we have been able to put together a customer extractor to read it in a couple of hours.  This imples that the SAP RFC_READ_TABLE function itself is the main issue here.  Is there a 'fixed' version of this function for Data Services whithout the 512 limit and storage issues ?

Thanks.

Accepted Solutions (1)

Accepted Solutions (1)

Former Member
0 Kudos

Hello

Sorry to hear that you are having problems.

The RFC mechanism, like all mechanisms, has limitations.  I only use it for quick and dirty extractions from ECC.  It is very useful for this, however because of its known limitiations (512 limit and lack of pushdown) I would always convert the extract into an ABAP dataflow for production quality code.

Strict change control procedures are good for governance, but they shouldn't stop us doing our job.  ABAP dataflows are the way to go, and hopefully you are not releasing code to QA and production very often.

"Corrupt data" sounds like a UNICODE issue, this is common when trying to load the ECC data into a non-unicode column type, for example varchar rather than nvarchar.  Template tables often cause this, but that is very easily fixed by specifying "use nvarchar..." on the options.

Michael

Former Member
0 Kudos

In this case the corrupt data really is corrupt.  Its a date field which somehow has a value '2000.??.??' in it.  From the SAP front end, it appears blank.  Other blank fields on the front end in the same column come though as '1900.01.01',  No one is sure how it got there, but we assume it feeds though from another system. 

Answers (0)