on 2024 May 27 4:58 PM
Hello,
I'm working on rebuilding a complex BW transformation that we have, in Datasphere. The transformation has an expert routine which I'm trying to recreate using Python script inside a Data Flow. In the ABAP expert routine, I have access to the target structure via
RESULT_PACKAGE
and I can use
ASSIGN COMPONENT x OF STRUCTURE RESULT_FIELDS
to access a component of the structure if it exists. This is convenient as I can use this to ignore fields from the source which I'm not interested in.
The question is, is there a similar mechanism in Datasphere? Can I somehow access the target structure within the script? My assumption is that it's not possible as there is only one Dataframe called 'data' to begin with. With that said, if there is no other way, is my only option to hardcode the columns of the result DataFrame with all the fields that I expect and check for each column before assigning any values. Is there a more elegant way?
Thanks in advance.
Request clarification before answering.
| User | Count |
|---|---|
| 10 | |
| 7 | |
| 5 | |
| 4 | |
| 3 | |
| 3 | |
| 3 | |
| 2 | |
| 2 | |
| 2 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.