on ‎2015 Feb 05 8:42 AM
How to check whether my operation is fully pushed down or partially done ?
View optimized SQL is the only way to check whether the operation i am doing is fully pushed down or partially done ? i have the below in
the SAP article which says
Operations within the SELECT statement that the software can push to the
database include:
• Aggregations — Aggregate functions, typically used with a Group by statement, always produce a data set smaller than or the same size as
the original data set.
My doubt here , If i do the opearions like one query transform to just map from my source to target and another query to do the sorting and another one to group by will have a better performace or combining everything in one query transform will have a better performace ??
Thanks,
AJ.
Request clarification before answering.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Yes i do and it was writted by you ..but my doubt is just to ensure that if i decouple the aggregation functions from my selelct will it cause a performance drain ? or its goanna be the same ?
In runtime i am not sure how the DS aggregates the group by and order by processes if its distributed across queries
Hi Dirk. What about the batch size handling when implementing full push down operations.
I must ensure the entire operation will fit in the database transaction log?
There´s a way to implement multiple step full push down operation (rows per commit)?
I´m getting transaction log full errors. And I cannot make the tlog match the size of my biggest select-insert push down.
Thanks a lot!
Thanks, Dirk!
I ended looking for full push down trying to optimize a data flow that takes about an hour to complete. With full push down and some dataflow splitting, it takes only 20 minutes (Simple recovery model). Other thing I have also tried is just the collect/use statistics for optimization strategy. This works (20 minutes duration too), but since I´m working in just a small piece of a bigger job, it´s not recommended by my colleagues. I´m reading to try to understand how to manually optimize caches the same way the collect/use statistics do. Is just matter of playing with cache activation/deactivation, sizes and types? Thanks again.
You are correct - use "View Optimized SQL" to check how much is being pushed down.
The Data Services optimizer attempts to push down as much of the logic from a data flow as possible, so if logic is distributed across a number of query transforms it can still be combined and pushed down. You can see what is being pushed down by using "View Optimized SQL".
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
| User | Count |
|---|---|
| 15 | |
| 9 | |
| 6 | |
| 4 | |
| 4 | |
| 4 | |
| 3 | |
| 2 | |
| 2 | |
| 2 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.