on ‎2007 May 04 5:19 PM
I have read the previous threads regarign this topic:
https://forums.sdn.sap.com/click.jspa?searchID=2437280&messageID=3271696
and
https://forums.sdn.sap.com/click.jspa?searchID=2437280&messageID=3364924
I think a user can easily handle more than 120,000 records, it all depends on the format of the data:
I have a BLS transaction that takes as input a set of manufacturing batch data from a Query Template. The BLS transaction perfroms a transform on the data, taking the results of the Query Template and reordering it by batch number:
<u>Batch, Step, Value</u>
10, StepA, Value1
10, StepB, Value2
10, StepC, Value3
11, StepA, Value4
11, StepB, Value5
11, StepC, Value6
...
is transformed to
<u>Batch, StepA, StepB, StepC...</u>
10, Value1, Value2, Value3...
11, Value4, Value5, Value6...
and so on (of course there are other complexities to the transform, but i won't go into those)
I already have more than 250,000 records and my users want to be able to see all data in this report. So yes, I need more than 120,000 records from my Query Template. How do I do this?
Thanks,
David
Request clarification before answering.
I completely disagree that a user can deal with that volume of data, but since you seem committed on your path to destruction..
Combining the results of multiple queries MIGHT work, although when sending it back through xMII, it might still get limited to 120K records. You'll need to try it.
In any case, I would suggest doing a drilldown model, where you provide users with a list of batches completed during the period, and let them drill down into the detail(s) for that batch or to do batch-to-batch or batch-to-"golden batch" comparisons.
- Rick
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Break up your your actions.
1. Query to get a list of unique Batches (10,11, etc.)
2. Using a Repeater, for each Batch, query to get the Batch, Step, and Value.
3. Transform the data
4. Append to an IllumDocument.
So instead of querying once and transforming once, you'll need to query multiple times.
And if possible instead of bringing back all 250,000+ rows to the user, why not filter it down to the manageable few that need to be looked at (meet a certain condition).
And overall, think about the number of records a user can "really" analyze at one time. I know this has been mentioned in the links you provided. An average novel has between 60,000 and 100,000 words in it and they take how long to read. How long would it take someone to read every piece of data in a 250,000 row result set. And if they don't need to see every piece of data, then why bring it back? Use the visualization tools to bring attention to the Batches that need attention and let the user drill down from there.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
| User | Count |
|---|---|
| 1 | |
| 1 | |
| 1 | |
| 1 | |
| 1 | |
| 1 | |
| 1 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.