cancel
Showing results for 
Search instead for 
Did you mean: 

Question

Former Member
0 Kudos

I have a crystal report that uses a stored procedure. The data in the stored procedure is returned very fast. When I run the report for 6 months the report works just fine. When I run the report for a year it chokes on me.

Is the report just to big? What are some good methods to getting Crystal reports to display a years worth of data?

This report does a lot of time calculations on 88,000 records returned from the stored procedure for the year.

Accepted Solutions (1)

Accepted Solutions (1)

Former Member
0 Kudos

If I ever find anything, I'll post it.

The way the SP works is one major select many inner joins and then two union selects.

In crystal reports it just crunches data based off the rows fired back from the SP.

Former Member
0 Kudos

I have a report that takes hours to run 'cause CR is crunching arrays for each of hundreds of thousands of records. If you let the report run (perhaps even overnight, if you know it's not causing DB issues), does it eventually display?

Here's the trick I use to debug Crystal Reports:

When there is a formula error, Crystal pops up the Formula Editor with the formula in error displayed. It also shows current values of all of the fields referenced by the formula. So, I force an error by adding (basic syntax):


dim debug as number
debug = debug / debug

This causes a Divide by Zero error, causing a breakpoint, so I can look at the values in my formula. You can use the following logic to cause the error on a specific record/iteration:


global cnt as number
dim debug as number

cnt = cnt + 1
if cnt > 10 then
  debug = debug / debug
end if

Obviously, the 10 in the IF statement is an example. You can also base this on fields in your record, or any other report field.

BTW, you have to use the debug variable. If you do something like 1 / 0, the Editor will complain when you try to save the formula.

HTH,

Carl

Former Member
0 Kudos

Thanks Carl.

I'll be adding that to my wee grab bag of knowledge.

Jason

Answers (4)

Answers (4)

Former Member
0 Kudos

The script is written with nolock it is a standard must in this organization. To much moving data.

I agree with you on the formulas and functions and I believe because of all the number crunching it is memory intensive.

My personal opinion is that the report should be crunched on the SQL end before displaying in the crystal reports.

Do you know of any debugger tool for crystal reports?

Former Member
0 Kudos

I agree with you there. I try to push as much processing back back to the server as I can.

As far as a debugger... Nothing that I know of. That's not to say there isn't one. I just don't know about it. I've always just used the Performance tab on the Task Manager to see ow big of a hit I was taking on the client side. If you come across anything please post it to this thread. I'd be interested as well.

This is probably a WAY off question but it bears asking... Are you combining the SP data set with any other data? (other tables or Commands) CR seems to hate it when you combine commands with anything else.

Jason

Former Member
0 Kudos

There is no error message. It just freezes up.

The stored procedure is in good shape. The execution plan does not have any performance related issues. It uses seek instead of scan. No usage is over 25%.

The stored procedure does two select unions.

I ruled out the stored procedure because it returns the necessary data.

Crystal reports is doing all of the calculations. The purpose of the report is to tell how fast an event occurred. Those 88,000 records get grouped into 12-14 pages in crystal reports.

When the data is broken down 6 months or narrowed down in crystal reports, it works just fine.

It is bizarre because I would think 88,000 records would be a piece of cake.

Former Member
0 Kudos

If you can exec the SP outside of CR without any performance issues then I would say that you are 100% correct. There is no need to modify the SP. From my own experiences, I've run into "Deadlock Victim" issues, the NOLOCK has worked to resolve that issues...

The only other thing I can think for would be formulas or functions within the report itself. If the report is doing a lot of client side number crunching you may be running into memory issues.

Jason

Former Member
0 Kudos

Janderz,

I'm with Carl here, 88,000 records shouldn't be an issue. Just taking a guess on the error... Sounds like it could be a record locking problem. If you have the ability to modify the SP and you can live with "dirty reads", I'd try adding NOLOCK table hints to the SP (assuming that you DB is SQL Server).

Just an idea...

Jason

Former Member
0 Kudos

What is the error message that you're getting? 88,000 records isn't really a whole lot...

Carl