cancel
Showing results for 
Search instead for 
Did you mean: 

Need suggestions for hardware on an integrity check platform

Former Member
2,657

One of my primary job functions is to validate customer databases and do the occasional recovery after assertion failures. I obtain a copy of the database, put it on my server, and run dbvalid against it. I'm trying to optimize this process while not allocating resources that won't be used, so I'm hoping someone can shed some light on the validation process for me.

My validation platform is virtualized. Most of our customers use a V11 database while some are still on V9. It seems that dbvalid only uses one processor core at a time, regardless of what the machine specs are. Though with both V9 and V11 databases, it'll use whatever RAM you throw at it. Is there a suggested amount of RAM, or a number where any more RAM doesn't help the validation speed? Will it only use one processor core, regardless of how many there are?

Accepted Solutions (1)

Accepted Solutions (1)

MarkCulp
Participant

SA versions prior to v12 will only use one processor core to process the validate database request. In version 12 the validation procedure was rewritten to take advantage of SA's parallel query execution capabilities and thus may use more than one core when validating a database.

The amount of memory that is required to do the validation is going to depend on the database size, individual table sizes, and the number and sizes of the indexes on those tables. Generally you want to be able to hold the table and all of its associated indexes in memory when the validation of each table is being performed. If you size the amount of cache for the server to be at least the size of the database then you will likely be close to optimizing the amount memory allocated to the server.

The amount of RAM for the computer will need to take into account the database cache size (which must be able to fully fit into memory), as well as the OS requirements to hold the programs that you need to have running.

Former Member
0 Kudos

Thank you for the explanation. Many of the databases that I receive are over 100 GB and some are upwards of 300 GB, so that explains a lot.

VolkerBarth
Contributor
0 Kudos

@Mark: Would it make sense to use a v12 dbvalid to validate v11 databases in order to use parallel execution - or is it generally recommended to let the dbvalid version fit the particular database version?

(And yes, for v9 databases, v12 won't work, that's understood.)

MarkCulp
Participant
0 Kudos

I can't think of any reason not to use v12 server to validate a v10 or v11 database.

Answers (1)

Answers (1)

Former Member

I wanted to follow up on this question with some new information I have found that is worth sharing if anyone else has a similar job function.

My old database integrity platform was Windows based. It had 16 GB RAM, 4x CPUs, and validated databases on a dedicated 300 GB 15K SAS drive. This worked fine for my needs, though one of our customers at around this time last year had a 250+ GB database.

I have since switched to a linux virtual machine for validations (RHEL6). The results have been amazing. It used to take up to 2 weeks to validate the larger v11 databases on Windows, even when using the v12 or v16 dbvalid instead of v11 dbvalid. With the same VM specs on linux, it takes 6 hours. All I did was switch to a different OS.