cancel
Showing results for 
Search instead for 
Did you mean: 

Time to extract 300GB Database

636

I have a team I'm working with that is running 10 SQL Anywhere databases.

They are running this on Ubuntu 18.04

SQL Anywhere is version 17.0.11.7312

They are wanting to introduce replication so they can use it to move data to their Azure environment. Long story short, one of the databases is 300GB is size.

Currently they have been extracting for almost 24 hours and it looks like they are about 80% done.

Their machine has 8 cores, and for the most part it has just been using 1 core, with the occasional blip up to 2 cores. It never goes beyond that.

Is there any way to speed this process up? They are more or less throwing their hands up thinking that extractions taking this long would cause issues since they are running isolation level 3.

This is not a disk space issue from what I can tell. They have about 100TB available on the database partition.

Any thoughts or suggestions on what might speed this process up would be greatly appreciated.

Jeff Gibson Nashville, TN

Additional info - We are watching the core usage and SQL Anywhere just jumped up to 8 cores, but it's only using about 12% of each core. But it literally just started using this.

regdomaratzki
Product and Topic Expert
Product and Topic Expert
0 Kudos

When you say "they are wanting to introduce replication" am I correct in assuming that you mean SQL Remote?
When you say "extracting" do you mean running dbxtract against the database?

I'm going to assume yes and answer the question, and will delete the answer if my assumptions are wrong.

VolkerBarth
Contributor
0 Kudos

Are there complicated article definitions contained, say with several joins based on subscribe by values? That may slow down extraction heavily.

0 Kudos

My apologies Reg. Yes. They are using SQL Remote. And this is only one way replications. So they are doing sends only from production and receive only at the two other environments.

0 Kudos

It is not. It's basically a select all saying send everything that happens.

Accepted Solutions (0)

Answers (1)

Answers (1)

regdomaratzki
Product and Topic Expert
Product and Topic Expert

I would not suggest you run dbxtract on an in use production consolidated database, but rather that you run the extract on a copy of the consolidated database.

  1. Stop the consolidated database.
  2. Make a copy of the database file to another location, preferably another computer just to be paranoid.
  3. BE CAREFUL NEVER TO RUN DBREMOTE AGAINST THE COPY OF THE CONSOLIDATED.
  4. While the consolidated is still not running, start the DB using dbeng17, and execute the SQL that creates the new remote user, subscribes them to the publications, and then starts the subscriptions. Make sure that the message directory for this new user exists.
  5. BE CAREFUL NEVER TO RUN DBREMOTE AGAINST THE COPY OF THE CONSOLIDATED.
  6. Start the consolidated database again. As the database is being used and data is being added, when you run dbremote against the production version of the consolidated (NOT THE BACKUP!!), it will create messages for the new remote user to eventually apply.
  7. BE CAREFUL NEVER TO RUN DBREMOTE AGAINST THE COPY OF THE CONSOLIDATED.
  8. Extract the remote user from the backup copy of the consolidated database. Nobody else is using this database, so the resources it is consuming or the time it takes is irrelevant.
  9. BE CAREFUL NEVER TO RUN DBREMOTE AGAINST THE COPY OF THE CONSOLIDATED.
  10. You can now delete the copy of the consolidated database, which will ensure that dbremote cannot be run against this database, because, you know, it no longer exists.
  11. Deploy the newly extracted remote database and run dbremote against the remote database. When dbremote runs, it will pick up messages from the consolidated database that include all the changes that have occurred on the consolidated since the copy of the consolidated database was taken in step 2.
  12. You now have two database that are perfectly in synch, and the only downtime was the time to physically copy the database file of the consolidated database.

Reg