Automate SQL Databse Migration Wizard

Jun 23 at 2:17 AM
Is there a way to have the wizard run unattended and on a regular schedule?

Thank you.
Coordinator
Jun 25 at 2:57 PM
If you are looking at something like a test dev process where you delete the old database after destructive testing and want to upload a baseline master database, then yes. That is what SQLAzureMWBatchBackup and SQLAzureMWBatchUpload are for. They are both command line driven and you can put them in a batch process and run them when you want.

HTH,
George
Jun 28 at 10:17 PM
Edited Jun 28 at 10:19 PM
Thank you, this was very helpful.

My next question is can I just select two tables from a Database containing several tables when using SQLAzureMWBatchBackup?
I know I can do that when using the SQLAzure Migration Wizard, but I want to do this using SQLAzureMWBatchBackup.
I have looked closely at the SQLAzureMWBatchBackup config file and I can't find anything to let me just pick specific tables from a database.

Thank you for your help.
Coordinator
Jun 29 at 2:50 AM
Hi,

Check out ObjectSelector.xml. Look through the instructions at the top of the file then change this section:
<Tables script="true">
    <SQLObjects>
        <SQLObject script="true" regex="true" schema=".*" name=".*" />
    </SQLObjects>
</Tables>
You can have one of many SQLObject nodes.

Hope this helps,
George
Jun 29 at 4:06 PM
Hi,

Thanks so much for this information. It looks like exactly what I need to use.
As soon as I give it a try, I will let you know how I make out.

And thanks so much for the quick reply.

Regards,
Larry
Jun 29 at 8:47 PM
Hi,

I was able to modify the ObjectSelector.xml file and have only the objects of interest used by SQLAzureMWBatchBackup.
Thank you for that information.

Here is my next question. When I use SQLAzureMWBatchUpload, I notice that I get a message that a Database is being created.
It is not my intention to either create a database or even create any tables.
I only wish to have two specific tables in the target database updated with new data from the source database. The other tables in the target database are to be left untouched.
Now when I use SQLAzureMWBatchUpload, I see that other tables within the target database are actually deleted, which is not what I want.

Many thanks for all of your help.

Regards,
Larry
Coordinator
Jun 29 at 9:39 PM
Hi Larry,

There is a -d flag that should be set to false (what this says is do not delete the database if it already exists). Also, if you don't want to use the -d flag, then edit SQLAzureMWBatchUpload.exe.config file and look for:
    <add key="DropOldDatabaseIfExists" value="true"/>       <!-- -d option -->
You will see that the default value is set to "true". Change this to "false" and it will no longer drop your database.

Hope this helps,
George
Jun 30 at 3:15 AM
Hi George,

Thanks so much for this timely reply.

Regards,
Larry
Jun 30 at 4:20 PM
Hi George,

I think I am at my last step in my process.
When I use SQLAzureMWBatchUpload against a table that already has data in it, I get an error that says "Duplicate Keys Not Allowed".
If the tables don't already exist, then everything works just fine.

I really would like to be able to have SQLAzureMWBatchUpload clear out all of the rows in the selected target tables before using BCP to upload new fresh data into these selected target tables.

Is there an easy way to do this?

Many thanks,

Larry
Coordinator
Jul 1 at 2:28 AM
Hi Larry,

Here is what I would do. I would setup a SQL script that had something like this:

TRUNCATE TABLE dbo.MyTable1
TRUNCATE TABLE dbo.MyTable2
GO

or

DELETE FROM dbo.MyTable1
DELETE FROM dbo.MyTable2
GO

and run this script against your target database first. Then I would run a second command that migrated your data.

The other way would be to write some kind of program that can edit your SQL script and add the table purging script before the BCPArgs with a GO after them then run SQLAzureMWBatchUpload. The first option sounds easier to me :-).

Hope this helps,
George
Jul 1 at 3:46 PM
Hi George,

Thanks for your reply.
Actually I was also thinking of trying the same thing as you suggested.

Your help has been most beneficial for our work.

Regards,
Larry