Incremental data changes?

Oct 31, 2013 at 6:41 AM
Good day,

Would it be possible to do incremental updates between a local SQL Server database and SQL Azure? Imagine I do an initial synchronization that works fine. After that, I would like to send new records in one of the tables that happened since the last update to the Azure db. Is it possible to specify a condition to limit the data to be scripted (BCP)?

I'm not expecting the tool to keep track of what has changed or not. I have that information, so if I can specify the where condition for the data to be extracted that will do it.

Looking at the other posts in the forums I know that the tool will just happily process the data script if the table already exists.

Finally, if all of the above is possible, can I do this as a batch operation (non-interactive)? Even better, would this functionality be supported in the SQLAzureMWUtils class library mentioned in the home page?

Thanks a lot in advance and thank you for a great tool.

Esto
Coordinator
Oct 31, 2013 at 6:07 PM
Hi Esto,

SQLAzureMW is not setup to pull new records and upload. It basically grabs everything is a table and sends it to the target server. One of the things that you might look at is SQL Data Sync. Another thing you can do is setup an SSIS package to Sync your data. A third thing that you can do is create a batch file that that uses BCP with a select statement to pull certain new records and then upload them to the target server. Note, that the last option would really only work for new records (not deleted or updated records). Also note that SQL Data Sync is still in preview mode.

Now, you do have another option if you wish. That is to have a temporary local database that you copy only the new records to that you want to upload to SQL Database. Then run SQLAzureMWBatchBackup to export that database (note that your data and script will be saved to your specified directory). Then, you can run SQLAzureMWBatchUpload (pointing to the script generated by SQLAzureMWBatchBackup) to upload to your data to the target server. Both SQLAzureMWBatchBackup and SQLAzureMWBatchUpload are command line applications, so you can easily put them in a batch file. Also, as another side note, in SQLAzureMWBatchBackup directory, you will find a file called ObjectSelector.xml. With this file, you can tell SQLAzureMWBatchBackup which object (views, stored procs, tables ...) you want to backup. In your case, you could set everything to false except tables and then specify just the specific tables you want to work with). So, the only real work that you would have to do is have a local SQL Script that deleted records from your temporary sync table, then copy new records to temporary sync table. Then run SQLAzureBatchBackup and SQLAzureMWBatchUpload.

I hope this helps,
George
Oct 31, 2013 at 6:55 PM
Hi George,

Thank you for your thorough response. I have to think about all the alternatives you are giving me. I will post back here once I experiment a bit more with this great tool.

Thanks again and I really appreciate the time you took to reply to my question.

Esto