Error 40552 large table upload to Azure

May 19, 2014 at 9:49 PM

I am migrating a DB with a very large table and I keep getting error 40552 due to the transaction log exceeding Azure's limit. I tried modifying the script to change ONLINE=ON but I then get error 2725 because the index contains a Data column with images.

Any suggestions would be appreciated - Thanks
May 20, 2014 at 1:56 AM
Ouch. How many rows are you talking about? Do you have triggers on your table?

I tell you what, go to the directory where you put SQLAzureMW and edit SQLAzureMW.exe.config file and then find "ChunkSize". You will see that it is set to 100000000. Change that to 1000000 (or 100000 if you wish). About the largest table that I have played with is 250,000,000 records. I think I had it broken down to 100,000 record chunks. Note that it took DAYS to upload that table. What I see is that BCP starts out around 2,500 records per second, but after 50 million records or so, it was processing 1,000 records per second and eventually it ended up processing 20 records per second. It is probably because BCP had to find the start record each time and it just s spins through the records. Anyway, change the chunk size and see if that helps.

May 20, 2014 at 9:29 PM
Hello George,

It is not really a huge DB. 118,352 records and about 16 GB. The size is due to mostly image types. reducing the chunk size resulted in a TCP connection reset on Azure. I believe the throttling did not like the pause between chunks? I am trying to reducing the bcp options from -b 10000 to -b 5000 to see if smaller batch sizes prevent the transaction log from blowing up. I will post back if it is successful.

Thanks for the reply and help!
May 25, 2014 at 10:35 PM
Hi Joey,

How did it go?

May 27, 2014 at 1:13 PM
Hello George,

Adjusting the batch size to 5000 did the trick, it did not seem to slow anything down either. A helpful feature for the application would be to recover the last batch with a lower batch size if error 40552 is encountered during the upload.