SQLAzureMWBatchUpload Config - use RAW input?

Jun 8, 2015 at 8:03 PM
So our current process uses BatchBackup to generate the files, then SQLAzureMWBatchUpload to upload the files. We're running into an issue with different code pages. From what I can see, the backup portion is successfully generating the files using the RAW format, but it seems that the BatchUpload portion is not using RAW. This is resulting in some issues when we try to map over ASCII characters outside of the standard AZ09 types.

Is there a config setting to force BatchUpload to use RAW format when running the BCP IN commands?
Jun 9, 2015 at 3:10 PM
Edited Jun 9, 2015 at 3:12 PM
Yes. The BCP in part of the BCP command is generated from SQLAzureMWBatchBackup. So, if you would edit SQLAzureMWBatchBackup.exe.config file you should find BCPArgsIn and BCPArgsOut. You can set them to look something like this:
<add key="BCPArgsIn" value="{0} in {1} -E -n -C RAW -b 1000 -a 4096"/>    <!-- BCP in command parameters -->
<add key="BCPArgsOut" value="&quot;{0}&quot; out {1} -E -n -C RAW"/>      <!-- BCP out command parameters -->
Or change the parameters to meet your needs.

I hope this helps. Let me know if it doesn't.

Jun 9, 2015 at 3:24 PM
Edited Jun 9, 2015 at 3:26 PM
Does this work even when the Backup config file isn't in the same folder as the Upload exe? We are using an older version and this makes sense, but I'm not sure if I should copy the backup config file into the upload folder or do something else to get them to work together.

Did you mean to add BCPArgsIn to the SQLAzureMWBatchUpload.exe.config file? Would that even work?
Jun 9, 2015 at 4:10 PM
Yes. There is really nothing to do with the Upload exe file. During the export process, SQLAzureMWBatchBackup creates a SQL file that has your schema and BCP commands in it (look at the bottom of the file). The BCP commands are generated during this export process and SQLAzureMWBatchUpload has nothing to do with the actual generation of the commands. It just uses what SQLAzureMWBatchBackup gives it.

So, no don't look at BCPArgsIn in SQLAzureMWBatchUpload.exe.config. It really should not be in the config file anyway. In my next release, it will be gone.

Jun 9, 2015 at 4:18 PM
OK - I made a copy of the backup config and put it in the Upload folder. Hopefully that will do the trick. I'll know tomorrow. :) (And BCPArgsIn wasn't an option in the Upload config - I was just wondering if it should have been there.)

I'm assuming that if the upload exe doesn't see the backup config, it doesn't use those options. Is that correct?
Jun 9, 2015 at 8:07 PM

Sorry, I left out part of what I was trying to convey. You need to edit SQLAzureMWBatchBackup.exe.config file as noted above and rerun SQLAzureMWBatchBackup so that BCP will export the data that you want it exported in the correct format. By rerunning SQLAzureMWBatchBackup, not only will BCP export the data in the format you want, but it will put write out the BCP commands for uploading by the next process in the format that you want as well. So, there is really not anything that you have to do in regards to SQLAzureMWBatchUpload. You are right, copying the backup config file to the upload folder will do you no good. And the BCPArgsIn is not needed (thus ignored) in SQLAzureMWBatchUpload.exe.config.

I hope this helps,
Jun 9, 2015 at 8:14 PM
In that case, we may have something else going on. Was this fixed/updated along the way? I have those in my current settings so the data is exported via the BatchBackup process (exes in one folder) and then uploaded using the BatchUpload process (again - in their own folder), but we're still seeing issues with the data importing into the target DB due to code page mismatches.

Is there a way to preview the BCP IN commands that are being sent? That would help us troubleshoot this a little better.

We're currently running in our Prod environment (which I realize is quite old), but I'm hesitant to just blanket replace these bits without a backup and knowing that nothing else will be affected when we do that. I assume I'd just need to replace the DLLs and EXEs but am not sure what the proper upgrade process should be. I don't want to lose the configuration we have for the export as I didn't set that up.
Jun 9, 2015 at 10:06 PM

Upgrading should not be an issue. You don't have to replace anything because there is really nothing to install. Just download the latest version of SQLAzureMW tools to a new folder and unzip. You might have to do a properties and unblock the exes, but that is all. Then you just run like you always do and point to your files. As far the BCP commands, you know when you run through SQLAzureMWBatchBackup, you specify a TSQL output file. Edit that file and look for "-- BCPArgs:". You will see something like this:

-- BCPArgs:830:[dbo].[Orders] in "c:\SQLAzureMW\BCPData\09-June-2015 0942\dbo.Orders.dat" -E -n -C RAW -b 1000 -a 4096
-- BCPArgs:2155:[dbo].[Order Details] in "c:\SQLAzureMW\BCPData\09-June-2015 0942\dbo.Order Details.dat" -E -n -C RAW -b 1000 -a 4096

That is your basic BCP command. The first part "-- BCPArgs:" is my key that I look for. The number is the number of records found in that file. I use that to make sure the number of records I uploaded are the same as the number of records I downloaded. The [dbo].[Orders] is the table I am uploading to and the next part is the file that has the data and the last are BCP args.

I pretty much take that string as is and add target server, username and password.

Jun 9, 2015 at 10:17 PM
Well, that helped quite a bit. Apparently, I've been passing in the "-C RAW" option to export/import in RAW format, but we're still seeing ASCII characters not matching between our source and target servers.

'ZÄI?œè)€64_¦s' --Source
'ZÄI_?oè_)_64_¦s' --Target after copy

As you can see, the 4th character was remapped, the oe was mapped to o, and so on. Is there another option I should be considering instead of RAW for this case?

I appreciate the quick responses. We've been troubleshooting this for a while and have been having issues with this particular set of values that we encrypted on our side, but aren't translating when we try to copy to another environment for UAT.
Jun 10, 2015 at 2:47 PM
I don't know the answer to this, but I have posted a query within Microsoft hoping that somebody will have an idea. Basically, at this point, I would take SQLAzureMW out of the equation and start using BCP directly from a command windows. If one command does not work, truncate the target table and try a different parameter. I would also post this question on the MSDN SQL Forum: https://social.msdn.microsoft.com/Forums/sqlserver/en-US/home. Something else I would try and that is can you do a backup and restore to a dummy database and drop everything but one problem table? This way, you could then backup the dummy database and send it to me and let me play as well.

In regard to BCP commands, I actually write the full BCP command I use to a txt file. Look in the folder that was created by SQLAzureMWBatchBackup and look for a txt file. You can just find the BCP command (go to the bottom of the file and you should see a summary of all BCP commands) and use that for playing with.

Jun 10, 2015 at 5:58 PM
Thanks for your patience. I think we actually did have this corrected by manually setting the RAW codepage, but something was interfering with our upload to AWS. I tried to upgrade the bits last night (without success - kept getting a "file not found" error) but in the process found that we were timing out when connecting to AWS during the upload phase.
After some work with the hosting company, I think everything was properly opened up for that server to push changes to our AWS instance and after the last round with the RAW option enabled, the passwords were in sync.

I'll keep trying to get the latest version working on that server, but it's not nearly as high a priority at the moment.

Thank you for your time and for pointing me at that script with all of the details. That was helpful to troubleshoot what was supposed to happen.