Incorrectly identifying "...not supported in current version of SQL Azure"?

Aug 28, 2010 at 12:17 AM

 

Throw my hat in for fixing the "Could not find number of records copied. File will not be uploaded." issue as well.

Along with that, there seems to be scripts that the MW had issues with which aren't actually a problem:

StoredProcedure [dbo].[sp_delete_schedule] -- sp_delete_schedule is not supported in current version of SQL Azure **
StoredProcedure [dbo].[sp_update_schedule] -- sp_update_schedule is not supported in current version of SQL Azure
StoredProcedure [dbo].[sp_update_attachment] -- File_Name is not supported in current version of SQL Azure

I was able to create them all manually.

  o sp_delete_schedule is the name of a stored procedure.
  o sp_update_schedule is the name of a stored procedure.
  o File_Name was a column name in a table.


** I re-ran the process and sp_delete_schedule went through.  Not sure what changed.  Seems to give different results if I go Back, Next and exit and restart.

Coordinator
Aug 29, 2010 at 1:55 AM

Hi,

In regards to “Could not find number of records copied”, the problem is that I parse the output from BCP and look for the rows copied. If you look in the SQLAzureMW.exe.config file you will see the Regex for this (look at key BCPRowsCopied). The problem usually is that my system is setup in English and if your system is setup for a different language, then the output from BCP will reflect that language. This means that you would have to modify the Regex search string to reflect your language. The best way to do this is to do a BCP from a command line and point to one of your tables and at the end, look at the rows copied and make the BCPRowsCopied match. If you want, copy the last part of your BCP output and send it to me and I will figure out your regex string.

 In regards to sp_delete_schedule and sp_update_schedule these are stored procedures in MSDB (which does not exist in SQL Azure thus the check for them). If you have your own stored procedures with the same names, then just ignore the message from SQLAzureMW and move on.

As far as sp_update_attachment I don’t see where I am even looking for it. Are these errors coming from the evaluation part of SQLAzureMW or when SQLAzureMW is actually executing your script against SQL Azure?

 Regards,
George

Aug 29, 2010 at 2:48 AM

1. I'm not all that familiar with BCP, but I think this might get you what you want.

bcp "SELECT Title FROM table" queryout "table.txt" -S xxxxx.com -U xxxx -P xxxxx

Enter the file storage type of field Title [nvarchar]:
Enter prefix-length of field Title [2]:
Enter field terminator [none]:

Do you want to save this format information in a file? [Y/n]
Host filename [bcp.fmt]:

Starting copy...

82 rows copied.
Network packet size (bytes): 4096
Clock Time (ms.) Total     : 62     Average : (1322.58 rows per sec.)

bcp.fmt:

9.0
1
1       SQLNCHAR      2       128     ""   1     Title          SQL_Latin1_General_CP1_CI_AS

2. Understood, I will ignore the "sp_update_schedule" messages.  I see it created the procedure scripts anyway.

3. It looks to be related to a field name, File_Name.  Perhaps it's a reserved word.  I can get around it, so no problems there.

Thank you for your assistance.  This is certainly a valuable tool when migrating to SQL Azure.

Coordinator
Aug 29, 2010 at 3:07 AM

Very good.  In the BCP output, you can see "82 rows copied".  The Regex search string is "[0-9]+\srows copied" which should find that line and then I get the number 82 from the line to record the number of records out.  Then when I upload, I make sure that 82 records were uploaded.  Thus the reason for looking for the number of records copied.  Now all that said, your output looks just like mine so I don't see why it can't be found.  Do you get the same error message for every single table or it is just one table in specific that is causing you problems?

 

Aug 29, 2010 at 3:22 AM

I checked and BCPRowsCopied looks good:

        <add key="BCPRowsCopied" value="[0-9]+\srows copied" />               <!-- Regex Search -->

I tried Analyze and Migrate / SQL Database on a single, simple table with no foreign keys.

Process started at 8/28/2010 8:20:56 PM -- UTC -> 8/29/2010 3:20:56 AM ...
Using BCP to get data from table dbo.tblNotes
*
Could not find number of records copied. File will not be uploaded.

bcp.exe:  unknown option d
usage: bcp.exe {dbtable | query} {in | out | queryout | format} datafile
  [-m maxerrors]            [-f formatfile]          [-e errfile]
  [-F firstrow]             [-L lastrow]             [-b batchsize]
  [-n native type]          [-c character type]      [-w wide character type]
  [-N keep non-text native] [-V file format version] [-q quoted identifier]
  [-C code page specifier]  [-t field terminator]    [-r row terminator]
  [-i inputfile]            [-o outfile]             [-a packetsize]
  [-S server name]          [-U username]            [-P password]
  [-T trusted connection]   [-v version]             [-R regional enable]
  [-k keep null values]     [-E keep identity values]
  [-h "load hints"]         [-x generate xml format file]

BCP output file: "c:\SQLAzureMW\BCPData\dbo.tblNotes.dat"

Analysis completed at 8/28/2010 8:21:02 PM -- UTC -> 8/29/2010 3:21:02 AM
Any issues discovered will be reported above.
Total processing time: 0 hours, 0 minutes and 6 seconds

###

Looks like "unknown option d" is of issue.  Perhaps it's calling on an older version of bcp.  I have SQL 2005, 2008 and 2008 R2 installed on my machine.

If it's the old version of bcp it's calling, is there a way to force it in the .config file to use the 2008 R2 version?  Or perhaps I can copy bcp.exe into the same directory.

Thanks George!

Coordinator
Aug 29, 2010 at 1:25 PM

I do believe that you are right!!!  When I bring up a command windows here is the information I get from BCP:

C:\Users\ghuey>bcp /?
usage: bcp {dbtable | query} {in | out | queryout | format} datafile
  [-m maxerrors]                  [-f formatfile]                   [-e errfile]
  [-F firstrow]                        [-L lastrow]                      [-b batchsize]
  [-n native type]                  [-c character type]           [-w wide character type]
  [-N keep non-text native] [-V file format version]   [-q quoted identifier]
  [-C code page specifier]    [-t field terminator]         [-r row terminator]
  [-i inputfile]                         [-o outfile]                       [-a packetsize]
  [-S server name]                 [-U username]                [-P password]
  [-T trusted connection]      [-v version]                      [-R regional enable]
  [-k keep null values]           [-E keep identity values]
  [-h "load hints"]                  [-x generate xml format file]
  [-d database name]

C:\Users\ghuey>bcp /v
BCP - Bulk Copy Program for Microsoft SQL Server.
Copyright (C) Microsoft Corporation. All Rights Reserved.
Version: 10.50.1600.1

C:\Users\ghuey>

 you will notice that mine has the [-d database name] option.  You also can see where I did a bcp /v to get the version.  Just out of curiosity, when you type in bcp /v, what is your version.

Thanks very much for your help! George

 

 

 

 

Aug 29, 2010 at 8:56 PM

Hi George!

Yes, that was the ticket!  I updated my PATH= setting in Environment Variables, rebooted and that took care of the problem.

While poking around, I did see that Tasks > Generate SQL Scripts in 2008 R2 offers the following:

  Script for the database engine type: SQL Azure Database
  Types of data to script: Schema and data

Could you fill me in on the advantages (or disadvantages) of using this method over SQL Azure Migration Wizard?  I can say that the database that I am transferring is 1.5 GB with the possibility of moving another database which is 15 GB.

Thank you for your time and insights George.  They are much appreciated.

Aug 29, 2010 at 8:59 PM

As a side note, I suspect the backup/restore function (even if limited) in SQL Azure is pretty close to be completed which would make life easier all around.

Would you kindly let me know if you have heard any updates on your end regarding that?

Thanks again!

Coordinator
Aug 29, 2010 at 10:18 PM

Great!!!  This is good to know.  In regards to "Generate SQL Scripts" in SSMS, it works pretty well except for moving data.  You will not want to generate data if you have lots of data (which looks like you do) because it generates an insert statement for each row of data.  I tried that once on a very large table and it killed my machine (ran out of memory).  The only real problem that I have with it is that if you have something that is not compatible with SQL Azure, then the “Generate SQL Scripts” wizard flags an error then stops. You have to save the error off to a report, then look at the error, then resolve before you can move on.

Another method that I like is instead of using “Generate SQL Scripts” in SSMS, use “Export Data…” It will export both schema and data and you can target SQL Azure. Note that when targeting SQL Azure, you must select “.Net Framework Data Provider for SqlServer” (Data source) for the target. Your User ID must be in the format of username@server. The initial catalog will be your database and the Data Source will be your SQL Azure server (i.e. server111.data.int.mscds.com). The Initial Catalog will be your target SQL Azure database. Note that you will have to create the database in SQL Azure before you run the “Export Data…” wizard. You can do this via the SQL Azure portal or SSMS (i.e. CREATE DATABASE BusinessProcess (maxsize=1 gb, edition='web')). You can select which tables to copy. Note that if this wizard runs into errors, it will flag them as errors and move on (unlike “Generate SQL Scripts”).

Now, the one thing that will get you with both of these wizards is that they do not check to see if you have a clustered index on the tables. So, before you can move your data via either one of these, you need to make sure that all tables have a clustered index. Since SQLAzureMW looks at your tables, you can use SQLAzureMW to identify which tables do not have a clustered index, add clustered indexes to the tables, then run the other wizards.

In regards to backup / restore, I don’t know when these will be released. But, that said, you can do a CREATE DATABASE copy_db AS COPY OF source_db (see http://blogs.msdn.com/b/sqlazure/archive/2010/08/24/10053883.aspx for more information).

I hope this helps.
Regards,
George

Aug 30, 2010 at 7:51 PM

Hi George,

Thanks for the tips and insights.  I see what you mean with Generate SQL Scripts.  I created an unwieldy file on just one table.

I got the depreciated features out of my code and all is working well now with Migration Wizard.

A couple minor notes on Migration Wizard:

1. If I click Back, Next and re-process the Script Wizard Summary; while it executes, it doesn't overwrite the original script which is executed at the destination server.  You have to exit and restart the program for that.  This may cause some confusion when making changes to fix stored procedures and such.

2. The Delete Database does not have any warning prompts. :)

3. Resizing does anchor Connect to Server, Back, Next, Exit to bottom right.

4. Connect to Server window tabbing is dis-ordered.


CREATE DATABASE ... AS COPY AS makes a world of difference! Cheers!