SQL Azure Migration failed

Jul 15 at 3:37 PM
I ran the SQL Server Migration wizard to upload a 250GB database to Azure but it failed. I'm getting this message for couple of tables.

Starting copy...
SQLState = 23000, NativeError = 2627
Error = [Microsoft][SQL Server Native Client 11.0][SQL Server]Violation of PRIMARY KEY constraint 'PK_Orders'. Cannot insert duplicate key in object 'dbo.Orders'. The duplicate key value is (37).
SQLState = 01000, NativeError = 3621
Warning = [Microsoft][SQL Server Native Client 11.0][SQL Server]The statement has been terminated.
BCP copy in failed

Any Ideas?
Coordinator
Jul 16 at 4:31 PM
Hi,

I have seen this when I have a computed column in the table. Check out the column types and see if you are doing anything special. Also make sure that you have no triggers turned on.

Regards,
George
Jul 16 at 8:15 PM
Hi George,
I have no triggers nor do I have a computed column on the Orders table. I have scripted out the Orders table schema.


CREATE TABLE [dbo].[Orders](
[ID] [bigint] IDENTITY(1,1) NOT NULL,
[ParticipantID] [bigint] NOT NULL,
[ItemID] [int] NULL,
[InvoiceID] [int] NULL,
[OrdersTransactionLogID] [int] NULL,
[VendorOrderID] [varchar](50) NULL,
[CheckedReceipt] [bit] NOT NULL,
[StatusID] [int] NOT NULL,
CONSTRAINT [PK_Orders] PRIMARY KEY CLUSTERED
(
[ID] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]

GO

SET ANSI_PADDING ON
GO

ALTER TABLE [dbo].[Orders] ADD CONSTRAINT [DF_Orders_CheckedReceipt] DEFAULT ((0)) FOR [CheckedReceipt]
GO

ALTER TABLE [dbo].[Orders] WITH CHECK ADD CONSTRAINT [FK_Orders_Invoices] FOREIGN KEY([InvoiceID])
REFERENCES [dbo].[Invoices] ([ID])
GO

ALTER TABLE [dbo].[Orders] CHECK CONSTRAINT [FK_Orders_Invoices]
GO

ALTER TABLE [dbo].[Orders] WITH CHECK ADD CONSTRAINT [FK_Orders_OrdersTransactionLog] FOREIGN KEY([OrdersTransactionLogID])
REFERENCES [dbo].[OrdersTransactionLog] ([ID])
GO

ALTER TABLE [dbo].[Orders] CHECK CONSTRAINT [FK_Orders_OrdersTransactionLog]
GO

ALTER TABLE [dbo].[Orders] WITH CHECK ADD CONSTRAINT [FK_Orders_Participants] FOREIGN KEY([ParticipantID])
REFERENCES [dbo].[Participants] ([ID])
GO

ALTER TABLE [dbo].[Orders] CHECK CONSTRAINT [FK_Orders_Participants]
GO

ALTER TABLE [dbo].[Orders] WITH CHECK ADD CONSTRAINT [FK_Orders_StatusTypes] FOREIGN KEY([StatusID])
REFERENCES [dbo].[StatusTypes] ([ID])
GO

ALTER TABLE [dbo].[Orders] CHECK CONSTRAINT [FK_Orders_StatusTypes]
GO

I can share my screen with you, if you want to see the live errors.

My Skype ID is "toyins".


Thanks,
Toyin.
Coordinator
Jul 17 at 1:45 AM
Hi Toyin,

I would remove all constraints and turn off the identity column (just to make it as simple as possible). If the data imports, then you can turn everything back on. I hope you don't have to go through the export process every time you try this. You know that you can save the SQL script after the export process and start over form there. As a matter of fact, once you have your tables created, if you want, just create a BCP command (you can look at the bottom of the TCQL script for BCP args to get the parameters and just do the BCP command) from a command window. You can watch how many records are uploaded and then restart BCP where the last batch succeeded. That is what SQLAzureMW basically does.

Good luck,
George
Jul 18 at 4:16 PM
Hi George,
I'm still getting the duplicate error message on some of the tables.
Here is what I did.
  1. Created the database in Azure (using the Schema only option)
  2. Turned everything off using these statements below.
    EXEC sp_MSforeachtable @command1="ALTER TABLE ? NOCHECK CONSTRAINT ALL"
    GO
EXEC sp_MSforeachtable @command1="ALTER TABLE ? DISABLE TRIGGER ALL"
GO

EXEC sp_MSforeachtable @command1="SET IDENTITY_INSERT ? ON"
go
  1. Re-ran the migration tool (using the data only option).
Please let me know if I need to do anything extra or differently.
Jul 20 at 7:19 AM

Hi George,

I also tried running the bcp directly but it hangs after a while.

Here is the statement.

bcp MyDB.dbo.Survey_Users in G:\ SSMW \BCPData\Survey_Users.dat -c -UmyUserName -PmyPassword -S server.database.windows.net -q

I also tried using the batch size switch, but to no avail.

bcp MyDB.dbo.Survey_Users in G:\SSMW\BCPData\Survey_Users.dat -c -UmyUserName -PmyPassword -S server.database.windows.net -q -b1000000

Coordinator
Jul 20 at 2:01 PM
Hi, does it get very far into the BCP process? Or does it die right off the bat? What version of SQL Server are you running on your local machine? How big is the table causing you issues? Is this something that I can copy to my machine and try (as long as it is not top secret)? What is your target server in Azure? When you type in BCP /v in a command window, what is the version that you get?

Sorry that this is not going very smooth for you.
George
Jul 20 at 7:10 PM

Hi George,

Thanks for taking the time out, to address my issue. Let me try to answer all your questions.

1. does it get very far into the BCP process? Or does it die right off the bat?

Yes, it does but I do not get any response that it has finished, no commit transaction fired internally.

2. What version of SQL Server are you running on your local machine?

2016

3. How big is the table causing you issues?

70M records.

4. Is this something that I can copy to my machine

Yes, here is the URL to the raw data. https://we.tl/qzreIakci8

5. What is your target server in Azure?

Do you mean the size or the name? It’s an S0 standard type of Maximum 250GB

6. When you type in BCP /v in a command window, what is the version that you get?

Version 11.0

The table is Called SurveysResults. Here is the DDL.

CREATE TABLE [dbo].[SurveysResults](

[ID] [bigint] IDENTITY(1,1) NOT NULL,

[Surveys_Users_ID] [bigint] NOT NULL,

[Survey_Question_Culture_ID] [bigint] NOT NULL,

[Answer] [varchar](1000) NULL,

CONSTRAINT [PK_SurveyResults] PRIMARY KEY CLUSTERED

(

[ID] ASC

)

)

GO

I’ve omitted the constraints and other indexes as they are tied to other tables.

Please let me know, if you were able to upload it to Azure and how long it took.

Here are the bcp statements, that I used.

bcp MyDB.dbo.SurveysResults in G:\ SSMW \BCPData\SurveysResults.dat -c -UmyUserName -PmyPassword -S server.database.windows.net -q

I also tried using the batch size switch, but to no avail.

bcp MyDB.dbo.SurveysResults in G:\SSMW\BCPData\SurveyResults.dat -c -UmyUserName -PmyPassword -S server.database.windows.net -q -b1000000

Thanking so much, for your help.

Regards,

Toyin.

Coordinator
Jul 20 at 8:31 PM
Hi Toyin,

I will take a look. But as a side note, I would temporally bump my database server up from an s0 to s1 or s2 (or even higher if you want) for the database migration. Once everything has been migrated, then I would bump it back down to s0 (or whatever size you need).

Regards,
George
Coordinator
Jul 21 at 4:32 AM
Hi Toyin,

I pulled your data and put it in a local database. Then I ran SQLAzureMW and went through the full process. Everything went well. Here is my summary:

BCP output file: c:\SQLAzureMW\BCPData\dbo.SurveysResults.dat.txt
7/20/2016 10:23:36 PM --> Copied 70010500 of 70010500 (100%)
Clock Time (ms.) Total : 12209141 Average : (5734.27 rows per sec.)
Processing finished at 7/20/2016 10:23:36 PM -- UTC -> 7/21/2016 3:23:36 AM
Total processing time: 3 hours, 23 minutes and 30 seconds

You can see that it took 3 hours and 23 mins to upload. I created an s2 database. You might want to try that. I will also use SQL Azure to backup my database to blob storage. We might want to copy that to your Azure blob storage and you can load from that. But it is just that one table.

Regards,
George
Jul 21 at 4:39 AM

Thanks George,

Could you please, send me the bcp statement, including all the switches. Please block out your credentials J

I’ll try it out with an S3 database.

Coordinator
Jul 21 at 4:49 AM
Here was the BCP command:

bcp.exe "Surveys.dbo.SurveysResults" in "c:\SQLAzureMW\BCPData\dbo.SurveysResults.dat" -E -n -C RAW -b 1000 -a 4096 -q -S myazuresvr.database.windows.net -U "myadmin@myazuresvr" -P "mypassword"

As a side note, this specific machine I was running SQL Server 2014 with BCP version 11.0.2100.60.

Let me know how it goes.

Good luck,
George
Jul 21 at 4:53 AM

Thanks George,

I’ll keep you posted.

Toyin.

Jul 21 at 9:31 PM

Thanks George,

I was able to upload the table to Azure using the bcp statement you sent, it took much longer but without errors. Please keep in touch my Skype ID is “toyins”.

Regards,

Toyin.

Coordinator
Jul 21 at 10:00 PM
Woo Hoo! I am glad you are up and running. I am going to delete my test db and all of your data.

Thanks for the update.

Regards,
George
Jul 23 at 3:37 AM

Hi George,

Sorry to bother you again. Is it possible to run multiple instances of the bcp operation from different command windows?

Instead of waiting for one operation to complete, I tried running bcp from another command prompt, to copy data to a different table but to no avail.

It just hangs, It seems there’s a block somewhere.

Toyin.

Coordinator
Jul 23 at 3:55 AM
Hi Toyin,

Yes, you should be able to if they are on different tables. SQLAzureMW kicks off 8 parallel BCP processes when migrating a database. You can set the number of parallel BCP process you want in SQLAzureMW.exe.config. Something else that I do is use many machines for large (many table) databases. Once the database schema is create on your target server, then you can use BCP (or SQLAzureMW) to upload data to different tables on different machines.

Hope this helps,
George
Jul 23 at 4:15 AM

Thanks George, I’ll try again.

Jul 24 at 5:05 AM

Thanks George, I finally got it working on 3 machines. It was actually quicker, to copy over the data files to one of my VMs in Azure and then do the BCP from there, the speed was impressive. I have Comcast at home and you know what that means. J

Toyin.

Jul 27 at 9:11 PM

Hi George,

Quick question, thought you might be able to help here. I can’t seem to execute an insert statement against a table. I don’t know what the issue is, so I tried to drop or rename it, so that I can recreate it.

I executed the sp_MSForEachTable 'ALTER TABLE ? NOCHECK CONSTRAINT ALL'

to disable the constraints in the database but I’m still this error message.

Could not drop object 'Campaigns' because it is referenced by a FOREIGN KEY constraint.

Is there anything else one can do to temporarily disable the constraints in a database, so that one can drop or rename and then recreate a table?

Coordinator
Jul 29 at 4:55 PM
I don't think you can disable a foreign key constraint. You have to delete it, then make your table modifications and then create a new foreign key constraint.

What kind of error are you getting on the insert?

Thanks,
George
Aug 9 at 8:27 PM

Thanks George,

It has been resolved. I just had to kill the process and it released the lock on table.