Moving data from Sql Server to MySQL

To move data from Sql Server to MySQL, it is certainly possible to use tools that can make connections to both data stores and manipulate data that way, such as Access, Excel, or SSIS. Here I will introduce a process that does not need any special tools or data drivers. Instead, we can use the utilities and methods that come with a standard Sql Server and MySQL install to accomplish that task.

With this approach, it is assumed that matching tables already exist on MySQL. If not, they need to be created first.

This process is comprised of these steps: first bcp command will be generated based on Sql Server database meta data (sysobjects, think information_schema in MySQL); then the generated bcp commands will be executed; the resulting csv files can then be transferred to the MySQL server, optionally it is possible to compress them if the size is big; and finally the csv files will be imported into MySQL with LOAD DATA LOCAL INFILE.

1. Run the code below on Sql Server to generate bcp commands, customize the data dump folder name and database name as needed. For database name, replace MyDb with the proper database name, 2 such places need to be changed:

set nocount on

/* Set BackupFolder name here */
declare @BackupFolder varchar(100)
set @BackupFolder = 'c:\MyDataDumpFolder\'

/* Initialize variables here */
declare @TableNames table (TableTempID  
smallint IDENTITY(1,1) PRIMARY KEY CLUSTERED, TableName varchar(50))
declare @TableName varchar(50)
declare @BackupFileName char(50)
declare @BackupFolderFile varchar(150)
declare @Counter smallint
declare @MaxTableCount smallint
declare @BCPOutCommand varchar(500)

/* Get the list of tables that we want to dump out as csv */
insert into @TableNames (TableName) select name from 
MyDb.dbo.sysobjects where xtype = 'U' 
and name not in ('dtproperties')

select @MaxTableCount = max(TableTempID) from @TableNames
set @Counter = 1

/* Loop through all each table individually, generate bcp 
commands and run bcp commands to export data */
while @Counter < = @MaxTableCount

Begin

/* Create backup file name */
select @TableName = ltrim(rtrim(TableName)) from 
@TableNames where TableTempID = @Counter

select @BackupFileName = ltrim(rtrim(@TableName)) + '.txt'

/* Combine backup folder name and file name */
select @BackupFolderFile = @BackupFolder + @BackupFileName

/* Create BCP command */
select @BCPOutCommand = 'bcp ' + '"MyDb.dbo.' + @TableName + 
'" out "' + ltrim(rtrim(@BackupFolderFile)) + 
'" -c -q -S' + @@Servername + ' -T'

print @BCPOutCommand
set @Counter = @Counter + 1
end

2. Run the bcp commands generated above on Windows DOS command line. Suppose we only have one table called t1, below is what the bcp command would look like:

bcp “test.dbo.t1″ out “c:\junk\t1.txt” -c -q -SSqlServerInstanceName -T

3. Gather the text files and move them to MySQL server;
4. Run the command in mysql to import the data in. Adjust the directory name as needed:

mysql> LOAD DATA LOCAL INFILE ‘/home/haidong/Desktop/t1.txt’ INTO TABLE t1 LINES TERMINATED BY ‘\r\n’;

I’ve tested this for tables that have number, character, datetime, and sequence data successfully. Your mileage may vary. Hope this helps.

13 Comments »

  1. Dr. Warren Porter Said,

    March 20, 2009 @

    thank you very much for your code.
    do you have any advice/code for transferring DBF files into tables in MySQL? I am using some very large databases for doing GIS operations. Some of these tables are more than 12 million lines long. I have found that it is impossible to use the normal command
    mysql -u usrnam -pusrpaswrd databasenam < filename.sql
    to place large amounts of data into a table in a database. do you have any idea how large a file this command can handle? It won’t do it for my 12 million lines. However, if we break it into small sections it goes in with no problem.
    Thank you for considering this request.
    Best wishes,
    Warren Porter

  2. Dr. Warren Porter Said,

    March 20, 2009 @

    I forgot to mention that we use a DBF to MySQL converter, DBF Viewer 2000, to create the SQL data file.

  3. Haidong Ji Said,

    March 20, 2009 @

    Hi Dr. Warren Porter,

    I can see it is difficult to do 12 million rows of insert in one fell swoop.

    I am not familiar with DBF Viewer 2000, but I wonder if there is a way to ask the tool to suck data out to sql file one sub-set at a time, say 500000 rows. Perhaps importing 500000 rows is more manageable. We will just need to do that a few times.

    Another thing that might be helpful is through some kind of archive tools. In fact, there is a Perl utility, part of MaatKit, called mk-archiver, that nibbles away data in an MySQL source, and move that data to a different MySQL instance. Unfortunately, this tool is for MySQL only. Fortunately, it is open source, so it is possible to tweak it, along with perhaps Perl’s XBase module, to get the data out of the DBF.

    http://www.maatkit.org/doc/mk-archiver.html
    http://search.cpan.org/~pratp/Xbase-1.07/Xbase.pm

  4. Wiley Thomas Said,

    March 24, 2009 @

    Hi Haidong Ji – Thank you for posting this code.

    I wonder if you have any ideas for another situation. I’m trying to move data from a MS SQL Server database to MySQL where the MS SQL connection is through Citrix. I only have access to query analyzer. I’m able to export the data successfully but when I try to load it into MySQL I encounter errors caused by the character encoding in the downloaded file (UTF-16LE). If read the file into MS Access and back out I can load it into MySQL.

    Tnanks again,
    Wiley

  5. Haidong Ji Said,

    March 24, 2009 @

    Hi Wiley,

    Not too sure on this one. A quick thought is that if you have direct access to BCP, it is possible to define the collation in a BCP format file so the data is piped out in the right encoding format. Not sure if that helps you, but I will submit a sample BCP format file tomorrow when I have access to it.

    Looking at settings that Access take to output a file can also give you a clue. In Query Analyzer (Sql Server 2000 edition), when I save results to a file, assuming that’s how you get your data, I see three options: ANSI, Unicode, and OEM. Have you tried any one of them?

    I may do some digging myself and let you know. In the meantime, if you figure it out, I’d be curious to hear about it.

  6. Wiley Thomas Said,

    March 25, 2009 @

    Haidong – You are the best! How many times have I clicked that save button without ever noticing the dropdown that gives you the option to save with encoding?

    I chose to save as ANSI and loaded it right into MySQL with no problems.

    Thank you,
    Wiley

  7. Haidong Ji Said,

    March 26, 2009 @

    No problem Wiley. Glad you got it resolved.

  8. Vishnu Kumar Said,

    July 10, 2009 @

    Hi Haidong Ji,
    That was a beautiful script.
    It worked perfectly with me on SQL Express 2005

    Also, i would like to add that one can copy all the bcp commands and paste it notepad.
    Then he can save that as a .bat file.

    Then goto CMD and execute the batch file.

    Magic !!!
    ;)

  9. Vishnu Kumar Said,

    July 10, 2009 @

    Hi
    But how do i transfer the table data from MSSQL to MySQL if not manually
    thanks
    Vishnu

  10. Anjali Sharma Said,

    October 6, 2009 @

    Hi Haidong Ji,

    I have read your post.
    I have one requirement to transfer or post the data from Sql table xxx to mysql yyyy with different table structures(i.e no of columns may vary,columns names may vary etc).
    Here for example sql table consists of 4 columns by names 1,2,3,4 and mysql table yyyy consists of 3 columns 5,6,7.
    here i should map col 1 of sql table xxx to col 5 of mysql table yyy.In the same way we should map col 2 to col 6 and col 4 to col 7.
    how can i proceed in this regard considering autoincrement,Primary key functionalities?
    Please reply me asap.

    Regartds
    Anjali

  11. ipotpal Said,

    October 28, 2009 @

    I try to move my database, this is easy

  12. GoodKid » aa Said,

    November 2, 2009 @

    [...] Moving data from Sql Server to MySQL [...]

  13. ипотпал Said,

    November 15, 2009 @

    Thanks for the great piece of code. You saved me so much time

RSS feed for comments on this post · TrackBack URI

Leave a Comment