To move data from Sql Server to MySQL, it is certainly possible to use tools that can make connections to both data stores and manipulate data that way, such as Access, Excel, or SSIS. Here I will introduce a process that does not need any special tools or data drivers. Instead, we can use the utilities and methods that come with a standard Sql Server and MySQL install to accomplish that task.
With this approach, it is assumed that matching tables already exist on MySQL. If not, they need to be created first.
This process is comprised of these steps: first bcp command will be generated based on Sql Server database meta data (sysobjects, think information_schema in MySQL); then the generated bcp commands will be executed; the resulting csv files can then be transferred to the MySQL server, optionally it is possible to compress them if the size is big; and finally the csv files will be imported into MySQL with LOAD DATA LOCAL INFILE.
1. Run the code below on Sql Server to generate bcp commands, customize the data dump folder name and database name as needed. For database name, replace MyDb with the proper database name, 2 such places need to be changed:
set nocount on
/* Set BackupFolder name here */
declare @BackupFolder varchar(100)
set @BackupFolder = ‘c:\MyDataDumpFolder\’
/* Initialize variables here */
declare @TableNames table (TableTempID
smallint IDENTITY(1,1) PRIMARY KEY CLUSTERED, TableName varchar(50))
declare @TableName varchar(50)
declare @BackupFileName char(50)
declare @BackupFolderFile varchar(150)
declare @Counter smallint
declare @MaxTableCount smallint
declare @BCPOutCommand varchar(500)
/* Get the list of tables that we want to dump out as csv */
insert into @TableNames (TableName) select name from
MyDb.dbo.sysobjects where xtype = ‘U’
and name not in (‘dtproperties’)
select @MaxTableCount = max(TableTempID) from @TableNames
set @Counter = 1
/* Loop through all each table individually, generate bcp
commands and run bcp commands to export data */
while @Counter < = @MaxTableCount Begin /* Create backup file name */ select @TableName = ltrim(rtrim(TableName)) from @TableNames where TableTempID = @Counter select @BackupFileName = ltrim(rtrim(@TableName)) + '.txt' /* Combine backup folder name and file name */ select @BackupFolderFile = @BackupFolder + @BackupFileName /* Create BCP command */ select @BCPOutCommand = 'bcp ' + '"MyDb.dbo.' + @TableName + '" out "' + ltrim(rtrim(@BackupFolderFile)) + '" -c -q -S' + @@Servername + ' -T' print @BCPOutCommand set @Counter = @Counter + 1 end [/sourcecode] 2. Run the bcp commands generated above on Windows DOS command line. Suppose we only have one table called t1, below is what the bcp command would look like: bcp "test.dbo.t1" out "c:\junk\t1.txt" -c -q -SSqlServerInstanceName -T 3. Gather the text files and move them to MySQL server; 4. Run the command in mysql to import the data in. Adjust the directory name as needed: mysql> LOAD DATA LOCAL INFILE ‘/home/haidong/Desktop/t1.txt’ INTO TABLE t1 LINES TERMINATED BY ‘\r\n’;
I’ve tested this for tables that have number, character, datetime, and sequence data successfully. Your mileage may vary. Hope this helps.