Moving data from SQL Server to Snowflake

While moving data from SQL Server or Azure SQL to Snowflake, the following are some suggested considerations:

1. Which tool to use

- SSIS, Azure Data Factory are appropriate under two situations a) You will manually create tables and columns in the target and b) You are comfortable with the batch / offline nature - that is you submit a job which may take days depending on the size of the database. 

- If your databases are under 4 or 5 TB, then tools like Windocks automatically create and map the column types for you and move the data on demand (i.e. the data is moved while you wait in a few hours using the full capacity of the system or as much capacity as you configure). This type of tool is suitable when you need the data sooner and you don't want to deal with the column type mapping

2. Differences in column types

Column types differ between SQL Server and Snowflake. Certain types in SQL Server such as uniqueidentifier are not available in Snowflake and you would need to map this type for example to varchar. 

3. Differences in column maximums

Size limitations apply to maximum varchar and binary sizes in Snowflake. In SQL Server you may use max as an option. If your data contains binary or character values that are larger than the Snowflake maximums, then that data needs to be truncated and your resulting database will have different data.

4. Manually output CSV files and then ingesting them into Snowflake

You may output CSV files from SQL Server, then upload them to Snowflake staging and then ingesting into the Snowflake database. This involves the use of the commands PUT and COPY TO. 

Download a free data mover from to Azure SQL / SQL Server to Snowflake with a performance of 20 million rows per minute

DOWNLOAD