Snowflake Database Integration

Document created by ruchika_yadav Employee on Jan 24, 2018Last modified by Adam Arrowsmith on Apr 16, 2018
Version 4Show Document
  • View in full screen mode

This article describes how to setup a connection to Snowflake database using the standard database connector and inserting data using the file staging method.

 

 

Overview

To connect to snowflake in Boomi , we need the standard database connector. Here is the link for snowflake documentation: 

Welcome to the Snowflake Documentation — Snowflake Documentation 

 

Connection

We need snowflake JDBC driver to use custom type in the database connector 

Download the JDBC Driver here: Downloading/Integrating the JDBC Driver — Snowflake Documentation 

 

You can create a Custom Library component to deploy the JAR file on the Atom. Follow the instructions here: Working with custom libraries.

Once you have deployed the JDBC jar file on the atom then we can create a database connection. 

JDBC connection setup from snowflake: Configuring and Using the JDBC Driver — Snowflake Documentation 

 

 

Chose custom driver type and specify the credentials for your snowflake instance. 

 

Operation

We are going to write data into Snowflake database using a file import method--this is a little different than writing to other databases. Note that you can still write standard SQL inserts but writing to Snowflake is very slow when you have high volume. So staging the data in internal/external stage is highly recommended.

 

Writing to Snowflake is a two step process.

  1. Put the data in a CSV format to an internal/external stage. You can chose internal vs external based on your organizational requirements.
    • Internal stage: You can place these files on a directory path where atom is installed or the atom service has access to.
    • External Stage : Amazon S3 bucket where you place files in the bucket of your choice. 
  2. Copy the data from the internal/external stage to the database table.

 

How to create a stage

Log into Snowflake-> go to the database-> click on the desired database and click on STAGE tab. Chose Snowflake stage vs Amazon S3 and create your stage. This basically means that you will put the file on an internal/external stage and then copy the file to your database schema.

 

 

Next step is to create an operation in Boomi. You will chose GET action in the database operation and create a profile. In the SQL script section, we will specify our script. 

 

This is an example of an Internal Stage:

PUT file://<file_path> @schema_name.stage_name

 

For example: PUT file://C:\temp\Boomi_stage\test.csv @BOOMI_STAGE.BOOMIDATAINTEGRATION/test

 

Linux server file path : file:///tmp/load

Windows : file://C:\tmp\load

Make sure its a CSV file. You can specify the file format in snowflake and make sure that its the exact same format. 

 

How to Create file format

You can chose any file format that you want for your CSV file. For our example, its a comma delimited file. You can explore more options in Snowflake. 

 

 

Copy the file into the database

Now our next step is to COPY this data into Snowflake. Here is the command: 

COPY INTO <schema.table_name> from @schema_name.stage_name file_format=<schema_name>.<format_name> ON_ERROR='CONTINUE'

 

For example: COPY Into BOOMI_STAGE.Testtable from @BOOMI_STAGE.BOOMIDATAINTEGRATION/test/ file_format=BOOMI_STAGE.BOOMI_FORMAT ON_ERROR='CONTINUE'

 

Example Process

 

Here is a simple process that outlines the steps discussed above. This process gets the data from Salesforce and then create a CSV file to save it on disk. Then put the CSV file on the Snowflake stage from where it is copied to the database table. Alternatively, you can choose to stage the file on Amazon S3. You will put file on S3 using our Amazon S3 connector and then copy the files from the S3 bucket. 

 

1 person found this helpful

Attachments

    Outcomes