Export Lists from Snowflake

Export Snowflake audience data for personalization and segmentation in Airship.

Sending Snowflake data to Airship capitalizes on existing, readily available tools to achieve the highest degree of flexibility and control, all while maintaining cost-effectiveness. Your data team uses Snowflake to generate valuable insights and audience segments by employing a tool they are already proficient in: SQL. You can expand on these existing processes to send data to Airship. This collaboration unlocks a multitude of potential use cases and drives a substantial improvement in personalization capabilities.

Follow these steps to create AttributesMetadata used for audience segmentation and personalization. They extend the concept of Tags by adding comparison operators and values to determine whether or not to target a user, helping you better evaluate your audience. from Snowflake data. You can also leverage any of the supported features listed in SFTP upload for CSV files, including Uploaded (Static) ListsA reusable audience list that you create. Uploaded lists are static and updatable. In the API, they are referred to as Static Lists..

To send Airship data to Snowflake, see the Real-Time Data Streaming integration.

Generating SFTP keys

Airship’s SFTP implementation uses SSH key pairs for authentication. You must create a pair of keys: a private key for your client and a public key for Airship. Then you can add the public key to Airship use the private key from Snowflake.

In Airship:

  1. Generate your key pair. Follow the steps in Generate keys in SFTP upload for CSV files.

  2. Add your public key, making sure to set the Purpose to Attributes. Follow the steps in Add your public key to Airship in SFTP upload for CSV files. Note the host, port, and username for use in the following SFMC steps.

Creating a network rule and integration

In Snowflake, your setup may require the following.

Create a network rule and external access integration
-- This gets used in creating the external access integration
CREATE OR REPLACE NETWORK RULE airship_sftp_network_rule
  TYPE = HOST_PORT
  VALUE_LIST = ('sftp.airship.com:5222') 
  MODE= EGRESS
;

-- This gets used later creating the stored proc
CREATE OR REPLACE EXTERNAL ACCESS INTEGRATION sftp_airship_ext_int
  ALLOWED_NETWORK_RULES = (airship_sftp_network_rule)
  ALLOWED_AUTHENTICATION_SECRETS = (sftp_airship_attribute_cred)
  ENABLED = true
;

Storing the SFTP credentials

In Snowflake, store the private key in a SECRET type password. This is preferred because the tokens are encrypted.

Store the credentials
-- create secret credential object (removed the actual key in this example)
-- this creates an object with a username and password (it's a key in this case)
CREATE  SECRET sftp_airship_attribute_cred
    TYPE = password
    USERNAME = 'OZzRx6y4Rm690T6KkReIuQ'
    PASSWORD = '-----BEGIN RSA PRIVATE KEY-----
    ...
-----END RSA PRIVATE KEY-----
';

Creating a function to retrieve the SSH authentication

In Snowflake, you can create a function to retrieve the credentials.

Create a function
CREATE OR REPLACE FUNCTION get_secret_username_password()
RETURNS STRING
LANGUAGE PYTHON
RUNTIME_VERSION = 3.9
HANDLER = 'get_secret_username_password'
EXTERNAL_ACCESS_INTEGRATIONS = (sftp_airship_ext_int)
SECRETS = ('cred' = sftp_airship_attribute_cred )
AS
$$
import _snowflake

def get_secret_username_password():
  username_password_object = _snowflake.get_username_password('cred')
  username_password_dictionary = {}
  username_password_dictionary["Username"] = username_password_object.username
  username_password_dictionary["Password"] = username_password_object.password
  return username_password_dictionary
$$;

Creating a sample attributes table

Next, create a sample SQL table for your Snowflake attributes. It’s unlikely that attributes will be pulled from a single table formatted precisely like this. Nevertheless, it’s a good place to start as it illustrates the functionality.

Create a table
-- create a table for attributes and insert a row or two
CREATE table sample_attributes (
    named_user VARCHAR(100),
    first_name VARCHAR(100),
    last_name VARCHAR(100),
    loyalty_id INT,
    favourite_color VARCHAR(100),
    tacos_desired INT
);
INSERT INTO sample_attributes VALUES ('8732eda2-c13c-4a2a-9123-fb5bf0bccffb', 'John', 'Smith', 60001, 'Green', 77);
INSERT INTO sample_attributes VALUES ('ba06daf2-66f8-43ce-a152-b0605b9b834e', 'Alice', 'Jones', 87301, 'Blue', 1);

Creating a Stored Procedure

This example creates a very basic Stored Procedure that uploads the sample_attributes table above via SFTP to Airship. It doesn’t use arguments, everything is hard coded, and it uses the function above to fetch the credentials.

Create a stored procedure
CREATE OR REPLACE PROCEDURE upload_to_sftp()
RETURNS STRING
LANGUAGE PYTHON
RUNTIME_VERSION = 3.8
HANDLER = 'main'
EXTERNAL_ACCESS_INTEGRATIONS = (sftp_airship_ext_int)
PACKAGES = ('snowflake-snowpark-python','paramiko')
SECRETS = ('cred' = sftp_airship_attribute_cred)
AS
$$
import _snowflake
import paramiko
from snowflake.snowpark.files import SnowflakeFile
from io import StringIO


def main(session):
    # use the function for grabbing the creds
    sftp_cred = _snowflake.get_username_password('cred');
    
    # convert the private key to a file-like object
    private_key_file = StringIO(sftp_cred.password)
    private_key = paramiko.RSAKey.from_private_key(private_key_file)
    
    # grab the data from the sample attributes table
    df = session.sql("select * from sample_attributes").toPandas()
    df.to_csv('/tmp/out.csv',index=False)
    
    # connect and upload via SFTP using Paramiko
    ssh = paramiko.SSHClient()
    ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
    try:
        ssh.connect("sftp.airship.com",port=5222, username=sftp_cred.username, pkey=private_key)
        sftp = ssh.open_sftp()
        ret = sftp.put('/tmp/out.csv','/out.csv', confirm=False)
        # confirm=False is important because Airship sftp doesn't support `ls`
        return ret
    except Exception as e:
        return f" Error with SFTP : {e}"

$$;

Verifying transfer

Run call upload_to_sftp();, then verify your data was successfully transferred to Airship. Select the Audience menu, then Attributes, then Upload History. For more information, see Viewing Attributes upload history in Managing Attributes.

Targeting users

You can now use the transferred data for targeting. See: