Customer facing dashboards using Snowflake
Get started
To get build your customer facing dashboard with Vizzly and Snowflake, you can use the Vizzly Command Line Interface (CLI) and execute the following commands, or alternatively you can follow the guide on the Vizzly app here, after signing in to your account (opens in a new tab).
npm i -g @vizzly/cli
These will be used in your identity setup at a later stage. You can find more information about the secrets on our security page, and more information regarding the identity setup on our identity prop page.
vizzly create-key-pair
vizzly init-config -i snowflake
You should now find you have the Vizzly CLI installed, with public and private .pem
files, and a vizzly.config.json
file
which looks like the following;
{
"dataSets": [],
"publicKeyCertificates": [
"<< Your public key >>"
],
"connection": {
"client": "snowflake",
"connection": {
"username": "{{VIZZLY_SNOWFLAKE_USERNAME}}",
"password": "{{VIZZLY_SNOWFLAKE_PASSWORD}}",
"account": "{{VIZZLY_SNOWFLAKE_ACCOUNT}}",
"schema": "{{VIZZLY_SNOWFLAKE_SCHEMA}}",
"database": "{{VIZZLY_SNOWFLAKE_DATABASE}}",
"warehouse": "{{VIZZLY_SNOWFLAKE_WAREHOUSE}}",
"role": "{{VIZZLY_SNOWFLAKE_ROLE}}",
"insecureConnect": false,
"ocspFailOpen": false
}
}
}
See the connection type definition
{
client: "snowflake";
connection: {
username: string;
password: string;
account: string;
schema: string;
database: string;
warehouse: string;
role: string;
insecureConnect?: boolean;
ocspFailOpen?: boolean;
}
}
Environment variables
You will notice that the vizzly config we have just generated, contains mustache values such as {{VIZZLY_SNOWFLAKE_PASSWORD}} and {{VIZZLY_SNOWFLAKE_DATABASE}}. You have the choice of replacing these with the literal values, or setting these values as environment variables when you deploy your query engine.
Snowflake connection environment variables
VIZZLY_SNOWFLAKE_PASSWORD
Set the password to access your snowflake database.
VIZZLY_SNOWFLAKE_DATABASE
The name of your snowflake database to connect too.
VIZZLY_SNOWFLAKE_USERNAME
The name of your snowflake database to connect too.
VIZZLY_SNOWFLAKE_ACCOUNT
The ID of your snowflake account.
VIZZLY_SNOWFLAKE_WAREHOUSE
The ID of your snowflake warehouse.
VIZZLY_SNOWFLAKE_ROLE
The name of a role to use when connecting to your Snowflake DB.
VIZZLY_SNOWFLAKE_SCHEMA
The name of the Snowflake schema to connect too.
For secret values such as passwords and API keys, we strongly recommend using the mustache templates and provide their values by setting environment variables.
Data sets
You will notice that the dataSets
key in your Vizzly config is empty. The next stage of the setup process is to define the schema of the data sets that you want to make available to your users.
The Snowflake data set structure is;
{
/*
ID of the data set that remains constant.
*/
id: string;
/*
A public name that describes this data set to your users.
*/
name: string;
/*
Fields that define the flat schema of the data set.
*/
fields: Array<{
/*
A unique reference to this field, and should remain constant.
*/
id: string;
/*
The name of the column in the table.
*/
name: string;
/*
What type of data does this field contain?
*/
dataType: "number" | "boolean" | "string" | "date_time";
/*
The name of this field that your users will see.
*/
publicName: string;
/*
The name of the table in the database
*/
table: string;
}>;
joins: Array<{
type:
| "rightJoin"
| "leftJoin"
| "innerJoin"
| "outerJoin"
| "leftOuterJoin"
| "rightOuterJoin";
left: {
field: string;
table: string;
};
right: {
field: string;
table: string;
};
}>;
}
Data types
These are the valid data types supported for the Snowflake integration;
- date_time
- number
- string
- boolean
Joins
If you need to join tables together to add fields to your data set, then you can do so by using the joins
key of your data set.
Expand to see an example of how you would join two tables on a foreign key city_id
.
{
"connection": {
// ...
},
"dataSets": [
{
"id": "weather-by-city",
"name": "Weather by city",
"fields": [
{
"id": "field_1",
"publicName": "City ID",
"name": "id",
"table": "cities",
"dataType": "number"
},
{
"id": "field_2",
"publicName": "City",
"name": "name",
"table": "cities",
"dataType": "number"
},
{
"id": "field_3",
"publicName": "Weather record ID",
"name": "id",
"table": "weather",
"dataType": "number"
},
{
"id": "field_4",
"publicName": "City ID",
"name": "city_id",
"table": "weather",
"dataType": "number"
},
{
"id": "field_5",
"publicName": "High temperature",
"name": "high_temperature",
"table": "weather",
"dataType": "number"
}
],
"joins": [
{
"type": "leftJoin",
"left": {
"field": "id",
"table": "cities"
},
"right": {
"field": "city_id",
"table": "weather"
}
}
],
"secureFilterGuards": []
}
// ...
],
"publicKeyCertificates": [
// ...
]
}
Secure filter guards (optional)
Secure filters are used to provide Vizzly's multi-tenancy feature, and they are defined in the identity config.
As a safe-guarding measure to ensure these filters are always set when queries run against the data set, you can setup secureFilterGuards
.
For example, if you had a data set which stored bank transactions, and you have an account_id
field which you're using to determine access
for your multi-tenancy platform, you will be setting a secure filter in your identity config such as account_id = account-abc
. To ensure that
this filter is always set, you can define the secure filter guards on the data set in your Vizzly config that ensures an =
filter is always applied on the account_id
field.
You would achieve this by defining the following secure filter guard;
{
"fieldId": "account_id",
"op": "="
}
The typescript definition of a secure filter guard is as follows:
type Operator =
| ">"
| "<"
| "="
| "!="
| ">="
| "<="
| "is_one_of"
| "is_not_one_of"
| "starts_with"
| "ends_with"
| "contains_substring"
| "does_not_contain_substring";
{
fieldId: string; // The `id` of the field
op: Operator;
}
Deployment
You can deploy your Vizzly query engine anywhere that can run a docker image, provided it can connect securely to your Snowflake instance.
We have documentation for deploying to Amazon Web Services (AWS), Google Cloud Platform (GCP), Heroku and locally for development and testing.