Snowflake Data Source Template

Prev Next

Overview

The Snowflake data source template is a built-in SQL template designed for Snowflake cloud data warehouse connections. It ships with Validatar and provides complete metadata ingestion, data type mappings, profiling definitions, and macros optimized for Snowflake's SQL dialect and system views.

Platform: Snowflake Cloud Data Warehouse
Connection Category: Database
Template Category: Built-in

What's Included

Connection Configuration

  • Supports Snowflake native connections
  • Identifier delimiters: " / "
  • Compatible with Snowflake's native pushdown execution engine

Data Type Mappings

Includes mappings for all standard Snowflake data types:

  • String types: VARCHAR, STRING, TEXT, CHAR
  • Numeric types: NUMBER, INTEGER, BIGINT, FLOAT, DOUBLE
  • Date/time types: DATE, TIMESTAMP_NTZ, TIMESTAMP_LTZ, TIMESTAMP_TZ, TIME
  • Boolean: BOOLEAN
  • Semi-structured: VARIANT, OBJECT, ARRAY
  • Binary: BINARY, VARBINARY

Default Parameters

Execution Scripts

Recommended pre-execution scripts for Snowflake sessions:

ALTER SESSION SET TIMEZONE = 'UTC';
ALTER SESSION SET QUERY_TAG = 'validatar';

Metadata Ingestion

Schema Level

Queries INFORMATION_SCHEMA.SCHEMATA to discover all schemas, excluding INFORMATION_SCHEMA.

Table Level

Queries INFORMATION_SCHEMA.TABLES to discover tables and views with schema, name, and type.

Column Level

Queries INFORMATION_SCHEMA.COLUMNS to discover columns with data types, ordinal position, and nullability.

Note: Snowflake's INFORMATION_SCHEMA queries operate within the context of the current database. Ensure the connection's database is set correctly.

Profiling

Includes the standard set of ~40 profile definitions. Snowflake-specific considerations:

  • Profiling functions use Snowflake SQL syntax
  • APPROX_PERCENTILE can be used for approximate median calculations on large tables
  • Semi-structured data types (VARIANT, OBJECT, ARRAY) have limited profiling support — basic profiles like null_count apply, but statistical profiles do not

Macros

Built-in macros include standard patterns for:

  • Row count checks
  • Null percentage analysis
  • Duplicate detection
  • Value distribution queries

Installation

The Snowflake template ships with Validatar — no import required. To verify it's available:

  1. Navigate to Settings > Data Source Templates
  2. Look for "Snowflake" in the template list
  3. Ensure it is in Active status

Configuration

After confirming the template is available:

  1. Create a new data source or edit an existing one
  2. Configure the connection with Snowflake connection type
  3. Select the Snowflake template
  4. Set the server name, database, warehouse, and credentials on the connection
  5. Optionally customize pre-execution scripts for your specific role and warehouse

Customization

Common customizations:

  • Schema filters — Modify the schema ingestion script to include or exclude specific schemas
  • Custom macros — Add macros for Snowflake-specific patterns (e.g., querying ACCOUNT_USAGE views, working with VARIANT data)
  • Profile definitions — Add profiles that leverage Snowflake-specific functions
  • Execution scripts — Set the appropriate role, warehouse, and session parameters for your environment

Related Articles