> **Building with AI coding agents?** If you're using an AI coding agent, install the official Scalekit plugin. It gives your agent full awareness of the Scalekit API — reducing hallucinations and enabling faster, more accurate code generation.
>
> - **Claude Code**: `/plugin marketplace add scalekit-inc/claude-code-authstack` then `/plugin install <auth-type>@scalekit-auth-stack`
> - **GitHub Copilot CLI**: `copilot plugin marketplace add scalekit-inc/github-copilot-authstack` then `copilot plugin install <auth-type>@scalekit-auth-stack`
> - **Codex**: run the bash installer, restart, then open Plugin Directory and enable `<auth-type>`
> - **Skills CLI** (Windsurf, Cline, 40+ agents): `npx skills add scalekit-inc/skills --list` then `--skill <skill-name>`
>
> `<auth-type>` / `<skill-name>`: `agentkit`, `full-stack-auth`, `mcp-auth`, `modular-sso`, `modular-scim` — [Full setup guide](https://docs.scalekit.com/dev-kit/build-with-ai/)

---

# Snowflake

**Authentication:** OAuth 2.0
**Categories:** Data, Analytics
## What you can do

Connect this agent connector to let your agent:

- **Grants show** — Run SHOW GRANTS in common modes (to role, to user, of role, on object)
- **Warehouses show** — Run SHOW WAREHOUSES
- **Schemas show databases** — Run SHOW DATABASES or SHOW SCHEMAS
- **Keys show imported exported, show primary** — Run SHOW IMPORTED KEYS or SHOW EXPORTED KEYS for a table
- **Get get** — Query INFORMATION_SCHEMA.REFERENTIAL_CONSTRAINTS
- **Query cancel** — Cancel a running Snowflake SQL API statement by statement handle

## Authentication

This connector uses **OAuth 2.0**. Scalekit acts as the OAuth client: it redirects your user to Snowflake, obtains an access token, and automatically refreshes it before it expires. Your agent code never handles tokens directly — you only pass a `connectionName` and a user `identifier`.

You supply your Snowflake **Connected App** credentials (Client ID + Secret) once per environment in the Scalekit dashboard.

Before calling this connector from your code, create the Snowflake connection in **AgentKit** > **Connections** and copy the exact **Connection name** from that connection into your code. The value in code must match the dashboard exactly.

## Set up the connector

Register your Scalekit environment with the Snowflake connector so Scalekit handles the authentication flow and token lifecycle for you. The connection name you create will be used to identify and invoke the connection programmatically. You'll need to create an OAuth Security Integration in your Snowflake account.

1. ### Set up auth redirects

    - In [Scalekit dashboard](https://app.scalekit.com), go to **AgentKit** > **Connections** > **Create Connection**.

    - Find **Snowflake** from the list of providers and click **Create**. Copy the redirect URI. It looks like `https:///sso/v1/oauth//callback`.

      > Image: Copy redirect URI from Scalekit dashboard

    - Log into your Snowflake account (Snowsight) and run the following SQL to create an OAuth Security Integration, replacing `<redirect_uri>` with the URI you copied:

      ```sql
      CREATE OR REPLACE SECURITY INTEGRATION scalekit_oauth
        TYPE = OAUTH
        OAUTH_CLIENT = CUSTOM
        OAUTH_CLIENT_TYPE = 'CONFIDENTIAL'
        OAUTH_REDIRECT_URI = '<redirect_uri>'
        ENABLED = TRUE;
      ```

2. ### Get client credentials

    - After creating the integration, run the following SQL to retrieve the client credentials:

      ```sql
      SELECT SYSTEM$SHOW_OAUTH_CLIENT_SECRETS('SCALEKIT_OAUTH');
      ```

    - This returns a JSON object containing:
      - **Client ID** — value of `OAUTH_CLIENT_ID`
      - **Client Secret** — value of `OAUTH_CLIENT_SECRET_2` (or `OAUTH_CLIENT_SECRET_1`)

3. ### Add credentials in Scalekit

    - In [Scalekit dashboard](https://app.scalekit.com), go to **AgentKit** > **Connections** and open the connection you created.

    - Enter your credentials:
      - Client ID (from the SQL output)
      - Client Secret (from the SQL output)

      > Image: Add credentials in Scalekit dashboard
    - Click **Save**.

## Code examples

Connect a user's Snowflake account and make API calls on their behalf — Scalekit handles OAuth and token management automatically.

**Don't worry about your Snowflake account domain in the path.** Scalekit automatically resolves `{{domain}}` from the connected account's configuration. For example, a request with `path="/api/v2/statements"` will be sent to `https://myorg-myaccount.snowflakecomputing.com/api/v2/statements` automatically.

## Proxy API Calls

  ### Node.js

```typescript

const connectionName = 'snowflake'; // get your connection name from connection configurations
const identifier = 'user_123';  // your unique user identifier

// Get your credentials from app.scalekit.com → Developers → Settings → API Credentials
const scalekit = new ScalekitClient(
  process.env.SCALEKIT_ENV_URL,
  process.env.SCALEKIT_CLIENT_ID,
  process.env.SCALEKIT_CLIENT_SECRET
);
const actions = scalekit.actions;

// Authenticate the user
const { link } = await actions.getAuthorizationLink({
  connectionName,
  identifier,
});
console.log('🔗 Authorize Snowflake:', link);
process.stdout.write('Press Enter after authorizing...');
await new Promise(r => process.stdin.once('data', r));

// Make a request via Scalekit proxy
const result = await actions.request({
  connectionName,
  identifier,
  path: '/api/v2/statements',
  method: 'POST',
});
console.log(result);
```

  ### Python

```python

from dotenv import load_dotenv
load_dotenv()

connection_name = "snowflake"  # get your connection name from connection configurations
identifier = "user_123"     # your unique user identifier

# Get your credentials from app.scalekit.com → Developers → Settings → API Credentials
scalekit_client = scalekit.client.ScalekitClient(
    client_id=os.getenv("SCALEKIT_CLIENT_ID"),
    client_secret=os.getenv("SCALEKIT_CLIENT_SECRET"),
    env_url=os.getenv("SCALEKIT_ENV_URL"),
)
actions = scalekit_client.actions

# Authenticate the user
link_response = actions.get_authorization_link(
    connection_name=connection_name,
    identifier=identifier
)
# present this link to your user for authorization, or click it yourself for testing
print("🔗 Authorize Snowflake:", link_response.link)
input("Press Enter after authorizing...")

# Make a request via Scalekit proxy
result = actions.request(
    connection_name=connection_name,
    identifier=identifier,
    path="/api/v2/statements",
    method="POST"
)
print(result)
```

## Tool list

Use the exact tool names from the **Tool list** below when you call `execute_tool`. If you're not sure which name to use, list the tools available for the current user first.

## Tool list

### `snowflake_cancel_query`

Cancel a running Snowflake SQL API statement by statement handle.

Parameters:

- `statement_handle` (`string`, required): Snowflake statement handle to cancel
- `request_id` (`string`, optional): Optional request ID used when the statement was submitted

### `snowflake_execute_query`

Execute one or more SQL statements against Snowflake using the SQL API. Requires a valid Snowflake OAuth2 connection. Use semicolons to submit multiple statements.

Parameters:

- `statement` (`string`, required): SQL statement to execute. Use semicolons to send multiple statements in one request.
- `async` (`boolean`, optional): Execute statement asynchronously and return a statement handle
- `bindings` (`object`, optional): Bind variables object for '?' placeholders in the SQL statement
- `database` (`string`, optional): Database to use when executing the statement
- `nullable` (`boolean`, optional): When false, SQL NULL values are returned as the string "null"
- `parameters` (`object`, optional): Statement-level Snowflake parameters as a JSON object
- `request_id` (`string`, optional): Unique request identifier (UUID) used for idempotent retries
- `retry` (`boolean`, optional): Set true when resubmitting a previously sent request with the same request_id
- `role` (`string`, optional): Role to use when executing the statement
- `schema` (`string`, optional): Schema to use when executing the statement
- `timeout` (`integer`, optional): Maximum number of seconds to wait for statement execution
- `warehouse` (`string`, optional): Warehouse to use when executing the statement

### `snowflake_get_columns`

Query INFORMATION_SCHEMA.COLUMNS for column metadata.

Parameters:

- `database` (`string`, required): Database name
- `column_name_like` (`string`, optional): Optional column name pattern
- `limit` (`integer`, optional): Maximum rows
- `role` (`string`, optional): Optional role
- `schema` (`string`, optional): Optional schema filter
- `table` (`string`, optional): Optional table filter
- `warehouse` (`string`, optional): Optional warehouse

### `snowflake_get_query_partition`

Get a specific result partition for a Snowflake SQL API statement.

Parameters:

- `partition` (`integer`, required): Partition index to fetch (0-based)
- `statement_handle` (`string`, required): Snowflake statement handle returned by Execute Query
- `request_id` (`string`, optional): Optional request ID used when the statement was submitted

### `snowflake_get_query_status`

Get Snowflake SQL API statement status and first partition result metadata by statement handle.

Parameters:

- `statement_handle` (`string`, required): Snowflake statement handle returned by Execute Query
- `request_id` (`string`, optional): Optional request ID used when the statement was submitted

### `snowflake_get_referential_constraints`

Query INFORMATION_SCHEMA.REFERENTIAL_CONSTRAINTS.

Parameters:

- `database` (`string`, required): Database name
- `limit` (`integer`, optional): Maximum rows
- `role` (`string`, optional): Optional role
- `schema` (`string`, optional): Optional schema filter
- `table` (`string`, optional): Optional table filter
- `warehouse` (`string`, optional): Optional warehouse

### `snowflake_get_schemata`

Query INFORMATION_SCHEMA.SCHEMATA for schema metadata.

Parameters:

- `database` (`string`, required): Database name
- `limit` (`integer`, optional): Maximum rows
- `role` (`string`, optional): Optional role
- `schema_like` (`string`, optional): Optional schema pattern
- `warehouse` (`string`, optional): Optional warehouse

### `snowflake_get_table_constraints`

Query INFORMATION_SCHEMA.TABLE_CONSTRAINTS.

Parameters:

- `database` (`string`, required): Database name
- `constraint_type` (`string`, optional): Optional constraint type filter
- `limit` (`integer`, optional): Maximum rows
- `role` (`string`, optional): Optional role
- `schema` (`string`, optional): Optional schema filter
- `table` (`string`, optional): Optional table filter
- `warehouse` (`string`, optional): Optional warehouse

### `snowflake_get_tables`

Query INFORMATION_SCHEMA.TABLES for table metadata in a Snowflake database.

Parameters:

- `database` (`string`, required): Database name
- `limit` (`integer`, optional): Maximum number of rows
- `role` (`string`, optional): Optional role
- `schema` (`string`, optional): Optional schema filter
- `table_name_like` (`string`, optional): Optional table name pattern
- `warehouse` (`string`, optional): Optional warehouse

### `snowflake_show_databases_schemas`

Run SHOW DATABASES or SHOW SCHEMAS.

Parameters:

- `object_type` (`string`, required): Object type to show
- `database_name` (`string`, optional): Optional database scope for SHOW SCHEMAS
- `like_pattern` (`string`, optional): Optional LIKE pattern
- `role` (`string`, optional): Optional role
- `warehouse` (`string`, optional): Optional warehouse

### `snowflake_show_grants`

Run SHOW GRANTS in common modes (to role, to user, of role, on object).

Parameters:

- `grant_view` (`string`, required): SHOW GRANTS variant
- `object_name` (`string`, optional): Object name for on_object
- `object_type` (`string`, optional): Object type for on_object
- `role` (`string`, optional): Optional execution role
- `role_name` (`string`, optional): Role name (for to_role/of_role)
- `user_name` (`string`, optional): User name (for to_user)
- `warehouse` (`string`, optional): Optional warehouse

### `snowflake_show_imported_exported_keys`

Run SHOW IMPORTED KEYS or SHOW EXPORTED KEYS for a table. For reliable execution in this environment, use fully-qualified scope (database_name + schema_name + table_name).

Parameters:

- `key_direction` (`string`, required): Which command to run
- `table_name` (`string`, required): Table name (use with schema_name and database_name for fully-qualified scope)
- `database_name` (`string`, optional): Optional database name (recommended with schema_name)
- `role` (`string`, optional): Optional role
- `schema_name` (`string`, optional): Optional schema name (recommended with database_name)
- `warehouse` (`string`, optional): Optional warehouse

### `snowflake_show_primary_keys`

Run SHOW PRIMARY KEYS with optional scope. When using schema_name (or schema_name + table_name), database_name is required for fully-qualified scope.

Parameters:

- `database_name` (`string`, optional): Optional database name for scope (required when schema_name is set)
- `role` (`string`, optional): Optional role
- `schema_name` (`string`, optional): Optional schema name for scope
- `table_name` (`string`, optional): Optional table name for scope
- `warehouse` (`string`, optional): Optional warehouse

### `snowflake_show_warehouses`

Run SHOW WAREHOUSES.

Parameters:

- `like_pattern` (`string`, optional): Optional LIKE pattern
- `role` (`string`, optional): Optional role
- `warehouse` (`string`, optional): Optional warehouse


---

## More Scalekit documentation

| Resource | What it contains | When to use it |
|----------|-----------------|----------------|
| [/llms.txt](/llms.txt) | Structured index with routing hints per product area | Start here — find which documentation set covers your topic before loading full content |
| [/llms-full.txt](/llms-full.txt) | Complete documentation for all Scalekit products in one file | Use when you need exhaustive context across multiple products or when the topic spans several areas |
| [sitemap-0.xml](https://docs.scalekit.com/sitemap-0.xml) | Full URL list of every documentation page | Use to discover specific page URLs you can fetch for targeted, page-level answers |
