Data Warehouses
Luden supports the following Data Warehouses (DWH) as destinations:
This article explains which features are available for each DWH and implementation details.
Postgres SQL
Feature | Status | Details |
---|---|---|
Push events (API keys) stream mode | Supported | If destination table has primary key:
|
Push events (API keys) batch mode | Supported | If destination table has primary key:
In SQL transaction:
1. create |
Pull events (Sources) | Supported* | *JavaScript Transformation currently not applied to pulled data
In SQL transaction:
1. Delete previous data for current sync interval: |
User Recognition stream mode | Supported | Primary key on |
User Recognition batch mode | Supported | Primary key on |
ClickHouse
Feature | Status | Details |
---|---|---|
Push events (API keys) stream mode | Supported |
|
Push events (API keys) batch mode | Supported |
|
Pull events (Sources) | Supported* | *JavaScript Transformation currently not applied to pulled data
1. Delete previous data for current sync interval: |
User Recognition stream mode | Supported* |
|
User Recognition batch mode | Supported* |
|
Redshift
Feature | Status | Details |
---|---|---|
Push events (API keys) stream mode | Supported |
|
Push events (API keys) batch mode | Supported | 1. Upload file with data to S3
2. Copy data from S3 to destination table: |
Pull events (Sources) | Supported* | *JavaScript Transformation currently not applied to pulled data
If destination table has primary key:
In SQL transaction:
1. Delete previous data for current sync interval: |
User Recognition stream mode | Supported | Primary key on |
User Recognition batch mode | Supported | Primary key on |
Snowflake
Feature | Status | Details |
---|---|---|
Push events (API keys) stream mode | Supported |
|
Push events (API keys) batch mode | Supported | S3:
1. Upload file with data to S3
2. Copy data from S3 to destination table: |
Pull events (Sources) | Supported* | *JavaScript Transformation currently not applied to pulled data
If destination table has primary key:
In SQL transaction:
1. Delete previous data for current sync interval: |
User Recognition stream mode | Supported | Primary key on |
User Recognition batch mode | Supported | Primary key on |
BigQuery
Feature | Status | Details |
Push events (API keys) stream mode | Supported | Insert using bigquery API |
Push events (API keys) batch mode | Supported | 1. Upload file with data to GCS 2. Copy data from GCS to destination table using bigquery API 3. Delete data from GCS |
Pull events (Sources) | Not supported | |
User Recognition stream mode | Not supported | |
User Recognition batch mode | Not supported |
MySQL
Feature | Status | Details |
---|---|---|
Push events (API keys) stream mode | Supported | If destination table has primary key:
|
Push events (API keys) batch mode | Supported | If destination table has primary key:
In SQL transaction:
1. create |
Pull events (Sources) | Supported* | *JavaScript Transformation currently not applied to pulled data
In SQL transaction:
1. Delete previous data for current sync interval: |
User Recognition stream mode | Supported | Primary key on |
User Recognition batch mode | Supported | Primary key on |
Last updated