Data Warehouses

Luden supports the following Data Warehouses (DWH) as destinations:

This article explains which features are available for each DWH and implementation details.

Postgres SQL

FeatureStatusDetails

Push events (API keys) stream mode

Supported

If destination table has primary key: INSERT INTO destination_table(...) VALUES ... ON CONFLICT ON CONSTRAINT primary_key DO UPDATE SET ... If destination table has no primary key: INSERT INTO destination_table(...) VALUES ...

Push events (API keys) batch mode

Supported

If destination table has primary key: In SQL transaction: 1. create tmp_table 2. insert multiple rows into tmp_table 3. insert from tmp_table: INSERT INTO destination_table(...) SELECT ... FROM tmp_table ON CONFLICT ON CONSTRAINT primary_key DO UPDATE SET ... 4. delete tmp_table If destination table has no primary key: In SQL transaction: 1. insert multiple rows into destination table

Pull events (Sources)

Supported*

*JavaScript Transformation currently not applied to pulled data In SQL transaction: 1. Delete previous data for current sync interval: DELETE FROM destination_table WHERE _time_interval=? 2. See Push events (API keys) batch mode

User Recognition stream mode

Supported

Primary key on eventn_ctx_event_id field is required for destination table. UPDATE destination_table SET .. WHERE eventn_ctx_event_id=?

User Recognition batch mode

Supported

Primary key on eventn_ctx_event_id field is required for destination table. Same as Push events (API keys) batch mode

ClickHouse

FeatureStatusDetails

Push events (API keys) stream mode

Supported

INSERT INTO destination_table (...) VALUES ...

Push events (API keys) batch mode

Supported

INSERT INTO destination_table (...) VALUES ...

Pull events (Sources)

Supported*

*JavaScript Transformation currently not applied to pulled data 1. Delete previous data for current sync interval: ALTER TABLE destination_table DELETE WHERE _time_interval=? 2. See Push events (API keys) batch mode

User Recognition stream mode

Supported*

INSERT INTO destination_table (...) VALUES ...

User Recognition batch mode

Supported*

INSERT INTO destination_table (...) VALUES ...

Redshift

FeatureStatusDetails

Push events (API keys) stream mode

Supported

INSERT INTO destination_table(...) VALUES ...

Push events (API keys) batch mode

Supported

1. Upload file with data to S3 2. Copy data from S3 to destination table: copy destination_table from 's3://...' 3. Delete data from S3

Pull events (Sources)

Supported*

*JavaScript Transformation currently not applied to pulled data If destination table has primary key: In SQL transaction: 1. Delete previous data for current sync interval: DELETE FROM destination_table WHERE _time_interval=? 2. create tmp_table 3. insert multiple rows into tmp_table 4. delete from destination table duplicated rows DELETE FROM destination_table using tmp_table where primary_key_columns=? . 5. insert from tmp_table: INSERT INTO destination_table (...) select ... from tmp_table 6. delete tmp_table If destination table has no primary key: In SQL transaction: 1. Delete previous data for current sync interval: DELETE FROM destination_table WHERE _time_interval=? 2. insert multiple rows into destination table

User Recognition stream mode

Supported

Primary key on eventn_ctx_event_id field is required for destination table. UPDATE destination_table SET .. WHERE eventn_ctx_event_id=?

User Recognition batch mode

Supported

Primary key on eventn_ctx_event_id field is required for destination table. In SQL transaction: 1. create tmp_table 2. insert multiple rows into tmp_table 3. delete from destination table duplicated rows DELETE FROM destination_table using tmp_table where eventn_ctx_event_id=? . 4. insert from tmp_table: INSERT INTO destination_table (...) select ... from tmp_table 5. delete tmp_table

Snowflake

FeatureStatusDetails

Push events (API keys) stream mode

Supported

INSERT INTO destination_table(...) VALUES ...

Push events (API keys) batch mode

Supported

S3: 1. Upload file with data to S3 2. Copy data from S3 to destination table: copy into destination_table (...) from 's3://...' 3. Delete data from S3 Google Cloud Storage: 1. Upload file with data to GCS 2. Copy data from GCS to destination table: copy into destination_table (...) from @... 3. Delete data from GCS

Pull events (Sources)

Supported*

*JavaScript Transformation currently not applied to pulled data If destination table has primary key: In SQL transaction: 1. Delete previous data for current sync interval: DELETE FROM destination_table WHERE _time_interval=? 2. create tmp_table 3. insert multiple rows into tmp_table 4. merge into destination table: MERGE INTO destination_table USING (SELECT ... FROM tmp_table) ON primary_key_columns=? WHEN MATCHED THEN UPDATE SET ... WHEN NOT MATCHED THEN INSERT (...) VALUES (...) 5. delete tmp_table If destination table has no primary key: In SQL transaction: 1. Delete previous data for current sync interval: DELETE FROM destination_table WHERE _time_interval=? 2. insert multiple rows into destination table

User Recognition stream mode

Supported

Primary key on eventn_ctx_event_id field is required for destination table. UPDATE destination_table SET .. WHERE eventn_ctx_event_id=?

User Recognition batch mode

Supported

Primary key on eventn_ctx_event_id field is required for destination table. In SQL transaction: 1. Delete previous data for current sync interval: DELETE FROM destination_table WHERE _time_interval=? 2. create tmp_table 3. insert multiple rows into tmp_table 4. merge into destination table: MERGE INTO destination_table USING (SELECT ... FROM tmp_table) ON eventn_ctx_event_id=? WHEN MATCHED THEN UPDATE SET ... WHEN NOT MATCHED THEN INSERT (...) VALUES (...) 5. delete tmp_table

BigQuery

Feature

Status

Details

Push events (API keys) stream mode

Supported

Insert using bigquery API

Push events (API keys) batch mode

Supported

1. Upload file with data to GCS 2. Copy data from GCS to destination table using bigquery API 3. Delete data from GCS

Pull events (Sources)

Not supported

User Recognition stream mode

Not supported

User Recognition batch mode

Not supported

MySQL

FeatureStatusDetails

Push events (API keys) stream mode

Supported

If destination table has primary key: INSERT INTO destination_table(...) VALUES ... ON DUPLICATE KEY UPDATE ... If destination table has no primary key: INSERT INTO destination_table(...) VALUES ...

Push events (API keys) batch mode

Supported

If destination table has primary key: In SQL transaction: 1. create tmp_table 2. insert multiple rows into tmp_table 3. insert from tmp_table: INSERT INTO destination_table(...) SELECT * FROM (SELECT ... FROM tmp_table) AS tmp ON DUPLICATE KEY UPDATE ... 4. delete tmp_table If destination table has no primary key: In SQL transaction: 1. insert multiple rows into destination table

Pull events (Sources)

Supported*

*JavaScript Transformation currently not applied to pulled data In SQL transaction: 1. Delete previous data for current sync interval: DELETE FROM destination_table WHERE _time_interval=? 2. See Push events (API keys) batch mode

User Recognition stream mode

Supported

Primary key on eventn_ctx_event_id field is required for destination table. UPDATE destination_table SET .. WHERE eventn_ctx_event_id=?

User Recognition batch mode

Supported

Primary key on eventn_ctx_event_id field is required for destination table. Same as Push events (API keys) batch mode

Last updated