Redshift destination for batch exports

Last updated:

|Edit this page

Batch exports can be used to export data to Redshift, Amazon's data warehouse product.

Creating the batch export

  1. Subscribe to data pipelines add-on in your billing settings if you haven't already.
  2. Click on data pipeline in the sidebar and go to the exports tab in your PostHog instance.
  3. Click "Create export workflow".
  4. Select Redshift as the batch export type.
  5. Fill in the necessary configuration details.
  6. Finalize the creation by clicking on "Create".
  7. Done! The batch export will schedule its first run on the start of the next period.

Redshift configuration

Configuring a batch export targeting Redshift requires the following Redshift-specific configuration values:

  • User: A Redshift user name with permissions to insert data into the specified table and, if the table does not exist, permissions to create the table.

  • Password: The password for the Redshift user specified.

  • Host: The endpoint of your Redshift cluster, excluding the port number and database name.

  • Port: The port number on which the Redshift cluster is listening (default is 5439).

  • Database: The name of the Redshift database to which the data is to be exported.

  • Schema: The name of the schema within the database. This determines where the table for exporting data will be located.

  • Table name: The name of the table where the data will be inserted.

  • Properties data type: The data type configurations for the properties, set, and set_once columns, either VARCHAR(65535) or SUPER.

  • Events to exclude: A list of events to omit from the exported data.

  • Events to include: A list of events to include in the exported data. If added, only these events will be exported.

Event schema

This is the schema of all the fields that are exported to Redshift.

FieldTypeDescription
uuidVARCHAR(200)The unique ID of the event within PostHog
eventVARCHAR(200)The name of the event that was sent
propertiesSUPER or VARCHAR(65535)A JSON object with all the properties sent along with an event
elementsVARCHAR(65535)This field is present for backwards compatibility but has been deprecated
setSUPER or VARCHAR(65535)A JSON object with any person properties sent with the $set field
set_onceSUPER or VARCHAR(65535)A JSON object with any person properties sent with the $set_once field
distinct_idVARCHAR(200)The distinct_id of the user who sent the event
team_idINTEGERThe team_id for the event
ipVARCHAR(200)The IP address that was sent with the event
site_urlVARCHAR(200)This field is present for backwards compatibility but has been deprecated
timestampTIMESTAMP WITH TIME ZONEThe timestamp associated with an event

Questions?

Was this page useful?

Next article

Airbyte

The Airbyte export sends data from PostHog, to Airbyte. It supports both Full Refresh and Incremental syncs. You can choose if it copies only the new or updated event data, or all rows in the tables and columns you set up for replication, every time a sync is run. Requirements Using the Airbyte export requires either PostHog Cloud with the data pipeline add-on , or a self-hosted PostHog instance running version 1.30.0 or later. Not running 1.30.0? Find out how to update your self-hosted…

Read next article