Skip to main content

Hello - we have recently signed up for customer io and I am working on extracting our journey information to our data warehouse. I have successfully exported data to s3, and am now working on the next step of the process. 

With limited data engineering resources, I was wondering if anyone has set up ingesting this data into redshift and could share any tips in their set up. I believe I want to use a COPY command, but am struggling where people are running these. I have a ec2 instance running other parts of my ETL, so I could add to that, but curious what other set ups are out there. 

Hello drosenberg,

Happy to help with this. You could look into using Google Cloud Storage, Amazon S3 or Yandex as your storage bucket. You must set your data warehouse to ingest the data from your storage bucket. We’ve written a guide here on how to set this up: https://customer.io/docs/journeys/amazon-redshift-data-out/.

I hope this helps! 

Cheers,
Jon


Reply