Loading data from S3 to Snowflake

We are building AI driven analytics application which targets data scientists and analysts. Our application uses Snowflake for data storage. We give ability to our end users to upload their own data to our warehouse. We want to build an workflow which will load data from s3 to Snowflake DB. Each file will represent one table in Snowflake. There is no fixed structure for file (it can contain any number of columns)

  1. User will upload file to S3 bucket
  2. Simple service should load it from S3 to Snowflake

I explored S3 CSV option. It seems table name has to be predefined when we create integration.

Please let me know if I can use Import API to load given file to into Snowflake database

Hey Antony,

If building and maintaining a script to format and POST data from these CSVs is feasible for you or your team, Stitch’s Import API integration can be used as a receiving point for JSON or TRANSIT posts that would then be loaded to the destination warehouse connected to your Stitch account.

Your script would likely need to dynamically create individual tables for each file and detect / define the structure of the files, as the Import API also needs to have the target table and structure defined as part of the POST. In case you haven’t already reviewed it, you can find those requirements outlined in this section of the Import API documentation.

If you have more questions about how Stitch’s Import API functions, or whether one of Stitch’s existing integrations would be a good fit for this use-case, I would recommend reaching out to Stitch support directly via in-app chat or email.