We are building AI driven analytics application which targets data scientists and analysts. Our application uses Snowflake for data storage. We give ability to our end users to upload their own data to our warehouse. We want to build an workflow which will load data from s3 to Snowflake DB. Each file will represent one table in Snowflake. There is no fixed structure for file (it can contain any number of columns)
- User will upload file to S3 bucket
- Simple service should load it from S3 to Snowflake
I explored S3 CSV option. It seems table name has to be predefined when we create integration.
Please let me know if I can use Import API to load given file to into Snowflake database