-
Notifications
You must be signed in to change notification settings - Fork 16
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Configure s3 in the containerized environment for file upload e2e tests #457
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@GermanSaracca @ErykKul @ekraffmiller Thanks to Eryk's help, I've managed to set S3 as the default and unique storage option, so there's no need for the intermediate step of creating a dataverse with S3 storage. 🎉 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What this PR does / why we need it:
Adds S3 remote storage option using a bucket hosted on AWS.
Initially, I evaluated using localstack as in js-dataverse to avoid using AWS and to keep everything containerized. After encountering issues and limitations, I decided to use a remote S3 storage.
Which issue(s) this PR closes:
Special notes for your reviewer:
Now we need to create a .env under dev-env folder, by copying the .env.example file, and paste there the credentials for AWS. I will be happy to share them, just ping me on Slack. @ekraffmiller @ErykKul @GermanSaracca
The AWS secrets are already added to the repository for the GitHub actions to operate.
Suggestions on how to test this:
Run docker containers locally and create a dataset. Next, verify that direct upload works on JSF.