-
Notifications
You must be signed in to change notification settings - Fork 594
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[CRITICAL] [lib-storage] Uploading in Safari is in a non-working state #2365
Comments
Just had a thought: The v2 AWS SDK didn't have this issue. So I'm not sure what they're doing differently. |
UPDATE (related issue). Uploading using this library and Safari 14.1 also breaks (when using the Here's a reproduction of this upload issue: https://github.com/ffxsam/repro-safari-upload Clone, run yarn, then |
@AllanZhengYP I think this could be considered critical. Anyone using the v3 SDK with Safari is going to run into this issue, I would imagine. |
This is becoming more of an issue with our customers. I'd definitely appreciate some help on this! @trivikr @AllanZhengYP @jeskew |
@ffxsam apologies for the late reply, I was able to reproduce this previously. Need to investigate more to find an appropriate solution. You are right about that Blob.prototype.stream() missing in safari which originally causes the issue. |
@ajredniwja Thanks for the response! It looks like Safari 14.1 implements For now, we've added in the old aws-sdk v2 to use |
Its unfortunate that you have to downgrade for workaround, I will discuss it with the team and see what a potential solution could be for this. |
Thanks, Ajwinder! |
Hey @ffxsam I was able to run some tests, #2183 (comment) used this code which didn't work in previous version but works in |
@ajredniwja I don't think that solution will work for me: const blob = await response.blob(); This looks like it loads the entire blob into memory. We have customers uploading multiple files that could range from 20MB to 100MB. I was already reading the whole file into memory as a workaround, and it bombed some users' browsers. We really do need streaming to work like it did with |
I understand, let me try and work with the example you provided. |
@ffxsam, getting CORS error for some reason trying to work with the code you provided. I was going to look into pollyfill for previous versions. |
@ajredniwja Very odd, not sure why you'd get a CORS error. Since the code isn't doing anything that would require any sort of CORS setup, the only thing I can think of is that your destination S3 bucket doesn't have the proper CORS permissions. |
I was suspecting the same @ffxsam, but I use the same bucket for my React testing, I didnt dig deep, but will definitely try to come up with similar vue example, since the react code worked on safari later version. |
@ajredniwja Hey Ajwinder, just wanted to check in on this and see if there's been any progress. I'm ok with using the older |
Hey @ffxsam apologies for late reply, I wasnt working for a while but I tried instantiating a new app: I was able to successfully upload stuff on S3 without any problems. I used the Vue CLI to instantiate an app: My Package.json: Copied from react project
Code with function to upload: import { fromCognitoIdentityPool } from "@aws-sdk/credential-provider-cognito-identity";
import { CognitoIdentityClient } from "@aws-sdk/client-cognito-identity";
import { S3Client } from "@aws-sdk/client-s3";
import { Upload } from "@aws-sdk/lib-storage";
const idPool = "us-west-2khjsdakj";
const Bucket = 'aj-teest';
const region = 'us-west-2';
export const s3Test = async () => {
// debugger;
const creds = fromCognitoIdentityPool({
client: new CognitoIdentityClient({ region }),
identityPoolId: idPool,
});
const client = new S3Client({
region,
credentials: creds
});
const Key = `${Date.now()}-new-key`;
let upload = new Upload({
client,
tags: [{
Key: "my-tag-key",
Value: "my-tag-value"
}],
params: { Key, Bucket,
Body: "hello world!"
}
});
upload.on("httpUploadProgress", (progress ) => {
console.log(progress);
});
const uploadResult = await upload.done();
// debugger;
console.log("done uploadingg", uploadResult);
} |
@ajredniwja As I mentioned before, this is an issue with larger files. Create a 50MB file and use that as your I tested just now in Safari, and the page still bombs with a message at the top: "This webpage was reloaded because a problem occurred." You can use my repro I linked to awhile back to test: https://github.com/ffxsam/repro-safari-upload/tree/main/src |
@ffxsam, I used your repo, safari had some caching issue that it wont pull the latest packages but thats another issue. I saw series of issues but the weird part is that was only with Vue.js framework, I can look into what pollyfills they use as they mention they do it automatically using babel. |
@ajredniwja Wow, good find! Thank you for taking so much time to help with this. I'm going to file an issue with Vue and link to this issue, hopefully we can discover a solution. |
@ffxsam I have the same issue with safari crashing and reloading the web page. We're however using a React app. |
@ludwighen Ah! Do you think you could put together a small reproduction repo? |
Also weird: Files smaller than 500KB work fine. Larger files fail.
|
@ajredniwja Seems like this issue could be due to a conflict with the AWS SDK and some other package, maybe? @ludwighen Could you paste your package.json here so we can see your dependencies? |
The level of attention this issue is getting is kind of disappointing. It's been months now, and this SDK still causes Safari to bomb when large files are uploaded. |
@ffxsam, The team is actively looking into it, I will have an update for you soon. |
@ajredniwja Any updates? This has been an issue since May. How large is the team that maintains this SDK? This feature is in an unusable state, and it's frustrating to see this issue not getting the priority I think it deserves. @ludwighen Just pinging you again. If you share your package.json file, it would help us nail down this issue. |
Any updates on this one? We are facing the same issue |
Pinging @trivikr to take a look. |
I'm running into issues in safari as well, but my main problem is with for anyone interested, I basically reverted using
|
This issue has a "workaround-available" tag. Is the workaround to downgrade to the v2 SDK? Or is the workaround to use |
I have the same problem on Safari when uploading files over I am using multi-part upload on the FrontEnd and The error says Is there a solution for aws-sdk v3? |
Following this ... |
I have contacted AWS support over this issue. There were some promises they will fix this, but there has been radio silence for two months now and support request was closed without resolution. So make of that what you will. My solution was to take lib-storage source code and modify problematic portion to behave the same as in v2. Patch package would probably also work, but since it is relatively simple library, I decided against it to keep setup simple. |
FYI, this shouldn't be an issue at this point. I'd recommend people use the Amplify JS client for uploads. |
Describe the bug
When uploading from Safari directly to S3 (credentials supplied via Cognito), I get this error:
Body Data is unsupported format, expected data to be one of: string | Uint8Array | Buffer | Readable | ReadableStream | Blob
I'm passing a
File
object like so:Your environment
SDK version number
@aws-sdk/[email protected]
Is the issue in the browser/Node.js/ReactNative?
Browser
Details of the browser/Node.js/ReactNative version
Safari 14.0.2 and under
Steps to reproduce
(see code snippet above)
Observed behavior
Uploading a file using the
Upload
class fails with error:Body Data is unsupported format, expected data to be one of: string | Uint8Array | Buffer | Readable | ReadableStream | Blob
Expected behavior
I would expect Webkit/Safari to follow W3C standards better. 😉 Also, uploads worked totally fine in SDK v2.
Additional context
Pour yourself a coffee and pull up a chair...
So, lib-storage is checking for the existence of
data.stream
as a function, which is totally legit:https://github.com/aws/aws-sdk-js-v3/blob/main/lib/lib-storage/src/chunker.ts#L19-L21
File
, inheriting fromBlob
, should implement that. Unfortunately, in Safari,Blob.prototype.stream()
does not exist, even though this is a W3C standard drafted way back in May 2019. This wasn't supported in Safari until 14.1.I'm working around this with the following code, but it's not ideal, as it creates a slight lag for the user as very large files (potentially 70-150MB) are converted to
Uint8Array
before passing to theUpload
constructor:I don't think there's any ideal workaround at this point. But maybe if this library could handle this scenario instead of putting that on the developer (so they don't wind up wasting time like I did), that would be helpful. Maybe the
getChunk
function could check fordata
being an instance ofBlob
but withdata.stream
returningundefined
, and if so, then calldata.arrayBuffer()
to parse the data and return it.If there's a better way I can handle this, I'm open to hearing about it! But if Safari doesn't support Blob streaming, it seems like reading the entire file into memory is the only alternative at this point.
The text was updated successfully, but these errors were encountered: