-
Notifications
You must be signed in to change notification settings - Fork 12
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Bad x-amz-content-sha256 on large upload #12
Comments
I'm not sure off the top of my head; I'll try to take a look when I get some time. |
I will setup a reproductible example if I succeed so we can work on it |
Here is a reproductible example on a R2 bucket. It does not works with the special filename and it also does not works even if I rename the file to a basic filename but will works if I resize the file to a much less large file. import "https://deno.land/[email protected]/dotenv/load.ts";
import { S3Client } from "https://raw.githubusercontent.com/nestarz/deno-s3-lite-client/patch-1/mod.ts";
const s3 = new S3Client({
accessKey: Deno.env.get("AWS_ACCESS_KEY_ID")!,
secretKey: Deno.env.get("AWS_SECRET_ACCESS_KEY")!,
endPoint: Deno.env.get("S3_ENDPOINT_URL")!,
region: Deno.env.get("AWS_BUCKET_REGION")!,
bucket: Deno.env.get("DEFAULT_BUCKET")!,
useSSL: true,
pathStyle: true,
});
const key = "remixicon & ?_- (1).woff2"; // encodeURI not working either
const size = 1 * 64 * 1024 * 1024 + 100;
const value = new Blob([
new Uint8Array(size).map(() => Math.floor(255 * Math.random())),
new Uint8Array(size).map(() => Math.floor(255 * Math.random())),
]).stream();
console.log("Stream exists");
await s3.putObject(key, value, {
size: size * 2,
partSize: 64 * 1024 * 1024,
metadata: { "Content-Type": "application/octet-stream" },
});
console.log("OK"); Stream exists is logged and then I get:
|
I made two PR that should resolve the issue, but needs review before approval. The payload sha256 fix is a hack and you may find the original issue based on this ? |
Hello ! Thanks for your work !!
Do you have any idea why on large files (> 100mb) I got this issue (I tried multiple partSize, and I have set the size field to the exact content length of the file uploaded), also I am streaming the file using ReadableStream (works fine on small files).
On a side note, I also have issues with filenames that contains space or characters like
()
, if I sluggify them it works well.The text was updated successfully, but these errors were encountered: