-
Notifications
You must be signed in to change notification settings - Fork 2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG] [azure-storage-blob] Timeout when uploading large files to a block blob #5221
Comments
@cdraeger Thank you for reporting this. Can you provide a bit more information to help us get started debugging? Are you able to see network traffic at all with something like Fiddler or is it stalling before it hits the network (I'm seeing a proxy buried in the middle of your stack trace that could be interfering)? Also, are you able to upload small files at all and this only manifests past a certain size or does it always trigger this exception? Is there any way you can test this outside of Spring as well to make sure nothing there is interfering? I'm able to upload a 1GB file in about 18s over here, which if nothing else is better than timing out after 60s. |
@rickle-msft Thanks for responding quickly. I indeed see network activity:
I just tried again with a
Handshake warning in between that I sometimes see, but this didn't yet fail the upload:
I tried this once more: upload also failed after ~3 minutes, but this time the SSL handshake timeout was the actual cause:
So I'm really not sure what I can do, it looks like something is wrong in the underlying network stack of the SDK, but I don't know how to tackle or configure it properly. I also tried uploading an Also as said, I don't see much of network activity when writing to the outputstream of a blob directly via the Thanks! |
@rickle-msft I now tried isolating my code from Spring, set up a bare Gradle test project (see sample code below). With this setup, uploading a 1Mb file works fine. Uploading a 250Mb file or even more reproducibly freezes and literally crashes my Mac. My terminal cached some of the stacktrace though, I enabled debug logging. What I see is possibly similar: some errors and handshake timeouts in between. I don't get to any final exception because the Mac freezes before (with or without uploading in a separate thread). Stacktrace
Code snippet
|
@rickle-msft Basically using the async client directly and setting the block size to the max of 100MB (the regular client does not give this possibility). Same error, but this results in a much cleaner stacktrace without SSL handshake errors/warnings. Also the exception comes a bit later after 4m15s as you can see in below stracktrace. Overall unfortunately the SDK seems barely usable to me currently. :(
|
Thank you for the follow up and thank you for your thorough investigation. I'm sure this information will prove helpful in our efforts to diagnose the problem. @jianghaolu can you ake a look at what might be causing these timeouts? As a side note, the sync client should let you configure the block size. I have created #5234 to track |
@cdraeger while we continue to investigate this over here, can you please try setting the retryOptions on your client (through the builder) and set an arbitrarily large tryTimeout value to see what happens in the network stack if we get the |
Hi @rickle-msft thanks, this advise seems to have actually solved it entirely. I was now able to upload 2Gb and 4GB without errors, assuming that with the right value I will also be able to upload 16Gb or more (not yet tried). Reading the JavaDoc of the retry options class it becomes clear, unfortunately I was not aware of this at all. So in the end maybe just a documentation issue? Since I assume many people will upload large files, this just needs to be prominent then in the first basic examples I think. I also tried again with a default block size (I believe 4MB) when uploading a 1Gb file. With the Just one more general question: our upload sizes will vary between just a couple of megabytes to ~16Gb. Is it fine to set the block size to the max for all uploads, or should I say e.g. max block size for large files only, and default block size for all files smaller than the max block size? I am still having issues uploading via streams, but I will open a separate ticket for this. Thanks again! |
@cdraeger Happy to hear it! I think our path forward on this will actually be to remove the default timeout. We didn't have it in the past, and we were experimenting with adding it as a safeguard against hanging, but it seems that that environments and workflows are too variable to effectively provide a default, not to mention it's not easy for customers to figure out what went wrong when they do get this error and don't realize there's a default timeout. I'll leave this issue open as the tracking item to remove the default (though it will definitely still be possible to set a timeout if desired). Thank you for opening a different issue for the streams. I'll respond over there now. |
@alzimmermsft, you were thinking this would require the removal of a default timeout parameter that wasn't there in v11. |
@kurtzeborn @alzimmermsft Confirming that it looks like removing that default should fix this based on the customer's experience |
We are also facing same timeout issue for large file upload >500MB file size in blob storage version 12.10.0 @cdraeger As per your below comment
I also tried tryTimeout default value(which is Integer.MAX_VALUE) and customized value as per below code but none of this worked and still throwing Channel response timed out error and large file upload getting failed Correct me if anything is missing in above code snippet. Please help us to sort out this issue? |
@harisingh-highq Are you using https? If so, can you try with http just as an experiment to see what you get? |
Hi @harisingh-highq |
Describe the bug
I am currently trying to utilize the Storage SDK 12 (azure-storage-blob) to upload very large files to the Blob Storage with a
BlockBlobClient
instance. But I am running into timeouts quickly.File size when testing: ~ 1Gb, but file size can easily be ~ 10-16Gb which may take quite some time to upload. Downloading files of the same size should of course also work, but I didn't get to try this out yet because the upload already fails.
I uploaded using
BlockBlobClient#uploadFromFile(filePath)
. I also tried setting a high timeout via this method.I also tried writing to the outputstream of a blob directly via
BlockBlobClient#getBlobOutputStream()
, but for some reason this was very slow and the resulting blob size was totally off when I tried with much smaller files of just a few megabytes.Exception or Stack Trace
I'm seeing first a couple of these:
And in the end this:
I also saw when I played around:
To Reproduce
BlockBlobClient#uploadFromFile(filePath)
orBlockBlobClient#getBlobOutputStream()
Code Snippet
Also this is not working, resulting in very slow upload activity or completely wrong resulting file size:
Expected behavior
I expect the upload of large files to complete successfully, even if this means upload times of one hour or more.
I would also expect writing to the OutputStream of a block blob to work as fast as regular file upload, resulting in the correct blob size depending on the final length of the input stream.
Screenshots
If applicable, add screenshots to help explain your problem.
Setup (please complete the following information):
I'd appreciate any help or comments, thanks!
The text was updated successfully, but these errors were encountered: