-
Notifications
You must be signed in to change notification settings - Fork 48
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
OutOfMemoryError #12
Comments
Could provide code that reproduces this error along with information on the length of the file you're attempting to upload? Also, I see you responded with this trace on another issue but it looks like from this post this is a separate issue. Would you mind if I delete the other post? |
the length of the file i am attempting to upload is 45M |
I guess i use uploadBlock(final String blockId, final InputStream sourceStream, final long length) may resolve this issue,but ,i do not know how to use this method, can you give me a example?Thank you! |
I'd love to try to figure out with you what's going wrong with uploadFile, but I need the code you're using as well as information on the device to be able to debug further. Could you provide this? My initial guess is that your device may have relatively low memory or other apps are using a lot of memory. This might cause BufferedInputStream to not be able to allocate the size of the buffer we are requesting. You can reduce the buffer size by setting setStreamWriteSizeInBytes on the CloudBlockBlob. Note that reducing this will cause more, but smaller, calls to the storage service. Again, this is just a guess on my part but might be worth a try. uploadBlock requires somewhat more knowledge of the storage service to use than the other upload methods. It is what we use underneath the covers in uploadFromFile. The basic idea with uploadBlock is that you upload your data in chunks (blocks) and then commit the list of blocks using commitBlockList. As uploadBlock and commitBlockList are simple wrappers around basic REST calls, you can see the REST documentation for PutBlock and PutBlockList for more information. |
I can see you do AzureStorage.uploadFile but this is not actually our API - you must have wrapped our code. Could you please provide a minimum code example that reproduces the problem? Did the suggestion I gave above (setStreamWriteSizeInBytes) work for you? |
I ran into the similar issue with OutOfMemoryError and the solution with setting setStreamWriteSizeInBytes on the CloudBlockBlob worked for me. In my code I'm using CloudBlockBlob.openOutputStream to retrieve the output stream and then copy data to it using simple write(byte[] buffer, int offset, int count) method. Here is the stack trace of my exception before setting setStreamWriteSizeInBytes: |
Glad to get confirmation. We'd love to fix this library side but I haven't found a good way yet. It's simply that some phones do not have the ability to allocate a large enough buffer period, or they normally would but have other processes running which consume memory, etc. I could reduce the default for setStreamWriteSize but then phones with more memory (or less memory usage) would lose some upload perf. Out of curiosity
|
I've checked memory consumption in DDMS and didn't found any noticeable memory leaks. I may be wrong, but I guess the issue in my case is caused by ByteArrayOutputStreams.toByteArray call which makes the copy of stream's internal buffer and it happens too often. |
We did something similar and also have not seen leaks. ByteArrayOutputStream.toByteArray is one of the potential issues and I'd guess the issue in your case with the Galaxy S2. More profiling is needed on our end to find other potential culprits though. Some smaller phones running more processes might also be having heap size issues. In addition to ByteArrayOutputStream issues, I'm also looking at android:largeHeap (which seems potentially abusive, so leaning on no) and http://developer.android.com/intl/ja/reference/android/app/ActivityManager.html#getMemoryClass() to help determine how much memory the phone has at the moment and optimize stream write size. If you have additional ideas to explore, I'd love to know. More advanced info, just if you're interested: We do actually need multiple byte arrays (so potentially copies). The reason for this is that we're spinning up an async thread to write to the service which has to access the buffer, but you should still be able to continue writing to the output stream without blocking. This requires fundamentally two buffers, even with parallelism at 1. If you increase parallelism, the number of buffers needed increases. Copying into a buffer from a buffer pool could help, but this is a bit less trivial than simply extending the class. These are the kind of optimizations we are also exploring. |
Any news on this ? I still have issues with 1.0.0 :
This happens very often to my users, any idea ? |
Hi @victorleduc, Have you tried the suggested approach of reducing the buffer size by setting setStreamWriteSizeInBytes on the CloudBlockBlob as a workaround? We will use this issue to track this item internally for a possible solution in the library as well. cc:/ @jofriedm-msft |
I tried this without success, I finally fixed it by forking your library and changing this 3 constants :
The issue is that in the writeToOutputStream method, the sourceStream mark size is at his maximum (64Mo), so the read fails on low memory devices (I tested on Galaxy S4 mini). It is set to the maximum here in the CloudBlockBlob.java file line 649 :
|
Hi @victorleduc, Please try updating the SingleUploadThresholdInBytes value on the BlobRequestOptions per API or on your BlobClient. We will update if there are any decisions to modify some of the constants. Thanks, |
Hi, I don't remember precisely because I did this a month ago, but I remember trying to change the SingleUploadThresholdInBytes in the options, and it didn't fix it. I had to to what I said to you to make it work. Victor |
java.lang.OutOfMemoryError
at java.io.BufferedInputStream.fillbuf(BufferedInputStream.java:156)
at java.io.BufferedInputStream.read(BufferedInputStream.java:288)
at com.microsoft.azure.storage.core.Utility.writeToOutputStream(Utility.java:1218)
at com.microsoft.azure.storage.core.Utility.writeToOutputStream(Utility.java:1130)
at com.microsoft.azure.storage.blob.BlobOutputStream.write(BlobOutputStream.java:525)
at com.microsoft.azure.storage.blob.CloudBlockBlob.upload(CloudBlockBlob.java:583)
at com.microsoft.azure.storage.blob.CloudBlob.uploadFromFile(CloudBlob.java:1837)
at com.microsoft.azure.storage.blob.CloudBlob.uploadFromFile(CloudBlob.java:1809)
at com.datatang.client.framework.storage.AzureStorage.uploadFile(AzureStorage.java:135)
at com.datatang.client.framework.upload.service.UploadBinder$2.run(UploadBinder.java:110)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1112)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:587)
at java.lang.Thread.run(Thread.java:841)
The text was updated successfully, but these errors were encountered: