-
Notifications
You must be signed in to change notification settings - Fork 573
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
IOError while writing to file. You might have run out of disk space, or file might be too large for your OS #155
Comments
I have the same issue. |
I have the same issue: OS Win10 Pro trying to download onto a network drive (NAS). |
@LoonyGryphon @Monsterbacke007 assuming that disk has enough space for the file (and destination system supports large files), do you mind trying the process again? Since it was skipped during the error, next run will try to download it again and we'll see if the problem is specific to the file or intermittent. |
I used this solution #150 (comment) because I also had the issue mentioned in that thread. After doing that, I didn't have this problem again (it "magically" solved two issues at once). |
@AndreyNikiforov the issue persists (differnt computer, different location, same OS) with different files now. @nihelmasell I am getting a syntax error or "continue not properly in loop" trying this on Win10 Pro: As I am new to Python can someone please help? |
I'm also new to Python. I'm running the script on an Ubuntu 20 Linux install and have no issue right now (modifying the lines like the mentioned post says). |
Is there a way to retry aborted or skipped files with filename? Havent found it in the doc. |
Guys, are you sure you're running the latest version of pyicloud-ipd? Please check:
If you are running an older version, please install the latest version of icloud_photos_downloader and the requirements.
It should not be necessary to edit files any more as the fix for the other bug is included now. Is the issue still occurring? |
What do you mean? An selective download of a specific file? If you want to retry for whatever reason you could just rerun the script. Existing files are skipped quite quickly. |
All good. It seems the files were all downloaded despite the error. The only explanation I can imagine is that files were downloaded twice in a short amount of time so whilst file abc was written the system tried to write another instance of the exact same file and hence skipped it. Thx for all your feedbacks. |
I had same issue on Debian 10. My solution is to create new directory and set chown for user which runs icloudpd. |
I've seen this issue tonight as well. A snippet from one of my logs:
I think it may be related to multi-threaded downloads, as I haven't seen the error when using the |
Latest version of icloudpd download files with There is no functionality to retry specific file, icloudpd will retry all aborted files. Latest version 1.6.2 encourages to use only one download thread and later versions will remove multithreaded downloading, so you should have at most one aborted file. |
@HarrisonOates @nihelmasell @Monsterbacke007 Do you still experience the issue with icloudpd 1.6.2 and |
I didn't try it yet. What's the problem with the solution I used? Is it actually NOT downloading all iCloud files? I don't want to fix something that seems to work OK so far. |
@nihelmasell The solution you mentioned seems to be related to encoding. I saw such encoding error once and it was a different message from the one in this issue. IOError has been happening due to multi-threaded downloading, which has been changed to 1 by default in 1.6.2. If you do not see IOError any more, I'll close the issue. |
I chmod -R to my NAS icloud folder to 755, and the error passed. |
I've run the script through once, and it worked alright, however it did not download all of the photos. I am getting the error
OError while writing file to #### You might have run out of disk space, or the file might be too large for your OS. Skipping this file...
I'm on an Arch Linux machine.
The text was updated successfully, but these errors were encountered: