Replies: 3 comments 1 reply
-
@rtrost with
Although we do not have capability to filter by date, I don't see how it can help you. You may see less throttling errors because you'll naturally have pauses between chunks, but overall from start to 80K photos you may be faster with error-then-retry approach. |
Beta Was this translation helpful? Give feedback.
-
@rtrost Although we don't know Apple throttling logic, if What cmd are you running? If you can use progress bar, you may be able to get some other insights (e.g. errors after 2G download)... Just thinking loud... |
Beta Was this translation helpful? Give feedback.
-
Ok, I have to admit that the IO error actually IS a hardware issue :-(. I have 53.000 photos downloaded now, and with re-runnig the command, I always get a few more. Thats fine, it takes time but it works :-). My command is pretty basic, i just use --directory, --username and --password. Thanks a million for that great tool! |
Beta Was this translation helpful? Give feedback.
-
Hi,
I am trying to download my photos from iCloud and so far I haven´t been successful to get them all. I have roughly 80.000 photos in the cloud, but at max 50% the script fails (connection refused by peer), and I wasn´t able to find a reason for it.
So my idea was to just re-run the script as it will check which photos already have been downloaded (which is pretty slow?) and just add the rest - which failed as well with I/O errors. The bug is listed in the issues as well, but I didn´t manage to install the fix, it broke my installation completely.
So given the situation that for some reason the connection breaks after few hours, how would you do it? It would be cool if I could somehow split the downloads, like "download everything from 2021" or so. Any ideas?
Beta Was this translation helpful? Give feedback.
All reactions