Replies: 1 comment
-
Hi, |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I've got three folders containing each some 20-30k fotos. Searching for duplicates (with or without cache enabled) is a matter of seconds, during which it says
"Analyzed full hash of ... / 965 files."
It finds no duplicates (which is okay, if true). If I manually "create" a duplicate, it finds it as well.
What irks / confuses me is that it says "965 files" - rather than ~100k files. How can I be sure that all files were checked/compared?
When doing the full hash dupe check on so many files, it took hours in the past, not seconds or minutes. Again, with cache disabled in settings, it takes a few seconds more. But time it takes aside, if the three folders contain >100k files, why does it claim to only check <1k of these files?
PS: Corrolary: Is $all_files below all files checked and compared - or only a (random?) subset for which a hash it analysed:
progress_analyzed_full_hash = Analyzed full hash of {$file_checked}/{$all_files} files
from https://fossies.org/linux/czkawka/czkawka_gui/i18n/en/czkawka_gui.ftl
btw, fdupes (in the command line) says:
Progress [ ... / 74918] 25%
and find no duplicates either. Still, I'm more happy with having the correct count as 74918, rather than 965...
Beta Was this translation helpful? Give feedback.
All reactions