-
Notifications
You must be signed in to change notification settings - Fork 47
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Find better pad size #29
Comments
It seems when adding many files at once to an archive, it automagically becomes "solid", perhaps that means we can't use padding attack except on very first file (if even that)? Or maybe we can, but only if we pick the first file/stream. If the latter is true AND there is usable padding, maybe we shouldn't search for a smaller file. The other scenario is adding a file at a time to an archive. Then it may end up as non-solid - and we can compare all padding sizes and consider them when comparing data sizes, as in OP. |
I used 7z2hashcat.exe against a 9 year old 7z archive I think its size was 12GB and got a 3MB hash file |
Yes, the same thing happened to me. |
Same here, were you able to find a solution? |
Nope, |
Unfortunately, this size is normal.
Cracking it is very slow unless you know the scope of the password.
…------------------ 原始邮件 ------------------
发件人: "philsmd/7z2hashcat" ***@***.***>;
发送时间: 2024年11月25日(星期一) 中午12:03
***@***.***>;
***@***.******@***.***>;
主题: Re: [philsmd/7z2hashcat] Find better pad size (#29)
I used 7z2hashcat.exe against a 9 year old 7z archive I think its size was 12GB and got a 3MB hash file txt.txt This file is 6MB but it will become 3MB if converted to UTF8. I also tried 7z2john and got the same large hash file
Same here, were you able to find a solution?
Nope,
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you commented.Message ID: ***@***.***>
|
7z2hashcat tries to find "shortest file" for attacking. I think we should include "padding size" in the equation when applicable: In case we can find a larger file but with better padding, it might well be worth it!
The text was updated successfully, but these errors were encountered: