Please note, this is a STATIC archive of website hashcat.net from 08 Oct 2020, cach3.com does not collect or store any user information, there is no "phishing" involved.

hashcat Forum

Full Version: How to optimize attacking very large hashes
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
Hi,

I'm trying to attack a very large hashes, few GB in size and manage to get it down now to a few hundreds MB. The problem is that it takes a long time to:
  • remove recovered hashes from the hash file (i'm using --remove)


Example:
doing -a 3 ?a?a?a?a?a?a attack completed fast on my 2 x GTX 1080, but to get back to command prompt will take a while
  • when loading, it takes a long time to compare hashes with pot file
Code:
Comparing hashes with potfile entries...

How can I optimize this and make it a lot faster.

Thank you.

Best regards,
Azren
(06-14-2016, 08:08 AM)azren Wrote: [ -> ]Hi,

I'm trying to attack a very large hashes, few GB in size and manage to get it down now to a few hundreds MB. The problem is that it takes a long time to:
  • remove recovered hashes from the hash file (i'm using --remove)


Example:
doing -a 3 ?a?a?a?a?a?a attack completed fast on my 2 x GTX 1080, but to get back to command prompt will take a while
  • when loading, it takes a long time to compare hashes with pot file
Code:
Comparing hashes with potfile entries...

How can I optimize this and make it a lot faster.

Thank you.

Best regards,
Azren
from what i understand from what u are saying is, u are using --remove so each hash u recover his password is removed from the list, and what take time is to reading from the pot file, 
why not just to change the name of pot file so hashcat wont read from it.
u can also try to remove duplicate recover hashes
Don't use --remove, use --show and --left with -o /dev/null instead.
That helps. Saved me about 5 to 10 minutes per iteration. Thanks.

Best regards,
Azren