Please note, this is a STATIC archive of website hashcat.net from 08 Oct 2020, cach3.com does not collect or store any user information, there is no "phishing" involved.

hashcat Forum

Full Version: Divide the workload to multiple computers
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
How about a way to divide the workload to multiple computers so that if something takes 10 days to do, then it will reduce the time by 3 fold. Something for bruteforce implementation. For example, suppose a hash has 100 steps, so u have an option to divide it into multiple stages i.e 3 stages, 4, 5 etc. So on one computer you run stage 1, while at the same time, on computer 2, you run stage 2, etc
(05-17-2010, 04:44 AM)richardsguy Wrote: [ -> ]How about a way to divide the workload to multiple computers so that if something takes 10 days to do, then it will reduce the time by 3 fold. Something for bruteforce implementation. For example, suppose a hash has 100 steps, so u have an option to divide it into multiple stages i.e 3 stages, 4, 5 etc. So on one computer you run stage 1, while at the same time, on computer 2, you run stage 2, etc

this is already supported by hashcat and oclHashcat by using -s and -l parameter in combination. example:

wordlist contains 10000 words and you have 4 pcs (all of same speed):

-s 0 -l 2500
-s 2500 -l 2500
-s 5000 -l 2500
-s 7500 -l 2500

--
atom