As the title says, I want to run two instances of hashcat at once, on the same computer, from the same folder. What should I look out for when I'm doing this, or should I not do this at all?
Currently on Windows 10 Education, using Hashcat v5.1.0-1602-gda7a13af+(due to Hashcat v5.1.0 not recognizing my GPU properly) on a Radeon RX 580.
All advice is appreciated!
What is the significance of this? Just do multiple cmds in a bat file.
hashcat.exe -m xxx -w 3 -a 3 etc...
hashcat.exe -m xxx -w 3 -a 0 etc...
pause
If you do 2 instances, you're just going to be splitting the workload and straining your system. So why not just finish one attack and get full potential out of both of them.
(01-18-2020, 02:35 AM)slyexe Wrote: [ -> ]What is the significance of this? Just do multiple cmds in a bat file.
hashcat.exe -m xxx -w 3 -a 3 etc...
hashcat.exe -m xxx -w 3 -a 0 etc...
pause
If you do 2 instances, you're just going to be splitting the workload and straining your system. So why not just finish one attack and get full potential out of both of them.
I was just wondering if running two Hashcats would mess up the files or anything, since they share files and use them at the same time(ie hashcat.restore, hashcat.potfile), I know that I can change the restore and potfile names but I was asking on here in case there's anything else I need to keep in mind
I often run more than one simultaneously, either distributing load among GPUs, or else pausing one long-running job to run a short-running job. This is often easier than quitting and restoring.
The best way to manage this is to assign all running hashcats their own session ID with --session. You can then refer to that session by name when restoring or doing other things.
(Edit: I also use --potfile-path to give them separate potfiles with the same name)
(01-18-2020, 02:42 AM)royce Wrote: [ -> ]I often run more than one simultaneously, either distributing load among GPUs, or else pausing one long-running job to run a short-running job. This is often easier than quitting and restoring.
The best way to manage this is to assign all running hashcats their own session ID with --session. You can then refer to that session by name when restoring or doing other things.
Got it; do they interact well together with the potfile or does --session make them have separate potfiles? I don't believe so, based on the wiki, so would I need to do --potfile-path for them as well(assuming that the answer to my first question is no)?
(01-18-2020, 02:42 AM)royce Wrote: [ -> ]I often run more than one simultaneously, either distributing load among GPUs
I suppose if you're managing a good cracking rig this would make sense. I never thought of that, you got me there. Only having a single card, I wouldn't want to be running multiple instances at once.
(01-18-2020, 02:48 AM)Coloradohusky Wrote: [ -> ]Got it; do they interact well together with the potfile or does --session make them have separate potfiles? I don't believe so, based on the wiki, so would I need to do --potfile-path for them as well(assuming that the answer to my first question is no)?
Writing to the same potfile by two different processes seems undefined to me (but I haven't looked at the code). I always explicitly specify the potfile to match the session, and then combine potfiles using a separate script into a potfile that is not used by any session, to minimize chances of weird behavior. But do whatever seems best for you.
(01-18-2020, 02:55 AM)slyexe Wrote: [ -> ]I suppose if you're managing a good cracking rig this would make sense. I never thought of that, you got me there. Only having a single card, I wouldn't want to be running multiple instances at once.
Generally true; the only corner case is when you have two very inefficient attacks (that you still have to run for some reason). The GPUs will happily handle multiple such jobs.
(01-18-2020, 02:42 AM)royce Wrote: [ -> ]I often run more than one simultaneously, either distributing load among GPUs, or else pausing one long-running job to run a short-running job. This is often easier than quitting and restoring.
The best way to manage this is to assign all running hashcats their own session ID with --session. You can then refer to that session by name when restoring or doing other things.
(Edit: I also use --potfile-path to give them separate potfiles with the same name)
I'm doing this just now, for not very good reasons, and have given separate pot files, separate session files, and taken the --remove off both, as they're hitting the same hash list. It seems to be working OK.