All,
I'm wondering if you can offer some assistance. I'm trying to test out hashcat and learn how to use the program. When I run the following command it runs for about 3 minutes and ends in "killed". What am I doing wrong? The database is very large 2GB so I'm assuming perhaps this could be the issue?
root@The-Distribution-Which-Does-Not-Handle-OpenCL-Well (Kali):/usr/share/hashcat# ./hashcat.bin -m3200 -a0 /root/users.txt /usr/share/wordlists/rockyou.txt
Initializing hashcat v0.49 with 1 threads and 32mb segment-size...
Killed
(08-28-2015, 05:21 PM)DyOS Wrote: [ -> ]Initializing hashcat v0.49 with 1 threads and 32mb segment-size...
Well you should at lease use the latest hashcat v0.50 instead. And "1 thread"? Is that a VM or some ancient CPU? How much system RAM is there?
"Killed" usually happens when your system runs out of ram.
And to test this, just try the same command line on a small amount of hashes of hashes (5-10).
Yeah, you're running up against the kernel OOM killer. I know what you're working on, and you're trying to load 36,150,089 salts. You likely do not have anywhere near enough RAM for that (you need around 48GB free.)
Errr...trying 36M hashes...bcrypt ones, on a single-threaded CPU? Good luck.
Yea, honestly I only care to crack one hash to prove the theory. So if it's a memory problem how can I tell hashcat to only load X hashes? Also, I'm sure some of you have already had a run at this file. Using the standard rockyou.txt how many do I need before it cracks just one? Thanks for all the feedback.
(08-31-2015, 10:45 PM)DyOS Wrote: [ -> ]how can I tell hashcat to only load X hashes?
Er, only put X hashes in the hash file?
(08-31-2015, 11:23 PM)rico Wrote: [ -> ] (08-31-2015, 10:45 PM)DyOS Wrote: [ -> ]how can I tell hashcat to only load X hashes?
Er, only put X hashes in the hash file?
Yea, I'm just not sure if I have enough memory to even open or manipulate the file. I will give that a shot.
(08-31-2015, 11:26 PM)DyOS Wrote: [ -> ]Yea, I'm just not sure if I have enough memory to even open or manipulate the file. I will give that a shot.
head -n5
/root/users.txt >
/root/newuser5.txt
That will copy the first 5 lines ( -n5 ) from users.txt to a new file called newuser5.txt
I don't think that will use much memory. Or maybe in your case use -n1 (I assume hashes are one per line?) and use the new file in your command.