Please note, this is a STATIC archive of website hashcat.net from October 2020, cach3.com does not collect or store any user information, there is no "phishing" involved.

hashcat Forum

Full Version: --show takes forever to finish
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
I use the --show command to get a list of recovered passes and it won't end for almost 5 hours now
I don't understand why, if I do the matching myself outside of hashcat by creating dictionaries it takes less time.

the command I run is:
hashcat64 -m hashtype --show --username --outfile-format=2 "hashes.txt" -o outfile.txt

my file contains 1m rows.
Can you run a line count on hashes.txt to see how many have been written? I'm guessing it is either slow for some reason and hasn't written all 1 million lines yet, or it's stuck in a loop and has written millions of lines.
you mean on the outfile.txt? there is no file created yet.
it depends mainly on the size of the hashcat.potfile and of course also to some extend on the hashes.txt file (note there can be cracks assigned to several users, especially if unsalted hashes are used).
Therefore, --username of course makes it a bit slower.

Which version of hashcat do you use. The speed was already increased by a lot with recent versions of hashcat.
(08-21-2018, 10:33 PM)philsmd Wrote: [ -> ]it depends mainly on the size of the hashcat.potfile and of course also to some extend on the hashes.txt file (note there can be cracks assigned to several users, especially if unsalted hashes are used).
Therefore, --username of course makes it a bit slower.

Which version of hashcat do you use. The speed was already increased by a lot with recent versions of hashcat.

I'm using the latest version 4.2.1
My .potfile size is ~20Mb
I actually find it faster to do it in a script out of Hashcat... how can that be I don't know...
what's your memory usage and CPU usage during the --show --username run ? High memory, high CPU?

We might need to troubleshoot this in detail but I guess we need to have the files (or similar files) and an open issue on https://github.com/hashcat/hashcat/issues . Do you think you can open the issue and provide some more details about when exactly this happens and some files the devs can use to troubleshoot?
Thx
I just tested with a hash file and potfile with 1000000 entries and it's very fast on my system.
Maybe you can do the same and report back.

This is how I generated the lists:
Code:
#!/usr/bin/env perl

my $NUM_HASHES = 1000000;

for (my $i = 0; $i < $NUM_HASHES; $i++)
{
  my $hash = "";

  for (my $j = 0; $j < 16; $j++)
  {
    $hash .= chr (int (rand (256)));
  }

  print STDOUT "user_$i:" . unpack ("H*", $hash)             . "\n"; # hash file
  print STDERR              unpack ("H*", $hash) . ":a${i}b" . "\n"; # pot  file
}

and run this script like this:
Code:
perl generate_rand_hash_pot_file.pl > tmp_hashes.txt 2> tmp_hashes.potfile

(where generate_rand_hash_pot_file.pl is the perl script above and tmp_hashes.txt will be the hash file and tmp_hashes.potfile will be our pot file. Note: the cracks are of course not correct, i.e. the password in the potfile is in this case not the correct one corresponding to the md5 hash, but hashcat doesn't verify this anyways)

after that run hashcat like this:
Code:
hashcat -m 0 --show --username --outfile-format 2 --potfile-path tmp_hashes.potfile -o outfile.txt tmp_hashes.txt

(btw, you could also shuffle the lines, e.g. with the linux shuf command, before you run hashcat, but it shouldn't change the speed by a lot, because hashcat will use it's own sorting internally, which is different from the output of generate_rand_hash_pot_file.pl)


I can't really reproduce a speed problem here. maybe your problem is a different one (some more special)