Please note, this is a STATIC archive of website hashcat.net from 08 Oct 2020, cach3.com does not collect or store any user information, there is no "phishing" involved.

hashcat Forum

Full Version: fgets_sse2 versions of hashcaat-utils
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
TL;DR: Use the regular ones unless you really need speed.

Summary: most of the hashcat-utils use fgetl (a wrapper around fgets) so for speed I ported them to use atom's fgets_sse2.

Benefits....Speed I guess. Some benchmarks are available on the link at the bottom of this post. Not all tools become faster as the bottleneck is not always fgets...

Drawbacks: They don't handle \r correctly at all. For example, "mp64 -i ?d?d?d | len 1 2" will show 0-9 using the windows binary of mp64 but probably works fine under linux.

Some tools are also totally broken. rli and rli2 are such tools. Unfortunately I do not have enough understanding of C to fix them.

Some data also is handled very oddly. enwik8 is one such example...

Code:
$ cat enwik8 | md5sum
a1fa5ffddb56f4953e226637dabbb36a *-

mangix@Mangix-PC ~/devstuff/hashcat-utils/fgets_sse2
$ ./catf enwik8 | md5sum
37b68f34a9a1709c73fb446ad70049d9 *-

It seems to work just fine with wordlists though so no worries there.

Code:
$ cat rockyou.txt | md5sum
9076652d8ae75ce713e23ab09e10d9ee *-

mangix@Mangix-PC ~/devstuff/hashcat-utils/fgets_sse2
$ ./catf rockyou.txt | md5sum
9076652d8ae75ce713e23ab09e10d9ee *-

combinator is also missing. It does not use fgets last I remember.

Extras: I wrote a line-count tool which is equivalent to "wc -l". It's faster than cygwin's "wc -l" on large files last I remember. I also wrote a catf tool which should work just as well as cat.

I also increased LEN_MAX on splitlen to 55.

Not sure if this will really help anyone. Feel free to edit the source. My command of C is not that great...

https://github.com/neheb/hashcat-utils
Looks nice, but the fact that some tools don't work is worrisome.