Please note, this is a STATIC archive of website hashcat.net from 08 Oct 2020, cach3.com does not collect or store any user information, there is no "phishing" involved.

hashcat Forum

Full Version: oclhashcat vs hashcat
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
Pages: 1 2
You all might already know this, but here it is:

Pick a more or less large, if possible, md5 hash file, and run a simple dictionaire
attack, no rules, no remove option, against your own dictionaires, all of them in one
folder, if possible, just to have a comparasion.

In the test example, oclhashcat gave 14883 hashes cracked.
Hashcat, with the same hash file, same dictionaires, gave 17110 hashes cracked.

Hashcat give in this run some hashes, not plain text, for double md5, if these
hashes are in the dictionaire file of course.
However some plain text were cracked by hashcat that oclhashcat didn't crack.

Anyone else can reproduce and confirm this with their own files ?
The difference of 2227 hashes looks a too high number here.

Attached screen shots before start, and after ending a run on both
programs (1.00b53 for oclhashcat and 0.47b9 for hashcat).

[Image: ado7hvs4.jpg] [Image: adijraLB.jpg] [Image: acbfgWtU.jpg] [Image: adbF3Xrv.jpg]
Did you make sure to have no words >= length 32 inside your dictionaries? That's the difference between both programs.

There's also a huge difference between b53 and b57, please try again with b57.

In my local tests I was unable to reproduce. I first tried with my unmodified dictionaries which resulted in the expected result. Hashcat was cracking more than oclHashcat. Then i removed all words >= 32 from my dictionaries and ran hashcat again. It then cracked the same number as oclHashcat.

oclhashcat:
Quote:Recovered......: 5708/1633919 (0.35%) Digests, 0/1 (0.00%) Salts

hashcat before:
Quote:Recovered.: 6665/1633919 hashes, 0/1 salts

hashcat after:
Quote:Recovered.: 5669/1633919 hashes, 0/1 salts

When removing oversized words please use len.exe tool from hashcat-utils because only this program makes sure to count correctly bytewise. For example, other length utilities do count a "ä" as 1 byte only but bytewise it takes 2 byte.
Thanks atom !
In fact, many dictionaires have words >=32, so I'll do a test following your suggestion, with latest beta,
and using len.exe tool, and will report back.
Using the same simple command line as before except adding -o option, b58 is giving an error: hashfile and outfile are not allowed to point to the same file.
This error message doesn't happen using exactly same command line with b53.
(11-26-2013, 07:57 PM)proinside Wrote: [ -> ]Using the same simple command line as before except adding -o option, b58 is giving an error: hashfile and outfile are not allowed to point to the same file.
This error message doesn't happen using exactly same command line with b53.
Maybe that it's a new check that has been implemented. What was your command line?
Yeah please post commandline. As mastercracker correctly said, it's a new check.
(11-26-2013, 09:50 PM)mastercracker Wrote: [ -> ]Maybe that it's a new check that has been implemented. What was your command line?

oclHashcat64.exe -n 80 -m 0 --status --markov-disable --disable-potfile --gpu-temp-disable e:\tools\oclHashcat\hashfiles\hashes.txt -o pass.found e:\tools\oclHashcat\dic
why do you have -o between the two arguments?
(11-26-2013, 10:10 PM)epixoip Wrote: [ -> ]why do you have -o between the two arguments?

-o pass.found is the file were the cracked hashes should be stored.
Weird file name, but never had any problem with that before until b58.
getops allows the usage of -o between two arguments, but the function that tries to find out if the input hashfile and the outfile is the same seems not to work on windows. please wait for a fix
Pages: 1 2