Please note, this is a STATIC archive of website hashcat.net from 08 Oct 2020, cach3.com does not collect or store any user information, there is no "phishing" involved.

hashcat Forum

Full Version: LM hash support for OCLHC
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
Pages: 1 2 3 4
I'd wish to see it.Smile
It may not be bitsliced.
Ofc, it may not be possible to implement with current oclhc architecture(because of 7+7 password length limit).
Not top priority.
Rolf the more I talk to people in the security community the more I tend to agree. Having a high quality LM GPU accelerated bruteforcer/dictionary application would be nice. Companies need to realize that LM should not be used anymore. Rainbow tables are nearly 100%. But they are so slow and if you have more than a handful of hashes you can forget it. I've always heard that hardware is easier to use with DES based algos. May just have to get some FPGA.
(10-21-2010, 12:47 PM)Rolf Wrote: [ -> ]I'd wish to see it.Smile
It may not be bitsliced.
Ofc, it may not be possible to implement with current oclhc architecture(because of 7+7 password length limit).
Not top priority.
I am not sure that doing hybrid attack will really help things that much but since the finite keyspace is not that big, pure GPU bruteforcing is very efficient. With EGB, it takes less than 24 hours (2 GPUs) to bruteforce the whole 0-7 Upperalpha-Number-Symbol on 1 million hashes. OClhashcat can easily do that as well if the algorithm can be implemented.
Implement LM is good idea because:

Maximal length is 7 characters and can use only upper case alphabet(can calculate any combination of real-time)
Calculate RT in gpu is possible, but if use lm-frt-cp437-850 charset (some national characters), RT is very big. If calculate byte character RT is very very very big (I find average 2% password from national charset)
RT can use only for limited number of hash, because cryptoanlysis time is ascertained (1 min for one hash (dependent on the length of chain), 2000 min for 2000 hash)

But LM use DES... DES is not implemented Sad
I'd wish also to see the LM implementation Smile
I would also like to see it so I can remove all the LM hashes from my not-found lists off https://www.md5decrypter.co.uk. I'd keep the input hex the same, 32 hex chars. Reason I mention this is I have seen some sites want the LM hashes as 16 hex chars which is arkward but I can see where they are coming from with LM split in 2 x 7 chars.
In the mean time, you could create a private small database containing the list of all the 16 hex chars (meaning both halves of the LM hash) from the full LM hash that has been cracked. You can then run a script that will pull out any 32 hex hash containing 1 of the 16 hex hash list. From the pulled out list, you crack any 16 hex hash that is not cracked yet and add it to the small database so that it is recognized next time. This should help reduce the amount in the not-found list.

P.S. I am currently doing a full cracking run with EGB so I should post some results in the next day or 2.
Re-visiting this old topic, one problem mastercracker is that ALL uncracked 32 char hex hashes in my database could be LM hashes!! So I need to search them all as they are in 32 hex format. Ophcrack supports the input of 32 hex BUT cannot handle the sheer amount. SAM Inside is the same and JTR is no good either. I need [32_hex_hash]:[pass] format, simple! If hashcat could support LM, it would piss it but I know it would differ from other algos due to the hash being in 2 x 16_hex parts.
To reduce a bit your list, you can look for all hashes which ends with
Code:
*AAD3B435B51404EE

It is obviously a LM hash, and the plain text is only < 7 (the second part is the null password)

If you have CUDA, you could use the Cryptohaze multiforcer - it has LM support and handles large lists nicely (including taking 32-character hashes & splitting them - I find LOTS of LM in the hashkiller files & other MD5 lists).

As for the size of the wordlist... my tool laughs at your puny 2.6M 8 byte chunks. Smile

I'll finish running standard US charset through it & get them to you somehow.
Pages: 1 2 3 4