01-20-2017, 11:42 PM
I'm trying to understand why I see a significant increase in time to complete as the list of hashes gets larger, and hopefully get some guidance on how to improve performance. Since I've seen posts referencing very large lists of hashes (up to 'in the millions'), I've been surprised to see performance impacts as I increase the list of hashes into the thousands. The noob in me thinks that comparing each generated hash to the list of candidates should not be this big of a bottleneck. I get that comparing the hash to one hash vs 2000 requires more compute time - just looking for a head check if I'm missing anything here - or if there are performance tweaks (or maybe other approaches like splitting the file up) I should look for.
My gear:
1U Supermicro box
64GB RAM
4x Tesla P100-PCIE-16GB
2x Xeon CPU E5-2640 v4 @ 2.40GHz
Sample speeds:
using basic wordlist attack - rockyou with best64 rule
50 md5crypt hashes - estimated time to complete = 27 mins
2063 md5crypt hashes - estimated time to complete = 18 hours 37 mins
Thanks
My gear:
1U Supermicro box
64GB RAM
4x Tesla P100-PCIE-16GB
2x Xeon CPU E5-2640 v4 @ 2.40GHz
Sample speeds:
using basic wordlist attack - rockyou with best64 rule
50 md5crypt hashes - estimated time to complete = 27 mins
2063 md5crypt hashes - estimated time to complete = 18 hours 37 mins
Thanks