09-14-2015, 05:30 PM
I was curious so checked hashing algorithms. I ran some test but I'm confused now.
First, I made a program with C#(Okay, not the best one for this),
It search an SHA1 string from a simple word dictionary. To do this, It has to hash all word real time, and compare with the target hash.
It checks average 590k pw/sec.
Then, I wanted to test the same with GPU.
CudaHashCat checks average 5675k pw/sec.
With different method(brute force) it checks 3850M pw/sec.
The question is, are these realistic data, or did I do something wrong? What would be realistic?
CPU-GPU comparison(dictionary attack) might be good with 10x, but the brute force with nearly x1000 is a little bit too much.
All test use SHA1 hashing algorithm.
Spec: i7-4790k(4.40 GHz), GTX 970, 16GB RAM
First, I made a program with C#(Okay, not the best one for this),
It search an SHA1 string from a simple word dictionary. To do this, It has to hash all word real time, and compare with the target hash.
It checks average 590k pw/sec.
Then, I wanted to test the same with GPU.
CudaHashCat checks average 5675k pw/sec.
With different method(brute force) it checks 3850M pw/sec.
The question is, are these realistic data, or did I do something wrong? What would be realistic?
CPU-GPU comparison(dictionary attack) might be good with 10x, but the brute force with nearly x1000 is a little bit too much.
All test use SHA1 hashing algorithm.
Spec: i7-4790k(4.40 GHz), GTX 970, 16GB RAM