06-18-2014, 10:33 PM
Heya,
For my special use case brute-force doesn't work as good as a wordlist. My list is dozens of GB already and every now and then I add new lists of a few gigs to the old list and do a simple "sort -u oldlist.txt > newlist.txt" to remove the duplicates.
Hashcat works great with such big lists, but managing the list (adding new entries without storing all the duplicates) is a pain and takes a lot of time.
Are there some best practices to manage wordlists of this size? Maybe using a NoSQL-DB like LevelDB?
For my special use case brute-force doesn't work as good as a wordlist. My list is dozens of GB already and every now and then I add new lists of a few gigs to the old list and do a simple "sort -u oldlist.txt > newlist.txt" to remove the duplicates.
Hashcat works great with such big lists, but managing the list (adding new entries without storing all the duplicates) is a pain and takes a lot of time.
Are there some best practices to manage wordlists of this size? Maybe using a NoSQL-DB like LevelDB?