Search Results
|
Post
[asc]
|
Author |
Forum |
Replies |
Views |
Posted |
|
|
Thread: Copy and reuse dictionary cache
Post: RE: Copy and reuse dictionary cache
My having explicitly said "This isn't a direct answer to your question" isn't exactly "completely ignoring" your question, yes?
The canonical solution to this problem is to not do what you're doing... |
|
royce |
hashcat
|
23 |
2,533 |
07-10-2020, 03:18 AM |
|
|
Thread: Copy and reuse dictionary cache
Post: RE: Copy and reuse dictionary cache
rli is for deduplication *across files* - see this example: https://hashcat.net/wiki/doku.php?id=hashcat_utils#rli
If you use 'split', you don't have to re-sort. Just use 'split' to take your exist... |
|
royce |
hashcat
|
23 |
2,533 |
07-10-2020, 05:50 PM |
|
|
Thread: Copy and reuse dictionary cache
Post: RE: Copy and reuse dictionary cache
rli2 is definitely faster - once you've paid the initial cost of the sorting of the input files first. but it only takes one file to be removed as input.
there's also a new project 'rling' in progre... |
|
royce |
hashcat
|
23 |
2,533 |
08-01-2020, 08:44 PM |
|
|
Thread: Copy and reuse dictionary cache
Post: RE: Copy and reuse dictionary cache
The markov flag is unrealted to the dictionary.
I've used split -n l/3 in the past and it split properly. It's OK if the resulting files are not the same size, though they are usually close in my e... |
|
royce |
hashcat
|
23 |
2,533 |
08-01-2020, 09:53 PM |
|
|
Thread: Copy and reuse dictionary cache
Post: RE: Copy and reuse dictionary cache
I don't know what you mean by the first sentence. As for hashcat loading dictionary files one by one, what should it be doing instead? |
|
royce |
hashcat
|
23 |
2,533 |
08-06-2020, 05:49 AM |
|
|
Thread: Copy and reuse dictionary cache
Post: RE: Copy and reuse dictionary cache
Splitting the wordlist into smaller chunks doesn't change the *total* load or attack time. It just distributes the dictionary load time into smaller chunks as well.
If this isn't helpful, please re... |
|
royce |
hashcat
|
23 |
2,533 |
08-11-2020, 05:59 PM |
|
|
Thread: Copy and reuse dictionary cache
Post: RE: Copy and reuse dictionary cache
|
royce |
hashcat
|
23 |
2,533 |
08-16-2020, 01:26 AM |
|
|
Thread: Cooling
Post: RE: Cooling
It sounds like you need to improve airflow in the case, or move the system to a cooler room (or both). Running cards hot is a gamble, and I'd concentrate on getting the cooling resolved rather than gu... |
|
royce |
hashcat
|
10 |
7,556 |
05-09-2018, 04:36 PM |
|
|
Thread: Contact & dump list manager
Post: RE: Contact & dump list manager
For future searchers, this thread is relevant:
https://hashcat.net/forum/thread-6796.html |
|
royce |
General Talk
|
1 |
5,146 |
11-19-2017, 03:46 AM |
|
|
Thread: complex md5 salted hash
Post: RE: complex md5 salted hash
Yes, if I understand you correctly - if you're only attacking a single salt, then appending a hyphen to the end of the salt would have the same effect. |
|
royce |
hashcat
|
1 |
2,518 |
11-30-2017, 04:57 AM |
|
|
Thread: compiling makefile github
Post: RE: compiling makefile github
https://github.com/hashcat/hashcat/blob/master/BUILD.md |
|
royce |
Beta Tester
|
8 |
13,135 |
05-08-2017, 10:25 PM |
|
|
Thread: compiling makefile github
Post: RE: compiling makefile github
Briefly:
* Don't use The-Distribution-Which-Does-Not-Handle-OpenCL-Well (Kali). Use a direct OS install.
* Don't use aircrack-ng. Get the caps with https://github.com/ZerBea/hcxtools (or better ... |
|
royce |
Beta Tester
|
8 |
13,135 |
05-08-2017, 11:55 PM |
|
|
Thread: combining sustems
Post: RE: combining sustems
You can manually divide up work using -s/--skip and -l/--limit.
There are also some frameworks that help to automate this.
https://hashcat.net/wiki/doku.php?id=frequently_asked_questions#how_can_i_d... |
|
royce |
General Talk
|
1 |
2,901 |
03-10-2017, 04:02 AM |
|
|
Thread: Combining a custom hex charset with the standard charsets
Post: RE: Combining a custom hex charset with the standa...
You have to use a list of masks that represent all possible character positions in the target. This is of course not ideal; if hashcat had support for multibyte characters, all of these workarounds wo... |
|
royce |
hashcat
|
4 |
2,417 |
01-31-2019, 06:27 PM |
|
|
Thread: Combining a custom hex charset with the standard charsets
Post: RE: Combining a custom hex charset with the standa...
|
royce |
hashcat
|
4 |
2,417 |
02-05-2019, 05:56 PM |
|
|
Thread: [Solved] Combined attack with four words
Post: RE: Combined attack with four words
Unless it is an extremely slow hash, it's easier to just try all possible combinations of the four words, using something like
https://hashcat.net/wiki/doku.php?id=princeprocessor |
|
royce |
hashcat
|
2 |
2,634 |
11-25-2017, 06:19 PM |
|
|
Thread: combine rules without duplicates?
Post: RE: combine rules without duplicates?
You could use mp64 to generate them, maybe?
https://hashcat.net/wiki/doku.php?id=rules_with_maskprocessor
Might still have to dedupe it a little after, depending |
|
royce |
hashcat
|
10 |
965 |
08-31-2020, 11:53 PM |
|
|
Thread: combine rules without duplicates?
Post: RE: combine rules without duplicates?
Dedupe of text on the command line is a largely solved problem. Depends on your platform. 'sort -u' on Unix-likes covers most use cases. On Windows, 'sort.exe /unique' seems roughly equivalent. |
|
royce |
hashcat
|
10 |
965 |
09-01-2020, 04:10 AM |
|
|
Thread: combine rules without duplicates?
Post: RE: combine rules without duplicates?
There's also this project, that tries to detect rules with redundant results:
https://github.com/0xbsec/duprule/ |
|
royce |
hashcat
|
10 |
965 |
09-01-2020, 05:04 PM |
|
|
Thread: Combinator Attack issue with rules
Post: RE: Combinator Attack issue with rules
Just a guess, but you could try combinator3 and then tack on four digits with rules ( [wordlist] [4digitwordlist] [wordlist] ?d?d?d?d) |
|
royce |
hashcat
|
10 |
6,614 |
07-08-2018, 06:51 AM |