Please note, this is a STATIC archive of website hashcat.net from October 2020, cach3.com does not collect or store any user information, there is no "phishing" involved.

hashcat Forum

Full Version: Limiting the consecutive occurrence
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
Pages: 1 2 3 4 5 6 7 8 9 10 11 12 13
So we are limiting it to this so far...

No more than 2 instance of upper alpha consecutively.
No more than 2 of any character within a single line

with this command

Code:
sed "/\(.\)\1\1/d;/\(.\).*\1.*\1/d"

Which on letter "A" produces this

Code:
AABBCCDD
AABBCCDE
AABBCCDF
AABBCCDG
AABBCCDH
AABBCCDI
AABBCCDJ
AABBCCDK


How would you limit the total amount of times a consecutive instance can happen to say one?

Code:
AABBCCDD

So in the line above "A" is the first character to appear twice, when the character "B" appear twice, the line would be deleted. Even if the line was

Code:
ABCADEFB

because "A" is still the first character to appear twice the line would still be deleted when the "B" appear twice as the second "A" came before the second "B"

Hope this make sense Smile

Can this be done with sed?
@Pixel:: I believe this is what you want:
Code:
sed "A BUG found!" xD

I don't what you people trying to do...
I'm just the scripting guy!! xD

EDiT:
"A BUG found!"
Try this one:
Code:
sed "/\(.\)\1\1/d;/\(.\).*\1.*\1/d;/\(.\).*\1.*\(.\).*\2/d;/\(.\).*\(.\).*\1.*\2/d"
(05-28-2012, 06:43 AM)M@LIK Wrote: [ -> ]@Pixel:: I believe this is what you want:
Code:
A BUG found!

I don't what you people trying to do...
I'm just the scripting guy!! xD

EDiT:
"A BUG found!"
Try this one:
Code:
sed "/\(.\)\1\1/d;/\(.\).*\1.*\1/d;/\(.\).*\1.*\(.\).*\2/d;/\(.\).*\(.\).*\1.*\2/d"

That was quick, you are a sed scripting god M@LIK Big Grin Thanks

M@LIK, it not working as I'd hoped

Code:
AABACDEF
AABACDEG
AABACDEH
AABACDEI
AABACDEJ
AABACDEK

breaks this rule

No more than 2 of any character within a single line


Edit: I got it working but can't be sure as I don't understand the command at all but notice you added
Code:
/\(.\).*\1.*\(.\).*\2/d

and
Code:
/\(.\).*\(.\).*\1.*\2/d

so I remove one of them. The command I using that seems to work is

Code:
sed "/\(.\)\1\1/d;/\(.\).*\1.*\1/d;/\(.\).*\1.*\(.\).*\2/d"

and output is

Code:
AABCDEFG
AABCDEFH
AABCDEFI
AABCDEFJ
AABCDEFK
AABCDEFL

these extra commands have slowed it down even more lol, shame they is no GPU version of sed or even a mulit-core version Smile
@Pixel:: Yup, try the new one.

In the "EDiT".
If it doesn't work too, I'll try fixing it when I'm back.
It's 8am and I'm leaving.
Just had another idea for sed, can it also delete lines that have any 6 character alphabet sequence within them?

This is the output I'm now getting...

Code:
AABCDEFG
AABCDEFH
AABCDEFI
AABCDEFJ
AABCDEFK
AABCDEFL

so the 6 characters in an alphabet sequence within these lines would be

Code:
ABCDEF

so all the lines above would get deleted but only if it has 6?

Do you get me?
how large is one file generated by such a command? After we have the correct command test ed, I can help you with either N, T or K, depend on the file size. About time I need a larger HD too, I only have 250 GB available.

Is there a way to feed result direct to a compressor like GZIP or BZIP?

How will you upload? or share? uploading allow only 100MB or only little more each time maybe I am wrong? DO you already plan a place to store them?
Pixel Wrote: [ -> ]Edit: I got it working but can't be sure as I don't understand the command at all but notice you added
Code:
/\(.\).*\1.*\(.\).*\2/d

and
Code:
/\(.\).*\(.\).*\1.*\2/d

so I remove one of them. The command I using that seems to work is

Code:
sed "/\(.\)\1\1/d;/\(.\).*\1.*\1/d;/\(.\).*\1.*\(.\).*\2/d"

and output is

Code:
AABCDEFG
AABCDEFH
AABCDEFI
AABCDEFJ
AABCDEFK
AABCDEFL

these extra commands have slowed it down even more lol, shame they is no GPU version of sed or even a mulit-core version Smile

I added those for the rule you wanted, I don't know how you got it working without them.
And yes, the more commands, the slower.


Pixel Wrote: [ -> ]Just had another idea for sed, can it also delete lines that have any 6 character alphabet sequence within them?

Yes, of course!
But, there's no need to do that, it will be slower.
Just use "-s AABCDEGA" to skip all those candidates as they won't appear again.


ntk Wrote: [ -> ]how large is one file generated by such a command?

Huge! Hash-IT generated all the possibilities beginning with "A" without any filtration, it was 75GB, so 75x26=1950GB, let's say we can filter 25% = 487.5GB
I barely have 10gb free on my 2tb xD
Anyways, we're still working on the filter, so nothing to start with.
Wow, you guys are making some progress ! Big Grin

Well, I am sorry to report I am getting nowhere with this. I used the code I set out in my last post and left my computer running last night only to return to it in the morning and it hadn't written anything !!

I don't understand, I can use the exact same code to make 1,2,3,4,5,6 length passwords but if I try 7 or 8 nothing happens at all. Even after waiting over 10 hours !

I am using XP pro and I am starting to wonder if you two are doing this in BackTrack ? I can only guess this is a windows problem.

Can you two confirm what OS you are using to do this ? Also have you tried it on a windows machine ?

I have been learning a bit more about regular expressions and I have found code very similar to M@liks but for Regular Expressions. I am going to try to make a full set of A and then filter it afterwards as I believe that may be quicker. I will let you know if I get it working

As you both already know having to store this amount of data is going to be a problem, this really does need to be able to be done on the fly and using GPU, we need atom !!!

Good luck guys and keep up the experimenting, even though I know this needs to be done on the fly I am getting quite interested in our project ! Big Grin
(05-28-2012, 09:57 AM)ntk Wrote: [ -> ]DO you already plan a place to store them?

I figured out some time ago that if you compress a text file that has repetitive patten, like I have here with GZIP then 7ZIP it for some reason saves loads more than GZIP or 7ZIP alone. With this method of double compressing I can get all 1950GB into 667MB and is 25.6MB a letter so store them and downloading them shouldn't be a problem as for the place well torrents. but these are unfiltered.

(05-28-2012, 10:21 AM)M@LIK Wrote: [ -> ]Yes, of course!
But, there's no need to do that, it will be slower.
Just use "-s AABCDEGA" to skip all those candidates as they won't appear again.


M@LIK I don't think you get what I mean, I mean any 6 in a alphabetical order in between the first and last characters like any these

ABCDEFGA
ACDEFGHA
ADEFGHIA
AEFGHIJA
...
BRSTUVWB
CSTUVWXC
DTUVWXYD
EUVWXYZE

So all these would be deleted as well as all the others.

(05-28-2012, 12:43 PM)Hash-IT Wrote: [ -> ]Can you two confirm what OS you are using to do this ? Also have you tried it on a windows machine ?

I'm on Window 7 64-bit and is work fine Smile

Try this only take about 20mins, 1GB worth of passwords just to test

Code:
mp32.exe --start-at=HSJSKLMK --stop-at=HSSSTMET H?u?u?u?u?u?u?u | sed "/\(.\)\1\1/d;/\(.\).*\1.*\1/d">out.txt
Hash-IT Wrote: [ -> ]Well, I am sorry to report I am getting nowhere with this...

: (
I'm using Windows too, maybe your CPU is not fast enough.
sed is probably the fastest regex (Regular Expressions) editor, so no need to look for any other tool, unless it supports multi-thread or GPU (Like hashcat's rule engine).

Pixel Wrote: [ -> ]M@LIK I don't think you get what I mean, I mean any 6 in a alphabetical order in between the first and last characters like any these...

Hmm, right now I can think of anything for that, as far as I know there's nothing near to that in sed.


With all these rules, we need to write a code or a script!
It would be something like:
Code:
mp64 ?u?u?u?u?u?u?u?u | filtermeplease -c2 -x2
-cN = No more than N instance of any character consecutively.
-xN = No more than N of any character within a single line.
Awesome! xD
Pages: 1 2 3 4 5 6 7 8 9 10 11 12 13