somenxavier
somenxavier

Reputation: 1557

How to get unique IP (and number of banned times) from fail2ban logs

I have a lot banned IP from fail2ban log. This have this format:

[...]
2021-02-28 00:03:33,818 fail2ban.filter         [687]: INFO    [sshd] Found 193.142.146.33 - 2021-02-28 00:03:33
2021-02-28 00:07:17,068 fail2ban.filter         [687]: INFO    [sshd] Found 193.142.146.33 - 2021-02-28 00:07:16
2021-02-28 00:08:49,568 fail2ban.filter         [687]: INFO    [sshd] Found 142.93.234.120 - 2021-02-28 00:08:49
[...]

I want to transform that to a list of unique IP with number of banned times (with the previous example):

2 193.142.146.33
1 142.93.234.210

Comment: uniq (zcat /var/log/fail2ban.log.4.gz | grep ssh | uniq -c - | less) does not work, because the time shot is different. So I need some preprocess before call uniq.

Upvotes: 1

Views: 582

Answers (1)

Shawn
Shawn

Reputation: 52354

Assuming all the lines of the log follow the same template as those three, with no extra spaces anywhere:

zcat /var/log/fail2ban.log.4.gz | awk '{ print $8 }' | sort | uniq -c | sort -k1,1rn

Note that uniq expects its input to be sorted. The final sort in the pipeline will show the most frequently occurring addresses first.


Or using perl's Regexp::Common module to get a robust regular expression to extract all IPv4 addresses from each line:

zcat /var/log/fail2ban.log.4.gz | perl -MRegexp::Common=net -nE 'say for m/\b$RE{net}{IPv4}\b/g' | sort | uniq -c | sort -k1,1rn

Upvotes: 2

Related Questions