agrawal1084
agrawal1084

Reputation: 139

what is the right way to extract rest api's in logstash

I have this kind of log 21.4.1.2 - - [28/Dec/2016:12:18:40 +0000] "GET a/b/c/d/e/f HTTP/1.1" 200 984072 "Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/55.0.2883.87 Safari/537.36" 0.104 0.103 . Now how should I extract this using grok pattern ? I dont know the no of fields also i.e. rest api can be a/b/c also and a/b/c/d/e/f/g also. How should I handle it so that I can group by a,b or c in kibana.

Upvotes: 1

Views: 715

Answers (2)

Derrick
Derrick

Reputation: 1606

There is a %{GREEDYDATA:value} grok template that you can use to extract the API path part, from there you could split on "/". This tool can be useful when debugging grok patterns http://grokdebug.herokuapp.com/.

So start with:

%{IP:clientip} \- \- \[%{NOTSPACE:date} \+%{INT}\] \"%{WORD:action} %{GREEDYDATA:api} %{WORD:protocol}/%{NUMBER:protocolNum}\" %{NUMBER:status} %{NUMBER} %{QUOTEDSTRING} %{NUMBER} %{NUMBER}

Which will give you the api path in the api field.

Alternatively, we are working on Moesif which is an API debug and analytics tool (https://www.moesif.com/features) which may be helpful for you depending on what you require. (Full disclosure, I am the CEO)

Upvotes: 0

Alain Collins
Alain Collins

Reputation: 16362

If there's a known depth, you could re-grok the URL field into those fields.

If there's an arbitary depth, mutate-split could make an array of them, but they wouldn't be useful.

How about the csv{} filter, which could take "/" as the separator and would produce you a bunch of fields called "column1", "column2", etc?

Upvotes: 1

Related Questions