natdev
natdev

Reputation: 577

How to aggregate logs to a persistent table in splunk?

My web application logs every user action.
Every log entry contains the user id, the action (click, double-click etc...), timestamp and a short description.
The logs for a specific user are stored for a few days, hence I need to aggregate them to a processed report / data.
I want to collect (and eventually display) a specific action (let's say double click) of each user and its description.

For example, I want a table that gets updated for every log (or a few logs with some delay),
that aggregates the data of a userId, the timestamp of all of his double clicks, the count of double
clicks and the description for each double click.

How can I solve this?
What tools does splunk offer for something like aggregating log streams that gets removed?

Upvotes: 1

Views: 467

Answers (2)

Simon Duff
Simon Duff

Reputation: 2651

As RichG said, you can configure Splunk to retain that data for as long as you require. However, if you want to only retain certain elements, you may want to look at summary indexing in Splunk https://docs.splunk.com/Documentation/Splunk/8.0.0/Knowledge/Usesummaryindexing

Upvotes: 0

RichG
RichG

Reputation: 9926

Splunk is a tool that aggregates log streams. Forward your web application logs to Splunk and they will stay there until you run out of disk space or they age out (default time is 7 years), even if the original source disappears. Once you have the data in Splunk you can report on it as you've described.

Upvotes: 1

Related Questions