Amine Ch 99
Amine Ch 99

Reputation: 143

Archiving Log files from elasticSearch and bringing them back to minimize the storage cost

please I need some answers from experienced people since it's my first time using elastic stack (Internship). Assuming that I injected logs (coming from multiple servers apache nginx ...) in elasticsearch and for sure after 1 month or maybe less elesaticsearch will be filled up of logs and this will be very expensive in terms of storage and performance, so I need to set a limit (let's assume that when the amount of logs reaches 100gb) I need to remove them from elasticsearch to free some space for new incoming logs, but I should preserve the old logs and not just delete them (I googled some solutions but all those were about deleting old logs to free space which in my case not helpful) and bring those old logs back to elasticsearch if needed. My question is there a way (an optimal way in terms of cost and performance like compressing old logs or something) to get around this with minimal cost.

Upvotes: 0

Views: 1877

Answers (2)

Vraj Bhatt
Vraj Bhatt

Reputation: 442

You can also go for ELK Lifecycle Policy which provides logging retention and retrieve with different states such as HOT, WARM & COLD etc.

Upvotes: 1

Sarmad
Sarmad

Reputation: 26

You can use snapshot and restore feature with custom repository to offload old data and retrieve it when needed. Try the following guide: https://www.elastic.co/guide/en/kibana/7.5/snapshot-restore-tutorial.html

Upvotes: 1

Related Questions