Reputation: 2025
I have a grafana Loki logs that's in my cluster. I am able to see my logs but as at the moment, the cluster is no longer in use and I would like to delete it but I still have some logs I would like to extract Loki and maybe store it locally on my system, or Azure bucket.
Is there a way to extract this logs and save locally or azure bucket. I used loki helm to setup my Loki, promethus any help is appreciated
Upvotes: 9
Views: 19474
Reputation: 11085
You can "download" complete logs with this command:
logcli query
--from="2024-01-01T00:00:00Z"
--to="2024-07-31T00:00:00Z"
--forward
--part-path-prefix .
--parallel-duration=15m
--parallel-max-workers=2
--merge-parts
'{job="foo"}' > foo.log
This ignores the --limit
option and downloads all the logs between the given --from
and --to
. Order of the logs will be --forward
(oldest first). It will download the logs with 2 workers in parallel. It will use temporary file stored in the given --part-path-prefix
directory and --merge-parts
when all parts are done. The parts are splitted to 15 minutes queries.
Maybe you have to play around with the --parallel-duration
and --parallel-max-workers
to download the logs faster.
Have a look at logcli help query
or https://grafana.com/docs/loki/latest/query/logcli/#query-command-reference.
Upvotes: 2
Reputation: 17804
Grafana Loki limits the number of log lines it can return in a single query response. By default this limit is set to 5000. This limit applies both to the HTTP query API and to logcli. This limit may prevent from exporting all the logs from Grafana Loki if it contains billions of log lines :(
You can use an alternative database for logs I work on, which allows querying all the logs in a streaming manner with a simple *
query according to these docs.
Upvotes: -2
Reputation: 8437
You could use logcli
to connect to Loki. See this documentation.
Example command:
port-forward <my-loki-pod> 3100
logcli query '{ job="foo"}' --limit=5000 --since=48h -o raw > mylog.txt
Upvotes: 15