Reputation: 8178
I'm currently using aws logs filter-log-events --log-group-name /aws/lambda/lambda-name --region us-east-1
to get logs from a lambda, but the logs that come back are quite.... extensive.
For example:
{
"ingestionTime": *,
"timestamp": *,
"message": "START RequestId: * Version: $LATEST\n",
"eventId": "*",
"logStreamName": "2018/10/26/[$LATEST]*"
}...
Can I get just the messages out with only a bash command that fits in the npm script? Maybe with grep or find.
Upvotes: 2
Views: 1749
Reputation: 2375
To get the specific attributes in the logs returned by the filter-log-events command you can use jq. Here is an example that I did in windows powershell.
aws logs filter-log-events --log-group-name <yourLogGroup> --region <yourRegion> | jq '.events[].message'
There is also a --filter-pattern
parameter which there are some examples for here
If the command needs to get the last few days it can use the --start-time
and --end-time
parameters of the filter-log-events command.
To have a real time subscription of the CloudWatch logs, the project can use the put-subscription-filter command to write the logs to another Lambda function to process them. Here is an example function in nodejs:
var zlib = require('zlib');
exports.handler = function(input, context) {
var payload = new Buffer(input.awslogs.data, 'base64');
zlib.gunzip(payload, function(e, result) {
if (e) {
context.fail(e);
} else {
result = JSON.parse(result.toString('ascii'));
console.log("Event Data:", JSON.stringify(result, null, 2));
context.succeed();
}
});
};
Upvotes: 3