Matthieu Napoli
Matthieu Napoli

Reputation: 49553

CloudWatch Insights: get logs of errored lambdas

A lambda can have a result that is either a success or an error.

I want to see the logs of lambda that errored. I am trying to do that via a CloudWatch Insights query.

How can I do this?

Upvotes: 25

Views: 36307

Answers (6)

Jatin Mehrotra
Jatin Mehrotra

Reputation: 11523

[2023 July update]

There is a new command with CLoudwatch log insights pattern which uses ML behind the scenes to automatically cluster your log data into patterns.

The pattern command uses AWS Machine Learning algorithms to automatically recognize patterns in log data, aggregate related logs and summarize thousands of log lines into a few easy to visualize groupings. Pattern helps customers quickly surface emerging trends, monitor known errors, increase cost visibility by identifying frequently occurring log lines and more.

filter @message like /ERROR/
| pattern @message

for specific error messages based on your logging statements

filter @message like /ERROR/
| parse @message 'Failed to do: *' as cause
| pattern cause
| sort @sampleCount asc

Upvotes: 1

Akif
Akif

Reputation: 6786

If anyone looking how to search an error or a log in AWS Log insights, can use this query to search:

fields @timestamp, @message
| filter @message like /text to search/
| sort @timestamp desc
| limit 20

Actually just selecting log group(s) and adding a new line as | filter @message like /text to search/ into the query editor is enough. The rest comes by default.

Also, keep in mind to configure the time span for the search history in case if you cannot find the relevant results. By default, it only searches for the last 1h.

AWS Logs insights

Upvotes: 2

Varunaditya J
Varunaditya J

Reputation: 529

If someone comes here looking for a solution, here's what I use:

filter @message like /(?i)(Exception|error|fail)/| fields @timestamp, @message | sort @timestamp desc | limit 20

Upvotes: 52

Lior Goldemberg
Lior Goldemberg

Reputation: 876

you can run the following query in the CloudWatch Logs Insights.

filter @type = "REPORT"
    | stats max(@memorySize / 1000 / 1000) as provisonedMemoryMB,
        min(@maxMemoryUsed / 1000 / 1000) as smallestMemoryRequestMB,
        avg(@maxMemoryUsed / 1000 / 1000) as avgMemoryUsedMB,
        max(@maxMemoryUsed / 1000 / 1000) as maxMemoryUsedMB,
        provisonedMemoryMB - maxMemoryUsedMB as overProvisionedMB
    

Upvotes: -3

Adiii
Adiii

Reputation: 59946

I use the below query to get those errors which are not covered by the query mentioned in answer and I can only see failure in monitoring dashboard.

fields @timestamp, @message
| sort @timestamp desc
| filter @message not like 'INFO' 
| filter @message not like 'REPORT'
| filter @message not like 'END'
| filter @message not like 'START'
| limit 20

Here is some example that cover by this query

timeout

@ingestionTime  
1600997135683
@log    
060558051165:/aws/lambda/prod-
@logStream  
2020/09/25/[$LATEST]abc
@message    
2020-09-25T01:25:35.623Z d0801056-abc-595a-b67d-47b14d3e9a20 Task timed out after 30.03 seconds
@requestId  
d0801056-abc-595a-b67d-47b14d3e9a20
@timestamp  
1600997135623

innovation error

@ingestionTime  
1600996797947
@log    
060558051165:/aws/lambda/prod-****
@logStream  
2020/09/25/[$LATEST]123
@message    
2020-09-25T01:19:48.940Z 7af13cdc-74fb-5986-ad6b-6b3b33266425 ERROR Invoke Error {"errorType":"Error","errorMessage":"QueueProcessor 4 messages failed processing","stack":["Error:QueueProcessor 4 messages failed processing"," at Runtime.handler (/var/task/lambda/abc.js:25986:11)"," at process._tickCallback (internal/process/next_tick.js:68:7)"]}
@requestId  
7af13cdc-74fb-5986-ad6b-6b3b33266425
@timestamp  
1600996788940
errorMessage    
QueueProcessor 4 messages failed processing
errorType   
Error
stack.0 
Error: QueueProcessor 4 messages failed processing
stack.1 
at Runtime.handler (/var/task/lambda/abcBroadcast.js:25986:11)
stack.2 
at process._tickCallback (internal/process/next_tick.js:68:7)

another example with node run time

Value
@ingestionTime  
1600996891752
@log    
060558051165:/aws/lambda/prod-
@logStream  
2020/09/24/[$LATEST]abc
@message    
2020-09-25T01:21:31.213Z 32879c8c-abcd-5223-98f9-cb6b3a192f7c ERROR (node:6) [DEP0005] DeprecationWarning: Buffer() is deprecated due to security and usability issues. Please use the Buffer.alloc(), Buffer.allocUnsafe(), or Buffer.from() methods instead.
@requestId  
32879c8c-7242-5223-abcd-cb6b3a192f7c
@timestamp  
1600996891214

Upvotes: 29

ketcham
ketcham

Reputation: 932

In your console, navigate to your lambda's configuration page. In the top left, click Monitoring, then View logs in CloudWatch on the right.

Upvotes: -2

Related Questions