dstandish
dstandish

Reputation: 2408

how to get logging working for EMR on EKS (i.e. emr-containers)

I am trying to get logging working with EMR on EKS (a.k.a. emr-containers).

I am using this configuration:

    "monitoringConfiguration": {
      "cloudWatchMonitoringConfiguration": {
        "logGroupName": "/emr-containers/jobs", 
        "logStreamNamePrefix": "demo"
      }, 
      "s3MonitoringConfiguration": {
        "logUri": "s3://my-bucket/my-prefix"
      }

And my execution role is correct:

  policy = jsonencode({
    "Version": "2012-10-17",
    "Statement": [
      {
        Effect: "Allow",
        Action: [
          "s3:PutObject",
          "s3:GetObject",
          "s3:ListBucket"
        ],
        Resource: "*"
      },
      {
        Effect: "Allow",
        Action: [
          "logs:PutLogEvents",
          "logs:CreateLogStream",
          "logs:DescribeLogGroups",
          "logs:DescribeLogStreams"
        ],
        Resource: [
          "arn:aws:logs:*:*:*"
        ]
      }
    ]
  })

However, when I look at cloudwatch, there is nothing in this log group, and similarly when I look at S3, there is nothing there.

It doesn't matter whether the job succeeds or fails -- in all cases there are no logs persisted either to cloudwatch or s3.

Any suggestions?

Upvotes: 2

Views: 1364

Answers (1)

Can you check the logs of the fluentd side car container that runs in the job-runner, driver and executor pods? This would be a good starting point.

Your job execution role has the proper policy for S3 and Cloudwatch, so make sure you update the trust policy of the execution role so that the EKS cluster can assume the role.

Lastly you should confirm that you completed the step to enable IRSA for the EKS cluster.

Upvotes: 4

Related Questions