Kavya shree
Kavya shree

Reputation: 446

Pyspark - spark-submit logging for both driver and executor

New to PySpark, I am using spark-submit to execute the program and logging.config package to log the executor logs to a file and exception to email errors. But logging is not working, nothing is written to config file and exception is not sent. Is there a way in pyspark, spark-submit?

Config file:

{
    "version": 1,
    "disable_existing_loggers": false,
    "formatters": {
        "standard": {
            "format": "%(asctime)s [%(levelname)s] %(name)s: %(message)s"
        }
    },
    "handlers": {
        "smtp": {
          "toaddrs": [
            "[email protected]"
          ],
          "level": "ERROR",
          "fromaddr": "[email protected]",
          "mailhost": [
            "apm",
            25
          ],
          "formatter": "standard",
          "class": "logging.handlers.SMTPHandler", 
          "subject": "Script Exception"
        },
        "console": {
            "formatter": "standard",
            "class": "logging.StreamHandler",
            "stream": "ext://sys.stdout",
            "level": "ERROR"
        },
        "file": {
            "filename": "/var/log/executor.log",
            "formatter": "standard",
            "class": "logging.FileHandler",
            "level": "DEBUG"
        }
    },
    "loggers": {
        "": {
            "level": "DEBUG",
            "propagate": true,
            "handlers": [
                "file",
                "smtp"
                ]
        }
    }
}

File is executed in background

enter image description here

Upvotes: -1

Views: 28

Answers (0)

Related Questions