Anand Shaw
Anand Shaw

Reputation: 229

How to specify different log4j.properties file for different flink job running on same standalone cluster

I have multiple flink job running on a standalone cluster. I want different log files for different flink job. So how can I pass different log4j.properties file while submit flink job.

Upvotes: 3

Views: 4294

Answers (2)

alghimo
alghimo

Reputation: 2899

According to the Flink documentation (of the latest version), you can just pass the log4j / logback file to use when you submit, here's the link: https://ci.apache.org/projects/flink/flink-docs-master/monitoring/logging.html

In short, you can provide "-Dlog4j.configuration=/path/to/log4j.properties" or "-Dlogback.configurationFile=/path/to/logback.xml". You could also just configure different loggers for every job, so you could keep a single "logback"/"log4j" file. Here's an example with logback:

<!-- logback.xml -->
<configuration>
    <property name="LOG_HOME" value="/path/to/logs" />

    <appender name="JOB1"
              class="ch.qos.logback.core.rolling.RollingFileAppender">
        <file>${LOG_HOME}/job1/job.out</file>
        <encoder class="ch.qos.logback.classic.encoder.PatternLayoutEncoder">
            <Pattern>
                %d{yyyy-MM-dd HH:mm:ss} - %msg%n
            </Pattern>
        </encoder>

        <rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
            <!-- rollover hourly -->
            <fileNamePattern>${LOG_HOME}/job1/%d{yyyyMMdd_HH}.%i.log</fileNamePattern>
            <timeBasedFileNamingAndTriggeringPolicy
                    class="ch.qos.logback.core.rolling.SizeAndTimeBasedFNATP">
                <maxFileSize>1MB</maxFileSize>
            </timeBasedFileNamingAndTriggeringPolicy>
        </rollingPolicy>
    </appender>

    <logger name="com.yourcompany.job1" level="INFO" additivity="false">
        <appender-ref ref="JOB1" />
    </logger>

    <!-- rest of configs -->
</configuration>

And in your code (example with Scala, it's pretty much the same with Java):

import org.slf4j.LoggerFactory

private final val logger = LoggerFactory.getLogger("com.yourcompany.job1")
logger.info("some message")

Cheers

Upvotes: 0

Derlin
Derlin

Reputation: 9881

As of now, there is no simple way to do it, since flink always load the files under flink/conf.

If you use the one-yarn-cluster-per-job mode of flink (i.e. you launch your scripts with: flink run -m yarn-cluster ...), here is a workaround :

  1. copy the flink/conf directory to a custom location used only for your job
  2. modify the log4j.properties or any other configuration file
  3. before launching your job, run export FLINK_CONF_DIR=/path/to/my/conf

Depending on your version of flink, check the file flink/bin/config.sh. If your run accross this line:

FLINK_CONF_DIR=$FLINK_ROOT_DIR_MANGLED/conf

change it with:

if [ -z "$FLINK_CONF_DIR" ]; then 
    FLINK_CONF_DIR=$FLINK_ROOT_DIR_MANGLED/conf; 
fi

If you find another way, please share it with us.

Upvotes: 1

Related Questions