How do I change the log level in Hadoop 2?

I am trying to change the log level in user logs, that is, the files that show up /var/log/hadoop-yarn/userlogs/application_<id>/container_<id>

on CDH 5.2.1. However, no matter what I try, only level logs appear INFO

. I want to enable level logs TRACE

for debugging.

Things I've tried so far:

  • Setting all recorders to TRACE level in /etc/hadoop/conf/log4j.properties

    .
  • Setting up mapreduce.map.log.level

    and mapreduce.reduce.log.level

    in mapred-site.xml

    .
  • Setting up mapreduce.map.log.level

    and mapreduce.reduce.log.level

    configuring the job before submitting.
  • Including log4j.properties

    in the jar file of my job that sets the Log4j root logger to TRACE.
  • Change yarn-env.sh

    for indicationYARN_ROOT_LOGGER=TRACE,console

None of them worked - they didn't break anything, but they didn't affect the log output in the directory userlogs

. Changing yarn-env.sh caused the ResourceManager and NodeManager logs to go into the trace layer. Unfortunately this is not useful for my purpose.

I am getting the following error in /var/log/hadoop-yarn/userlogs/application_<id>/container_<id>/stderr

which might be relevant.

SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/lib/zookeeper/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/tmp/hadoop-yarn/nm-local-dir/usercache/tomcat/appcache/application_1419961570089_0001/filecache/10/job.jar/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
log4j:WARN No appenders could be found for logger (org.apache.hadoop.ipc.Server).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.

      

I don't understand why the log4j "no configuration" message will happen, given that there is a file in the root of the jar file log4j.properties

that points to the root log:

log4j.rootLogger=TRACE, stdout

log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=%5p [%t] %m%n

      

My code is not knowingly using SLF4J for logging, it only uses Log4j.

+3


source to share


3 answers


The actual answer was to set it yarn.app.mapreduce.am.log.level

to the level you wanted , but crucially, it needs to be set in the Hadoop job configuration at submission time. It cannot be installed globally in the cluster. The global cluster will always be the default INFO

as it is hardcoded.

Using only container-log4j.properties

won't work as YARN will override the log level value on the command line. See Method addLog4jSystemProperties

org.apache.hadoop.mapreduce.v2.util.MRApps

and cross-reference using org.apache.hadoop.mapreduce.MRJobConfig

.



container-log4j.properties

will indeed execute, but it cannot override the level set by this property.

+5


source


On CDH5.8, I overwrite the application log layer at the time of application. To change the log level of cartographers and reducers, I had to explicitly specify them with the following command:

hasoop jar my-mapreduce-job.jar -libjars $ {LIBJARS} -Dyarn.app.mapreduce.am.log.level = DEBUG, console -Dmapreduce.map.log.level = DEBUG, console -Dmapreduce.reduce.log. level = DEBUG, console



Logs are written to the application syslog, which can be viewed using the Yarn ResourceManager web interface.

+1


source


You should try editing (or creating if it doesn't exist): / etc / hadoop / conf / container-log4j.properties.

You can see it with a simple ps aux

: the command line is -Dlog4j.configuration=container-log4j.properties

.

0


source







All Articles