I am trying to configure my log output using Log4j and a log4j.properties file, but I am having trouble setting up colored output based on log level.
It appears that the logging utilities simply do not parse the %highlight ... conversion command.
I can tell that I am on the right track with the configuration, as the other parts of the pattern are working correctly.
I am basing my current code off of this S/O answer: log4j 2 adding multiple colors to console appender
I am assuming it is something simple that I am not seeing, as I am rather new to this. Unfortunately the documentation is hard to piece together, or is meant for xml based configurations.
Thanks for any help in advance.
Some resources I have been looking through already:
http://logging.apache.org/log4j/1.2/manual.html
https://logging.apache.org/log4j/1.2/apidocs/org/apache/log4j/PatternLayout.html
http://saltnlight5.blogspot.com/2013/08/how-to-configure-slf4j-with-different.html
http://logging.apache.org/log4j/2.x/manual/layouts.html
Here are the relevant files/code:
log4j.properties:
log4j.rootLogger=TRACE, STDOUT
log4j.logger.deng=INFO
log4j.appender.STDOUT=org.apache.log4j.ConsoleAppender
log4j.appender.STDOUT.layout=org.apache.log4j.PatternLayout
log4j.appender.STDOUT.layout.ConversionPattern=%highlight{%d %-5p [%t] (%F:%L) - %m%n}{FATAL=red, ERROR=red, WARN=yellow bold, INFO=black, DEBUG=green bold, TRACE=blue}
Gradle dependencies:
dependencies {
// The production code uses the SLF4J logging API at compile time
compile 'org.slf4j:slf4j-api:1.7.+'
compile 'org.slf4j:slf4j-log4j12:1.7.+'
testCompile 'junit:junit:4.12'
}
Test to test logging:
#Test
public void testLogger(){
LOGGER.info("Info level log");
LOGGER.debug("Debug level log");
LOGGER.warn("Warn level log");
LOGGER.error("Error level log");
LOGGER.trace("Trace level log");
}
Output:
log4j:ERROR Unexpected char [h] at position 2 in conversion patterrn.
%highlight{2017-07-08 13:18:21,243 INFO [main] (LongLinkedListTest.java:40) - Info level log
}{FATAL=red, ERROR=red, WARN=yellow bold, INFO=black, DEBUG=green bold, TRACE=blue}%highlight{2017-07-08 13:18:21,246 DEBUG [main] (LongLinkedListTest.java:41) - Debug level log
}{FATAL=red, ERROR=red, WARN=yellow bold, INFO=black, DEBUG=green bold, TRACE=blue}%highlight{2017-07-08 13:18:21,246 WARN [main] (LongLinkedListTest.java:42) - Warn level log
}{FATAL=red, ERROR=red, WARN=yellow bold, INFO=black, DEBUG=green bold, TRACE=blue}%highlight{2017-07-08 13:18:21,246 ERROR [main] (LongLinkedListTest.java:43) - Error level log
}{FATAL=red, ERROR=red, WARN=yellow bold, INFO=black, DEBUG=green bold, TRACE=blue}%highlight{2017-07-08 13:18:21,246 TRACE [main] (LongLinkedListTest.java:44) - Trace level log
}{FATAL=red, ERROR=red, WARN=yellow bold, INFO=black, DEBUG=green bold, TRACE=blue}
Process finished with exit code 0
Final results:
For posterity, I ended up with the following:
Gradle Dependencies:
dependencies {
//logging dependencies
compile 'org.apache.logging.log4j:log4j-slf4j-impl:2.+'
compile 'org.apache.logging.log4j:log4j-api:2.+'
compile 'org.apache.logging.log4j:log4j-core:2.+'
compile 'com.fasterxml.jackson.dataformat:jackson-dataformat-yaml:2+'
compile 'com.fasterxml.jackson.core:jackson-databind:2+'
...
}
Once I updated dependencies, I was able to move to a yaml based configuration, and using the %highlight notation worked like a charm!
If anyone wishes to see the final configuration, it can be found here: https://github.com/Epic-Breakfast-Productions/OWAT/blob/master/implementations/java/src/main/resources/log4j2.yaml
Is it possible that you're mixing up Log4j2 with log4j 1.x? The log4j:ERROR error is an legacy log4j 1.2 error message.
Please simplify matters by only using Log4j2 dependencies and using a Log4j2 configuration. The configuration file is called log4j2.xml by default.
Please remove the org.slf4j:slf4j-log4j12:1.7.+ dependency and instead use log4j-api-2.x, log4j-core-2.x and log4j-slf4j-impl-2.x.
Useful links:
Which jars
Console appender
PatternLayout with colors and highlighting
Configuration
Log4j 1.2 became End of Life in summer 2015. Let's upgrade! :-)
I use log4j 2.7 in my product. I configured log4j to use a RollingFile appender with the following pattern:
%d - %-5p - [%t] - (%C::%M[%L]) - %notEmpty{%marker: }%m%n
When I compile it inside the IDE the line number is set. When I use an ant task the line number is not set (because the build artifact should not include debug information) and log4j prints out -1 as line number.
Is it possible to hide the -1 in brackets when the line number is not set?
Example:
LogManager.getLogger().debug("This is a message");
the result is
2016-11-02 14:22:53,481 - DEBUG - [JavaFX Application Thread] - (Main::start[-1]) - This is a message
but it should be
2016-11-02 14:22:53,481 - DEBUG - [JavaFX Application Thread] - (Main::start) - This is a message
running my selenium suite with Intellij IDEA I'm facing this issue:
I configured correctly log4j using this properties file
# Root logger option
log4j.rootLogger=INFO, stdout
# Redirect log messages to console
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=%d %p [%c{1} : %L] - %m%n
logs are following the pattern but are written ALL IN ONE LINE in this way:
2016-05-03 11:34:05,612 INFO [CreateTicketPage : 68] - clicked on save button2016-05-03 11:34:05,627 INFO [TicketDetailsViewPage : 27] - Ticket Details View.page.successfully.loaded2016-05-03 11:34:07,452 WARN [AddTransactionPage : 44] - Add Transaction.page.successfully.loaded2016-05-03 11:33:52,032 INFO [TestRunner : 25] - executing TestRunner.setup
%n seems ignored!! but running the tests using just mvn clean install in the command line everything look perfect. any suggestion?? is this an Intellij related issue??
I have a scala app that is running in docker container. I use image 'develar/java' which is based on alpine linux. My app is working, but i don't see cyrillic logs. Here what i have:
docker logs -f myApp
22:22:08.152 [main] INFO application - Creating Pool for datasource 'default'
22:22:09.213 [main] INFO play.api.db.DefaultDBApi - Database [default] connected at jdbc:postgresql://localhost/db
22:22:09.627 [main] INFO p.a.l.concurrent.ActorSystemProvider - Starting application default Akka system: application
22:22:09.698 [main] INFO application - ????????????? ??????? ???????
22:22:09.722 [main] INFO application - ????????????? ??????? 'direct
22:22:09.734 [main] INFO application - ????????????? ??????? 'adwords
22:22:09.761 [main] INFO play.api.Play$ - Application started (Prod)
22:22:09.866 [main] INFO play.core.server.NettyServer$ - Listening for HTTP on /0:0:0:0:0:0:0:0:9000
But logs that are delivered to elasticsearch server are ok. How can i force alpine linux to work with utf-8?
develar/java has an old bug of an old glibc 2.21 package. Andy Shinn (the creator and maintainer of glibc package for Alpine) and I have resolved this a long time ago in glibc 2.23 packaging, which I have integrated into frolvlad/alpine-glibc, which is the base image for frolvlad/alpine-oraclejre8. Just replace devalar/java with frolvlad/alpine-oraclejre8:slim and you should be fine.
I am trying to get centralised logging working with log4j and rsyslog.
What I have so far
Solr running inside tomcat6 on RHEL6, using the following log4j and sl4j libs
# lsof -u tomcat | grep log4j
java 14503 tomcat mem REG 253,0 9711 10208 /usr/share/java/tomcat6/slf4j-log4j12-1.6.6.jar
java 14503 tomcat mem REG 253,0 481535 10209 /usr/share/java/tomcat6/log4j-1.2.16.jar
java 14503 tomcat mem REG 253,0 378088 1065276 /usr/share/java/log4j-1.2.14.jar
java 14503 tomcat 20r REG 253,0 378088 1065276 /usr/share/java/log4j-1.2.14.jar
java 14503 tomcat 21r REG 253,0 481535 10209 /usr/share/java/tomcat6/log4j-1.2.16.jar
java 14503 tomcat 35r REG 253,0 9711 10208 /usr/share/java/tomcat6/slf4j-log4j12-1.6.6.jar
#
Solr is using the following log4j.properties file (via -Dlog4j.configuration=file:///opt/solr/lib/log4j.properties)
# Logging level
log4j.rootLogger=INFO, file, CONSOLE, SYSLOG
log4j.appender.CONSOLE=org.apache.log4j.ConsoleAppender
log4j.appender.CONSOLE.layout=org.apache.log4j.PatternLayout
log4j.appender.CONSOLE.layout.ConversionPattern=%-4r [%t] %-5p %c %x \u2013 %m%n
#- size rotation with log cleanup.
log4j.appender.file=org.apache.log4j.RollingFileAppender
log4j.appender.file.MaxFileSize=4MB
log4j.appender.file.MaxBackupIndex=9
#- File to log to and log format
log4j.appender.file.File=/var/log/tomcat6/solr.log
log4j.appender.file.layout=org.apache.log4j.PatternLayout
log4j.appender.file.layout.ConversionPattern=%-5p - %d{yyyy-MM-dd HH:mm:ss.SSS}; %C; %m\n
log4j.logger.org.apache.zookeeper=WARN
log4j.logger.org.apache.hadoop=WARN
# set to INFO to enable infostream log messages
log4j.logger.org.apache.solr.update.LoggingInfoStream=OFF
#- Local syslog server
log4j.appender.SYSLOG=org.apache.log4j.net.SyslogAppender
log4j.appender.SYSLOG.syslogHost=localhost
log4j.appender.SYSLOG.facility=LOCAL1
log4j.appender.SYSLOG.layout=org.apache.log4j.PatternLayout
log4j.appender.SYSLOG.layout.ConversionPattern=${sysloghostname} %-4r [%t] java %-5p %c %x %m%n
log4j.appender.SYSLOG.Header=true
On the same server I have rsyslog running and accepting log messages from log4j.
# rpmquery -a | grep syslog
rsyslog-5.8.10-7.el6_4.x86_64
#
rsyslog config
# #### MODULES ####
$MaxMessageSize 32k
$ModLoad imuxsock # provides support for local system logging (e.g. via logger command)
$ModLoad imklog # provides kernel logging support (previously done by rklogd)
$ModLoad imfile # provides file monitoring support
#
$ModLoad imudp.so
$UDPServerRun 514
$WorkDirectory /var/lib/rsyslog # where to place spool files
# #### GLOBAL DIRECTIVES ####
# # Use default timestamp format
$ActionFileDefaultTemplate RSYSLOG_TraditionalFileFormat
$IncludeConfig /etc/rsyslog.d/*.conf
$ActionQueueType LinkedList # run asynchronously
$ActionQueueFileName fwdRule1 # unique name prefix for spool files
$ActionQueueMaxDiskSpace 1g # 1gb space limit (use as much as possible)
$ActionQueueSaveOnShutdown on # save messages to disk on shutdown
$ActionResumeRetryCount -1 # infinite retries if host is down
$ActionSendStreamDriverMode 0 # require TLS for the connection
$ActionSendStreamDriverAuthMode anon # chain and server are verified
#local1.*;*.* ##(o)XXXXXXXX:5544
local1.* /var/log/remote.log
# # The authpriv file has restricted access.
authpriv.* /var/log/secure
# # Log all the mail messages in one place.
mail.* -/var/log/maillog
# # Log cron stuff
cron.* /var/log/cron
# # Everybody gets emergency messages
*.emerg *
# # Save news errors of level crit and higher in a special file.
uucp,news.crit /var/log/spooler
# # Save boot messages also to boot.log
local7.* /var/log/boot.log
I am catching local1 messages from Solr's logj4 and redirecting them to /var/log/remote.log
Everything works as expected. Sample INFO message
Oct 31 13:57:08 hostname.here 3431839 [http-8080-10] java INFO org.apache.solr.core.SolrCore [collection1] webapp=/solr path=/select params={indent=true&q=*:*&wt=json&rows=1} hits=42917 status=0 QTime=1
And stack traces are on the same line as the ERROR message
Oct 31 12:27:17 hostname.here 157666248 [http-8080-7] java ERROR org.apache.solr.core.SolrCore org.apache.solr.common.SolrException: undefined field *#012#011at org.apache.solr.schema.IndexSchema.getDynamicFieldType(IndexSchema.java:1223)#012... Cut for brevity....#011at java.lang.Thread.run(Thread.java:724)#012
Note #012 as line ending and #011 tab.
Using this setup I can ship the logs to remote rsyslog server over TCP and pipe them into fluentd/elaticsearch/kibana etc... everything works as expected.
The problem
I am now trying to get another webapp running inside the same tomcat container to log as above, everything works as expected apart from stack traces, each line of a stack trace ends up on a separate line (separate syslog message)
Oct 31 12:54:47 hostname.here 4909 [main] java ERROR org.hibernate.tool.hbm2ddl.SchemaUpdate could not get database metadata
Oct 31 12:54:47 hostname.here org.apache.commons.dbcp.SQLNestedException: Cannot create PoolableConnectionFactory (Communications link failure
Oct 31 12:54:47 hostname.here
Oct 31 12:54:47 hostname.here The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.)
The webapp ships with it's own log4j libs and log4j.xml config. Libs are of the same version as those used by solr.
log4j.xml file for this app
<appender name="SYSLOG" class="org.apache.log4j.net.SyslogAppender">
<param name="SyslogHost" value="localhost" />
<param name="Facility" value="LOCAL1" />
<param name="Header" value="false" />
<property name="facilityPrinting" value="false"/>
<param name="Threshold" value="DEBUG" />
<layout class="org.apache.log4j.PatternLayout">
<param name="ConversionPattern"
value="%-4r [%t] java %-5p %c %x %m%n"/>
</layout>
</appender>
I would like to see stack traces from the new application to appear on the same line just like with Solr.
Does anyone know if this is a log4j config issue?
Many thanks.
I've been working on something similar lately (in fact, I have this question open presently you may be able to help with).
This is probably not a very good answer, but it's more information than I can fit in a comment so here you go (I hope some of it is new to you). The rsyslog imfile rsyslog documentation has this section:
ReadMode [mode]
This mode should defined when having multiline messages. The value can range from 0-2 and determines the multiline detection method.
0 (default) - line based (Each line is a new message)
1 - paragraph (There is a blank line between log messages)
2 - indented (New log messages start at the beginning of a line. If a line starts with a space it is part of the log message before it)
The imudp rsyslog docs have no such configuration option. My guess is that the UDP input module doesn't support multiline logging. Thus, each line of the stack trace is sent out as a separate log entry.
Do you have any configuration files in /etc/rsyslog.d? There may be more information in there.