I am using slf4j and logback for logging in my java web application. I need the info logs from a specific class (MyClass in the example below) to be sent in an email. I configured an email appender in logback. The rest of it can go wherever the root logger is set to. But the email doesn't go out with my current set up. See below...
Set Up:
Here's the relevant information about jar versions and other setup for this:
jars in the classpath:
activation-1.1.jar
jcl-over-slf4j-1.7.25.jar
logback-classic-1.2.3.jar
logback-core-1.2.3.jar
logback-ext-spring-0.1.4.jar
logstash-logback-encoder-4.11.jar
mail-1.4.jar
slf4j-api-1.7.25.jar
logback.xml from classpath:
<appender name="console" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>%d [%thread] %-5level %logger{5}:%L - %msg%n</pattern>
</encoder>
</appender>
<appender name="email" class="ch.qos.logback.classic.net.SMTPAppender">
<filter class="ch.qos.logback.classic.filter.ThresholdFilter">
<level>info</level>
</filter>
<smtpHost>smtp.server</smtpHost>
<to>code4kix#email.com</to>
<from>do-not-reply#email.com</from>
<subject>code4kix - ${HOSTNAME}: %logger{20} - %m</subject>
<layout class="ch.qos.logback.classic.PatternLayout">
<pattern>%d [%thread] %-5level %logger{5}:%L - %msg%n</pattern>
</layout>
<!-- <STARTTLS>true</STARTTLS> -->
<!-- <cyclicBufferTracker class="ch.qos.logback.core.spi.CyclicBufferTracker"> -->
<!-- <bufferSize>1</bufferSize> -->
<!-- </cyclicBufferTracker> -->
<!-- <asynchronousSending>false</asynchronousSending> -->
</appender>
<root level="error">
<appender-ref ref="console" />
</root>
<logger name="mypackage.MyClass" level="info" additivity="true">
<appender-ref ref="email"/>
</logger>
The Issue:
The email seems to go out fine if I have logger.error statements in MyClass.java, but if they have just logger.info, the email doesn't go out... despite configuring the threshold to info!
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
public class MyClass
{
private static Logger logger = LoggerFactory.getLogger(MyClass.class);
public void myMethod()
{
logger.error("using this sends the email out");
logger.info("using this doesn't");
}
}
I do get this in the console log, but the email never goes out. What could possibly be wrong?
SystemOut O 13:18:00,012 |-INFO in ch.qos.logback.classic.net.SMTPAppender[email] - SMTPAppender [email] is tracking [1] buffers
From my recent reading of the logback it sounds as though only under ERROR will an email be sent out. Chapter 4: Appenders
Thanks to Elijah for pointing me in the right direction. As mentioned in the documentation in SMTP Appender:
by default, the email transmission is triggered by a logging event of
level ERROR
There are multiple ways to solve this. The simplest way for me was to implement my own EventEvaluator like so, and use it in the smtp appender config (shown further below). Alternatively, you can explore using Markers to solve this issue.
public class OnInfoEvaluator extends EventEvaluatorBase<ILoggingEvent> {
#Override
public boolean evaluate(ILoggingEvent loggingEvent) throws NullPointerException, EvaluationException {
if(loggingEvent.getLevel().toInt() >= Level.INFO_INT)
{
return true;
}
else
{
return false;
}
}
}
In the config:
<appender name="email" class="ch.qos.logback.classic.net.SMTPAppender">
<evaluator class="mypackage.OnInfoEvaluator" />
...
</appender>
Related
Problem Statement: Logback is printing in console properly but not in log.txt file. There are many solutions given in other pages but apparently none worked. Could someone help me in this regard?
Java:-
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
public class Log {
public static void main(String[] args) {
Logger logger = LoggerFactory.getLogger("Example App");
logger.info("'sup? I'm your info logger");
logger.debug("hey HEY hey! I'm your debug logger");
}
}
Config:- logback-fileAppender.xml
<configuration>
<appender name="FILE" class="ch.qos.logback.core.FileAppender">
<file>${project.basedir}/Log.txt</file>
<append>true</append>
<!-- set immediateFlush to false for much higher logging throughput -->
<immediateFlush>true</immediateFlush>
<!-- encoders are assigned the type
ch.qos.logback.classic.encoder.PatternLayoutEncoder by default -->
<encoder>
<pattern>%-4relative [%thread] %-5level %logger{35} - %msg%n</pattern>
</encoder>
</appender>
<root level="DEBUG">
<appender-ref ref="FILE" />
</root>
</configuration>
Expected:- The logs should be printed in log.txt
Actual:- The logs is not printed in log.txt
Note: I am using my customized directory structure in maven not the default provided by maven "src/main/resources".
I have configured logback xml for a spring boot project.
I want to configure another appender based on the property configured. We want to create an appender either for JSON logs or for text log, this will be decided either by property file or by environment variable.
So I am thinking about the best approach to do this.
Using filters to print logs to 1 of the file (either to JSON or to Txt). But this will create both of the appenders. I want to create only 1 appender.
Use "If else" blocks in logback XML file. To put if else around appenders, loggers seems untidy and error prone. So will try to avoid as much as possible.
So now exploring options where I can add appender at runtime.
So I want to know if it is possible to add appender at runtime. And will it be added before spring boots up or it could be done anytime in the project.
What could be the best approach to include this scenario.
As you're already using Spring, I suggest using Spring Profiles, lot cleaner than trying to do the same programmatically. This approach is also outlined in Spring Boot docs.
You can set an active profile from either property file:
spring.profiles.active=jsonlogs
or from environment value:
spring_profiles_active=jsonlogs
of from startup parameter:
-Dspring.profiles.active=jsonlogs
Then have separate configurations per profile:
<?xml version="1.0" encoding="UTF-8"?>
<configuration>
<appender name="stdout-classic" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>%d{dd-MM-yyyy HH:mm:ss.SSS} %magenta([%thread]) %highlight(%-5level) %logger{36}.%M - %msg%n</pattern>
</encoder>
</appender>
<appender name="stdout-json" class="ch.qos.logback.core.ConsoleAppender">
<encoder class="ch.qos.logback.core.encoder.LayoutWrappingEncoder">
<layout class="ch.qos.logback.contrib.json.classic.JsonLayout">
<timestampFormat>yyyy-MM-dd'T'HH:mm:ss.SSSX</timestampFormat>
<timestampFormatTimezoneId>Etc/UTC</timestampFormatTimezoneId>
<jsonFormatter class="ch.qos.logback.contrib.jackson.JacksonJsonFormatter">
<prettyPrint>true</prettyPrint>
</jsonFormatter>
</layout>
</encoder>
</appender>
<!-- begin profile-specific stuff -->
<springProfile name="jsonlogs">
<root level="info">
<appender-ref ref="stdout-json" />
</root>
</springProfile>
<springProfile name="classiclogs">
<root level="info">
<appender-ref ref="stdout-classic" />
</root>
</springProfile>
</configuration>
As the previous answer states, you can set different appenders based on Spring Profiles.
However, if you do not want to rely on that feature, you can use environments variables as described in the Logback manual. I.e.:
<appender name="json" class="ch.qos.logback.core.ConsoleAppender">
<encoder class="ch.qos.logback.core.encoder.LayoutWrappingEncoder">
<layout class="ch.qos.logback.contrib.json.classic.JsonLayout">
<jsonFormatter class="ch.qos.logback.contrib.jackson.JacksonJsonFormatter">
<prettyPrint>true</prettyPrint>
</jsonFormatter>
<appendLineSeparator>true</appendLineSeparator>
</layout>
</encoder>
</appender>
<appender name="console" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>
%cyan(%d{HH:mm:ss.SSS}) %gray([%thread]) %highlight(%-5level) %magenta(%logger{36}) - %msg%n
</pattern>
</encoder>
</appender>
<root level="info">
<!--
! Use the content of the LOGBACK_APPENDER environment variable falling back
! to 'json' if it is not defined
-->
<appender-ref ref="${LOGBACK_APPENDER:-json}"/>
</root>
I wrote a simple program in which I am using Logback. My intention was to use ASYNS which internally will use STDOUT.
Here is the Java code listing:
package com.example;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
public class LogBackMainApp {
private static final Logger LOGGER =
LoggerFactory.getLogger(LogBackMainApp.class);
public static void main(String[] args) throws InterruptedException {
LOGGER.info("Hello world");
LOGGER.info("Hello world again");
Thread.sleep(5000);
}
}
The below is the configuration file:
<?xml version="1.0" encoding="UTF-8"?>
<configuration scan="true" scanPeriod="60 seconds" >
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<!-- %d{yyyy-MM-dd HH:mm:ss.SSS} %thread %-5level %logger{0}:%L
If you required class name ,enable %logger{0}:%L -->
<pattern>%d{yyyy-MM-dd HH:mm:ss.SSS} %thread %-5level - %msg
%n</pattern>
</encoder>
</appender>
<appender name="ASYNC-STDOUT" class="ch.qos.logback.classic.AsyncAppender">
<queueSize>1</queueSize>
<discardingThreshold>20</discardingThreshold>
<neverBlock>true</neverBlock>
<appender-ref ref="STDOUT" />
</appender>
<root level="INFO">
<appender-ref ref="ASYNC-STDOUT" />
</root>
I am defining root logger which would cater to my com.example package, and it refers to ASYNC-STDOUT, which internally uses ch.qos.logback.core.ConsoleAppender.
As per my current understanding, it should be able to log to console. However, nothing is coming. Is there something wrong in my code or configuration OR do i miss to understand the concept altogether.
If you use maven have a look: Dependency management for SLF4J and Logback. Maybe you're missing a required dependency. Sl4j is only an abstraction for you're real logger implementation which has to be added as dependency.
My team is developing a telecom realtime application with high calls per second rate. We are using logback to filter log based on a key-value match (traffic live values, like Calling Party, and so on). The filtered log file is correctly created, once verified the match from live values and db values, but we would get rid of default file which is filled with logs when there is no match. It might happen that a traffic node needs to be monitored for a while before a key-value match takes place, so in the meantime the default could indefinitely increase in size and cause problems to performance and stability of node itself. What should I do in my logback.xml to avoid generation of default log file? Is it possible? Any other option to achieve same result?
logback.xml
<?xml version="1.0" encoding="UTF-8"?>
<property scope="context" name="LOG_LEVEL" value="INFO" />
<appender name="SIFT_LOGGER" class="ch.qos.logback.classic.sift.SiftingAppender">
<discriminator class="com.ericsson.jee.ngin.services.log.ServiceKeyDiscriminator">
</discriminator>
<sift>
<appender name="FILE" class="ch.qos.logback.core.rolling.RollingFileAppender">
<prudent>true</prudent>
<rollingPolicy class="ch.qos.logback.core.rolling.SizeAndTimeBasedRollingPolicy">
<fileNamePattern>/var/log/tit/logback_${serviceKey}_%d{yyyy-MM-dd}_%i.log</fileNamePattern>
<maxFileSize>1MB</maxFileSize>
<maxHistory>10</maxHistory>
<totalSizeCap>2GB</totalSizeCap>
</rollingPolicy>
<filter class="ch.qos.logback.classic.filter.LevelFilter">
<level>${LOG_LEVEL}</level>
<onMatch>ACCEPT</onMatch>
<onMismatch>DENY</onMismatch>
</filter>
<!-- encoders are by default assigned the type ch.qos.logback.classic.encoder.PatternLayoutEncoder -->
<encoder>
<pattern> %d{yyyy-MM-dd HH:mm:ss.SSSZ} [%thread] %-5level %logger{36} %msg%n</pattern>
</encoder>
</appender>
</sift>
</appender>
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
<!-- encoders are by default assigned the type ch.qos.logback.classic.encoder.PatternLayoutEncoder -->
<encoder>
<pattern> %d{yyyy-MM-dd HH:mm:ss.SSSZ} [%thread] %-5level %logger{36} %msg%n</pattern>
</encoder>
</appender>
<turboFilter class="ch.qos.logback.classic.turbo.DynamicThresholdFilter">
<key>serviceKey</key>
<defaultThreshold>DEBUG</defaultThreshold>
<onHigherOrEqual>ACCEPT</onHigherOrEqual>
<onLower>ACCEPT</onLower>
</turboFilter>
<root level="DEBUG">
<appender-ref ref="SIFT_LOGGER" />
<appender-ref ref="STDOUT" />
</root>
ATTACHMENTS: FILTERED LOGBACK CLASS
The provided FL class is only working for a SK which has a java Discriminator in FL module.
You must move the filter to the general sift appender.
<appender name="SIFT-TRACE"
class="ch.qos.logback.classic.sift.SiftingAppender">
<discriminator
class="ch.qos.logback.classic.sift.MDCBasedDiscriminator">
<Key>loggerFileName</Key>
<DefaultValue>unknown</DefaultValue>
</discriminator>
<filter class="ch.qos.logback.core.filter.EvaluatorFilter">
<evaluator
class="ch.qos.logback.classic.boolex.JaninoEventEvaluator">
<expression>
mdc.get("loggerFileName")!=null
</expression>
</evaluator>
<OnMismatch>DENY</OnMismatch>
<OnMatch>NEUTRAL</OnMatch>
</filter>
<sift>
<appender name="TRACE-${loggerFileName}"
class="ch.qos.logback.core.FileAppender">
<File>D:/U/${loggerFileName}.log</File>
<Append>true</Append>
<layout class="ch.qos.logback.classic.PatternLayout">
<Pattern>%d [%thread] %level %mdc %logger - %msg%n</Pattern>
</layout>
</appender>
</sift>
</appender>
<logger name="org.springframework" level="DEBUG" />
<root level="DEBUG">
<appender-ref ref="SIFT-TRACE" />
</root>
Also to make it work correctly, you MUST after each Thread/file/marker/etc. to put those statements:
public void handle()
{
MDC.put("loggerFileName","some value");
...
MDC.remove("loggerFileName");
}
You have defined this root logger:
<root level="DEBUG">
<appender-ref ref="SIFT_LOGGER" />
<appender-ref ref="STDOUT" />
</root>
This means that all log events with Level >= DEBUG will be directed to two appenders:
SIFT_LOGGER
STDOUT
If I understand your question correctly then you do want logs to be written via your SIFT_APPENDER but you don't want any other log output. If so, then just remove this entry:
<appender-ref ref="STDOUT" />
The STDOUT appender is a console appender so it doesn't actually write to a log file instead it writes to System.out. I suspect the reason you are seeing these log events in some file is that whatever is running your application is redirecting System.out to a file. As long as you only have your SIFT_APPENDER in the root logger definition then you can be confident that this will be the only appender in play. Note: once you remove the appender from the root logger you can probbaly remove it from logback.xml since it is unused.
Update 1: Based on your last comment I now understand that you want to discard the logs which arrive at the SiftingAppender but do not match a given condition. I suspect what's happeneing here is that some log events arrive at the sifting appender with an 'unknown' value for serviceKey and these events are then written to /var/log/tit/logback_[unknownValue]_%d{yyyy-MM-dd}_%i.log. Is this the crux of the issue? If so, then you can add a filter into the nested appender. Here are some examples:
Using Groovy to express the 'contains unknown serviceKey condition':
<filter class="ch.qos.logback.core.filter.EvaluatorFilter">
<!-- GEventEvaluator requires Groovy -->
<evaluator
class="ch.qos.logback.classic.boolex.GEventEvaluator">
<expression>
serviceKey == null
</expression>
</evaluator>
<OnMismatch>NEUTRAL</OnMismatch>
<OnMatch>DENY</OnMatch>
</filter>
Using Janino to express the 'contains unknown serviceKey condition':
<filter class="ch.qos.logback.core.filter.EvaluatorFilter">
<!-- JaninoEventEvaluator requires Janino -->
<evaluator
class="ch.qos.logback.classic.boolex.JaninoEventEvaluator">
<expression>
serviceKey == null
</expression>
</evaluator>
<OnMismatch>NEUTRAL</OnMismatch>
<OnMatch>DENY</OnMatch>
</filter>
With either of these filters in place any log events which arrive at the sifting appender and have the 'unknown' serviceKey will be ignored. Note: I have written the 'contains unknown serviceKey condition' as serviceKey == null your logic might differ but the above examples show should you how to tell Logback to apply this filter for you.
Just to notify to #glitch (and all others interested) the happy conclusion of this issue: I have managed to make the tag expression working was this:
<expression>mdc.get("servicekey") == null</expression>
Thanks to this expression, I have got the wanted behavior: the default file "IS_UNDEFINED is not generated when the key doesn't match with the runtime traffic values.
The reason is because the type of Event in JaninoEventEvaluator is LoggingEvent that has a reserve object "mdc" (the type is Map).
Regards,
Pierluigi
my problem is : My application maintains three buildings, and each building has a different process.
So, using logback, I want to create a log which has a specification :
each building will have a specific folder, and inside that specific folder of each building, there will be many log files, with each log file indicates a process.
My logback.xml right now is :
<?xml version="1.0" encoding="UTF-8"?>
<configuration>
<appender name="stdout" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>%d{HH:mm:ss.SSS} [%thread] %-5level %logger{36} - %msg%n
</pattern>
</encoder>
</appender>
<appender name="logAppender" class="ch.qos.logback.classic.sift.SiftingAppender">
<discriminator>
<key>processName</key>
<defaultValue>unknown</defaultValue>
</discriminator>
<sift>
<appender name="FILE-${processName}"
class="ch.qos.logback.core.rolling.RollingFileAppender">
<file>logs/${distributor}/${processName}.log</file>
<layout class="ch.qos.logback.classic.PatternLayout">
<pattern>%d [%thread] %level %mdc %logger{35} - %msg%n</pattern>
</layout>
<rollingPolicy
class="ch.qos.logback.core.rolling.FixedWindowRollingPolicy">
<fileNamePattern>logs/${distributor}/${processName}.%i.log</fileNamePattern>
<minIndex>1</minIndex>
<maxIndex>10</maxIndex>
<!-- <timeBasedFileNamingAndTriggeringPolicy class="ch.qos.logback.core.rolling.SizeAndTimeBasedFNATP">
<maxFileSize>5KB</maxFileSize> </timeBasedFileNamingAndTriggeringPolicy> -->
</rollingPolicy>
<triggeringPolicy
class="ch.qos.logback.core.rolling.SizeBasedTriggeringPolicy">
<maxFileSize>10MB</maxFileSize>
</triggeringPolicy>
</appender>
</sift>
</appender>
<logger name="processLog" level="debug" additivity="false">
<appender-ref ref="logAppender" />
<appender-ref ref="stdout" />
</logger>
<root level="debug">
<appender-ref ref="stdout" />
<appender-ref ref="logAppender" />
</root>
</configuration>
And my java servlet code is :
public class DistributorServlet extends HttpServlet {
private static final long serialVersionUID = 1L;
private static Logger processLog = LoggerFactory.getLogger("processLog");
protected void doPost(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException {
String office = req.getParameter("office");
MDC.put("distributor", office);
String process = req.getParameter("process");
MDC.put("process", process);
processLog.debug("Processing");
}
}
However, a log file is not generated.
Can anyone help me?
Thank you very much
1. Make the below change
private static Logger processLog = LoggerFactory.getLogger("processLog");
to
private static Logger processLog = LoggerFactory.getLogger(DistributorServlet .class);
2. Add additional discriminator for distributor
From the logback.xml it appears that only one discriminator has been added. Did you try adding another one for distributor
3. Do not forget
To add MDC.remove("keyName"); after its usage.
4. In case if you observe issue with multiple MDC keys
I faced an issue with the MDC.put in trying to add multiple keys into it. (I wondered why no putAll has been defined)
Although the underlying implementation is that of a HashMap<Key k, Value v> that should allow adding multiple keys, I was only able to see that the last key you put into MDC would be applied to logback.xml
While for the other keys I observed _IS_UNDEFINED value that gets appended instead.
Of course then again you can refer to the other various links and although this may not be a best idea, here is what I have done in my Java code,
System.setProperty("distributor", req.getParameter("office"));
System.setProperty("process", req.getParameter("process"));
Modify the logback.xml by removing discriminator. Well, alternatively you can remove one of the above properties and have the MDC.put for that property.
However please do refer to the links System.setProperty is safe in java?
Alternatively I suggest https://stackoverflow.com/a/32615172/3838328