How to log multiple threads in different log files? - java

I have a JAVA class that starts various threads that have unique IDs.
Each thread should log into a unique log file, named after the ID.log.
Because I only get the unique ID at runtime, I have to configure Log4J programatically:
// Get the jobID
myJobID = aJobID;
// Initialize the logger
myLogger = Logger.getLogger(myJobID);
FileAppender myFileAppender;
try
{
myFileAppender = new FileAppender(new SimpleLayout(), myJobID + ".log", false);
BasicConfigurator.resetConfiguration();
BasicConfigurator.configure(myFileAppender);
} catch (IOException e1) {
// TODO Auto-generated catch block
e1.printStackTrace();
}
Now this works fine if I start jobs sequentially - but when I start 2 threads (of the same class) simultaneously, both logs are created but the logs are mixed up: The second thread logs into the first as well as the second log.
How could I make sure that each instance is unique ?
I already tried to give a unique name to each logger instance, but it did not change anything.

Logback has a special appender called SiftingAppender which provides a very nice solution to the type of problems you describe. A SiftingAppender can be used to separate (or sift) logging according to any runtime attribute, including thread id.

For log4j v2 you can use RoutingAppender to dynamically route messages. You can put value for key 'threadId' into the ThreadContext map and then use this id as a part of the file name. There is an example which I have easily applied for the same purpose as yours. See http://logging.apache.org/log4j/2.x/faq.html#separate_log_files
Be aware when putting values into ThreadContext map: "A child thread automatically inherits a copy of the mapped diagnostic context of its parent." So if you have put a value for key 'threadId' into the parent thread and eventually created multiple threads from it, then all child threads will inherit the value of 'threadId' value. I was not able to simply override this value by using put() one more time - you need to use ThreadContext.clear() or explicitly remove() the value from thread context map.
Here is my working log4j.xml:
<?xml version="1.0" encoding="UTF-8"?>
<configuration status="WARN">
<properties>
<property name="logMsgPattern">%d{HH:mm:ss} %-5level - %msg%n</property>
<property name="logDir">test logs</property><!-- ${sys:testLogDir} -->
</properties>
<appenders>
<Console name="Console" target="SYSTEM_OUT">
<PatternLayout pattern="${logMsgPattern}"/>
</Console>
<Routing name="Routing">
<Routes pattern="$${ctx:threadId}">
<Route>
<RollingFile name="RollingFile-${ctx:threadId}" fileName="${logDir}/last-${ctx:threadId}.log" filePattern="${logDir}/%d{yyyy-MM-dd}/archived_%d{HH-mm}-${ctx:threadId}.log">
<PatternLayout pattern="${logMsgPattern}"/>
<Policies>
<OnStartupTriggeringPolicy />
</Policies>
</RollingFile>
</Route>
</Routes>
</Routing>
</appenders>
<loggers>
<root level="debug">
<appender-ref ref="Console" level="debug" />
<appender-ref ref="Routing" level="debug"/>
</root>
</loggers>
</configuration>

#havexz 's approach is quite good: writing everything to the same log file and using nested diagnostic contexts.
If your concern is about several JVMs writing to the same FileAppender, then i'd suggest two things:
using SLF4J as a logging facade
using logback as logging implementation, in prudent mode
In prudent mode, FileAppender will safely write to the specified file,
even in the presence of other FileAppender instances running in
different JVMs, potentially running on different hosts.

Here is the snippet of the Routing code from a working log4j.xml file.
<Appenders>
<Console name="ConsoleAppender" target="SYSTEM_OUT">
<PatternLayout>
<pattern>%d{yyyy-MM-dd HH:mm:ss.SSS} [%t] %-5level %-50logger{4}: %msg%n</pattern>
</PatternLayout>
</Console>
<Routing name="RoutingAppender">
<Routes pattern="${ctx:logFileName}">
<!-- This route is chosen if ThreadContext has a value for logFileName.
The value dynamically determines the name of the log file. -->
<Route>
<RollingFile name="Rolling-${ctx:logFileName}"
fileName="${sys:log.path}/${ctx:logFileName}.javalog"
filePattern="./logs/${date:yyyy-MM}/${ctx:logFileName}_%d{yyyy-MM-dd}-%i.log.gz">
<PatternLayout>
<pattern>%d{yyyy-MM-dd HH:mm:ss.SSS} [%t] %-5level %-50logger{4}: %msg%n</pattern>
</PatternLayout>
<Policies>
<TimeBasedTriggeringPolicy interval="6"
modulate="true" />
<SizeBasedTriggeringPolicy size="10 MB" />
</Policies>
</RollingFile>
</Route>
<!-- This route is chosen if ThreadContext has no value for key logFileName. -->
<Route key="${ctx:logFileName}" ref="ConsoleAppender" />
</Routes>
</Routing>
</Appenders>
<loggers>
<root level="debug">
<appender-ref ref="RoutingAppender" level="debug" />
</root>
</loggers>
The key 'logFileName' can be added to the thread context map in the run() method of the Runnable class as follows,
public class SomeClass implements Runnable{
private int threadID;
public SomeClass(int threadID){
this.threadID=threadID;
}
#Override
public void run() {
String logFileName = "thread_log_"+ threadID;
ThreadContext.put("logFileName", logFileName);
//Some code
ThreadContext.remove("threadId");
}
}
In addition, the correct log4j packages must be imported, as shown below.
import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger;
import org.apache.logging.log4j.ThreadContext;
Please note that the following imports will not work. LogManager and Logger must also come from org.apache.logging.log4j.
import org.apache.log4j.LogManager;
import org.apache.log4j.Logger;
import org.apache.logging.log4j.ThreadContext;

As far as I can tell ThreadLocal API was designed to do what you describe.
Code like below would establish per-thread loggers each using own (per-thread) FileAppender:
/**
* usage: threadLocalLogger.get().info("hello thread local logger")
*/
static ThreadLocal<Logger> threadLocalLogger = newThreadLocalLogger("myJobId");
private static ThreadLocal<Logger> newThreadLocalLogger(final String myJobID) {
return new ThreadLocal<Logger>() {
#Override
protected Logger initialValue() {
return logger(myJobID, Thread.currentThread().getId());
}
};
}
private static Logger logger(String myJobID, long threadId) {
// Initialize the logger
String loggerId = myJobID + "-" + threadId;
Logger myLogger = Logger.getLogger(loggerId);
FileAppender myFileAppender;
try
{
myFileAppender = new FileAppender(new SimpleLayout(),
loggerId + ".log", false);
BasicConfigurator.resetConfiguration();
BasicConfigurator.configure(myFileAppender);
} catch (IOException e1) {
// TODO Auto-generated catch block
e1.printStackTrace();
}
return myLogger;
}

What about adding a static instance counter variable to your class. Then you would need a synchronized method which increases the counter for each object created and create the log file name from that value. Something like this:
class yourClass {
private static int cnt = 0;
public yourClass(){
...
initLogger();
}
private synchronized initLogger(){
yourClass.cnt++;
myJobid = yourClass.cnt;
//include your logging code here
}
}

Related

Log4j2 Incorrect Logging: Passing package name into getLogger

I am working on some older code that uses log4j in a way that I am not familiar with.
The Class seems to wrap Log4j. It sets its log levels and is able to operate statically across the package that it's named after:
public class MyClass {
private static final Logger logger = LogManager.getLogger("com.example");
private static MyClass instance = null;
private int logLevel;
public MyClass() {
setLevel(DefaultLevel);
}
// Code that sets log levels, and implements logDebug/logInfo/etc.
}
MyClass operates such that it is statically used to log across the entire package:
public class AnotherClassInComExample {
MyClass.log("message");
}
I'm curious about passing a package into getLogger as a parameter. Why does this work?
However, my main issue is that the class logs to both it's RollingFile AND the console and I have no idea why. The app is run through tomcat 9.
The log4j2.xml looks like this:
<Appenders>
<RollingFile name="SingletonMyClass" filename="C:/logs/myclass.log"
filePattern="C:/logs/myclass.log.%i" append="true" immediateFlush="true"
bufferedIO="false" ignoreExceptions="false" filePermissions="rwxr-xr-x">
<PatternLayout>
<pattern>%d{HH:mm:ss.SSS} [%t] %-5level %logger{36} - %msg%n</pattern>
</PatternLayout>
<SizeBasedTriggeringPolicy size="20 MB" />
<DefaultRolloverStrategy max="30" />
</RollingFile>
</Appenders>
<Loggers>
<Logger name="com.example" level="trace" additivity="false>
<AppenderRef ref="SingletonMyclass" />
<Root level="trace>
</Root>
</Loggers>
There is no console appender listed and I removed everything from the root, yet it still logs to console.

Preventing log4j from Rolling over all Loggers in a config file at Startup

I have a single java class (a device controller) that is being used to create 5 separate processes. Each of the processes is assigned an identifier. I would like each of the processes to write to its own log file based on its assigned identifier. I have all of the appenders and loggers defined in a shared log4j2.xml config file.
Issue: When I start the first device controller, it successfully writes to the correct log file. However, when I start the second device controller, log4j will roll-over all of the loggers in the log4j2.xml config file and will only write to the log file assigned to the new process. All of the log messages for the first process will go to the rolled-over log file, but new messages are no longer written to its newly rolled-over log file.
(OS: Linux, log4j version: 2.8.2)
Below is an abbreviated version of the log4j2.xml config file that I used.
...
<Appenders>
...
<RollingFile name="RollingFile-1" fileName="/logs/EPDU/Device-1.log" filePattern="/logs/EPDU/Device-1_%d{dd-MMM-yyyy::HH:mm:ss}.log">
<PatternLayout>
...
</PatternLayout>
<Policy>
<OnStartUpTriggeringPolicy minSize="1"/>
<SizeBasedTriggeringPolicy size="20 MB"/>
</Policy>
<DefaultRolloverStrategy fileIndex="nomax"/>
</RollingFile>
...
<RollingFile name="RollingFile-5" fileName="/logs/EPDU/Device-5.log" filePattern="/logs/EPDU/Device-5_%d{dd-MMM-yyyy::HH:mm:ss}.log">
<PatternLayout>
...
</PatternLayout>
<Policy>
<OnStartUpTriggeringPolicy minSize="1"/>
<SizeBasedTriggeringPolicy size="20 MB"/>
</Policy>
<DefaultRolloverStrategy fileIndex="nomax"/>
</RollingFile>
</Appenders>
<Loggers>
<Logger name="device-1" level="trace" additivity="false">
<AppenderRef ref="RollingFile-1" level="debug"/>
</Logger>
...
<Logger name="device-5" level="trace" additivity="false">
<AppenderRef ref="RollingFile-5" level="debug"/>
</Logger>
</Loggers>
The Logger variable is initialized and assigned in the main method after the device identifier is determined similar to the code below:
import org.apache.logging.log4j.Logger;
import org.apache.logging.log4j.LogManager;
public class DeviceController {
private satic Logger deviceLogger;
public DeviceController(Param param1, Param param2){
...
}
...
public static void main(String[] args) {
/**
* Fancy code to find device identifier...
* String loggerName = (results of fancy code is "device-[1..5]");
*/
deviceLogger = LogManager.getLogger(loggerName);
deviceLogger.info("Start logging stuff in device log.");
new DeviceController(param1, param2);
}
...
}
How can I prevent all of the loggers from rolling over, but instead leave the currently running processes/logs alone as the next process and log is started?
Note: I tried to provide a "Goldilocks" amount of detail to explain the problem. Sorry if I provided too much or not enough information.
Could you show a little bit more of your code? I think your issue comes from the fact you have a static logger. So from your above snippet, I believe you overwrite for each new DeviceController the deviceLogger with a new Logger with the next identifier you fetch. I would guess that at the end, all your logs are being appended to your device-5 log file, aren't they?
Side note, I think it's good practice to use the sl4f interface to declare your logger but then assigned the log4j implementation to your logger.

Choosing log file dynamically

I have an application with a number of components that create data in a database. Each component logs what it is doing when creating data. There are many such components and the application is flexible so that it does not always need to execute the same set of these data-creation components each time it runs.
Currently, everything logs to a single file, which is producing files that are starting to get unmanageable. I'd like it if each component could log to a file whose name describes which component wrote it - ComponentA should log to ComponentA-dataCreationPhase.log.
Most solutions I've seen seem to assume that the different loggers will be static so that it's OK to look them up by name, such as LogManager.getLogger("ComponentA"); - assuming a logger with that name is already configured in my log4j2.xml. Other solutions that I've seen have used Routing and ThreadContexts - but I'm not sure this will work since these components will probably all execute in the same thread.
How can I get each component (many are distinct classes, but some are just different instances of the same class, just configured differently) to log to its own log file? Ideally, this would be done based on the existing log4j2.xml file, as the log4j2.xml might have some user-specified configurations that I'd want to propagate to the component-specific loggers, such as logging path and logging level.
I found this answer:
https://stackoverflow.com/a/38096181/192801
And I used it as the basis of a new function that I added to all Components, via a default method in the top-level interface.
Added to the interface for all components:
public interface Component {
default Logger createLogger(String logFileName, String oldAppenderName, String newAppenderName, boolean append, Level level)
{
LoggerContext context = (LoggerContext) LogManager.getContext(false);
Configuration configuration = context.getConfiguration();
Appender oldAppender = configuration.getAppender(oldAppenderName);
Layout<? extends Serializable> oldLayout = oldAppender.getLayout();
// create new appender/logger
LoggerConfig loggerConfig = new LoggerConfig(logFileName, level, false);
Appender appender ;
// In my case, it is possible that the old appender could
// either be a simple FileAppender, or a RollingRandomAccessFileAppender,
// so I'd like the new one to be of the same type as the old one.
// I have yet to find a more elegant way to do create a new Appender
// of *any* type and then copy all relevant config.
if (oldAppender instanceof RollingRandomAccessFileAppender)
{
int bufferSize = ((RollingRandomAccessFileAppender)oldAppender).getBufferSize();
RollingRandomAccessFileManager oldMananger = (RollingRandomAccessFileManager)((RollingRandomAccessFileAppender) oldAppender).getManager();
TriggeringPolicy triggerPolicy = oldMananger.getTriggeringPolicy();
RolloverStrategy rollStrategy = oldMananger.getRolloverStrategy();
Filter filter = ((RollingRandomAccessFileAppender)oldAppender).getFilter();
// Inject new log file name into filePattern so that file rolling will work properly
String pattern = ((RollingRandomAccessFileAppender)oldAppender).getFilePattern().replaceAll("/[^/]*-\\%d\\{yyyy-MM-dd\\}\\.\\%i\\.log\\.gz", "/"+logFileName+"-%d{yyyy-MM-dd}.%i.log.gz");
appender = RollingRandomAccessFileAppender.newBuilder()
.withFileName("logs/" + logFileName + ".log")
.withFilePattern(pattern)
.withAppend(append)
.withName(newAppenderName)
.withBufferSize(bufferSize)
.withPolicy(triggerPolicy)
.withStrategy(rollStrategy)
.withLayout(oldLayout)
.withImmediateFlush(true)
.withFilter(filter)
.build();
}
else
{
appender = FileAppender.newBuilder()
.withFileName("logs/" + logFileName + ".log")
.withAppend(append)
.withName(newAppenderName)
.withLayout(oldLayout)
.setConfiguration(configuration)
.withLocking(false)
.withImmediateFlush(true)
.withIgnoreExceptions(true)
.withBufferSize(8192)
.withFilter(null)
.withAdvertise(false)
.withAdvertiseUri("")
.build();
}
appender.start();
loggerConfig.addAppender(appender, level, null);
configuration.addLogger(logFileName, loggerConfig);
context.updateLoggers();
return context.getLogger(logFileName);
}
Use of this function happens inside the constructor of the components:
public class ComponentA implements Component
{
private Logger logger = LogManager.getLogger();
public ComponentA(String componentName)
{
this.logger = this.createLogger(componentName, "MyOriginalFileAppender", componentName+"Appender", true, Level.DEBUG, this.logger);
}
}
and elsewhere:
ComponentA fooSubtypeAComponent = new ComponentA("FooA");
ComponentA barSubtypeAComponent = new ComponentA("BarA");
ComponentB fooVariantTypeBComponent = new ComponentB("FooB");
The original Appenders + Loggers snippet from the Log4j config:
<Appenders>
<!-- STDOUT appender. -->
<Console name="Console" target="SYSTEM_OUT">
<PatternLayout
pattern="%d{HH:mm:ss.SSS} [%t] %-5level %logger{36}#%M - %msg%n" />
</Console>
<RollingRandomAccessFile name="MyOriginalFileAppender" fileName="${baseDir}/${defaultLogName}.log" filePattern="${baseDir}/$${date:yyyy-MM}/${defaultLogName}-%d{MM-dd-yyyy}.%i.log.gz">
<PatternLayout pattern="%d{yyyy-MM-dd HH:mm:ss.SSS} [%t] %-5level %logger{36}#%M - %msg%n" />
<Policies>
<!-- <TimeBasedTriggeringPolicy /> -->
<!-- Let's try cron triggering policy configured to trigger every day at midnight -->
<CronTriggeringPolicy schedule="0 0 0 * * ?"/>
<SizeBasedTriggeringPolicy size="25 MB" />
</Policies>
</RollingRandomAccessFile>
</Appenders>
<Loggers>
<Root level="debug">
<!-- only write INFO level to the file. -->
<AppenderRef ref="MyOriginalFileAppender" level="debug"/>
<!-- Console shows everything at DEBUG level-->
<AppenderRef ref="Console" level="info" />
</Root>
</Loggers>
The result should be three logs files: logs/FooA.log, logs/BarA.log, logs/FooB.log - one log file for each instance shown above. Still has some kinks to iron out, but I think this will work fine.

Different log files for multiple threads using log4j2

I am running a Java application in which I am invoking multiple threads, each with some unique names. Now I want to create multiple log files for each of them and the name of the log files should be as the thread names. Is this possible using log4j2. Please help me write log4j2 configuration files.
Thank you in advance.
I agree a RoutingAppender is the way to go. I initially used the routing appender in conjunction with the ${ctx:threadName} lookup where the 'ctx' uses the ThreadContext. I found that I would have to sprinkle in the code a line like this:
ThreadContext.put("threadName", Thread.currentThread().getName());
While that code works it's not extensible in the design of the code. If I were to add a new java.lang.Runnable to the code base, I would have to also include that line.
Rather, the solution seems to be to implement the 'org.apache.logging.log4j.core.lookup.StrLookup' and register the #Plugin with the PluginManager Like this:
Class: ThreadLookup
package my.logging.package
import org.apache.logging.log4j.core.LogEvent;
import org.apache.logging.log4j.core.config.plugins.Plugin;
import org.apache.logging.log4j.core.lookup.StrLookup;
#Plugin(name = "thread", category = StrLookup.CATEGORY)
public class ThreadLookup implements StrLookup {
#Override
public String lookup(String key) {
return Thread.currentThread().getName();
}
#Override
public String lookup(LogEvent event, String key) {
return event.getThreadName() == null ? Thread.currentThread().getName()
: event.getThreadName();
}
}
Configuration: log4j2.xml (packages attribute of the Configuration registers the #Plugin with the PluginManager)
<?xml version="1.0" encoding="UTF-8"?>
<Configuration status="warn" packages="my.logging.package">
<Appenders>
<Routing name="Routing">
<Routes pattern="$${thread:threadName}">
<Route>
<RollingFile name="logFile-${thread:threadName}"
fileName="logs/concurrent-${thread:threadName}.log" filePattern="logs/concurrent-${thread:threadName}-%d{MM-dd-yyyy}-%i.log">
<PatternLayout pattern="%d %-5p [%t] %C{2} - %m%n" />
<Policies>
<SizeBasedTriggeringPolicy size="50 MB" />
</Policies>
<DefaultRolloverStrategy max="100" />
</RollingFile>
</Route>
</Routes>
</Routing>
<Async name="async" bufferSize="1000" includeLocation="true">
<AppenderRef ref="Routing" />
</Async>
</Appenders>
<Loggers>
<Root level="info">
<AppenderRef ref="async" />
</Root>
</Loggers>
This can be done with the RoutingAppender. The FAQ page has a good example config.
This question and the answers was a good starting point to me, however if you are using slf4j inside your implementation and log4j only for the logging side, this could be a bit trickier, which I wanted to share here.
First of all the previously mentioned log4j FAQ with the example could be found for example here: https://logging.apache.org/log4j/2.x/faq.html#separate_log_files
Which was unfortunately not useful to me as we are using MDC inside slf4j instead of ThreadContext. However it was not clear to me at first sight, but you can use ThreadContext and MDC in log4j xml the same. For example:
Routes pattern="$${ctx:macska}">
could refer an
MDC.put("macska", "cica" + Thread.currentThread().getId());
in the code. So as the same as ThreadContext.
But finally my solution was not using the MDC, as I found out log4j has EventLookup: https://logging.apache.org/log4j/2.x/manual/lookups.html#EventLookup
and I modified only my log4j xml with this:
<Routes pattern="$${event:ThreadName}">

Log4j 2.0 - No Logs appearing in the Log Files - Trying Multiple Loggers in the same Class using log4j2.xml

Below is the log4j2.xml file that I have created. I have configured async_file.log for Asynchronous Logging and regular_file.log for regular and synchronous logging. The problem is that the log files get created, but the size of the files is zero and with no logs. All logs are coming to server.log file (JBOSS) and not to the 2 files that I had got configured for (async_file.log and regular_file.log).
Please let me know why the logs are NOT going to the log files that I have configured. Please help me with this or give me some direction or hint.
I am calling the two different loggers in the same class file by name DCLASS as shown below:
private static final transient Logger LOG = Logger.getLogger(DCLASS.class);
private static final transient Logger ASYNC_LOG = Logger.getLogger("ASYNC");
I have included the following jars in the Class Path:
1. log4j-api-2.0-beta8.jar
2. log4j-core-2.0-beta8.jar
3. disruptor-3.0.0.beta1.jar
My log4j2.xml is as below:
<?xml version="1.0" encoding="UTF-8"?>
<configuration status="INFO">
<appenders>
<!-- Async Loggers will auto-flush in batches, so switch off immediateFlush. -->
<FastFile name="AsyncFastFile" fileName="../standalone/log/async_file.log"
immediateFlush="false" append="true">
<PatternLayout>
<pattern>%d %p %class{1.} [%t] %location %m %ex%n</pattern>
</PatternLayout>
</FastFile>
<FastFile name="FastFile" fileName="../standalone/log/regular_file.log"
immediateFlush="true" append="true">
<PatternLayout>
<pattern>%d %p %class{1.} [%t] %location %m %ex%n</pattern>
</PatternLayout>
</FastFile>
</appenders>
<loggers>
<!-- pattern layout actually uses location, so we need to include it -->
<asyncLogger name="ASYNC" level="trace" includeLocation="true">
<appender-ref ref="AsyncFastFile"/>
</asyncLogger>
<root level="info" includeLocation="true">
<appender-ref ref="FastFile"/>
</root>
</loggers>
</configuration>
The reason why the logs were not coming to the log files is because, I was using 'Logger' instead of 'LogManager'.
In the code, I had
private static final transient Logger ASYNC_LOG = Logger.getLogger("ASYNC");
The code should have been
private static final transient Logger ASYNC_LOG = Logmanager.getLogger("ASYNC");
When it is 'logger', then the compiler is looking into 'Log4j API' and when it is 'LogManager' it is looking into 'Log4j2 API'. Since I have configured everything to use Log4j2, by changing logger to LogManager, the logs started coming to the log files as expected.

Categories

Resources