Logback file per logger dynamicaly - java

I have "special" requirement on loggin - I need every logger in separate file.
Java
Logger log1 = LoggerFactory.getLogger("dynamic.log1");
Logger log2 = LoggerFactory.getLogger("dynamic.log2");
//...
Then I want logback any output from log1 to be written to file log1.log and so on. Is it possible to "dynamicaly" create new appender's like that with logback?
Can some other logging framework be used to solve this use-case?
I could configure appenders manualy but this is what I want to avoid. Like whenever I add dynamic logger, new appender/file is accordingly created.
EDIT:
I implemented custom discriminator:
public class LoggerBasedDiscriminator extends AbstractDiscriminator<ILoggingEvent> {
private static final String LOGGER_NAME = "loggerName";
#Override
public String getDiscriminatingValue(ILoggingEvent e) {
return e.getLoggerName();
}
#Override
public String getKey() {
return LOGGER_NAME;
}
}
And then my appender config looks like this:
<appender name="SIFT" class="ch.qos.logback.classic.sift.SiftingAppender">
<discriminator class="cz.svobol.logging.LoggerBasedDiscriminator">
<key>loggerName</key>
<defaultValue>root</defaultValue>
</discriminator>
<sift>
<appender name="FILE-${loggerName}" class="ch.qos.logback.core.FileAppender">
<file>${loggerName}.log</file>
<encoder>
<pattern>%msg%n</pattern>
</encoder>
</appender>
</sift>
</appender>

You can use a SiftingAppender.
This way you have one appender that can divide the log into different files dynamically.

Related

create a custom log4j2 rolling file appender

I want to create a custom log4j2 rolling file appender. I need to create this custom appender because I want to wrap the file name with current thread name. We are trying to migrate log4j 1.x to recent log4j2 version and previously we had used DailyRollingFileAppender to log all activities of our application.
please find the below code.
Here we are trying to append the log to a file on daily basis with help of DailyRollingFileAppender based on threadName.
Since DailyRollingFileAppender is deprecated in recent version -so, how to create custom rolling file appender with incorporating our thread based logic.?
Find the below log4j.properties file
log4j.logger.***=INFO, FileLogger
# log4j.appender.FileLogger=org.apache.log4j.DailyRollingFileAppender
# Custom Appendar which will redirect the logs based on thread names configured using
# log4j.appender.FileLogger.threadNameMapping property below
log4j.appender.FileLogger=********.framework.log4j.appender.ThreadNamePatternAppender
log4j.appender.FileLogger.DatePattern='.'yyyy-MM-dd
log4j.appender.FileLogger.file=/logs/fileName.log
log4j.appender.FileLogger.layout=org.apache.log4j.PatternLayout
log4j.appender.FileLogger.layout.ConversionPattern=%d [%-5p] [%t] [%c{1}] [%M] - %m%n
# Custom property to hold mapping between thread names and log file for plug-in
# Beware - ThreadNamePatternAppender class inherits DailyRollingFileAppender hence it will not work for any other type of appender
# This can be distuingished using - ThreadName1>ThreadName1.log|ThreadName2>ThreadName2.log|.....|ThreadNameN>ThreadNameN.log
# Note - If there is no mapping for a particular thread then logs will be written to default log file
log4j.appender.FileLogger.threadNameMapping=********/logs/fileName-fileName.log
Thanks!
import java.util.HashMap;
import java.util.Map;
import org.apache.logging.log4j.core.appender.RollingFileAppender;
import org.apache.log4j.helpers.LogLog;
import org.apache.log4j.spi.LoggingEvent;
public class ThreadNamePatternAppender extends DailyRollingFileAppender {
private Map<String, DailyRollingFileAppender> threadBasedSubAppenders = new HashMap<String, DailyRollingFileAppender>();
private String threadNameMapping;
public String getThreadNameMapping() {
return threadNameMapping;
}
public void setThreadNameMapping(String threadNameMapping) {
this.threadNameMapping = threadNameMapping;
}
#Override
public void activateOptions() {
super.activateOptions();
if (threadNameMapping != null && threadNameMapping.trim().length() > 0) {
DailyRollingFileAppender tempAppender;
String[] threadNames = threadNameMapping.split("\\|");
for (String threadName : threadNames) {
if (threadName != null && threadName.length() > 0) {
try {
LogLog.debug(String.format("Creating new appender for thread %s", threadName));
tempAppender = new DailyRollingFileAppender(getLayout(), threadName.split(">")[1],
getDatePattern());
threadBasedSubAppenders.put(threadName.split(">")[0], tempAppender);
} catch (Exception ex) {
LogLog.error("Failed to create appender", ex);
}
}
}
}
}
#Override
public void append(LoggingEvent event) {
String threadName = event.getThreadName().split(" ")[0];
if (threadBasedSubAppenders.containsKey(threadName)) {
threadBasedSubAppenders.get(threadName).append(event);
} else {
super.append(event);
}
}
#Override
public synchronized void close() {
LogLog.debug("Calling Close on ThreadNamePatternAppender" + getName());
for (DailyRollingFileAppender appender : threadBasedSubAppenders.values()) {
appender.close();
}
this.closed = true;
}
}
The RollingFileAppender in Log4j 2.x is final, so you can not extend it. However you can obtain the functionality of your custom Log4j 1.x appender using:
A RoutingAppender, which can create appenders on demand,
Multiple RollingFileAppender that will write and rotate your files,
The EventLookup to retrieve the current thread name.
For a simple logfile-per-thread appender you can use:
<Routing name="Routing">
<Routes pattern="$${event:ThreadName}">
<Route>
<RollingFile name="Rolling-${event:ThreadName}"
fileName="logs/thread-${event:ThreadName}.log"
filePattern="logs/thread-${event:ThreadName}.log.%d{yyyy-MM-dd}">
<PatternLayout pattern="%d [%-5p] [%t] [%c{1}] [%M] - %m%n" />
<TimeBasedTriggeringPolicy />
</RollingFile>
</Route>
</Routes>
</Routing>
For a more complex configuration both the <Routing> appender and the <Routes> can contain a <Script> (cf. documentation):
the script in the <Routing> appender can initialize the staticVariables map and return a default route,
the script in the <Routes> component chooses the appropriate route based on staticVariables and the logging event.

Mask the passwords in log messages using MDC or Any Filters in spring boot without using logback.xml file

I have to mask the passwords in log messages without using logback.xml file in spring boot.
sample Log:
LOGGER.info("user password : {}", pwd);
expected Output:
2019-11-26 18:27:15,951 [http-nio-8080-exec-2] INFO com.test.controller.TestController - user password: ***********
I able to achieve the same using logback.xml file. as shown below.
but without logback file need to do using application.properties configuration file in spring boot.
Note: Don't use log4j xml file. we should using slf4j or MDC or any filters and application.properties
<configuration>
<property name="DEV_HOME" value="c:/logs" />
<appender name="CONSOLE"
class="ch.qos.logback.core.ConsoleAppender">
<encoder
class="ch.qos.logback.core.encoder.LayoutWrappingEncoder">
<layout class="com.test.config.MaskingPatternLayout">
<patternsProperty>(SSN)</patternsProperty>
<pattern>%d [%thread] %-5level %logger{35} - %msg%n</pattern>
</layout>
</encoder>
</appender>
<logger name="com.test" level="debug" additivity="false">
<appender-ref ref="CONSOLE" />
</logger>
<root level="error">
<appender-ref ref="CONSOLE" />
</root>
</configuration>
Is it possible to achieve this without using logback.xml file and log4j.xml file?
can we able to mention the pattern layout java class in application properties file instead of mentioning the same in logback.xml file?
" in the above example, I have mention the java file in logback"
added MaskingPatternLayout for reference:
package com.test.config;
import java.util.Optional;
import java.util.regex.Matcher;
import java.util.regex.Pattern;
import org.springframework.stereotype.Component;
import ch.qos.logback.classic.PatternLayout;
import ch.qos.logback.classic.spi.ILoggingEvent;
#Component
public class MaskingPatternTest extends PatternLayout {
private String patternsProperty;
private Optional<Pattern> pattern;
public String getPatternsProperty() {
return patternsProperty;
}
public void setPatternsProperty(String patternsProperty) {
this.patternsProperty = patternsProperty;
if (this.patternsProperty != null) {
this.pattern = Optional.of(Pattern.compile(patternsProperty, Pattern.MULTILINE));
} else {
this.pattern = Optional.empty();
}
}
#Override
public String doLayout(ILoggingEvent event) {
final StringBuilder message = new StringBuilder(super.doLayout(event));
if (pattern.isPresent()) {
Matcher matcher = pattern.get().matcher(message);
while (matcher.find()) {
int group = 1;
while (group <= matcher.groupCount()) {
if (matcher.group(group) != null) {
for (int i = matcher.start(group); i < matcher.end(group); i++) {
message.setCharAt(i, '*');
}
}
group++;
}
}
}
return message.toString();
}
}
Kindly help on this.
As soon as you want to use advanced logging features (other than setting log levels), you have to use logging-library specific configuration, such as logback.xml for Logback, log4j.xml for log4j and so on.
However, Logback does have an API you can call. For example, you can set up your ConsoleAppender with beans:
#Bean
public LoggerContext loggerContext() {
return (LoggerContext) LoggerFactory.getILoggerFactory();
}
#Bean
public MaskPatternLayout maskPatternLayout(LoggerContext context) {
MaskPatternLayout layout = new MaskPatternLayout();
layout.setPatternsProperty("(SSN)");
layout.setPattern("%d [%thread] %-5level %logger{35} - %msg%n");
layout.setContext(context);
layout.start();
return layout;
}
#Bean
public LayoutWrappingEncoder<ILoggingEvent> maskEncoder(MaskPatternLayout layout) {
LayoutWrappingEncoder<ILoggingEvent> encoder = new LayoutWrappingEncoder<>();
encoder.setLayout(layout);
return encoder;
}
#Bean
public ConsoleAppender<ILoggingEvent> maskConsoleAppender(LoggerContext context, LayoutWrappingEncoder<ILoggingEvent> maskEncoder) {
ConsoleAppender<ILoggingEvent> appender = new ConsoleAppender<>();
appender.setContext(context);
appender.setEncoder(maskEncoder);
appender.start();
return appender;
}
Now you could create your own LoggerFactory:
#Component
public class MaskLoggerFactory {
private final Appender<ILoggingEvent> appender;
public MaskLoggerFactory(Appender<ILoggingEvent> appender) {
this.appender = appender;
}
public org.slf4j.Logger getLogger(String name) {
Logger logger = (Logger) LoggerFactory.getLogger(name);
logger.addAppender(appender);
logger.setLevel(Level.ALL);
logger.setAdditive(false);
return logger;
}
public org.slf4j.Logger getLogger(Class<?> cls) {
return getLogger(cls.getName());
}
}
And after that you can autowire MaskLoggerFactory to get a proper Logger. However, this doesn't make things easier to use, and if your only justification is to avoid creating a separate XML file, I would encourage you to keep using that XML file.

Does anybody know how to configure or extend lo4j2 to recreate logs after deletion?

Log4j2 does not recreate log files if they were deleted in runtime. For example, careless admin have removed log-files where app currently write own logs.
Actual result: logs doesn't write to file.
Wanted result: log4j2 recreate file after first attempt to write into it and continue to work with this file.
Manual recreating by cron or somehow else is not working because log4j2 "remembers" file descriptor of file and continiue to work with it even after old file was deleted and new was created.
On the StackOverflow I found only one workaround (https://stackoverflow.com/a/51593404/5747662) and it looks like this:
package org.apache.log4j;
import java.io.File;
import org.apache.log4j.spi.LoggingEvent;
public class ModifiedRollingFileAppender extends RollingFileAppender {
#Override
public void append(LoggingEvent event) {
checkLogFileExist();
super.append(event);
}
private void checkLogFileExist(){
File logFile = new File(super.fileName);
if (!logFile.exists()) {
this.activateOptions();
}
}
}
I don't like it beсause:
1) It "little bit" slow
Each time when we will write event we will also execute checkLogFileExist() and check file in filesystem.
2) It doesn't works for Log4j2
There is no method activateOptions() in Log4j2 infractucture.
So does anybody faced with same problem? How do you solved it?
UPDATE
I have tried to initialyze Triggering Policy to manually "rollover" deleted file, but it's not working for me.
My code:
final LoggerContext ctx = (LoggerContext) LogManager.getContext(false);
// loggerName is name of logger which should work with the file has been deleted.
LoggerConfig loggerConfig = ctx.getConfiguration().getLoggerConfig(loggerName);
// I also know what appender (appenderName) should work with this file.
RollingFileAppender appender = (RollingFileAppender) loggerConfig.getAppenders().get(appenderName);
appender.getTriggeringPolicy().initialize(appender.getManager());
Also my config:
<?xml version="1.0" encoding="UTF-8"?>
<Configuration status="ERROR">
<Appenders>
<RollingFile name="FILE_LOG">
<FileName>../log/temp/server.log</FileName>
<FilePattern>../log/server/SERVER_%d{yyyy-MM-dd-hh-mm}.log</FilePattern>
<PatternLayout pattern="%d{dd.MM.yyyy HH:mm:ss} [%t] %-5level %msg%n"/>
<Policies>
<SizeBasedTriggeringPolicy size="100 MB" />
</Policies>
</RollingFile>
<RollingFile name="OUTPUT_LOG">
<FileName>../log/temp/output.log</FileName>
<FilePattern>../log/output/OUTPUT_%d{yyyy-MM-dd-hh-mm}.log</FilePattern>
<PatternLayout>
<Pattern>%d{dd.MM.yyyy HH:mm:ss} %msg</Pattern>
</PatternLayout>
<Policies>
<CronTriggeringPolicy schedule="0 0 * * * ?"/>
<OnStartupTriggeringPolicy />
<SizeBasedTriggeringPolicy size="50 MB" />
</Policies>
</RollingFile>
</Appenders>
<Loggers>
<Logger name="OUTPUT" level="debug" additivity="false">
<AppenderRef ref="OUTPUT_LOG" />
</Logger>
<Root level="debug">
<AppenderRef ref="FILE_LOG" />
</Root>
</Loggers>
</Configuration>
I'm finally find the solution. Thanx #Alexander in comments for tip.
Short: We can manually initialize rollover process when detect file deletition.
Longer:
I implement it this way:
1) Create FileWatchService which will (1) subscribe for the log-file deletiiton events in your log folder and (2) notify you when these events will occur. It can be done by java.nio.file.WatchService (https://docs.oracle.com/javase/tutorial/essential/io/notification.html). I will provide my code below.
2) Create some other class which will initialize rollover when FileWatchService will notify about file deletition. I'm also will provide my full code below, but main magic will be occur this way:
final LoggerContext ctx = (LoggerContext) LogManager.getContext(false);
// You should know only appender name.
RollingFileAppender appender = (RollingFileAppender) ctx.getConfiguration().getAppenders().get(appenderName);
if (appender != null) {
// Manually start rollover logic.
appender.getManager().rollover();
}
My code looks like this (not ideal but it's working for me):
FileWatchService:
public class FileWatchService implements Runnable {
private final org.apache.logging.log4j.Logger logger = LogManager.getLogger(FileWatchService.class);
private WatchService watchService = null;
private Map<WatchKey,Path> keys = null;
private String tempPath;
public FileWatchService(String tempPath) {
try {
this.watchService = FileSystems.getDefault().newWatchService();
this.keys = new HashMap<WatchKey,Path>();
this.tempPath = tempPath;
Path path = Paths.get(tempPath);
register(path);
logger.info("Watch service has been initiated.");
}
catch (Exception e) {
logger.error("The error occurred in process of registering watch service", e);
}
}
// Method which register folder to watch service.
private void register(Path tempPath) throws IOException {
logger.debug("Registering folder {} for watching.", tempPath.getFileName());
// Registering only for delete events.
WatchKey key = tempPath.register(watchService, ENTRY_DELETE);
keys.put(key, tempPath);
}
#Override
public void run() {
try {
Thread.currentThread().setName("FileWatchService");
this.processEvents();
} catch (InterruptedException e) {
e.printStackTrace();
}
}
private void processEvents() throws InterruptedException {
WatchKey key;
// Waiting until event occur.
while ((key = watchService.take()) != null) {
// Poll all events when event occur.
for (WatchEvent<?> event : key.pollEvents()) {
// Getting type of event - delete, modify or create.
WatchEvent.Kind kind = event.kind();
// We are interested only for delete events.
if (kind == ENTRY_DELETE) {
// Sending "notification" to appender watcher service.
logger.debug("Received event about file deletion. File: {}", event.context());
AppenderWatcher.hadleLogFileDeletionEvent(this.tempPath + event.context());
}
}
key.reset();
}
}
}
Another class for initilize rollover (I have called it AppenderWatcher):
public class AppenderWatcher {
private static final org.apache.logging.log4j.Logger logger = LogManager.getLogger(AppenderWatcher.class);
public static void hadleLogFileDeletionEvent(String logFile) {
File file = new File(logFile);
if (!checkFileExist(file)) {
logger.info("File {} is not exist. Starting manual rollover...", file.toString());
// Getting possible appender name by log-file.
String appenderName = getAppenderNameByFileName(logFile);
// Getting appender from list of all appender
RollingFileAppender appender = (RollingFileAppender) getAppender(appenderName);
if (appender != null) {
// Manually start rollover logic.
appender.getManager().rollover();
logger.info("Rollover finished");
}
else {
logger.error("Can't get appender {}. Please, check lo4j2 config.", appenderName);
}
} else {
logger.warn("Received notification what file {} was deleted, but it exist.", file.getAbsolutePath());
}
}
// Method which checks is file exist. It need to prevent conflicts with Log4J rolling file logic.
// When Log4j rotate file it deletes it first and create after.
private static boolean checkFileExist(File logFile) {
return logFile.exists();
}
// Method which gets appender by name from list of all configured appenders.
private static Appender getAppender(String appenderName) {
final LoggerContext ctx = (LoggerContext) LogManager.getContext(false);
return ctx.getConfiguration().getAppenders().get(appenderName);
}
// Method which returns name of appender by log file name.
// ===Here I'm explaining some customer specific moments of log4j config.
private static String getAppenderNameByFileName(String fileName) {
return getLoggerNameByFileName(fileName) + "_LOG";
}
// This method fully customer specific.
private static String getLoggerNameByFileName(String fileName) {
// File name looks like "../log/temp/uber.log" (example).
String[] parts = fileName.split("/");
// Last part should look like "uber.log"
String lastPart = parts[parts.length - 1];
// We need only "uber" part.
String componentName = lastPart.substring(0, lastPart.indexOf("."));
return componentName.toUpperCase();
}
}

In Apache Log4J, Is there a way to simply create multiple log files on the fly, rather than appending to one log file?

I have a program, where I would like to be able to separate each single log-message into its own log file.
So if the class generates 10 ERROR logs and 10 DEBUG logs, within a single program execution, then that should create 20 log files , and they're names can ideally be something like :
logoutput1
logoutput2
logoutput3
..etc
And each log file just has a single line .
I'm working on an project where I'd like to implement some autonomic ability - the idea is that we can have a third , externally running program which can read in those log files(and then react based on them)
Is this possible with Log4j ? how can this be done ?
thanks !
Yes, you can use the RoutingAppender. See this question for details: Log4j2: Dynamic creation of log files for multiple logs
Write your own Log File Appender and create a new file every time when it attempts to write some log. The following piece of code might help you.
public class SingleLogMsgFileAppender extends FileAppender {
private String file = null;
private static long fileNo;
public SingleLogMsgFileAppender() {
super();
fileNo = 1;
}
#Override
protected void subAppend(LoggingEvent event) {
createNewFile(true);
synchronized (this) {
super.subAppend(event);
}
}
#Override
public void setFile(String file) {
this.file = file;
createNewFile(false);
}
public void createNewFile(boolean incrementFileNo) {
try {
String fileName = file + "testlogfile." + fileNo + ".log";
super.setFile(fileName);
super.activateOptions();
} catch (Exception e) {
e.printStackTrace();
}
if (incrementFileNo) {
fileNo++;
}
}
}
and here is the log4j configuration file
<log4j:configuration xmlns:log4j="http://jakarta.apache.org/log4j/">
<appender name="CustomAppender" class="loga.randd.threads.log4j.SingleLogMsgFileAppender">
<param name="File" value="log/" />
<param name="Append" value="true" />
<layout class="org.apache.log4j.PatternLayout">
<param name="ConversionPattern" value="%d{MM_dd_yyyy HH_mm_ss}%m%n" />
</layout>
</appender>
<root>
<priority value="debug" />
<appender-ref ref="CustomAppender" />
</root>
</log4j:configuration>

How to write separate log file for each instance of a class?

I am using java.util.logging for logging my program, the problem is that I need to create separate log file for each instance of class test case. For example, I have three test case objects, and in the end I get three log files, but:
Test Case #3 contains log for test case #3, Test Case #2 contains logs for test cases 2 and 3, and test case #1 contains log of all test cases.
Here is my code:
public class TestCase {
TestCase(String tcName){
this.tcName = tcName;
}
Logger log = Logger.getLogger("com.sigmaukraine.trn.autotest.testcase");
String tcName;
String scenarioReportDir;
List<Keyword> kwList = new ArrayList<Keyword>();
public void executeTestCase(){
//saving log for current test case
try {
FileHandler fh;
String fileName = new StringBuilder(tcName).append(".log").toString();
// This block configure the logger with handler and formatter
fh = new FileHandler(scenarioReportDir + fileName);
log.addHandler(fh);
SimpleFormatter formatter = new SimpleFormatter();
fh.setFormatter(formatter);
log.info("Executing test case: " + tcName);
} catch (SecurityException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
for(Keyword k : kwList){
k.executeKeyword();
}
}
problem is in
log.addHandler(fh);
its keep on adding the handler. so the behavior is as you are seeing. You should use
fh.close();
log.removeHandler(fh);
after executing test case.
<appender name="RootSiftAppender" class="ch.qos.logback.classic.sift.SiftingAppender">
<discriminator>
<Key><strong>testname</strong></Key>
<DefaultValue><strong>testrun</strong></DefaultValue>
</discriminator>
<sift>
<appender name="FILE-${testname}" class="ch.qos.logback.core.rolling.RollingFileAppender">
<File>${testname}.log</File>
<rollingPolicy class="ch.qos.logback.core.rolling.FixedWindowRollingPolicy">
<FileNamePattern><strong>${testname}.%i.log</strong></FileNamePattern>
<MinIndex>1</MinIndex>
<MaxIndex>100</MaxIndex>
</rollingPolicy>
<triggeringPolicy class="ch.qos.logback.core.rolling.SizeBasedTriggeringPolicy">
<MaxFileSize>10MB</MaxFileSize>
</triggeringPolicy>
<layout class="ch.qos.logback.classic.PatternLayout">
<Pattern>%d{ISO8601} %-5level %C{1} [%M:%L] [%thread] - %msg%n</Pattern>
</layout>
</appender>
</sift>
</appender>

Categories

Resources