How to properly configure log4j2 programmatically? - java

I am trying to setup log4j2 to write logs using the RollingFileAppender. I want to configure the logging system programmatically instead of using XML files.
Here is what I tried (mostly the same as the docs at https://logging.apache.org/log4j/2.x/manual/customconfig.html#Configurator):
public static void configure(String rootLevel, String packageLevel) {
ConfigurationBuilder<BuiltConfiguration> builder = ConfigurationBuilderFactory
.newConfigurationBuilder();
builder.setConfigurationName("RollingBuilder");
builder.setStatusLevel(Level.TRACE);
// Create a rolling file appender
LayoutComponentBuilder layoutBuilder = builder.newLayout("PatternLayout")
.addAttribute("pattern", "%d{yyyy-MM-dd HH:mm:ss,SSS} %-5p %c{1}:%L - %m%n");
ComponentBuilder triggeringPolicy =
builder
.newComponent("Policies")
.addComponent(
builder
.newComponent("SizeBasedTriggeringPolicy")
.addAttribute("size", "200M")
);
AppenderComponentBuilder appenderBuilder =
builder
.newAppender("rolling", "RollingFile")
.addAttribute("fileName", "log")
.addAttribute("filePattern", "log.%d.gz")
.add(layoutBuilder)
.addComponent(triggeringPolicy);
builder.add(appenderBuilder);
// Create new logger
LoggerComponentBuilder myPackageLoggerBuilder =
builder.newLogger("com.mypackage", packageLevel)
.add(builder.newAppenderRef("rolling"))
.addAttribute("additivity", false);
builder.add(myPackageLoggerBuilder);
RootLoggerComponentBuilder rootLoggerBuilder =
builder
.newRootLogger(rootLevel)
.add(builder.newAppenderRef("rolling"));
builder.add(rootLoggerBuilder);
// Initialize logging
Configurator.initialize(builder.build());
}
I call the configure() method at the start of the main method. A file named log is created when I run my program, but all log output goes to standard out and the log file remains empty.
Can someone help figure out what is wrong with my config?
I am not using any log4j configuration file, if that makes a difference. Also using the slf4j API in my code. Dependencies -
org.apache.logging.log4j:log4j-api:2.11.1
org.apache.logging.log4j:log4j-core:2.11.1
org.apache.logging.log4j:log4j-slf4j-impl:2.11.1
org.slf4j:slf4j-api:1.7.25

First, this answer is in response to the additional information provided in your comment.
My requirement is that I want to control log levels for different
packages via command line flags when I launch my program
Since you want to use program arguments to control your log level I suggest you take a look at the Main Arguments Lookup and the Routing Appender. Using these two features together you can set up your logging configuration to send log events to the appropriate appender based on program arguments.
I will provide a simple example to help guide you.
First here is an example Java class that generates some log events:
import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger;
import org.apache.logging.log4j.core.lookup.MainMapLookup;
public class SomeClass {
private static Logger log = LogManager.getLogger();
public static void main(String[] args){
MainMapLookup.setMainArguments(args);
if(log.isDebugEnabled())
log.debug("This is some debug!");
log.info("Here's some info!");
log.error("Some error happened!");
}
}
Next the configuration file for log4j2 (see comments in the code for details):
<?xml version="1.0" encoding="UTF-8"?>
<Configuration status="warn">
<Appenders>
<Routing name="myRoutingAppender">
<!-- log events are routed to appenders based on the logLevel program argument -->
<Routes pattern="$${main:logLevel}">
<!-- If the logLevel argument is followed by DEBUG this route is used -->
<Route ref="DebugFile" key="DEBUG" />
<!-- If the logLevel argument is omitted or followed by any other value this route is used -->
<Route ref="InfoFile" />
</Routes>
</Routing>
<!-- This appender is not necessary, was used to test the config -->
<Console name="Console" target="SYSTEM_OUT">
<PatternLayout pattern="%d{HH:mm:ss.SSS} [%t] %-5level %logger{36} - %msg%n" />
</Console>
<!-- Below are the 2 appenders used by the Routing Appender from earlier -->
<File name="DebugFile" fileName="logs/Debug.log" immediateFlush="true"
append="false">
<PatternLayout
pattern="%d{yyy-MM-dd HH:mm:ss.SSS} [%t] %-5level %logger{36} - %msg%n" />
</File>
<File name="InfoFile" fileName="logs/Info.log" immediateFlush="true"
append="false">
<PatternLayout
pattern="%d{yyy-MM-dd HH:mm:ss.SSS} [%t] %-5level %logger{36} - %msg%n" />
<LevelRangeFilter minLevel="FATAL" maxLevel="INFO" onMatch="ACCEPT" onMismatch="DENY"/>
</File>
</Appenders>
<Loggers>
<!-- Root logger is set to DEBUG intentionally so that debug events are generated.
However, events may be ignored by the LevelRangeFilter depending on where they
are routed by the Routing Appender
-->
<Root level="DEBUG">
<AppenderRef ref="Console" />
<AppenderRef ref="myRoutingAppender" />
</Root>
</Loggers>
</Configuration>
Using this configuration if you do not provide a "logLevel" argument then by default log events are routed to the "InfoFile" appender and any events that are more specific than INFO are ignored via the LevelRangeFilter.
If a "logLevel" argument is provided and is followed by "DEBUG" then log events are routed to the "DebugFile" appender and none of the events are ignored.
Note that I did try to use lookups to set the log level but it appears that log level parameters can't be configured via lookups. That is why I had to use this alternative approach. My one concern with this approach is that as I noted in the comments within the configuration file the log level must be kept at DEBUG which means you always generate the DEBUG events even if you don't use them. This might impact performance. A workaround would be to use your program argument to determine whether you need to generate the debug events. For example:
Normally you would use:
if(log.isDebugEnabled())
log.debug("This is some debug!");
but when using the configuration above you would use something like:
if("DEBUG".equals(args[1]))
log.debug("This is some debug!");
and you could make that more efficient using an enum (maybe even use the Level class provided by log4j2) if you needed to.
I hope this helps get you started.

Please find the sample Root log and finally needs to call reconfigure function.
RootLoggerComponentBuilder rootLogger = builder
.newRootLogger(Level.ALL)
.add(builder.newAppenderRef("LogToRollingFile"));
LoggerComponentBuilder logger = builder
.newLogger("MyClass",Level.ALL)
.addAttribute("additivity", false);
Configurator.reconfigure(builder.build());

Related

How do I programmatically disable BurstFilter and enable it with some parameters in log4j2?

I am planning to use log4j2's burstFilter in my application for burst logging management. The idea is to have it disabled until the administrator really wants to use it (I am planning to give an option in the application GUI to take parameters from the user and activate burstFilter accordingly).
I studied its documentation and realized that it's just a config change inside the log4j2.xml file. This xml config will be bundled along with the application anyway and I will include the filter like this..
<Console name="console">
<PatternLayout pattern="%-5p %d{dd-MMM-yyyy HH:mm:ss} %x %t %m%n"/>
<filters>
<Burst level="INFO" rate="16" maxBurst="100"/>
</filters>
</Console>
Now, here rate and maxBurst fields are set to some values which is not what I expect by default. One solution I thought of to just not use <filters> tag by default and explicitly write in the log4j2.xml once user sets these paramters in the GUI like below.
<Burst level="INFO" rate="16" maxBurst="100"/>
This feels like the rookiest solution, so I was wondering if there is any default attribute which I can toggle to switch the filter ON or OFF.
Expectation:
Default log4j2.xml:
<filters>
<Burst Activated="False" rate="16" maxBurst="100"/>
</filters>
If user wants to activate it:
<filters>
<Burst Activated="True" rate="16" maxBurst="100"/>
</filters>
Any help would be appreciated. Thank you
Unless the filter provides a parameter to enable/disable it I don't know of any way to directly toggle it. However, you can implement a toggle if you use the RoutingAppender to switch from an appender with the filter to one without the filter. The disadvantage is, obviously, that you need to have 2 appenders configured which could lead to some duplication of the configuration data. (You could extract some things into common properties which are shared between the two)
Here is a simple example using the ThresholdFilter which demonstrates the above strategy:
First, the log4j2.xml configuration file:
<?xml version="1.0" encoding="UTF-8"?>
<Configuration status="WARN">
<!--
See how I defined a common property to hold the pattern since it is used
in both appenders below.
-->
<Properties>
<Property name="pattern">%d{HH:mm:ss.SSS} [%t] %-5level %logger{36} - %msg%n</Property>
</Properties>
<Appenders>
<Console name="ConsoleWithFilter" target="SYSTEM_OUT">
<PatternLayout pattern="${pattern}" />
<ThresholdFilter level="ERROR" onMatch="ACCEPT" onMismatch="DENY"/>
</Console>
<Console name="Console" target="SYSTEM_OUT">
<PatternLayout pattern="${pattern}" />
</Console>
<Routing name="Routing">
<Routes pattern="$${ctx:filterToggle}">
<!-- This route is chosen if ThreadContext has value "ENABLE" for key filterToggle. -->
<Route key="ENABLE" ref="ConsoleWithFilter" />
<!-- This is the default route when no others match -->
<Route ref = "Console"/>
</Routes>
</Routing>
</Appenders>
<Loggers>
<Root level="ALL">
<AppenderRef ref="Routing" />
</Root>
</Loggers>
</Configuration>
Next, some Java code to generate logs:
import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger;
import org.apache.logging.log4j.ThreadContext;
public class SomeClass {
private static final Logger log = LogManager.getLogger();
public static void main(String[] args){
//We start with no value in the ThreadContext for filterToggle
// this should cause all logs to appear in the console.
if(log.isDebugEnabled())
log.debug("This is some debug! (This should appear in console)");
log.info("Here's some info! (This should appear in console)");
log.error("Some error happened! (We will also see this in the console)");
//Now we enable the filter by switching appenders
ThreadContext.put("filterToggle", "ENABLE");
log.info("This should not appear in the console");
log.debug("This also should --not-- appear");
log.fatal("This will appear since it's more specific than ERROR level.");
}
}
Finally, here is the output of the above:
12:22:35.922 [main] DEBUG example.SomeClass - This is some debug! (This should appear in console)
12:22:35.923 [main] INFO example.SomeClass - Here's some info! (This should appear in console)
12:22:35.923 [main] ERROR example.SomeClass - Some error happened! (We will also see this in the console)
12:22:35.924 [main] FATAL example.SomeClass - This will appear since it's more specific than ERROR level.
I hope this helps you!

Log4J API to log long messages in temporary files

Am trying to understand if there are any best practices/utilities/out of the box functionality for logging messages that are quite long (typically json file responses/xmls/CSVs etc).
While logging as part of my application I do something like
log.info("Incompatible order found {}", order.asJson())
The problem being that the asJson() representation could be pretty long. In the log files the actual json is only relevant say 1% of the time. So it is important enough to retain but big enough to make me lose focus when I am trawling through logs.
Is there anyway I can could do something like
log.info("Incompatible order found, file dumped at {}", SomeUtility.dumpString(order.asJson()));
where the utility dumps the file into a location consistent with other log files and then in my log file I can see the following
Incompatible order found, file dumped at /abc/your/log/location/tmpfiles/xy23nmg
Key things to note being
It would be preferable to simply use the existing logging api to somehow configure this so that the location of these temp files are the same as log itself this way it goes through the cleanup cycle after N days etc just like the actual log files.
I can obviously write something but I am keen on existing utilities or features if already available withing log4j
I am aware that when logs such as these are imported into analysis systems like Splunk then only the filenames will be present without the actual files and this is okay.
(Suggestion was given based on logback instead of log4j2. However I believe similar facilities exists in log4j2)
In Logback, there is a facility called SiftingAppender, which allow separate appender (e.g. file appender) to be created according to some discriminator (e.g. in MDC).
So, by configuring an appender (e.g. orderDetailAppender) which separates file based on a discriminator (e.g. by putting order ID in MDC), and make use of a separate logger to connect to the appender, this should give you the result you want:
pseudo code:
logback config:
<appender name="ORDER_DETAIL_APPENDER" class="ch.qos.logback.classic.sift.SiftingAppender">
<!-- use MDC disciminator by default, you may choose/develop other appropriate discrimator -->
<sift>
<appender name="ORDER-DETAIL-${ORDER_ID}" class="ch.qos.logback.core.FileAppender">
....
</appender>
</sift>
</appender>
<logger name="ORDER_DETAIL_LOGGER">
<appender-ref name="ORDER_DETAIL_APPENDER"/>
</logger>
and your code looks like:
Logger logger = LoggerFactory.getLogger(YourClass.class); // logger you use normally
Logger orderDetailLogger = LoggerFactory.getLogger("ORDER_DETAIL_LOGGER");
.....
MDC.put("ORDER_ID", order.getId());
logger.warn("something wrong in order {}. Find the file in separate file", order.getId());
orderDetailLogger.warn("Order detail:\n{}", order.getJson());
// consider making order.getJson() an lambda, or wrap the line with logger
// level checking, to avoid json construction even not required to log
MDC.remove("ORDER_ID");
The simplest approach is to use a separate logger for the JSON dump, and configure that logger to put its output in a different file.
As an example:
private static final Logger log = LogManager.getLogger();
private static final Logger jsonLog = LogManager.getLogger("jsondump");
...
log.info("Incompatible order found, id = {}", order.getId());
jsonLog.info("Incompatible order found, id = {}, JSON = {}",
order.getId(), order.asJson());
Then, configure the logger named jsondump to go to a separate file, possibly with a separate rotation schedule. Make sure to set additivity to false for the JSON logger, so that messages sent to that logger won't be sent to the root logger as well. See Additivity
in the Log4J docs for details.
Example (XML configuration, Log4J 2):
<?xml version="1.0" encoding="UTF-8"?>
<Configuration status="INFO">
<Appenders>
<!-- By setting only filePattern, not fileName, all log files
will use the pattern for naming. Files will be rotated once
an hour, since that's the most specific unit in the
pattern. -->
<RollingFile name="LogFile" filePattern="app-%d{yyyy-MM-dd HH}00.log">
<PatternLayout pattern="%d %p %c [%t] %m %ex%n"/>
<Policies>
<TimeBasedTriggeringPolicy/>
</Policies>
</RollingFile>
<!-- See comment for appender named "LogFile" above. -->
<RollingFile name="JsonFile" filePattern="json-%d{yyyy-MM-dd HH}00.log">
<!-- Omitting logger name for this appender, as it's only used by
one logger. -->
<PatternLayout pattern="%d %p [%t] %m %ex%n"/>
<Policies>
<TimeBasedTriggeringPolicy/>
</Policies>
</RollingFile>
</Appenders>
<Loggers>
<!-- Note that additivity="false" to prevent JSON dumps from being
sent to the root logger as well. -->
<Logger name="jsondump" level="debug" additivity="false">
<appender-ref ref="JsonFile"/>
</Logger>
<Root level="debug">
<appender-ref ref="LogFile"/>
</Root>
</Loggers>
</Configuration>

Choosing log file dynamically

I have an application with a number of components that create data in a database. Each component logs what it is doing when creating data. There are many such components and the application is flexible so that it does not always need to execute the same set of these data-creation components each time it runs.
Currently, everything logs to a single file, which is producing files that are starting to get unmanageable. I'd like it if each component could log to a file whose name describes which component wrote it - ComponentA should log to ComponentA-dataCreationPhase.log.
Most solutions I've seen seem to assume that the different loggers will be static so that it's OK to look them up by name, such as LogManager.getLogger("ComponentA"); - assuming a logger with that name is already configured in my log4j2.xml. Other solutions that I've seen have used Routing and ThreadContexts - but I'm not sure this will work since these components will probably all execute in the same thread.
How can I get each component (many are distinct classes, but some are just different instances of the same class, just configured differently) to log to its own log file? Ideally, this would be done based on the existing log4j2.xml file, as the log4j2.xml might have some user-specified configurations that I'd want to propagate to the component-specific loggers, such as logging path and logging level.
I found this answer:
https://stackoverflow.com/a/38096181/192801
And I used it as the basis of a new function that I added to all Components, via a default method in the top-level interface.
Added to the interface for all components:
public interface Component {
default Logger createLogger(String logFileName, String oldAppenderName, String newAppenderName, boolean append, Level level)
{
LoggerContext context = (LoggerContext) LogManager.getContext(false);
Configuration configuration = context.getConfiguration();
Appender oldAppender = configuration.getAppender(oldAppenderName);
Layout<? extends Serializable> oldLayout = oldAppender.getLayout();
// create new appender/logger
LoggerConfig loggerConfig = new LoggerConfig(logFileName, level, false);
Appender appender ;
// In my case, it is possible that the old appender could
// either be a simple FileAppender, or a RollingRandomAccessFileAppender,
// so I'd like the new one to be of the same type as the old one.
// I have yet to find a more elegant way to do create a new Appender
// of *any* type and then copy all relevant config.
if (oldAppender instanceof RollingRandomAccessFileAppender)
{
int bufferSize = ((RollingRandomAccessFileAppender)oldAppender).getBufferSize();
RollingRandomAccessFileManager oldMananger = (RollingRandomAccessFileManager)((RollingRandomAccessFileAppender) oldAppender).getManager();
TriggeringPolicy triggerPolicy = oldMananger.getTriggeringPolicy();
RolloverStrategy rollStrategy = oldMananger.getRolloverStrategy();
Filter filter = ((RollingRandomAccessFileAppender)oldAppender).getFilter();
// Inject new log file name into filePattern so that file rolling will work properly
String pattern = ((RollingRandomAccessFileAppender)oldAppender).getFilePattern().replaceAll("/[^/]*-\\%d\\{yyyy-MM-dd\\}\\.\\%i\\.log\\.gz", "/"+logFileName+"-%d{yyyy-MM-dd}.%i.log.gz");
appender = RollingRandomAccessFileAppender.newBuilder()
.withFileName("logs/" + logFileName + ".log")
.withFilePattern(pattern)
.withAppend(append)
.withName(newAppenderName)
.withBufferSize(bufferSize)
.withPolicy(triggerPolicy)
.withStrategy(rollStrategy)
.withLayout(oldLayout)
.withImmediateFlush(true)
.withFilter(filter)
.build();
}
else
{
appender = FileAppender.newBuilder()
.withFileName("logs/" + logFileName + ".log")
.withAppend(append)
.withName(newAppenderName)
.withLayout(oldLayout)
.setConfiguration(configuration)
.withLocking(false)
.withImmediateFlush(true)
.withIgnoreExceptions(true)
.withBufferSize(8192)
.withFilter(null)
.withAdvertise(false)
.withAdvertiseUri("")
.build();
}
appender.start();
loggerConfig.addAppender(appender, level, null);
configuration.addLogger(logFileName, loggerConfig);
context.updateLoggers();
return context.getLogger(logFileName);
}
Use of this function happens inside the constructor of the components:
public class ComponentA implements Component
{
private Logger logger = LogManager.getLogger();
public ComponentA(String componentName)
{
this.logger = this.createLogger(componentName, "MyOriginalFileAppender", componentName+"Appender", true, Level.DEBUG, this.logger);
}
}
and elsewhere:
ComponentA fooSubtypeAComponent = new ComponentA("FooA");
ComponentA barSubtypeAComponent = new ComponentA("BarA");
ComponentB fooVariantTypeBComponent = new ComponentB("FooB");
The original Appenders + Loggers snippet from the Log4j config:
<Appenders>
<!-- STDOUT appender. -->
<Console name="Console" target="SYSTEM_OUT">
<PatternLayout
pattern="%d{HH:mm:ss.SSS} [%t] %-5level %logger{36}#%M - %msg%n" />
</Console>
<RollingRandomAccessFile name="MyOriginalFileAppender" fileName="${baseDir}/${defaultLogName}.log" filePattern="${baseDir}/$${date:yyyy-MM}/${defaultLogName}-%d{MM-dd-yyyy}.%i.log.gz">
<PatternLayout pattern="%d{yyyy-MM-dd HH:mm:ss.SSS} [%t] %-5level %logger{36}#%M - %msg%n" />
<Policies>
<!-- <TimeBasedTriggeringPolicy /> -->
<!-- Let's try cron triggering policy configured to trigger every day at midnight -->
<CronTriggeringPolicy schedule="0 0 0 * * ?"/>
<SizeBasedTriggeringPolicy size="25 MB" />
</Policies>
</RollingRandomAccessFile>
</Appenders>
<Loggers>
<Root level="debug">
<!-- only write INFO level to the file. -->
<AppenderRef ref="MyOriginalFileAppender" level="debug"/>
<!-- Console shows everything at DEBUG level-->
<AppenderRef ref="Console" level="info" />
</Root>
</Loggers>
The result should be three logs files: logs/FooA.log, logs/BarA.log, logs/FooB.log - one log file for each instance shown above. Still has some kinks to iron out, but I think this will work fine.

Log4j2 does not change logging level at runtime

Yes, I've read all the related questions. I am using log4j2 (tried both version 2.4 and updating to latest, version 2.6.2).
I have a small utility program for customers. I'm keen to keep exposed configurations at minimum. But for problematic cases, I'd also want to add a -debug flag to enable debug logs at runtime.
Here is my code to enable debug logging
private static void enableDebugLogs(){
LoggerContext ctx = (LoggerContext) LogManager.getContext();
LoggerConfig log = ctx.getConfiguration().getRootLogger();
System.out.println(log.getLevel()); // INFO
log.setLevel(Level.DEBUG);
System.out.println(log.getLevel()); // DEBUG
ctx.updateLoggers();
System.out.println(ctx.getRootLogger().getLevel()); // DEBUG, hey it works, right?
}
But it does not actually work with any of these cases:
enableDebugLogs();
logger.debug("Debug mode on"); // static, already made logger. Level did not change
LogManager.getLogger(Main.class).debug("Debug"); // Nope, not printing
Logger root = LogManager.getRootLogger();
root.info("Level: " + root.getLevel()); // Level: INFO, should be DEBUG
The utility program is finished usually in less than 30 seconds, so the change should be instant. Here is the log4j2.xml
<?xml version="1.0" encoding="UTF-8"?>
<Configuration status="WARN">
<Appenders>
<Console name="Console" target="SYSTEM_OUT">
<PatternLayout pattern="%d{HH:mm:ss.SSS} %-5level - %msg%n"/>
</Console>
<RollingFile name="File" fileName="program_name.log" filePattern="program_name-archived.log">
<PatternLayout>
<Pattern>%d{HH:mm:ss.SSS} %-5level - %msg%n</Pattern>
</PatternLayout>
<Policies>
<SizeBasedTriggeringPolicy size="10 KB" />
</Policies>
<DefaultRolloverStrategy min="1" max="1"/>
</RollingFile>
</Appenders>
<Loggers>
<Root level="info">
<AppenderRef ref="Console"/>
<AppenderRef ref="File"/>
</Root>
</Loggers>
</Configuration>
Is the problem with usings AppenderRefs? Can I somehow tell the Appenders to update logging level from Root logger?
Found the real issue. Had to use:
LoggerContext ctx = (LoggerContext) LogManager.getContext(false);
instead of
LoggerContext ctx = (LoggerContext) LogManager.getContext();
API stating the difference being "returns the LoggerContext" and "returns the current LoggerContext". And I clearly missed this bit of information for the version without boolean parameter:
"WARNING - The LoggerContext returned by this method may not be the LoggerContext used to create a Logger for the calling class."
You can change switch the logging configuration between two or multiple log4j2.xml files.
For example, create two log4j2.xml files with different configurations. log4j2.xml & log4j2-debug.xml and pass it to below code.
ConfigurationFactory configFactory = XmlConfigurationFactory.getInstance();
ConfigurationFactory.setConfigurationFactory(configFactory);
LoggerContext ctx = (LoggerContext) LogManager.getContext(false);
ClassLoader classloader = Thread.currentThread().getContextClassLoader();
InputStream inputStream = classloader.getResourceAsStream(logFileName);
ConfigurationSource configurationSource = new ConfigurationSource(inputStream);
ctx.start(configFactory.getConfiguration(ctx, configurationSource));
To reconfigure log4j2 after it's initialization, the documentation give you two ways :
Using the config file, you can enable the automatic reconfiguration (it's the preferred way) so that modifying the file after initialization will be reflected on your runtime : http://logging.apache.org/log4j/2.x/manual/configuration.html#AutomaticReconfiguration
Using ConfigurationBuilder you can reconfigure log4j programatically (I think it's what you are looking for), see the "Reconfigure Log4j Using ConfigurationBuilder with the Configurator" paragraph of this page : http://logging.apache.org/log4j/2.x/manual/customconfig.html
In addition to the above, to debug log4j initialization, set <Configuration status="trace" in the beginning of the configuration file. Log4j2's internal status logs will print to the console. This may help troubleshooting.
Here is a little known fact.
If you set your ROOT_LOGGER level in your log4j2.xml or equivalent properties to say DEBUG, then setting the level dynamically to something lower did not work for me; in other words, setting dynamically had no effect.
ONLY when I change the setting in log4j2.xml to TRACE and then using the below to change the log level did it work for me:
Level level = Level.valueOf(this.consoleLogLevel.toUpperCase());
//Dynamically set the log level for ALL loggers
LoggerContext ctx = (LoggerContext) LogManager.getContext(false);
Configuration config = ctx.getConfiguration();
LoggerConfig loggerConfig = config.getLoggerConfig(LogManager.ROOT_LOGGER_NAME);
loggerConfig.setLevel(level);
ctx.updateLoggers();
And here is my simple configuration file:
<?xml version="1.0" encoding="UTF-8"?>
<Configuration status="WARN">
<Appenders>
<Console name="Console" target="SYSTEM_OUT">
<PatternLayout pattern="%d{HH:mm:ss.SSS} [%t] %-5level %logger{36} - %msg%n"/>
</Console>
</Appenders>
<Loggers>
<Root level="TRACE">
<AppenderRef level="TRACE" ref="Console" />
</Root>
</Loggers>
</Configuration>
And here is a TestNG test you can refactor to verify the above:
#Test
public void givenCut_whenLoggingWithERRORLevel_LoggingOutputIsAtERRORLevelAndHigherOnly() {
try {
lcaProperties.setLogLevel(Level.ERROR.name());
GlobalLogger globalLogger = GlobalLogger.builder().lcaServiceProperties(lcaProperties).build();
assertNotNull(globalLogger.getConsoleLogger());
assertNotNull(globalLogger.getFunctionalDbLogger());
assertEquals(globalLogger.getConsoleLogger().getLevel(), Level.ERROR);
assertEquals(log.getLevel(), Level.ERROR);
System.out.println("log.getLogLevel() = " + log.getLevel());
System.out.println("globalLogger.getConsoleLogger().getLevel() = " + globalLogger.getConsoleLogger().getLevel());
log.fatal("logLevel::"+log.getLevel()+"::log.fatal");
log.error("logLevel::"+log.getLevel()+"::log.error");
log.warn("logLevel::"+log.getLevel()+"::log.warn");
log.info("logLevel::"+log.getLevel()+"::log.info");
log.debug("logLevel::"+log.getLevel()+"::log.debug");
log.trace("logLevel::"+log.getLevel()+"::log.trace");
globalLogger.getConsoleLogger().fatal("logLevel::"+globalLogger.getConsoleLogger().getLevel()+"::globalLogger.getConsoleLogger().fatal");
globalLogger.getConsoleLogger().error("logLevel::"+globalLogger.getConsoleLogger().getLevel()+"::globalLogger.getConsoleLogger().debug");
globalLogger.getConsoleLogger().warn("logLevel::"+globalLogger.getConsoleLogger().getLevel()+"::globalLogger.getConsoleLogger().debug");
globalLogger.getConsoleLogger().info("logLevel::"+globalLogger.getConsoleLogger().getLevel()+"::globalLogger.getConsoleLogger().debug");
globalLogger.getConsoleLogger().debug("logLevel::"+globalLogger.getConsoleLogger().getLevel()+"::globalLogger.getConsoleLogger().debug");
globalLogger.getConsoleLogger().trace("logLevel::"+globalLogger.getConsoleLogger().getLevel()+"globalLogger.getConsoleLogger()::log.trace");
} catch (Exception e) {
fail();
log.error(e);
}
}

Log4j 2.0 - No Logs appearing in the Log Files - Trying Multiple Loggers in the same Class using log4j2.xml

Below is the log4j2.xml file that I have created. I have configured async_file.log for Asynchronous Logging and regular_file.log for regular and synchronous logging. The problem is that the log files get created, but the size of the files is zero and with no logs. All logs are coming to server.log file (JBOSS) and not to the 2 files that I had got configured for (async_file.log and regular_file.log).
Please let me know why the logs are NOT going to the log files that I have configured. Please help me with this or give me some direction or hint.
I am calling the two different loggers in the same class file by name DCLASS as shown below:
private static final transient Logger LOG = Logger.getLogger(DCLASS.class);
private static final transient Logger ASYNC_LOG = Logger.getLogger("ASYNC");
I have included the following jars in the Class Path:
1. log4j-api-2.0-beta8.jar
2. log4j-core-2.0-beta8.jar
3. disruptor-3.0.0.beta1.jar
My log4j2.xml is as below:
<?xml version="1.0" encoding="UTF-8"?>
<configuration status="INFO">
<appenders>
<!-- Async Loggers will auto-flush in batches, so switch off immediateFlush. -->
<FastFile name="AsyncFastFile" fileName="../standalone/log/async_file.log"
immediateFlush="false" append="true">
<PatternLayout>
<pattern>%d %p %class{1.} [%t] %location %m %ex%n</pattern>
</PatternLayout>
</FastFile>
<FastFile name="FastFile" fileName="../standalone/log/regular_file.log"
immediateFlush="true" append="true">
<PatternLayout>
<pattern>%d %p %class{1.} [%t] %location %m %ex%n</pattern>
</PatternLayout>
</FastFile>
</appenders>
<loggers>
<!-- pattern layout actually uses location, so we need to include it -->
<asyncLogger name="ASYNC" level="trace" includeLocation="true">
<appender-ref ref="AsyncFastFile"/>
</asyncLogger>
<root level="info" includeLocation="true">
<appender-ref ref="FastFile"/>
</root>
</loggers>
</configuration>
The reason why the logs were not coming to the log files is because, I was using 'Logger' instead of 'LogManager'.
In the code, I had
private static final transient Logger ASYNC_LOG = Logger.getLogger("ASYNC");
The code should have been
private static final transient Logger ASYNC_LOG = Logmanager.getLogger("ASYNC");
When it is 'logger', then the compiler is looking into 'Log4j API' and when it is 'LogManager' it is looking into 'Log4j2 API'. Since I have configured everything to use Log4j2, by changing logger to LogManager, the logs started coming to the log files as expected.

Categories

Resources