Condition logging level in Log4j.properties - java

Is it possible to have conditions in log4j.properties. I have a situation where I want to have logging level set to Info on production environment and DEBUG on local. Is it possible to read environment variables in log4j.properties.

No, you have to have 2 different log4j.properties file

Configuring logging is something that should happen as part of the deployment, not as part of the build, i.e. you should NOT create multiple builds for different log configurations, the risk of introducing also other differences in artifacts is to big.
Create ONE build containing a default configuration, possibly the one you want to use in production.
Implement a way to find and use an alternative configuration without changing your artifact. Most of the time this is achieved by adding an additional directory to the classpath of your application and store a log4j configuration there. You can use the default initialization of log4j by using a configuration format that has higher precedence then the one contained in the artifact. This also allows you to reconfigure logging without new deployment, which can be very helpful when troubleshooting.
Alternatively you can provide the location of the configuration file to use via a environment variable at startup: -Dlog4j.configuration=log4j-prod.xml (borrowed from Keerthi Ramanathan's answer)

You can prepare different builds and decide which log4j.propeties you want to include on build time, for example using maven params, profiles or any other way.
There is no way to declare condition in log4j.properties

No.
But just to outline some other options
a) I would encourage you to have a look at logback which provides a simple facade over log4j and you can then change your config at runtime. The relevant documentation can be found here.
b) If you have a build process in place (ant/maven) you can do the replacement as part of the build process. If you use maven you can set up a profile to build and the in the build-cycle apply filtering
c) Load the log4j files from a conf directory for each environment. The idea for that is that the files once set for an environment are changed minimally over time. You maintain both in your repository and as part of your deployment process ensure that additional/deleted files/props get added/removed.

What i would suggest as said in comment, have a separate version of log4j properties file for every environment and follow the naming convention for easy maintainance. say, for dev environment, it would be log4j-dev.xml and for production, log4j-prod.xml. Now, you can configure the appropriate file to pick up during runtime using
-Dlog4j.configuration=log4j-prod.xml
during server startup. so, that appropriate conffiguration file will be taken by log4j.

You can use programmatic configuration when using log4j, which gives you more control over what options to use in what environment. You can have your own configuration files and use your own logic to convert them into a log4j configuration. The downside is that you need to do init() somewhere in your application. This answer provides good reference.

I used a this approach when I had similar question. A default log level if nothing is explicitly specified, and option to override.
So, I added a log4j.properties file in application resources.
log4j.rootLogger=ALL, stdout
...
log4j.appender.stdout.Threshold=INFO
...
And then added more log config properties (log4j-n.properties, for n in {d, i, w, e}) defining log levels at debug, info, warning and error. Now, during startup I would supply the config file explicitly if I wanted to override the default.
java ... -Dlog4j.configuration=file:///<path>/log4j-n.properties ...
This would override any config I had in the default log4j.properties.
Later I went with this approach. I removed all the extra config files. In the log4j.properties file in resources, I used a JVM arg placeholder:
log4j.appender.stdout.Threshold=${app.log.level}
And supplied that as JVM argument.
java ... -Dapp.log.level=<LOG-LEVEL> ...
Voila!

Related

log4j2 log file is not generating

the log file is generated when I run the code within IDE (Intellij IDEA).
as soon as I create runnable jar of the code and then try to run the jar then the logs are not generating.
I have made sure the log4j2.xml file is a part of classpath.
is there anything extra I have to do while creating jar in the Intellij IDEA?
Taken from the FAQ: How do I debug my configuration?
First, make sure you have the right jar files on your classpath. You need at least log4j-api and log4j-core.
Next, check the name of your configuration file. By default, log4j2 will look for a configuration file named log4j2.xml on the classpath. Note the “2” in the file name! (See the configuration manual page for more details.)
From log4j-2.9 onward
From log4j-2.9 onward, log4j2 will print all internal logging to the console if system property log4j2.debug is either defined empty or its value equals to true (ignoring case).
Prior to log4j-2.9
Prior to log4j-2.9, there are two places where internal logging can be controlled:
If the configuration file is found correctly, log4j2 internal status logging can be controlled by setting in the configuration file. This will display detailed log4j2-internal log statements on the console about what happens during the configuration process. This may be useful to trouble-shoot configuration issues. By default the status logger level is WARN, so you only see notifications when there is a problem.
If the configuration file is not found correctly, you can still enable log4j2 internal status logging by setting system property -Dorg.apache.logging.log4j.simplelog.StatusLogger.level=TRACE.

log4j logger overwriting into jxl.log file

I am using log4j for logger purpose. At the same time I am also using JXL to read/write Excel file.
But instead of writing log into log4j logger file, it is writing into jxl.log file.
What can be issue?
Looks like you have been using jxl-2.6.3.jar or similar version.
Log4j picks up the first configuration file with default file name ( i.e. log4j.xml or log4j.properties ) in your classpath if you haven't specified a specific name via JVM parameters. As jxl-2.6.3.jar contains a log4j.xml you ended up printing everything to jxl.log as defined in the log4j.xml
The best way to deal with these kind of problems is to run your application with -Dlog4j.debug JVM parameter. This would print a few line snippet when the log4j is initialized.
log4j: Using URL [jar:file:/C:/YourApp/WEB-INF/lib/jxl-2.6.3.jar!/log4j.xml] for automatic log4j configuration.
log4j: Preferred configurator class: org.apache.log4j.xml.DOMConfigurator
...{Blah Blah Blah}
There are many ways in which you can solve this problem.
Use the newer versions of jxl which doesn't contain log4j.xml.
Make sure your log4j.properties file is on top of classpath.
Remove the log4j.xml from the jxl-2.6.3.jar (Dirty solution).
Pass the configuration file name in VM parameter as -Dlog4j.configuration=log4j.properties. This would atleast make sure log4j.xml inside jxl-2.6.3.jar will not be used. (But what if another jar with same name as log4j.properties?).
Rename your log4j.properties file to log4j-yourApp.properties and add VM parameter -Dlog4j.configuration=log4j-yourApp.properties This would definitely help and this is how it should be done to avoid these kind of situations.
More details on Log4j here

Issue overriding application properties in Spring-boot (profile-specific) application launched with PropertiesLauncher

I'm having difficulty trying to override a property declared in a profile-specific application properties file on the classpath with another value declared in an overrides file on the file system.
I have an auto-configured Spring-boot application (that is, using #EnableAutoconfiguration) that has multiple profiles, which I launch using PropertiesLauncher rather than JarLauncher (the reason having to do with deployment constraints - I need to deploy an exploded directory rather than an archive into a read-only filesystem.)
Within the root of my application, I have some profile-specific application properties, for example:
application-dev.properties
application-qa.properties
application-prd.properties
And let's say, for the sake of argument that application-dev.properties contains:
foo.bar=baz
foo.baz=other
For any environment, it may be necessary to override an existing property, as well as supply an absent one (like a production password, for example), and the issue I'm seeing is with overriding properties already declared in an application-${profile}.properties file on the classpath. (Supplying properties not present in the classpath file works fine, this is not the issue.)
Say I have an overrides properties file in a file system location such as:
/local/appname/dev/overrides/application.properties
and I want to override the property, foo.bar, as well as declare a new property, foo.password.
Therefore the contents of the overrides file are:
foo.bar=overridden-value
foo.password=something
When I launch the application, I use a command line something like this:
java -Dspring.config.location=file:/local/appname/dev/overrides/
-Dspring.profiles.active=dev
org.springframework.boot.loader.PropertiesLauncher
--debug &
The issue I am seeing is that although foo.password, the property not declared in the application-dev.properties file is picked up, the override of foo.bar is ignored - I still see the value, baz from application-dev.properties rather than the value, overridden-value from /local/appname/dev/overrides/application.properties.
With the --debug option enabled, I can see the ConfigFileApplicationListener logging that it has loaded both the overrides file (from the filesystem) and the profile-specific file (from the classpath), in that order.
I'm tempted into the perhaps naïve conclusion that because the overrides file is listed first, it is being loaded first then overridden by the 'default' profile-specific file from the classpath, which is listed later. I do appreciate however, that order of listing in the log doesn't necessarily correlate with behaviour. And I have tried varying the order of paths declared on the spring.config.location property, so that classpath: is listed before file:... but this hasn't helped and I't not convinced it would anyway, given that the Spring-boot documentation clearly states that the default properties locations are always searched even if you supply a value for spring.config.location.
The Spring-boot documentation is very specific about the order that properties are resolved for a Spring-boot executable JAR, in descending order of precedence:
Command line arguments.
Java System properties (System.getProperties()).
OS environment variables.
JNDI attributes from java:comp/env
A RandomValuePropertySource that only has properties in random.*.
Application properties outside of your packaged jar (application.properties including YAML and profile variants).
Application properties packaged inside your jar (application.properties including YAML and profile variants).
#PropertySource annotations on your #Configuration classes.
Default properties (specified using SpringApplication.setDefaultProperties).
Take note of lines 6 and 7 - properties outside over properties inside your jar.
What's not stated, as far as I can see, and which may be the source of my confusion/issue, is what happens when you're not using a JAR but an exploded directory (and therefore PropertiesLauncher.)
If the behaviour of an exploded directory were consistent with what's stated for a JAR, I'd expect that the values of properties declared in /local/appname/dev/overrides/application.properties would override any of the same name declared in classpath:application-dev.properties, but this doesn't seem to be the case.
Also noted from the Spring-boot documentation (appendix C.4 on PropertiesLauncher) is mention of the loader.home property, which is described as '... [the] Location of additional properties file, e.g. /opt/app (defaults to ${user.dir})'.
So I tried using loader.home instead of spring.config.location, but to no avail.
(Update: I also tried using loader.config.location and I have two notes: it seems to want a file rather than a directory (so its behaviour is not analogous with spring.config.location), and when I did supply a file path rather than the parent directory, it still didn't help.)
Can anyone spot what I'm doing wrong, or what incorrect assumption(s) I'm making?
Thanks, Dave, your suggestion was 100% correct.
If I rename the properties file in /local/appname/dev/overrides to application-dev.properties then the property values from that file do override the ones in classpath:application-dev.properties.
I was sure I had tried this combination yesterday, but I think what must have stopped it working was when I was playing around with specifying the spring.config.location and got that wrong so it wasn't looking for the override file in the right place.

hadoop flume log4j configuration

If you run a hadoop flume node, as default it generates logs under /var/log/flume using log4j. The files will look like
/var/log/flume/flume-$FLUME_IDENT_STRING-$command-$HOSTNAME.log
According to the flume user guide here, the only way to change the flume log configuration is via flume-daemon.sh which runs flume node using the Flume Environment Variables like:
export FLUME_LOGFILE=flume-$FLUME_IDENT_STRING-$command-$HOSTNAME.log
export FLUME_ROOT_LOGGER="INFO,DRFA"
export ZOOKEEPER_ROOT_LOGGER="INFO,zookeeper"
export WATCHDOG_ROOT_LOGGER="INFO,watchdog"
The questions are:
if I want to change the log level from INFO to DEBUG, this is the only place to do it?
Is there a configuration somewhere I can do this?
what about I want to set some packages' log level to DEBUG while others stay INFO?
Check if the log4j.properties or log* related files exist to set the variables -- which will also allow you to check and have some components of the logging part do excessive/DEBUG while other do INFO.
Noticed that under /etc/flume/conf.empty, there is a log4j.properties. Copied it to /etc/flume/conf, restart the flume node service, the log4j.properties file starts taking effect.
The overriding order is like flume-env.sh->flume-daemon.sh->log4j.properties.
eg. if you set up your flume_root_logger to DEBUG in flue-daemon.sh, it will override whatever setting you have for root_logger in your log4j.properties.

Limit logging to only one jar log4j

I have a project which is in turn used by several other projects. I want log4j to log only my logs to a file that I have specified in the properties file. Other project use their own logging mechanisms and I have no control over them. My log4j files should not affect other project's logging. How should i configure my log4j property file?
So far what I'm doing is setting log4j.rootLogger = ERROR and for my module log4j.logger.com.xyz.myproject = INFO, FILE. Will this work without affecting other project's loggers? Or possibly limit logging to only my jar?
Thanks
It depends on the package structure of the other projects. Supposing that
loggers from other projects are created by Logger.getLogger(ClassA.class) AND
some of them rely on root logger configuration (have no specific log4j.category.loggerName settings AND
these projects contain subpackages of the package used by your project (i.e. your project's package is com.abc.def and other projects have packages deeper in the hierarchy com.abc.def.ghi THEN
changing com.abc.def logging level would affect other projects - they'll start logging on the level defined by com.abc.def.
Verify that it's not the case and you should be safe.
I suppose your jar is entirely contained in your own package (ex com.foo.mypackage). In this case, is just enough to add to your log4j configuration something like:
# Print only messages of priority WARN or above in the package com.foo
log4j.category.com.foo=WARN
# Print only messages of priority DEBUG or above in the package com.foo.mypackage
log4j.category.com.foo.mypackage=DEBUG
Regards,
M.

Categories

Resources