Log4j default initialization goes through a procedure to find and use a URL to configure with. Afterward, how can you find out what URL was ultimately used, without having to code the same procedure yourself? (If you have to code it yourself, you might not get it exactly the same as log4j does, and also it might change in a future release.)
If you are willing to use AspectJ LTW (load-time weaving), you can take a look at the static initialisation of LogManager mentioned by Ian Roberts. In log4j 1.2.14 it looks like this:
static {
// (...)
// if there is no default init override, then get the resource
// specified by the user or the default config file.
if (override == null || "false".equalsIgnoreCase(override)) {
// (...)
URL url = null;
// (...)
// If we have a non-null url, then delegate the rest of the
// configuration to the OptionConverter.selectAndConfigure method.
if (url != null) {
LogLog.debug("Using URL [" + url + "] for automatic log4j configuration.");
OptionConverter.selectAndConfigure(
url, configuratorClassName, LogManager.getLoggerRepository()
);
} else {
LogLog.debug("Could not find resource: [" + configurationOptionStr + "].");
}
}
}
Obviously if a default URL could be determined then OptionConverter.selectAndConfigure(URL, ..) will be called at one point within the static block in order to initialise log4j with that URL.
By means of AspectJ it is pretty simple to catch that method invocation:
import java.net.URL;
import org.apache.log4j.helpers.OptionConverter;
import org.apache.log4j.LogManager;
public aspect Log4jAspect {
before(URL defaultURL) :
within(LogManager) &&
cflow(staticinitialization(LogManager)) &&
call(* OptionConverter.selectAndConfigure(URL, ..)) &&
args(defaultURL, ..)
{
System.out.println("log4j default URL = " + defaultURL);
}
}
In prose this code means:
If we are within class LogManager and
within the control flow of the static class initialisation and
OptionConverter.selectAndConfigureis called,
then capture the first argument (the URL) and
print it to the console (you could just as well do something else).
If there is no default URL, nothing will be printed. Instead of printing the URL you could assign it to a static member of any class or whatever you like.
This is a solution for your problem, I tested it and it works. I would be happy to receive the bounty for answering your question, even though maybe the solution uses a technology you did not have in mind. But it solves the problem. :-)
Edit: It is also possible to explicitly intercept the log call in the case that no default URL is found, even though I do not think it is necessary. I just wanted to mention it.
The procedure used is hard-coded in a static initializer block in LogManager, so there doesn't appear to be a way to hook into it. The only place where it tells you what's going on is
LogLog.debug("Using URL ["+url+"] for automatic log4j configuration.");
but LogLog itself is hard-coded to use System.out.println for these messages so the only possibility I can see is to switch on debugging (-Dlog4j.debug=true) and somehow hook in to System.setOut before log4j is initialized, then parse the debug log message. But that is probably even more fragile than coding the config procedure yourself.
Even then, there may have been other programmatic configuration applied after the default configuration procedure (e.g. a Spring Log4jConfigListener) - there isn't necessarily a single configuration URL as such.
It might be worth putting in a log4j feature request to factor out the config file search code into a static method that you can call from elsewhere, but that wouldn't help when you might have to cope with earlier versions of log4j.
If you can setup your own Configurator you can do something like that:
Setup the JAVA system property : -Dlog4j.configuratorClass=MyConfigurator
And then have your configurator instance intercepts the doConfigure call.
public class MyConfigurator implements Configurator
{
public static URL url;
#Override
public void doConfigure(URL url, LoggerRepository repository)
{
this.url = url;
new PropertyConfigurator().doConfigure(url, repository);
}
}
Insert this into your java call:
-Dlog4j.configDebug=true
Thats all.
Related
Although there are a few other similar questions, they all seem to misunderstand logging configuration with regards to wildcards / root loggers. I have a different issue.
I have a code structure as follows:
Service 1
com.some.package.service1
|-> subpackage1
|-> subpackage2
Service 2
com.some.package.service2
|-> subpackage1
|-> subpackage2
I would like to set the log levels of, say, all loggers of classes in subpackage1 to DEBUG, while setting the log levels of all loggers of classes in subpackage2 to WARN while leaving the rest at, say, INFO.
I had hoped that I would be able to simply configure something like the following:
logging:
level:
com.some.package: INFO
com.some.package.*.subpackage1: DEBUG
com.some.package.*.subpackage2: WARN
Unfortunately, this does not work at all - the configuration with a wildcard in it is silently ignored. Since I have numerous services, I don't want to clog up my configuration file with numerous package logging definitions which also have to be updated whenever I add a new service. Changing the code structure is unfortunately not an option for me.
Is it possible to do this using slf4j, either through plain configuration or programmatically (ideally using only the slf4j API, but I could live with an implementation-specific solution)?
If not, is there an alternative solution?
I managed to do this using, unfortunately, using the specific Logback Classic Logger implementation.
The end result is something like:
#Component
public class PackageScanningLoggingConfiguration {
// Fields
// ...
#PostConstruct
public void postConstruct() {
initLoggingLevels();
}
private void initLoggingLevels() {
String basePackage = getBasePackage();
List<String> candidatePackageNames = Arrays.stream(Package.getPackages())
.filter(pkg -> pkg.getName().startsWith(basePackage))
.filter(pkg -> {
String packageName = pkg.getName();
return packageName.contains(subpackage1Package)
|| packageName.contains(subpackage2Package);
})
.map(Package::getName)
.collect(Collectors.toList());
candidatePackageNames.forEach(this::setConfiguredLoggingLevel);
}
private void setConfiguredLoggingLevel(final String packageName) {
// ch.qos.logback.classic.Logger;
// org.slf4j.LoggerFactory;
val logger = (Logger)LoggerFactory.getLogger(packageName);
if (logger != null) {
if (packageName.endsWith(subpackage1Package)) {
logger.setLevel(Level.toLevel(subpackage1Level));
} else if (packageName.endsWith(subpackage2Package)) {
logger.setLevel(Level.toLevel(subpackage2Level));
}
}
}
// resolve com.some.package.* to specific instance
private String getBasePackage() {
// In my case, I have only two "types" of services, so this
// returns either com.some.package.service1 or com.some.package.service2
}
}
Unfortunately, due to Logback internals, it seems that for this solution to work, my root package (com.some.package) has to be set to TRACE and all others have to be lower than that - the other way around does not seem to work.
Due to this, I don't like this solution very much (it works, but it's kind of weird). I'll leave this question open for a bit in case anyone has an idea of a better way to do this.
logEvent.getContextData().size() == 0 and logEvent.getContextStack().size() == 0 but otherwise the attributes of the LogEvent are fine in:
public class MyAppender extends AbstractAppender {
.........
#override
public void append(LogEvent ev) {
ev.getDataContext().size(); // <=== how can this equals 0?
ev.getStackContext().size(); // <=== how can this equals 0?
....
}
}
I cannot figure-out why this is the case. Do I need to create an AbstractConverter? AbstractFilter? Is my log4j2.xml or maybe the plugin config wrong?
Based on our discussion in the comments, it looks like what you're actually after is the location information. In a custom appender, this can be obtained by walking the stack trace provided by LogEvent.getSource(). You should be aware that obtaining this information is expensive though (see the documentation).
Edit
As you've stated, location information can be very useful, so it's a shame that it's expensive to obtain. Unfortunately, there's nothing Log4J can do about that - it's down to java's architecture.
One cheaper method that's commonly used to obtain the class name at least, is to ensure that the Logger being logged to is named after the class in which it's used (see documentation here). Then, you can obtain the class name in an appender by calling LogEvent.getLoggerName(). Note, however, that if you're writing a general Appender implementation that may be reused across several projects, it would be bad practice to assume that this would always be the calling class's name. Instead, it should be interpreted as "the functional context that the logging call came from, as determined by the application".
I currently have to work on an existing Java application in which I could filter out some.java files as complete dead code and hence deleted them.
The problems were that there were no comments or other hints that these classes were not for production purposes, the call hierarchy was confusing and the guy who wrote this was - of course - already gone.
I only found out by placing System.out.println debugging messages by hand in the code, that indicated if a constructor or method was enetered. If it was a class that was instantiated whil runtime, that was rather fine because then I just could place a
public class public TestObjectClass
{
public TestObjectClass(String whatever) // this is the constructor
{
System.out.println("Constructor of class " + getClass() + " entered");
/* some other code folowed */
} // end constructor
} // end class
and if the log did not indicate this message I knew that the whole class was never used, placing debugging statements in all the methods then was not necessary (no static methods were in those classes).
But there were classes of which no object was created with new statement and thus no explicit constructor was there. Hence I had to place the following in every method in order to find out, if it is used or not
public class ClassThatIsNotInstantiated
{
public void someMethodXYZ()
{
System.out.println("someMethodXYZ in " + getClass + " is entered.");
/* some other code followed */
} // end method
} // end class
Could that have been done easier and more convenient?
I was thinking of the Logger class of course, which indicates at least in which class the pointer currently is however...
It does not indicate the class name if it is a runnable .jar that is
e.g. executed by Jenkins
I could not a find a way yet to indicate the method with Logger
Also with Logger I had have to place a this.logger message in every
method, at least I do not know other possibilities.
LOG4J can do this. Use PatternLayout with "C" and "M" Conversion Patterns. But be careful - they hit performance(especially Method Names - for getting method name they throw-catch exceptions). More info:
https://logging.apache.org/log4j/2.x/manual/layouts.html
But in your case it can be better idea to use aspects. Check AspectJ. It allow you to print logs for each executed method
As already indicated, you could have automated injection of debugging statements by using AOP and AspectJ in particular.
Otherwise, you could have used a tool like UCDetector
I have essentially the same question as here but am hoping to get a less vague, more informative answer.
I'm looking for a way to configure DropWizard programmatically, or at the very least, to be able to tweak configs at runtime. Specifically I have a use case where I'd like to configure metrics in the YAML file to be published with a frequency of, say, 2 minutes. This would be the "normal" default. However, under certain circumstances, I may want to speed that up to, say, every 10 seconds, and then throttle it back to the normal/default.
How can I do this, and not just for the metrics.frequency property, but for any config that might be present inside the YAML config file?
Dropwizard reads the YAML config file and configures all the components only once on startup. Neither the YAML file nor the Configuration object is used ever again. That means there is no direct way to configure on run-time.
It also doesn't provide special interfaces/delegates where you can manipulate the components. However, you can access the objects of the components (usually; if not you can always send a pull request) and configure them manually as you see fit. You may need to read the source code a bit but it's usually easy to navigate.
In the case of metrics.frequency you can see that MetricsFactory class creates ScheduledReporterManager objects per metric type using the frequency setting and doesn't look like you can change them on runtime. But you can probably work around it somehow or even better, modify the code and send a Pull Request to dropwizard community.
Although this feature isn't supported out of the box by dropwizard, you're able to accomplish this fairly easy with the tools they give you. Note that the below solution definitely works on config values you've provided, but it may not work for built in configuration values.
Also note that this doesn't persist the updated config values to the config.yml. However, this would be easy enough to implement yourself simply by writing to the config file from the application. If anyone would like to write this implementation feel free to open a PR on the example project I've linked below.
Code
Start off with a minimal config:
config.yml
myConfigValue: "hello"
And it's corresponding configuration file:
ExampleConfiguration.java
public class ExampleConfiguration extends Configuration {
private String myConfigValue;
public String getMyConfigValue() {
return myConfigValue;
}
public void setMyConfigValue(String value) {
myConfigValue = value;
}
}
Then create a task which updates the config:
UpdateConfigTask.java
public class UpdateConfigTask extends Task {
ExampleConfiguration config;
public UpdateConfigTask(ExampleConfiguration config) {
super("updateconfig");
this.config = config;
}
#Override
public void execute(Map<String, List<String>> parameters, PrintWriter output) {
config.setMyConfigValue("goodbye");
}
}
Also for demonstration purposes, create a resource which allows you to get the config value:
ConfigResource.java
#Path("/config")
public class ConfigResource {
private final ExampleConfiguration config;
public ConfigResource(ExampleConfiguration config) {
this.config = config;
}
#GET
public Response handleGet() {
return Response.ok().entity(config.getMyConfigValue()).build();
}
}
Finally wire everything up in your application:
ExampleApplication.java (exerpt)
environment.jersey().register(new ConfigResource(configuration));
environment.admin().addTask(new UpdateConfigTask(configuration));
Usage
Start up the application then run:
$ curl 'http://localhost:8080/config'
hello
$ curl -X POST 'http://localhost:8081/tasks/updateconfig'
$ curl 'http://localhost:8080/config'
goodbye
How it works
This works simply by passing the same reference to the constructor of ConfigResource.java and UpdateConfigTask.java. If you aren't familiar with the concept see here:
Is Java "pass-by-reference" or "pass-by-value"?
The linked classes above are to a project I've created which demonstrates this as a complete solution. Here's a link to the project:
scottg489/dropwizard-runtime-config-example
Footnote: I haven't verified this works with the built in configuration. However, the dropwizard Configuration class which you need to extend for your own configuration does have various "setters" for internal configuration, but it may not be safe to update those outside of run().
Disclaimer: The project I've linked here was created by me.
I solved this with bytecode manipulation via Javassist
In my case, I wanted to change the "influx" reporter
and modifyInfluxDbReporterFactory should be ran BEFORE dropwizard starts
private static void modifyInfluxDbReporterFactory() throws Exception {
ClassPool cp = ClassPool.getDefault();
CtClass cc = cp.get("com.izettle.metrics.dw.InfluxDbReporterFactory"); // do NOT use InfluxDbReporterFactory.class.getName() as this will force the class into the classloader
CtMethod m = cc.getDeclaredMethod("setTags");
m.insertAfter(
"if (tags.get(\"cloud\") != null) tags.put(\"cloud_host\", tags.get(\"cloud\") + \"_\" + host);tags.put(\"app\", \"sam\");");
cc.toClass();
}
I am replacing the property factory which is used to the load the configuration files with spring. So technically now, my configuration files should be loaded using spring DI via Apache
commons configurations.
So far I have created an Action class and a POJO view class which has the getters and setters for Loadtime, FileModified time, File name etc. The Action has a list of configuration classes injected into the constructor and in the execute method I am looping over the configuration classes creating a simple view object for each one. Then I am setting the value to the my view object which I get from config something like this:
public final String execute() {
configViewList = new ArrayList<ConfigurationViewObject>();
if ((this.configurationList != null) && (this.configurationList.size() != 0)) {
for (PropertiesConfiguration config : configurationList) {
ConfigurationViewObject view = new ConfigurationViewObject();
view.setFileName(config.getFileName());
view.setFileModificationTime(new Date(config.getFile().lastModified()));
configViewList.add(view);
}
return SUCCESS;
} else {
addActionError("List is null.");
return Action.ERROR;
}
}
Well, now I want to find out the load time. Any idea how to find the time when the spring loads the file.I have entirely searched PropertiesConfiguration class and File class if there is any method which gets it for me, however couldn't find any. I would appreciate the much awaited help.
Its very tricky to find this because Spring will never expose this to outside world and thats correct, why would anyone want to know when config file gets loaded?
But still , here is what i would do. Spring config will be loaded when Spring creates loads a class which needs some properties to be set [This is not document officially anywhere but thats how it shld be logically]. Now next part is to find out which class instance needing properties is created first. Again no starighforward way for this but still an appox. way would be to put a System.currentTimeMillis() in Class's constructor. So this will give you an appox. time of when property file will be loaded.
For a straightforward way to do this... you can try calling long before = System.currentTimeMillis() before the call to retrieve properties, and another long after = System.currentTimeMillis() call right after the the properties are retrieved, and seeing the value of the difference, after - before.