Unable to get Log4J SocketAppender Working - java

My Java project uses Log4J2. Currently, it is successfully writing logs to a file. Now, I'm trying to push the logs to LogStash via a Socket Appender. Unfortunately, I am not having any success with these efforts. At this time, I'm looking at two pieces: my log4j2.xml file and my logstash config file. I've provided both here in hopes that someone can help me identify my problem.
log4j2.xml
<Configuration status="WARN" monitorInterval="30">
<Appenders>
<Socket name="A1" host="0.0.0.0" port="4560">
<SerializedLayout/>
</Socket>
<RollingRandomAccessFile name="RollingFile" fileName="/logs/server.log"
filePattern="/logs/$${date:yyyy-MM}/server-%d{yyyy-MM-dd-HH}-%i.log.gz">
<JSONLayout complete="false"></JSONLayout>
<Policies>
<TimeBasedTriggeringPolicy interval="6" modulate="true"/>
<SizeBasedTriggeringPolicy size="100 MB"/>
</Policies>
</RollingRandomAccessFile>
<Async name="AsyncFile">
<AppenderRef ref="RollingFile"/>
</Async>
</Appenders>
<Loggers>
<Logger name="com.company" level="trace" additivity="false">
<AppenderRef ref="AsyncFile"/>
</Logger>
<Root level="trace">
<AppenderRef ref="AsyncFile"/>
</Root>
</Loggers>
</Configuration>
That was my log4j2 configuration. Here is my logstash configuration:
logstash.conf
input {
log4j {
mode => "server"
type => "log4j"
host => "0.0.0.0"
port => "4560"
}
}
output {
stdout {
codec => "json"
}
}
I start logstash from the command-line using logstash-1.4.0/bin/logstash --config /etc/logstash/logstash.conf --debug. I do not see any errors. At the same time, I do not see any logs written to the console window. I know that logs should appear because they are appearing in my rolling server.log file that was configured as the second appender in log4j.
What am I doing wrong? I've been tinkering with this for 3 days.

The log4j input in Logstash is not compatible with Log4j2. Log4j2 uses a new format to serialize the log data.
There is however a plugin which enables parsing of Log4j2 sockets:
https://github.com/jurmous/logstash-log4j2
In logstash 1.5+ it can be installed from:
https://rubygems.org/gems/logstash-input-log4j2

I think you should replace 0.0.0.0 with your IP on the following line:
<Socket name="A1" host="0.0.0.0" port="4560">
And add the following lines in elements <Root ...> and <Logger ...> (like you did for AsyncFile appender):
<AppenderRef ref="A1" />
<AppenderRef ref="RollingFile" />

Related

Logging time incorrect in my custom Log4j2 XML configuration file Java

I am new to log4j2. Previously I am using log4j. The reason I am migrating into part 2 is for Asynchronous logging. After searching Internet I am able to write a configuration file that actually creates two log files "Errors.log" and "Messages.log". Now the problem is : I would be communicating with Servers that are kept far away from me. I wrote a client that communicates with the server and sends a request and in back the Server sends me a response. In any situation it takes at least 10 milli-seconds for the request to reach the Server and get back the response from it. But in my log files it is showing that the request sent to the Server and receive from the Server is at same time (Same milli-second). I am using the Asynchronous logging. Is this causing the wrong timestamp? or else the policies which I have used are creating these issues?
Below is my Log4j2 XML CONFIG file:
<?xml version="1.0" encoding="UTF-8"?>
<Configuration status="warn">
<Appenders>
<File name="my_file_appender" fileName="LOG4j_LOGS/Errors.log" immediateFlush="false" append="false">
<PatternLayout>
<Pattern>%d{yyyy-MM-dd HH:mm:ss.SSS} - %msg%n</Pattern>
</PatternLayout>
</File>
<Async name="async_appender">
<AppenderRef ref="my_file_appender" />
</Async>
<!-- file appender -->
<RollingFile name="Error-log" fileName="LOG4j_LOGS/Messages.log"
filePattern="LOG4j_LOGS/Messages-%d{yyyy-MM-dd}.log">
<!-- log pattern -->
<PatternLayout>
<Pattern>%d{yyyy-MM-dd HH:mm:ss.SSS} - %msg%n</Pattern>
</PatternLayout>
<!-- set file size policy -->
<Policies>
<TimeBasedTriggeringPolicy />
<SizeBasedTriggeringPolicy size="100 MB" />
</Policies>
<DefaultRolloverStrategy max="25"/>
</RollingFile>
</Appenders>
<Loggers>
<Logger name="Error-log" level="info" additivity="false">
<appender-ref ref="Error-log" level="debug"/>
</Logger>
<Root level="info" includeLocation="false">
<AppenderRef ref="async_appender"/>
</Root>
</Loggers>
</Configuration>
Can anyone please check my CONFIG file. All I want is to create two separate log files, one for storing info messages and other for storing errors. And this should create a new file every time I run my application and it should not delete the previous logs. The size of the logs can be anything. If the size has exceeded it should create a new file and write the data into it. No matter how many days I run the application the daily logs needs to be stored and the entire process has to be done in Asynchronously.
I am also using the below VM options for logging asynchronously :
-DLog4jContextSelector=org.apache.logging.log4j.core.async.AsyncLoggerContextSelector
When logging is done asynchronously you log messages got onto a separate queue. They are timestamped at the moment of processing by background thread that writes logs to disk (since timestamp is part of you Appender pattern). So only the order is preserved. Timestamps may be slightly different.
See https://logging.apache.org/log4j/2.x/manual/async.html for more info.

Failover Appender backing up JMS Appender - no parameter that matches element Failovers

I'm having some trouble making log4j2 working with ActiveMQ.
Here is my log4j2.xml :
<Configuration>
<ThresholdFilter level="all"/>
<Appenders>
<Console name="STDOUT" target="SYSTEM_OUT">
<PatternLayout pattern="%m%n"/>
<Filters>
<ThresholdFilter level="info" />
</Filters>
</Console>
<File name="baseLog" filename="\\\\p02630\\c$\\tmp\\logs\\logws-gendb.log">
<PatternLayout>
<Pattern>%d %p %c{1.} [%t] %m%n</Pattern>
</PatternLayout>
<Filters>
<ThresholdFilter level="error"/>
</Filters>
</File>
<JMS name="AMQError" providerurl="tcp://169.3.200.150:61616" password="admin" userName="admin">
<factoryName>org.apache.activemq.jndi.ActiveMQInitialContextFactory</factoryName>
<factoryBindingName>ConnectionFactory</factoryBindingName>
<TopicBindingName>logError</TopicBindingName>
</JMS>
<Failover name="FailOverAMQ" primary="AMQError">
<Failovers>
<appender-ref ref="baseLog"/>
</Failovers>
</Failover>
</Appenders>
<Loggers>
<root>
<AppenderRef ref="STDOUT" />
<AppenderRef ref="baseLog" />
<AppenderRef ref="FailOverAMQ" />
</root>
</Loggers>
</Configuration>
Goal is to be able to log errors in ActiveMQ. But if/when the AMQ server's down, I want the logger to be able to reconnect automatically and to still log errors inside a file Appender.
While the server is running, everything is working fine. But when I shutdown the server, nor the ActiveMQ (normal), nor the file Appender are working and when the server comes back up, log4j is not autoreconnecting after the 60s auto reconnect time(default). More troubling, my file appender is not working anymore after server shutdown.
I first had a problem with the "no parameter that matches element Failovers" detailed here and tried to add the FallBack class registering the "Failovers" element. It did remove the error message but the failover appender is not working any better. I have the impression all this class did was obfuscating the error.
Have you had any luck working with Failover appenders ?
Thanks for the help.
Just a quick follow up.
I could not found any solution to this problem. Seems to be a bug in log4j2.
In the end, I developped a short class using javax.jms package to manage my connection to ActiveMQ.
Not perfect but at least, it works.

How do I conditionally add log4j2 appender depending on java system property?

I'm trying to figure out how I can add an appender to a logger dependent on whether a java system property is given / set.
So let's say I have a basic configuration like this:
<Logger name="myLogger" level="info" additivity="false">
<AppenderRef ref="myAppender1" />
<AppenderRef ref="myAppender2" />
</Logger>
So now I'd like to figure out a way to conditionally only add the 2nd appender if I provide a parameter -PaddAppender2. Something like this:
<Logger name="myLogger" level="info" additivity="false">
<AppenderRef ref="myAppender1" />
<?if (${sys:enableAppender2:-false) == "true"}>
<AppenderRef ref="myAppender2" />
</?if>
</Logger>
How do I do that?
I know I can for example make the level dynamic on a given property ("logLevel") like that (where "info" is the default if the property is not given):
<Logger name="test" level="${sys:logLevel:-info}" additivity="false">
I looked at the documentation for filters, and I can't figure it out. That is of course if filters are even the right way to go here.
Solution without any scripting:
<?xml version="1.0" encoding="UTF-8"?>
<Configuration status="error" strict="true">
<Properties>
<Property name="appenderToUse">stdout_${sys:LOG4J_LAYOUT:-plain}</Property>
</Properties>
<Appenders>
<Appender type="Console" name="stdout_plain">
<Layout type="PatternLayout" pattern="%d [%t] %-5p %c - %m%n"/>
</Appender>
<Appender type="Console" name="stdout_json">
<Layout type="JSONLayout" compact="true" eventEol="true" stacktraceAsString="true" properties="true"/>
</Appender>
</Appenders>
<Loggers>
<Root level="info">
<AppenderRef ref="${appenderToUse}"/>
</Root>
</Loggers>
</Configuration>
The solution provided by Robert works, but it is not efficient as the script will be evaluated once per log record.
A more efficient solution that evaluates the script only once is to use ScriptAppenderSelector together with the NullAppender:
According to the docs:
ScriptAppenderSelector
When the configuration is built, the ScriptAppenderSelector appender calls a Script to compute an appender name. Log4j then creates one of the appender named listed under AppenderSet using the name of the ScriptAppenderSelector. After configuration, Log4j ignores the ScriptAppenderSelector.
NullAppender
An Appender that ignores log events. Use for compatibility with version 1.2 and handy for composing a ScriptAppenderSelector.
<?xml version="1.0" encoding="UTF-8"?>
<Configuration status="warn" name="ScriptAppenderSelectorExample">
<Appenders>
<ScriptAppenderSelector name="SelectConsole">
<Script language="groovy"><![CDATA[
if (System.getProperty("CONSOLE_APPENDER_ENABLED", 'true').equalsIgnoreCase('true')) {
return "Console"
} else {
return "Null"
}
]]></Script>
<AppenderSet>
<Console name="Console" target="SYSTEM_OUT">
<PatternLayout pattern="%d{HH:mm:ss.SSS} [%t] %-5level %logger{36} - %msg%n"/>
</Console>
<Null name="Null" />
</AppenderSet>
</ScriptAppenderSelector>
<ScriptAppenderSelector name="SelectFile">
<Script language="groovy"><![CDATA[
if (System.getProperty("FILE_APPENDER_ENABLED", 'true').equalsIgnoreCase('true')) {
return "File"
} else {
return "Null"
}
]]></Script>
<AppenderSet>
<File name="File" fileName="application.log">
<PatternLayout pattern="%d{HH:mm:ss.SSS} [%t] %-5level %logger{36} - %msg%n" />
</File>
<Null name="Null" />
</AppenderSet>
</ScriptAppenderSelector>
<ScriptAppenderSelector name="SelectSMTP">
<Script language="groovy"><![CDATA[
if (System.getProperty("SMTP_APPENDER_ENABLED", 'true').equalsIgnoreCase('true')) {
return "SMTP"
} else {
return "Null"
}
]]></Script>
<AppenderSet>
<SMTP name="SMTP"
subject="App: Error"
from="log4j#example.com"
to="support#example.com"
smtpHost="smtp.example.com"
smtpPort="25"
bufferSize="5">
</SMTP>
<Null name="Null" />
</AppenderSet>
</ScriptAppenderSelector>
</Appenders>
<Loggers>
<Root level="info">
<AppenderRef ref="SelectConsole"/>
<AppenderRef ref="SelectFile"/>
<AppenderRef ref="SelectSMTP"/>
</Root>
</Loggers>
</Configuration>
References
Is there a low overhead way to enable/disable a given appender based on the value of an environment variable or system property?
Similar to rgoers solution but using nashorn instead of groovy. This solution benefits from fact that Nashorn engine is part of
Java 8 so there is no additional dependencies needed.
<Scripts>
<Script name="isAppender2Enabled" language="nashorn"><![CDATA[
var System = Java.type('java.lang.System'),
Boolean = Java.type('java.lang.Boolean');
Boolean.parseBoolean(System.getProperty('enableAppender2', 'false'));
]]></Script>
</Scripts>
<Loggers>
<Logger name="myLogger" level="info" additivity="false">
<AppenderRef ref="myAppender1" />
<AppenderRef ref="myAppender2">
<ScriptFilter onMatch="ACCEPT" onMisMatch="DENY">
<ScriptRef ref="isAppender2Enabled" />
</ScriptFilter>
</AppenderRef>
</Logger>
</Loggers>
Note that ScriptFilter is evaluating the script every time when Log4j event occurs. Therefore it is possible to enable/disable the appender on the run time (by changing the value of the system property) with immediate effect. On the other hand, script evaluation can have negative impact on logging performance.
I wasn't able to figure out a solution via config file alone, but I found one that solves the problem programmatically.
Note that in our specific case, we always log to a "local log" ("splunk local"), but in given cases (controlled by the property), we also want to log the same information to another location (that is not relative) and is periodically read and forwarded to a splunk server ("splunk forwarder").
And that's why we can copy most of the properties from one logger to the other.
private static final Logger SPLUNK_LOG = getLogger();
private static Logger getLogger() {
if (!BooleanUtils.toBoolean(SystemUtils.getJavaPropertyValue(ENABLE_PROPERTY_NAME, "false"))) {
return LoggerFactory.getLogger(SPLUNK_LOG_NAME);
} else {
LOG.info("Dynamically adding splunk forwarder appender");
try {
final LoggerContext loggerContext = (LoggerContext) LogManager.getContext();
final Configuration configuration = loggerContext.getConfiguration();
// configure appender based on local splunk appender
final RollingFileAppender splunkLocal = (RollingFileAppender) configuration.getAppender(LOCAL_LOG_NAME);
final RollingFileAppender splunkForwarder = RollingFileAppender.createAppender(FORWARDER_FILE_NAME,
FORWARDER_FILE_PATTERN, FORWARDER_APPEND, FORWARDER_NAME, null, null, null,
splunkLocal.getManager().getTriggeringPolicy(), splunkLocal.getManager().getRolloverStrategy(),
splunkLocal.getLayout(), splunkLocal.getFilter(), null, FORWARDER_ADVERTISE, null, null);
splunkForwarder.start();
// add splunk forwarder appender to splunk logger
final LoggerConfig loggerConfig = configuration.getLoggerConfig(SPLUNK_LOG_NAME);
loggerConfig.addAppender(splunkForwarder, Level.INFO, null);
LOG.info("Successfully added splunk forwarder appender");
return loggerContext.getLogger(SPLUNK_LOG_NAME);
} catch (Exception ex) {
throw new IllegalStateException("Failed to dynamically add splunk forwarder appender", ex);
}
}
}
If anyone knows how to do this via config file alone, that would be great.
The way this is intended to be handled is by using a filter. In this case you can use a Script filter.
<Logger name="myLogger" level="info" additivity="false">
<AppenderRef ref="myAppender1" />
<AppenderRef ref="myAppender2">
<ScriptFilter onMatch="ACCEPT" onMisMatch="DENY">
<Script language="groovy"><![CDATA[
return System.getProperty("enableAppender2", "false").equalsIgnoreCase("true");
]]></Script>
</ScriptFilter>
</AppenderRef>
</Logger>
Building on some of the ideas in this thread, here's what I did to conditionally log to console.
Example use-case
Always log to a file appender.
Log to Console only in some environments.
Solution
For Console logging, set system property additional.log.appender=console
Or, disable Console logging by omitting this property.
In the Logger AppenderRef, use ${sys:additional.log.appender:-null}.
Sends logs to the console appender if the system property was set, or defaults to null appender if not set. (null appender ignores logs)
System Property
# set for console logging
additional.log.appender=console
log4j2.xml
<?xml version="1.0" encoding="UTF-8"?>
<Configuration>
<Appenders>
<RollingFile name="file"
fileName="my-file.log"
filePattern="my-file%i.log">
<PatternLayout pattern="%d %5p [%t] %c - %m%n" />
<Policies>
<SizeBasedTriggeringPolicy size="10 MB" />
</Policies>
<DefaultRolloverStrategy max="10" />
</RollingFile>
<Console name="console" target="SYSTEM_OUT">
<PatternLayout pattern="%d %5p [%t] %c - %m%n" />
</Console>
<Null name="null" />
</Appenders>
<Loggers>
<Logger name="com.acme" level="DEBUG">
</Logger>
<Root level="INFO">
<AppenderRef ref="file" />
<AppenderRef ref="${sys:additional.log.appender:-null}" />
</Root>
</Loggers>
</Configuration>

Log4j2 - Lookup Plugin (StrLookup) to resolve ThreadName for routing RollingLogFile via Thread

I am trying to configure a log4j2 config to route messages to different logfiles for a multithreaded program via threadname.
Here is what I have, so far (relevant to log4j2 config):
|-/src/main/java/log4j2/plugins
|-- ThreadLookup.java
|-/src/main/resources
|-- log4j2.xml
ThreadLookup.java:
package log4j2.plugins;
import org.apache.logging.log4j.core.LogEvent;
import org.apache.logging.log4j.core.config.plugins.Plugin;
import org.apache.logging.log4j.core.lookup.StrLookup;
#Plugin(name="threadLookup", category=StrLookup.CATEGORY)
public class ThreadLookup implements StrLookup {
#Override
public String lookup(String key) {
return Thread.currentThread().getName();
}
#Override
public String lookup(LogEvent event, String key) {
// Check event first:
if (event.getThreadName() != null) {
return event.getThreadName();
}
// Fallback to key if event doesn't define a threadName:
return this.lookup(key);
}
}
Log4j2.xml (It is my understanding that the packages attribute of Configuration should read in ThreadLookup.java and based on the annotation create a new threadLookup prefix to let me call lookup(String key) with whatever value I want to -- in this case I am not using a specific value because this class will only do a threadName lookup):
<?xml version="1.0" encoding="UTF-8"?>
<Configuration status="error" strict="true" schema="Log4J-V2.0.xsd"
packages="log4j2.plugins">
<Properties>
<Property name="logMsgPattern">%date{yyyy/MM/dd HH:mm:ss.SSS} %-5level ${sys:pid}[%thread] %class %method:%line - %message%n</Property>
</Properties>
<Appenders>
<Console name="console" target="SYSTEM_OUT" >
<PatternLayout pattern="${logMsgPattern}" />
</Console>
<Routing name="routing">
<Routes pattern="$${threadLookup:threadName}">
<Route>
<RollingFile name="RollingFile-${threadLookup:threadName}"
fileName="${sys:log4j.dir}/thread-${threadLookup:threadName}.log"
filePattern="${sys:log4j.dir}/thread-${threadLookup:threadName}-%i.log.gz">
<PatternLayout pattern="${logMsgPattern}"/>
<Policies>
<SizeBasedTriggeringPolicy size="100 MB" />
</Policies>
</RollingFile>
</Route>
</Routes>
</Routing>
<!-- Config for other appenders snipped -->
</Appenders>
<Loggers>
<!-- Config for other loggers snipped -->
<Root level="${sys:log4j.console.threshold}">
<AppenderRef ref="rootOut" level="trace" />
<AppenderRef ref="rootErr" level="error" />
<AppenderRef ref="console" level="${sys:log4j.console.threshold}" />
<AppenderRef ref="routing" level="trace" />
</Root>
</Loggers>
</Configuration>
However, when I launch my app, it just creates an additional file called thread-${threadLookup (no extension) in my log directory. It also never hits any breakpoints in ThreadLookup.java.
How can I register the plugin with log4j2 (I was using version 2.2, I also tried 2.3)? Note, I am using a spring-framework 4.1.7 project, if that helps at all; I use maven for the project as well, but I am only using it to resolve dependencies, I build the project via ant script.
UPDATE
When I build the script via ant, I do actually get a Log4j2Plugins.dat that shows up in my classpath (-cp resources:bin), but it doesn't seem to effect the outcome of the logs that are generated on the server:
$ find bin/META-INF/ -type f
bin/META-INF/org/apache/logging/log4j/core/config/plugins/Log4j2Plugins.dat
$ cat bin/META-INF/org/apache/logging/log4j/core/config/plugins/Log4j2Plugins.dat
lookup
threadlookupog4j2.plugins.ThreadLookup
threadLookup
$ vi bin/META-INF/org/apache/logging/log4j/core/config/plugins/Log4j2Plugins.dat
^A^Flookup^A^Lthreadlookup^[log4j2.plugins.ThreadLookup^LthreadLookup
$ find logs -type f -name "thread-*"
logs/thread-${threadLookup:threadName}.log
logs/thread-${threadLookup:threadName}-1.log.gz</pre>
Thanks
I ended up finding out that the issue was that my Plugin name cannot be camelCased.
I was debugging through PluginManager.java (Log4j2 - 2.3), on line 169 in private static void mergeByName(final Map<String, PluginType<?>> newPlugins, final List<PluginType<?>> plugins), and I saw that I had the following properties to go into newPlugins.put(key, pluginType);
key: threadlookup,
pluginType: PluginType [pluginClass=class log4j2.plugins.ThreadLookup, key=threadlookup, elementName=threadLookup, isObjectPrintable=false, isDeferChildren==false, category=Lookup]
After seeing that, I modified my Routing appender in my log4j2.xml config to the following (without needing to change the annotation in my Plugin class that implemented StrLookup) and it worked:
<Routing name="routing">
<Routes pattern="$${threadlookup:threadName}">
<Route>
<RollingFile name="RollingFile-${threadlookup:threadName}"
fileName="${sys:log4j.dir}/thread-${threadlookup:threadName}.log"
filePattern="${sys:log4j.dir}/thread-${threadlookup:threadName}-%i.log.gz">
<PatternLayout pattern="${logMsgPattern}"/>
<Policies>
<SizeBasedTriggeringPolicy size="100 MB" />
</Policies>
</RollingFile>
</Route>
</Routes>
</Routing>
Hopefully this can help others out, as I had to spend a few days to figure this out and I didn't find this in any of the documentation or questions I was reviewing for Log4j2.
Thanks!

Issue http://apache.org/xml/features/xinclude testing log4j 2

I'm testing Log4j2 and I don't know what I'm doing wrong because I downloaded the library from Apache and put them in the classpath. I added the xercesImpl, xalan, xml-apis, serializer, xsltc too and the exception still persist. I show the stack trace and the config file:
<?xml version="1.0" encoding="UTF-8"?>
<Configuration status="DEBUG">
<Properties>
<Property name="log-path">C:/Logs/</Property>
</Properties>
<Appenders>
<RollingFile name="RollingFile" fileName="${log-path}/myexample.log"
filePattern="${log-path}/myexample-%d{yyyy-MM-dd}-%i.log">
<PatternLayout>
<pattern>%d{dd/MMM/yyyy HH:mm:ss,SSS}- %c{1}: %m%n</pattern>
</PatternLayout>
<Policies>
<SizeBasedTriggeringPolicy size="1 KB" />
</Policies>
<DefaultRolloverStrategy max="4" />
</RollingFile>
</Appenders>
<Loggers>
<Logger name="root" level="debug" additivity="false">
<appender-ref ref="RollingFile" level="debug" />
</Logger>
<Root level="debug" additivity="false">
<AppenderRef ref="RollingFile" />
</Root>
</Loggers>
</Configuration>
ERROR StatusLogger Error parsing C:\W7des\cliente\Test\bin\log4j2.xml javax.xml.parsers.ParserConfigurationException: Feature 'http://apache.org/xml/features/xinclude' is not recognized.
at org.apache.xerces.jaxp.DocumentBuilderFactoryImpl.newDocumentBuilder(Unknown Source)
at org.apache.logging.log4j.core.config.xml.XmlConfiguration.newDocumentBuilder(XmlConfiguration.java:85)
at org.apache.logging.log4j.core.config.xml.XmlConfiguration.<init>(XmlConfiguration.java:137)
at org.apache.logging.log4j.core.config.xml.XmlConfigurationFactory.getConfiguration(XmlConfigurationFactory.java:44)
at org.apache.logging.log4j.core.config.ConfigurationFactory$Factory.getConfiguration(ConfigurationFactory.java:472)
at org.apache.logging.log4j.core.config.ConfigurationFactory$Factory.getConfiguration(ConfigurationFactory.java:442)
at org.apache.logging.log4j.core.config.ConfigurationFactory.getConfiguration(ConfigurationFactory.java:254)
at org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:419)
at org.apache.logging.log4j.core.LoggerContext.start(LoggerContext.java:138)
at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:207)
at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:41)
at org.apache.logging.log4j.LogManager.getContext(LogManager.java:160)
at org.apache.logging.log4j.LogManager.getLogger(LogManager.java:492)
at pac.Main.<clinit>(Main.java:14)
at java.lang.J9VMInternals.initializeImpl(Native Method)
at java.lang.J9VMInternals.initialize(J9VMInternals.java:200)
Thanks in advance.
In my case, removing an outdated xercesImpl-2.6.2.jar from the classpath helped.
The Oracle JVM comes with an XML parser that supports XInclude.
According the this doc http://www-03.ibm.com/systems/resources/sdkguide.zos.pdf the IBM J9 VM also bundles an XML parser that supports XInclude (see page 21 XML4J 4.5).
I am not sure if it is necessary to use a separate XML parser (you mentioned you are using Xerces instead of the XML parser bundled with the JVM).
What version of Xerces/Xalan are you using? What happens if you remove the custom XML parsers from the classpath?
By the way, Log4j will output WARN-level StatusLogger messages if it cannot enable XInclude.
Are you getting any WARN-level StatusLogger messages that start with "The DocumentBuilderFactory..."? Please include these messages in your question.
Unfortunately there is currently no switch in Log4j to prevent it from trying to enable the XInclude function. I suspect the problem above is a configuration issue but if it cannot be resolved you can request that such a switch be added as a new Log4j feature. The place for this is the Log4j Jira issue tracker.
Worked for me after removing xerces directory from gradle caches.
cd ~/.gradle/caches/modules-2/files-2.1
rm -rf xerces

Categories

Resources