log4j2 RollingFile Appender not changing filename on Tomcat? - java

I have a web application using log4j2. The logs should be created on a daily basis.
Problem: the content of the old file never gets deleted, but any new day is just appended to that file. So it grows continuously. Is the following confniguration correct in general, when running on a tomcat8?
log4j2.xml:
<Configuration>
<Appenders>
<RollingFile name="TEST" fileName="d:\test-application.txt" filePattern="d:\test-application-%d{yyyy-MM-dd}.log">
<Policies>
<TimeBasedTriggeringPolicy modulate="true"/>
</Policies>
//...
<RollingFile>
//...
</Appenders>
//...
</Configuration>
Maven:
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter</artifactId>
<exclusions>
<exclusion>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-logging</artifactId>
</exclusion>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-log4j2</artifactId>
</dependency>

You are not the first person to report this behavior and the common factor seems to be Windows. Log4j tries to rename the file. If that fails it tries to copy and delete then delete the file. If there is a lock on the file the rename will fail and then most likely the copy is succeeding but the delete is failing. But the code is still using the delete method of the File object, which is not good at reporting errors and Log4j isn't checking the return value, so it is failing silently.
This behavior is a bug and should be fixed, but it won't really solve your problem - it will just let you know about it. To fix it you need to find out what is preventing the rename from succeeding.

What we often see is multiple processes or multiple web applications using the same log4j2.xml configuration and logging to the same file. One process will then prevent the other from deletion the file.
I've had this problem when I kept the log file open in an editor, and the midnight rollover would fail because of this.

Related

How to use log4j.xml for spring boot + log4j2 dependency

I have a log4j.xml with a customized appender like:
<appender name="console" class="com.example.MyAppender">
<layout class="org.apache.log4j.PatternLayout">
<param name="ConversionPattern" value="%m (%c{1}:%L)"/>
</layout>
</appender>
Recently I upgraded log4j dependency to log4j2, but still using this log4j.xml and it works.
Now, I add a Spring Boot module in my project. Following Spring doc, I set my pom.xml as
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter</artifactId>
<exclusions>
<exclusion>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-logging</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-log4j2</artifactId>
<version>2.6.4</version>
</dependency>
And I also add arguments -Dlogging.config=log4j.xml -Dlog4j.configuration=log4j.xml -Dlog4j1.compatibility=true for it.
But my Spring application shows the error and no log output:
ERROR StatusLogger Unknown object "logger" of type org.apache.logging.log4j.core.config.LoggerConfig is ignored: try nesting it inside one of: ["Appenders", "Loggers", "Properties", "Scripts", "CustomLevels"].
Seems log4j2 lib cannot recognize log4j.xml, which means -Dlog4j1.compatibility=true does not work for Spring Boot I think.
Any related config can be utilized or any workaround? Thanks.
TL;DR: The problem is that Log4j2 has two XML configuration factories (for the Log4j 1.x and Log4j 2.x formats), with the 2.x format having higher priority. You need to explicitly set the ConfigurationFactory to use:
-Dlog4j2.configurationFactory=org.apache.log4j.xml.XmlConfigurationFactory
When a Spring Boot application starts Log4j2 is configured twice:
at the very beginning using Log4j2 automatic configuration. For this round you just need to set -Dlog4j1.compatibility=true and call the config file log4j.xml or call the file differently and set -Dlog4j.configuration.
when Spring's environment is ready, Spring reconfigures Log4j2 programmatically using only a subset of Log4j2 automatic configuration. That is why this phase requires many manual settings:
-Dlogging.config=log4j.xml: Spring does not look for a file named log4j.xml,
-Dlog4j1.compatibility=true to activate the Log4j 1.x configuration factories,
-Dlog4j2.configurationFactory=org.apache.log4j.xml.XmlConfigurationFactory to increase the priority of the Log4j 1.x XML configuration factory.
Remark: Using a native Log4j 1.x custom appender exposes you to all the problems (synchronization and performance) of the original Log4j 1.x. For example Log4j 1.x looses events during reconfiguration (as the one performed by Spring Boot), whereas Log4j 2.x does not.

Using Logback to catalina.out

I am using Logback in an application running in Tomcat. While my application works and, in the debugger, I see my logging statements reached, these statements never reach /opt/tomcat/logs/catalina.out. (By the way, I do see these statements in the IntelliJ IDEA debugger console, but upon deployment, they don't reach catalina.out.) Where do I begin?
In my WAR, WEB-INF/classes/logback.xml looks like this:
<configuration>
<appender name="FILE" class="ch.qos.logback.core.rolling.RollingFileAppender">
<file>${catalina.base}/logs/catalina.out</file>
<rollingPolicy class="ch.qos.logback.core.rolling.SizeAndTimeBasedRollingPolicy">
<maxFileSize>10MB</maxFileSize>
<maxHistory>10</maxHistory>
</rollingPolicy>
<immediateFlush>true</immediateFlush>
<encoder class="ch.qos.logback.classic.encoder.PatternLayoutEncoder">
<pattern>%date %level [%thread] %logger{40} %msg%n</pattern>
</encoder>
</appender>
<root level="INFO">
<appender-ref ref="FILE" />
</root>
</configuration>
In code, I do this for example:
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
public class Validator
{
private static final Logger logger = LoggerFactory.getLogger( Validator.class );
...
public void foo()
{
logger.info( "Called foo()" );
Correspondingly, in pom.xml dependencies, I have this:
<dependency>
<groupId>ch.qos.logback</groupId>
<artifactId>logback-classic</artifactId>
<version>${logback-version}</version>
</dependency>
<dependency>
<groupId>ch.qos.logback</groupId>
<artifactId>logback-access</artifactId>
<version>${logback-version}</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>${slf4j.version}</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-simple</artifactId>
<version>${slf4j.version}</version>
</dependency>
I also tried this in logback.xml. It didn't create the file. All logfiles are owned by tomcat:tomcat. Tomcat owns the thread that writes to the log.
<file>${catalina.base}/logs/test.log</file>
The problem turned out to be the slf4j JARs and multiple bindings. See
https://www.slf4j.org/codes.html#multiple_bindings.
I eliminated slf4j-simple from my dependencies as shown above. The fact that two slf4j JARs were in conflict barred in essence logback.xml from having any effect upon logging behavior. Upon fixing this, JUnit tests still worked, the IDE-embedded Tomcat began to reveal TRACE-level statements when the level was set to TRACE, ditto when deployed to a real Tomcat installation where the statements come out (in catalina.out in my case).
One additional observation I can make is that if a log-level configuration change is made (say, by hand directly on logback.xml on the path /opt/tomcat/webapps/application/WEB-INF/classes/logback.xml), I found that Tomcat had to be bounced for it to take effect (even though I played with <configuration scan="true">, etc.).

How to efficiently index logs to elastic search

I'm developing a web application in which i'll upload a log file, the file will be read and classified based on logger levels (INFO,ERROR,WARNING etc..). I need to index those logs to elasticsearch using java high level rest client api.
Currently i'm creating one index for a classname(Logs will contain classname) and store those class logs in that particular index. I feel this approach is wont be nice in some cases if a log file contains logs from 100 different classes, i'll be creating 100 indices for it and storing those logs.
Is there any efficient way of indexing logs to elasticsearch?How to determine indices in my case?
Sample log:
02-Jul-2021|10:03:10.040|INFO|[main]|org.apache.catalina.startup.VersionLoggerListener.log|Server
built: Jun 11 2021 13:32:01 UTC
Here is the practice that we use in some of our java based stack and it has many privileges for the usage of Apache Kafka as middle data pipe line and logstash as data ingestion pipeline.
First you need to remove default providers for logs in your spring boot application inside your pom.xml file, Which are Logback and perhaps Log-classic then you need to add log4j2 as new log provider and adding Kafka appender. After adding dependencies you need xml configuration file where you can add your Kafka appender configurations. By default you need to locate your configuration file in resource path of your project and name it as "log4j2.xml".
You can find many others Log4j2 appenders like Cassandra or Failover appenders and add them beside your Kafka appender inside your configuration file. You can find an applicable and correct example in below.
<!--excluding logback -->
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter</artifactId>
<exclusions>
<exclusion>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-logging</artifactId>
</exclusion>
<exclusion>
<groupId>org.springframework.boot</groupId>
<artifactId>logback-classic</artifactId>
</exclusion>
</exclusions>
</dependency>
<!--adding log4j2 and kafka appender-->
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-log4j2</artifactId>
</dependency>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-log4j-appender</artifactId>
<exclusions>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
</exclusion>
</exclusions>
</dependency>
Kafka appender configuration
<?xml version="1.0" encoding="UTF-8"?>
<Configuration status="info" name="kafka-appender" packages="Citydi.ElasticDemo">
<Appenders>
<Kafka name="kafkaLogAppender" topic="Second-Topic">
<JSONLayout />
<Property name="bootstrap.servers">localhost:9092</Property>
<MarkerFilter marker="Recorder" onMatch="DENY" onMismatch="ACCEPT"/>
</Kafka>
<Console name="stdout" target="SYSTEM_OUT">
<PatternLayout pattern="%d{HH:mm:ss.SSS} stdout %highlight{%-5p} [%-7t] %F:%L - %m%n"/>
<MarkerFilter marker="Recorder" onMatch="DENY" onMismatch="ACCEPT"/>
</Console>
</Appenders>
<Loggers>
<Root level="INFO">
<AppenderRef ref="kafkaLogAppender"/>
<AppenderRef ref="stdout"/>
</Root>
<Logger name="org.apache.kafka" level="warn" />
</Loggers>
</Configuration>
Activating Zookeeper broker
./zookeeper-server-start.sh ../config/zookeeper.properties
Activating Kafka broker
./kafka-server-start.sh ../config/server.properties
Create Topic
./kafka-topics.sh --create --topic test-topic -zookeeper localhost:2181 --replication-factor 1 --partitions 4
Active consumer of the created topic
./kafka-console-producer.sh --broker-list localhost:9092 --topic test-topic
Then add the log appender for created topic for consuming logs(This one is up to you) and after that create a Logstash pipeline such as below configuration as ingest your logs into your desired index in elastic .
input {
kafka{
group_id => "35834"
topics => ["yourtopicname"]
bootstrap_servers => "localhost:9092"
codec => json
}
}
filter {
}
output {
file {
path => "C:\somedirectory"
}
elasticsearch {
hosts => ["localhost:9200"]
document_type => "_doc"
index => "yourindexname"
}
stdout { codec => rubydebug
}
}

DailyRollingFileAppender not creating daily log file

DailyRollingFileAppender is not creating daily backup log file.
I am using the below config, which works on my local machine but it not working on the machine where my project has been deployed.
log4j.rootLogger=DEBUG, Appender2
log4j.appender.Appender2=org.apache.log4j.DailyRollingFileAppender
log4j.appender.Appender2.File=C:/Logs/AppLog.log
log4j.appender.Appender2.DatePattern='.'dd-MM-yyyy
log4j.appender.Appender2.layout=org.apache.log4j.PatternLayout
log4j.appender.Appender2.layout.ConversionPattern=%-7p %d [%t] %c %x - %m%n
log4j.appender.Appender2.rootLogger = DEBUG
Framework - Spring MVC
I am not able to understand which part of the config is bloking DailyRollingFileAppender to create date wise log on my server machine.
Edit-
I updated my file as per the suggestion and it is not creating a new backup file at 12 am next day. means it updated AppLog.l‌​og till 12 then there was no backup file and all the previous day logs are gone and it starts writing from the beginning.
This is log4j properties now-
log4j.rootLogger=DEBUG, Appender2
log4j.appender.Appender2=org.apache.log4j.DailyRollingFileAppender
log4j.appender.Appender2.File=${catalina.home}/Logs/AppLog.log
log4j.appender.Appender2.DatePattern='.'yyyy-MM-dd
log4j.appender.Appender2.layout=org.apache.log4j.PatternLayout
log4j.appender.Appender2.Append=false
log4j.appender.Appender2.layout.ConversionPattern=%-7p %d [%t] %c %x - %m%n
The issue is with the file path here:
log4j.appender.Appender2.File=C:/Logs/AppLog.log
Please make sure that this path exists on the server where you have deployed your project.
I faced with this problem before, the cause turned to be I used wrong log4j dependency in pom.xml. The previous dependency is:
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>1.2.17</version>
</dependency>
I use spring boot in my project, so I changed it to the following, it worked.
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter</artifactId>
<exclusions>
<exclusion>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-logging</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-log4j</artifactId>
<version>1.3.8.RELEASE</version>
</dependency>
your DatePattern should be '.'yyyy-MM-dd
refer to https://logging.apache.org/log4j/1.2/apidocs/org/apache/log4j/DailyRollingFileAppender.html
u can use this for get the daily rolling log file,
########## Appender Daily Rolling
log4j.logger.appender=Daily
log4j.appender.Daily=org.apache.log4j.DailyRollingFileAppender
log4j.appender.Daily.Threshold=INFO
log4j.appender.Daily.File=D:/backup/RFLI1010.log
log4j.appender.Daily.DatePattern='.'yyyy-MM-dd
# Append to the end of the file or overwrites the file at start.
log4j.appender.Daily.Append=true
log4j.appender.Daily.MaxBackupIndex=20
log4j.appender.Daily.layout=org.apache.log4j.PatternLayout
log4j.appender.Daily.layout.ConversionPattern= [%5p] %d %r %t (%F:%M:%L)%m%n%n

Logback can't find logback.xml even though it exists (on the classpath)

I've an issue with logback. I set it up (using maven) and everything seems fine except that Logback reports it can't find the configuration file (but I'm able to log to the console using the default logger configuration).
[#|2013-07-03T07:55:30.843+0200|INFO|glassfish3.1.2|javax.enterprise.system.std.com.sun.enterprise.server.logging|_ThreadID=124;_ThreadName=Thread-2;|07:54:39,844 |-INFO in ch.qos.logback.classic.LoggerContext[default] - Could NOT find resource [logback.groovy]
07:54:39,844 |-INFO in ch.qos.logback.classic.LoggerContext[default] - Could NOT find resource [logback-test.xml]
07:54:39,844 |-INFO in ch.qos.logback.classic.LoggerContext[default] - Could NOT find resource [logback.xml]
07:54:39,847 |-INFO in ch.qos.logback.classic.LoggerContext[default] - Setting up default configuration.
|#]
I put the configuration file (called logback.xml) into the src/main/resources folder of my Maven artifact (which is a WAR).
Interestingly, if I attempt to load the config from the classpath, I succeed:
Reader r = new InputStreamReader(getClass().getClassLoader().getResourceAsStream("logback.xml"));
StringWriter sw = new StringWriter();
char[] buffer = new char[1024];
for (int n; (n = r.read(buffer)) != -1; )
sw.write(buffer, 0, n);
String str = sw.toString();
System.out.println(str);
Which prints my sample configuration file:
[#|2013-07-03T07:55:30.844+0200|INFO|glassfish3.1.2|javax.enterprise.system.std.com.sun.enterprise.server.logging|_ThreadID=124;_ThreadName=Thread-2;|<configuration>
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
<!-- encoders are assigned the type
ch.qos.logback.classic.encoder.PatternLayoutEncoder by default -->
<encoder>
<pattern>%d{HH:mm:ss.SSS} [%thread] %-5level %logger{36} - %msg%n</pattern>
</encoder>
</appender>
<root level="debug">
<appender-ref ref="STDOUT" />
</root> </configuration>|#]
My pom.xml has the following entries:
<dependency>
<groupId>ch.qos.logback</groupId>
<artifactId>logback-classic</artifactId>
<version>1.0.13</version>
</dependency>
<dependency>
<groupId>ch.qos.logback</groupId>
<artifactId>logback-core</artifactId>
<version>1.0.13</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>1.7.5</version>
</dependency>
Which is packed as a WAR file (inside an EAR file). The location of the logback.xml inside the WAR file is as follows: WEB-INF/classes/logback.xml
Does anybody have an idea what's wrong with my setup?
Many thanks for your help
stupidSheep
The location within the WAR file is correct, WEB-INF/classes.
The logback configuration documentation talks about where the logback.xml file can be located within a war, but it doesn't mention anything about an EAR.
Could you please try the information at this link? I am wondering if it needs to be packed into the EAR in a specific way.
Glassfish 3 + ear + logback.xml
(edit: second link removed, didn't work)
Logback invokes very similar code to the code in your example, i.e.
getClassLoader().getResourceAsStream("logback.xml");
If logback cannot find logback.xml, then it must be that the resource is not visible to the class loader that loaded the logback class. This class loader is most probably different than the class loader that loaded your test code which can find logback.xml.
When you deliver the configuration file of the logging framework within an WAR works everything as expected and without any problems. But if you try this with an EAR something magically happens, the logging framework can’t find the configuration file. And it uses it’s default behavior.
I solved it doing the following:
Create a new folder directly under the EAR folder. For example, create
a new folder named "classes" --> MyEar/classes
Place your logback.xml file in this new folder:
MyEar/classes/logback.xml
In your WAR file's MANIFEST.MF file, add this new folder to the
classpath:
Manifest-Version: 1.0
Class-Path: classes
So I had similar problem where I had logback.xml in classpath but wasn't being included in the build process. I recently switched over to gradle. I was having issues initially with my resource files not included in the build even though I specifically added src/main/resources to sourceSet of build.gradle.
So my solution at the time was to put the types of files in the include:
includes = ["**/*.css", "**/*.wav", "**/*.mp3", "**/*.mp4", "**/*.png"]
Sometime passed and I noticed my logging config wasn't being applied. I spent a great deal of time tweaking the log and looking up the problem. I soon realized that the file wasn't being included.
String URL = "logback.xml";
System.out.println(ClassLoader.getSystemResource(URL));
I remembered I had to put the type of files in the include. I added the xml type and it worked.
sourceSets {
main {
resources {
srcDirs = ["src/main/java", "src/main/resources"]
includes = ["**/*.css", "**/*.wav", "**/*.mp3", "**/*.mp4", "**/*.png", "**/*.xml"]
}
}
}

Categories

Resources