I want to create 2 types of log 1 Debug log which will create all log another I want to create activity log I mean each method how much time took to execute or any specific info,
I am using below log4j property file-
please correct me, its logging all messages in only one file,in java I have instantiated both log object, kindly don't send any pointers or just Google because I have tried since last 2 days all options as described in Google,
Thanks in advance for your kind support,
log4j.rootLogger=debugLog,reportsLog
log4j.appender.debugLog=org.apache.log4j.FileAppender
log4j.appender.debugLog.File=logs/debug.log
log4j.appender.debugLog.layout=org.apache.log4j.PatternLayout
log4j.appender.debugLog.layout.ConversionPattern=%d [%24F:%t:%L] - %m%n
log4j.appender.reportsLog=org.apache.log4j.FileAppender
log4j.appender.reportsLog.File=logs/reports.log
log4j.appender.reportsLog.layout=org.apache.log4j.PatternLayout
log4j.appender.reportsLog.layout.ConversionPattern=%d [%24F:%t:%L] - %m%n
log4j.category.debugLogger=INFO, debugLog
log4j.additivity.debugLogger=false
log4j.category.reportsLogger=DEBUG, reportsLog
log4j.additivity.reportsLogger=false
Below log4j.properties file will configure the logger to log the messages with debug level to the logs/debug.log file. Messages with level INFO...FATAL are logged to logs/reports.log.
log4j.rootLogger=DEBUG, debugLog, reportsLog
log4j.appender.debugLog=org.apache.log4j.FileAppender
log4j.appender.debugLog.File=logs/debug.log
log4j.appender.debugLog.layout=org.apache.log4j.PatternLayout
log4j.appender.debugLog.layout.ConversionPattern=%d [%24F:%t:%L] - %m%n
log4j.appender.debugLog.filter.f1=org.apache.log4j.varia.LevelRangeFilter
log4j.appender.debugLog.filter.f1.LevelMax=DEBUG
log4j.appender.debugLog.filter.f1.LevelMin=DEBUG
log4j.appender.reportsLog=org.apache.log4j.FileAppender
log4j.appender.reportsLog.File=logs/reports.log
log4j.appender.reportsLog.layout=org.apache.log4j.PatternLayout
log4j.appender.reportsLog.layout.ConversionPattern=%d [%24F:%t:%L] - %m%n
log4j.appender.reportsLog.filter.f1=org.apache.log4j.varia.LevelRangeFilter
log4j.appender.reportsLog.filter.f1.LevelMax=FATAL
log4j.appender.reportsLog.filter.f1.LevelMin=INFO
log4j.category.debugLogger=DEBUG, debugLog
log4j.additivity.debugLogger=false
log4j.category.reportsLogger=INFO, reportsLog
log4j.additivity.reportsLogger=false
Related
I'm trying to transfer a file from one server to other server using sftp protocol. So, I'm trying to use sshj library for this. Now before transferring the file I'm performing several activities such as zipping and after transferring unzipping . So, I'm logging this details to successLog and if ay error to FailureLog.
here is my log4j properties:-
# Define the root logger
log4j.rootLogger = DEBUG, toConsole
# Define the console appender
log4j.appender.toConsole=org.apache.log4j.ConsoleAppender
log4j.appender.toConsole.layout=org.apache.log4j.PatternLayout
log4j.appender.toConsoleAppender.layout.ConversionPattern=%d{HH:mm:ss} %5p [%t] - %c.%M - %m%n
# Define the file appender
log4j.appender.success=org.apache.log4j.FileAppender
log4j.appender.success.File=logs/success.log
log4j.appender.success.Append=false
log4j.appender.success.layout=org.apache.log4j.PatternLayout
log4j.appender.success.layout.conversionPattern=%d{HH:mm:ss} %5p [%t] - %c.%M - %m%n
# Define the file appender
log4j.appender.failure=org.apache.log4j.FileAppender
log4j.appender.failure.File=logs/failure.log
log4j.appender.failure.Append=false
log4j.appender.failure.layout=org.apache.log4j.PatternLayout
log4j.appender.failure.layout.conversionPattern=%d{HH:mm:ss} %5p [%t] - %c.%M - %m%n
log4j.category.successLogger=DEBUG, success
log4j.additivity.successLogger=true
log4j.category.failureLogger=WARN, failure
log4j.additivity.failureLogger=false
and in java code I'm using this as :-
static final Logger successLog = Logger.getLogger("successLogger");
static final Logger failureLog = Logger.getLogger("failureLogger");
and example failureLog.error("exception occured due to so an so reason",e);
now I would like to store sshj library log in some file called sftp.log. How can I do that??so far I was manually logging into success and failure logs but sshj log is by default printing on console which I want to write it to some file.
Here's my example of writing the SFTP-session FINE-level log to file:
private void enableFineLogging() {
try {
fileHandler = new FileHandler(
"./logs/fine_sshj.log",
10000000, 1000, true);
fileHandler.setLevel(Level.FINER);
fileHandler.setFormatter(new SimpleFormatter());
final Logger app = Logger.getLogger("net.schmizz");
app.setLevel(Level.FINER);
app.addHandler(fileHandler);
app.setUseParentHandlers(false);
app.info(
"######################################################################## "
+ "START LOGGING NEW SSHJ SESSION "
+ "########################################################################");
} catch (Exception e) {
// Do something...
}
}
Though I'm not sure if are able to separate the log levels so that successLogger would have only the success cases... I think there will be also WARNING and SEVERE level messages.
i am quite new ot apache Kafka and log4j. i am trying to send my log messages into Kafka. Here is my log4j properties file
log4j.rootLogger=INFO, stdout
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L %% %m%n
log4j.appender.KAFKA=kafka.producer.KafkaLog4jAppender
log4j.appender.KAFKA.BrokerList=localhost:9092
log4j.appender.KAFKA.Topic=kfkLogs
log4j.appender.KAFKA.SerializerClass=kafka.producer.DefaultStringEncoder
log4j.appender.KAFKA.layout=org.apache.log4j.PatternLayout
log4j.appender.KAFKA.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L %% - %m%n
log4j.logger.logGen=DEBUG, KAFKA
however, i am not able to receive any messages in my consumer. I have tested the consumer with some other producer code and it works fine.
also, i get this warning
log4j:WARN No such property [serializerClass] in kafka.producer.KafkaLog4jAppender.
Edit
and here is the code that generates my log messages
package logGen;
import org.apache.log4j.Logger;
public class TestLog4j {
static Logger log = Logger.getLogger(TestLog4j.class.getName());
public static void main(String[] args) {
log.debug("Debug message");
log.info("Info message");
log.error("Error Message");
log.fatal("Fatal Message");
log.warn("Warn Message");
log.trace("Trace Message");
}
}
Also, if i write the log messages to a file by using something like
log4j.appender.KAFKA=org.apache.log4j.DailyRollingFileAppender
log4j.appender.KAFKA.DatePattern='.'yyyy-MM-dd-HH
log4j.appender.KAFKA.File=logs/server.log
i can see the log messages in the server.log file
thanks for the suggestions everyone. i think the weird behavior that i see might be related to my kafka set up. here are the contents of my server.properties file which i use to start up my kafka server. can you see anything odd about it ?
broker.id=0
port=9092
num.network.threads=3
num.io.threads=8
socket.send.buffer.bytes=102400
socket.receive.buffer.bytes=102400
socket.request.max.bytes=104857600
log.dirs=/Users/xyz/kafka/kafka-logs
num.partitions=1
num.recovery.threads.per.data.dir=1
log.retention.hours=168
log.segment.bytes=1073741824
log.retention.check.interval.ms=300000
log.cleaner.enable=false
zookeeper.connect=localhost:2181
zookeeper.connection.timeout.ms=6000
delete.topic.enable=true
I have took a look at the source code of KafkaLog4jAppender.scala and here are the valid and exhaustive properties for Kafka log4j appender as of version 0.8.2.1 : topic, brokerList, compressionType, requiredNumAcks, syncSend.
The log4j.properties that worked for me is below :
log4j.rootLogger=ERROR, stdout
log4j.logger.logGen=DEBUG, KAFKA
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L %% %m%n
log4j.appender.KAFKA=kafka.producer.KafkaLog4jAppender
log4j.appender.KAFKA.topic=LOG
log4j.appender.KAFKA.brokerList=localhost:9092
log4j.appender.KAFKA.compressionType=none
log4j.appender.KAFKA.requiredNumAcks=0
log4j.appender.KAFKA.syncSend=true
log4j.appender.KAFKA.layout=org.apache.log4j.PatternLayout
log4j.appender.KAFKA.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L %% - %m%n
You need to add KAFKA to your log4j.rootLogger like this:
log4j.rootLogger=INFO, stdout, KAFKA
This will add the KAFKA appender to your rootLogger.
i had to specify
log4j.appender.KAFKA.producer.type=async
log4j.logger.logGen.TestLog4j=TRACE, KAFKA
and that seems to work. however i experience a delay of 10-30 seconds. Specifically, if i publish right now and can see the messages in teh consumer, then the next time i publish has to be about 30 secs later else i dont see anything in my consumer. any ideas on why this might be happening ? maybe its an eclispe issue ?
log4j.appender.KAFKA.SerializerClass=kafka.producer.DefaultStringEncoder
could you try
log4j.appender.KAFKA.Serializer=kafka.producer.DefaultStringEncoder
Instead?
I believe the async mode of sending msgs does so by batching, hence the delay, have you you tried sending using sync?
I am getting duplicate entries in my log file.
Have attached my log4j.properties below.
log4j.properties:
###############################################################################
# log4j Configuration file: Defines following loggers
# SL - Standard root Logger
# EL - Error Logger with the threshold level explicitly set to ERROR
# DL - Data base logger - to log db queries separately
# BL - Batch logger
###############################################################################
log4j.rootLogger=TRACE,SL,EL
log4j.rootLogger.additivity=false
#Standard Log
log4j.appender.SL=org.apache.log4j.DailyRollingFileAppender
log4j.appender.SL.File=${log.file}/log.log
log4j.appender.SL.layout=org.apache.log4j.PatternLayout
log4j.appender.SL.layout.ConversionPattern=[%5p] [%t %d{HH:mm:ss:SSS}] [%X{sessionId}:%X{hostId}:%X{userId}] (%F:%M:%L) %m%n
#Error Log
log4j.appender.EL=org.apache.log4j.DailyRollingFileAppender
log4j.appender.EL.File=${log.file}/error.log
log4j.appender.EL.layout=org.apache.log4j.PatternLayout
log4j.appender.EL.Threshold=ERROR
log4j.appender.EL.layout.ConversionPattern=[%5p] [%t %d{HH:mm:ss:SSS}] [%X{sessionId}:%X{hostId}:%X{userId}] (%F:%M:%L) %m%n
# Database Log
log4j.logger.org.springframework.jdbc=DEBUG,DL
log4j.appender.DL=org.apache.log4j.DailyRollingFileAppender
log4j.appender.DL.File=${log.file}/db.log
log4j.appender.DL.layout=org.apache.log4j.PatternLayout
log4j.appender.DL.layout.ConversionPattern=[%5p] [%t %d{HH:mm:ss:SSS}] [%X{sessionId}:%X{hostId}:%X{userId}] (%F:%M:%L) %m%n
#Forecast Log
log4j.appender.MAPS_FC=org.apache.log4j.DailyRollingFileAppender
log4j.appender.MAPS_FC.File=${log.file}/forecast.log
log4j.appender.MAPS_FC.layout=org.apache.log4j.PatternLayout
log4j.appender.MAPS_FC.layout.ConversionPattern=[%5p] [%t %d{HH:mm:ss:SSS}] [%X{sessionId}:%X{hostId}:%X{userId}] (%F:%M:%L) %m%n
#Logger configuration
log4j.logger.com.singaporeair.maps=TRACE,SL,EL
log4j.logger.com.singaporeair.maps.app.service.impl.gantt=DEBUG,MAPS_FC
log4j.logger.com.singaporeair.maps.app.dao.impl.gantt=DEBUG,MAPS_FC
Getting dulicate entries in log.log file configured above.
Log extract:
[ INFO] [SimpleAsyncTaskExecutor-9 19:04:00:800] [::] (AppProfiler.java:doProfile:69) Processing Time(ms): BaseDAOImpl: getBatchDetails: 63
[ INFO] [SimpleAsyncTaskExecutor-9 19:04:00:800] [::] (AppProfiler.java:doProfile:69) Processing Time(ms): BaseDAOImpl: getBatchDetails: 63
[ INFO] [SimpleAsyncTaskExecutor-9 19:04:00:800] [::] (AppProfiler.java:doProfile:71) BaseDAOImpl: getBatchDetails: OUT
[ INFO] [SimpleAsyncTaskExecutor-9 19:04:00:800] [::] (AppProfiler.java:doProfile:71) BaseDAOImpl: getBatchDetails: OUT
Pls help
If you turn off additivity, the loggers that are children of the parents won't cause double logging. For instance:
#Logger configuration
log4j.logger.com.singaporeair.maps=TRACE,SL,EL
log4j.additivity.com.singaporeair.maps=false
log4j.logger.com.singaporeair.maps.app.service.impl.gantt=DEBUG,MAPS_FC
log4j.additivity.com.singaporeair.maps.app.service.impl.gantt=false
log4j.logger.com.singaporeair.maps.app.dao.impl.gantt=DEBUG,MAPS_FC
log4j.additivity.com.singaporeair.maps.app.dao.impl.gantt=false
Probably would be helpful for those who experience duplicate problem in a multithread application (couldn't find the answer in google):
This happens when one thread is done and another thread open logger to the same log file which the first thread used to write.
.removeAllAppenders() before I added a new appender helped to resolve the issue.
com.singaporeair.maps is a superset of com.singaporeair.maps.app.service.impl.gantt and com.singaporeair.maps.app.dao.impl.gantt
Everything that matches com.singaporeair.maps.app.dao.impl.gantt will also match com.singaporeair.maps which will result in 2 log entties.
Guess 1: You need to turn off appender inheritance. It appears that this is wrong.
Guess 2: The root logger and the com.singaporeair.maps are both logging to the SL and EL appenders. This is just a guess, but try changing this:
log4j.logger.com.singaporeair.maps=TRACE,SL,EL
to this:
log4j.logger.com.singaporeair.maps=TRACE
Does logging decreases application performance?
and how to restrict display-tags logs to be printed in log files?
eg. my log file has below logs
[2012-06-20 15:52:06,290] org.displaytag.tags.TableTag isFirstIteration 684 - [data] first iteration=true (row number=1)
[2012-06-20 15:52:06,290] org.displaytag.tags.TableTag isFirstIteration 684 - [data] first iteration=true (row number=1)
[2012-06-20 15:52:06,290] org.displaytag.tags.TableTag isFirstIteration 684 - [data] first iteration=true (row number=1)
[2012-06-20 15:52:06,290] org.displaytag.tags.TableTag isFirstIteration 684 - [data] first iteration=true (row number=1)
why the above is in log file?
log.properties file
# Log4j configuration file.
log4j.rootCategory=DEBUG, A1
# Available levels are DEBUG, INFO, WARN, ERROR, FATAL
#
# A1 is a ConsoleAppender
#
log4j.appender.A1 = org.apache.log4j.RollingFileAppender
log4j.appender.A1.File = C:/LogInfo/logfile.log
log4j.appender.A1.MaxFileSize = 100MB
log4j.appender.A1.MaxBackupIndex=50
log4j.appender.A1.layout = org.apache.log4j.PatternLayout
log4j.appender.A1.append = true
log4j.appender.A1.layout.ConversionPattern = [%d] %C %M %L - %m%n
log4j.appender.A1.Threshold = DEBUG
how to stop (org.displaytag.tags.TableTag) these kind of logs to be printed in log files
Does logging decreases application performance?
Yes. How much it does depends on a number of factors; see below.
and how to restrict display-tags logs to be printed in log files?
By changing the ConversionPattern in the logging properties
why the above is in log file?
Because:
somewhere in the code is a call to a Logger method (probably debug(String)) with that message, and
your logging properties set the logging Threshold to DEBUG for the appender.
To improve performance:
change the ConversionPattern to use less expensive date/time formatting, and (more importantly) avoid 'C', 'F', 'L' and 'M' because they are particularly expensive.
change the logging Threshold to INFO or WARNING or ERROR to reduce the amount of logging,
put the Logger.debug(...) call inside an if statement that checks that debug logging is enabled. This saves the cost of assembling the log message in cases where it won't be needed; see In log4j, does checking isDebugEnabled before logging improve performance?.
with log4j version 2 (log4j2), there are overloads that on the logging methods that take a format and parameters. These reduce the overhead when a logging at a level that is disabled.
look also at logback and log4j 2.0.
You can also throttle logging at the Logger level ... as described in the log4j documentation. In fact, the documentation answers most of the questions that you asked, and has a lot of detail on the topics of logging performance and logging configuration.
Short answer: yes, it decreases application performance as it uses some CPU cycles and other resources (memory, etc).
See also this question : log4j performance
Logging can be 30% of you cpu time or more. In terms of jitter, it as large (and more often) than your GC delays.
A simple way to reduce overhead is to use the Pattern to turn off where you are logging each message from. In your case this is %C %M and %L as it has to take a stack trace (of the entier stack) to get this information.
Yes they do. That's why you should only log an error or something that must absolutely be logged. You can also log information helpful for debugging in the debug channel so it won't affect production performance.
how about?
log4j.category.org.displaytag.tags.TableTag=ERROR, A1
You can restrict junk logs like this.
Set the root logger as INFO so that unnecessary debug logs won't come and fill up your log file.
log4j.rootCategory=INFO, A1
If you want specific class or package to give out DEBUG logs you can do it like this.
log4j.logger.org.hibernate.event.def.DefaultLoadEventListener=DEBUG,A1
The above will print DEBUG level logs from the class DefaultLoadEventListener in your log file along with other INFO level logs.
I'm trying to log certain info messages onto a file but as soon as I run the application both warn and info messages are logged. Now, from what I've read from this site, you cannot log one without logging the other. Has anyone tried this before? If so, how did your properties file look like?
My properties file looks like this:
***** Set root logger level to INFO and its two appenders to stdout and R.
log4j.rootLogger=INFO, stdout, R
# ***** stdout is set to be a ConsoleAppender.
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
# ***** stdout uses PatternLayout.
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
# ***** Pattern to output the caller's file name and line number.
log4j.appender.stdout.layout.ConversionPattern=%5p [%M has started] (%F:%L) - %m%n
/
# ***** R is set to be a RollingFileAppender.
log4j.appender.R=org.apache.log4j.DailyRollingFileAppender
log4j.appender.R.DatePattern='.'yyyy-MM-dd-HH
log4j.appender.R.File="folder where log will be saved"
log4j.appender.R.layout.ConversionPattern=%5p [%m has started] %c{2}.[%x] (%F:%L) %d{yyyy-MM-dd HH:mm:ss} - %m%n
# ***** R uses PatternLayout.
log4j.appender.R.layout=org.apache.log4j.PatternLayout
log4j.appender.R.layout.ConversionPattern=%5p [%m has started] %c{2}.[%x] (%F:%L) %d{yyyy-MM-dd HH:mm:ss} - %m%n
AFAIK there's no standard way to suppress higher log levels than those you are interested in.
However, you might be able to use a custom appender to do that.
It might look similar to this:
public class MyAppender extends AppenderSkeleton {
protected void append(LoggingEvent event) {
if( event.getLevel() == Level.INFO ) {
//append here, maybe call a nested appender
}
}
}
The log level WARN is higher than INFO, and the logging configuration defines the minimum threshold level to be logged by the appender. So, all messages higher than that level will also be logged.
Hence, the WARN messages are expected. And I don't think you can configure it to the way you want.
If the WARN messages that should not be printed come from a different package than the INFO messages, then you can define different log levels for these packages. The first can specify level ERROR and the second INFO
it should look something like this:
log4j.logger.com.test.something=ERROR
log4j.logger.com.other.package=INFO
cheers
Why do you want to filter the WARN level? As peshkira said, you could use two different loggers for splitting/filtering your log output, or you could use tools like grep for filtering (offline) or you could simply remove the WARN logs from your code if your don't need them anyway.
As far as I understand, advanced filters like LevelMatchFilter and LevelRangeFilter can do the trick for you.
Thing worth keeping in mind though is that using these may require xml config instead of properties: Can't set LevelRangeFilter for log4j