I am trying to implement Log4J API in my java project.
To do that, I have created a properties as shown in the image below (highlighted in yellow):
Project Structure Image
These are the properties I set in the file:
# TRACE < DEBUG < INFO < WARN < FATAL
log4j.rootLogger=TRACE, DEBUG, INFO, file
# Console
# log4j.appender.toConsole=org.apache.log4j.ConsoleAppender
# log4j.appender.toConsole.layout=org.apache.log4j.PatternLayout
# log4j.appender.toConsole.layout.ConversionPatter=%d{HH:mm:ss} %5p [%t] - $c.%M - %m%n
# Redirecting log messages to console
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.Target=System.out
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L - %m%n
I declared the LOGGER object in my class as following:
import org.apache.log4j.Level;
import org.apache.log4j.Logger;
public class Credits{
Logger LOGGER = null;
public Credits(String sources) {
LOGGER = Logger.getLogger(ChunkRecon.class.getName());
LOGGER.setLevel(Level.DEBUG);
String creditNames = "select tablename, creditNumbers, credit_type from schema.table where credit_type in ("sources")";
LOGGER.debug("This is a debug message");
system.out.println("This message is from println");
}
}
In the output, I see the message from sysout: This message is from println but not the debug message.
Could anyone let me know what is the mistake I am doing here ?
Try to rename your loggerproperties to log4j.properties and check it is in classpath. Another problem with rootLogger, see explanation here
Also Logger is usually used as static variable. For example:
public class Credits {
private static final Logger logger = Logger.getLogger(Credits.class);
public Credits(String sources) {
logger.setLevel(Level.DEBUG);
logger.debug("This is a debug message");
System.out.println("This message is from println");
}
}
log4j.properties
log4j.rootLogger=debug, stdout
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.Target=System.out
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L - %m%n
Your log4j.rootLogger declaration seems incorrect. Try as -
log4j.rootLogger=TRACE,stdout,file
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.file=org.apache.log4j.RollingFileAppender
If you only want the logs on console then remove file logging fully.
log4j.rootLogger=TRACE,stdout
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
Refer section Configuring loggers from JavaDoc.
The syntax for configuring the root logger is:
log4j.rootLogger=[level], appenderName, appenderName, ...
Related
I'm trying to transfer a file from one server to other server using sftp protocol. So, I'm trying to use sshj library for this. Now before transferring the file I'm performing several activities such as zipping and after transferring unzipping . So, I'm logging this details to successLog and if ay error to FailureLog.
here is my log4j properties:-
# Define the root logger
log4j.rootLogger = DEBUG, toConsole
# Define the console appender
log4j.appender.toConsole=org.apache.log4j.ConsoleAppender
log4j.appender.toConsole.layout=org.apache.log4j.PatternLayout
log4j.appender.toConsoleAppender.layout.ConversionPattern=%d{HH:mm:ss} %5p [%t] - %c.%M - %m%n
# Define the file appender
log4j.appender.success=org.apache.log4j.FileAppender
log4j.appender.success.File=logs/success.log
log4j.appender.success.Append=false
log4j.appender.success.layout=org.apache.log4j.PatternLayout
log4j.appender.success.layout.conversionPattern=%d{HH:mm:ss} %5p [%t] - %c.%M - %m%n
# Define the file appender
log4j.appender.failure=org.apache.log4j.FileAppender
log4j.appender.failure.File=logs/failure.log
log4j.appender.failure.Append=false
log4j.appender.failure.layout=org.apache.log4j.PatternLayout
log4j.appender.failure.layout.conversionPattern=%d{HH:mm:ss} %5p [%t] - %c.%M - %m%n
log4j.category.successLogger=DEBUG, success
log4j.additivity.successLogger=true
log4j.category.failureLogger=WARN, failure
log4j.additivity.failureLogger=false
and in java code I'm using this as :-
static final Logger successLog = Logger.getLogger("successLogger");
static final Logger failureLog = Logger.getLogger("failureLogger");
and example failureLog.error("exception occured due to so an so reason",e);
now I would like to store sshj library log in some file called sftp.log. How can I do that??so far I was manually logging into success and failure logs but sshj log is by default printing on console which I want to write it to some file.
Here's my example of writing the SFTP-session FINE-level log to file:
private void enableFineLogging() {
try {
fileHandler = new FileHandler(
"./logs/fine_sshj.log",
10000000, 1000, true);
fileHandler.setLevel(Level.FINER);
fileHandler.setFormatter(new SimpleFormatter());
final Logger app = Logger.getLogger("net.schmizz");
app.setLevel(Level.FINER);
app.addHandler(fileHandler);
app.setUseParentHandlers(false);
app.info(
"######################################################################## "
+ "START LOGGING NEW SSHJ SESSION "
+ "########################################################################");
} catch (Exception e) {
// Do something...
}
}
Though I'm not sure if are able to separate the log levels so that successLogger would have only the success cases... I think there will be also WARNING and SEVERE level messages.
i am quite new ot apache Kafka and log4j. i am trying to send my log messages into Kafka. Here is my log4j properties file
log4j.rootLogger=INFO, stdout
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L %% %m%n
log4j.appender.KAFKA=kafka.producer.KafkaLog4jAppender
log4j.appender.KAFKA.BrokerList=localhost:9092
log4j.appender.KAFKA.Topic=kfkLogs
log4j.appender.KAFKA.SerializerClass=kafka.producer.DefaultStringEncoder
log4j.appender.KAFKA.layout=org.apache.log4j.PatternLayout
log4j.appender.KAFKA.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L %% - %m%n
log4j.logger.logGen=DEBUG, KAFKA
however, i am not able to receive any messages in my consumer. I have tested the consumer with some other producer code and it works fine.
also, i get this warning
log4j:WARN No such property [serializerClass] in kafka.producer.KafkaLog4jAppender.
Edit
and here is the code that generates my log messages
package logGen;
import org.apache.log4j.Logger;
public class TestLog4j {
static Logger log = Logger.getLogger(TestLog4j.class.getName());
public static void main(String[] args) {
log.debug("Debug message");
log.info("Info message");
log.error("Error Message");
log.fatal("Fatal Message");
log.warn("Warn Message");
log.trace("Trace Message");
}
}
Also, if i write the log messages to a file by using something like
log4j.appender.KAFKA=org.apache.log4j.DailyRollingFileAppender
log4j.appender.KAFKA.DatePattern='.'yyyy-MM-dd-HH
log4j.appender.KAFKA.File=logs/server.log
i can see the log messages in the server.log file
thanks for the suggestions everyone. i think the weird behavior that i see might be related to my kafka set up. here are the contents of my server.properties file which i use to start up my kafka server. can you see anything odd about it ?
broker.id=0
port=9092
num.network.threads=3
num.io.threads=8
socket.send.buffer.bytes=102400
socket.receive.buffer.bytes=102400
socket.request.max.bytes=104857600
log.dirs=/Users/xyz/kafka/kafka-logs
num.partitions=1
num.recovery.threads.per.data.dir=1
log.retention.hours=168
log.segment.bytes=1073741824
log.retention.check.interval.ms=300000
log.cleaner.enable=false
zookeeper.connect=localhost:2181
zookeeper.connection.timeout.ms=6000
delete.topic.enable=true
I have took a look at the source code of KafkaLog4jAppender.scala and here are the valid and exhaustive properties for Kafka log4j appender as of version 0.8.2.1 : topic, brokerList, compressionType, requiredNumAcks, syncSend.
The log4j.properties that worked for me is below :
log4j.rootLogger=ERROR, stdout
log4j.logger.logGen=DEBUG, KAFKA
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L %% %m%n
log4j.appender.KAFKA=kafka.producer.KafkaLog4jAppender
log4j.appender.KAFKA.topic=LOG
log4j.appender.KAFKA.brokerList=localhost:9092
log4j.appender.KAFKA.compressionType=none
log4j.appender.KAFKA.requiredNumAcks=0
log4j.appender.KAFKA.syncSend=true
log4j.appender.KAFKA.layout=org.apache.log4j.PatternLayout
log4j.appender.KAFKA.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L %% - %m%n
You need to add KAFKA to your log4j.rootLogger like this:
log4j.rootLogger=INFO, stdout, KAFKA
This will add the KAFKA appender to your rootLogger.
i had to specify
log4j.appender.KAFKA.producer.type=async
log4j.logger.logGen.TestLog4j=TRACE, KAFKA
and that seems to work. however i experience a delay of 10-30 seconds. Specifically, if i publish right now and can see the messages in teh consumer, then the next time i publish has to be about 30 secs later else i dont see anything in my consumer. any ideas on why this might be happening ? maybe its an eclispe issue ?
log4j.appender.KAFKA.SerializerClass=kafka.producer.DefaultStringEncoder
could you try
log4j.appender.KAFKA.Serializer=kafka.producer.DefaultStringEncoder
Instead?
I believe the async mode of sending msgs does so by batching, hence the delay, have you you tried sending using sync?
The API I am working on cannot be connected to a database, but need to log events that are happening in the API. To do this I was thinking on using log4j to create log file with API event information.
The problem is that all log entries end up in both logs, and not separated.
Requirements I am need to fulfill
Multiple log files with certain information inside
Backup log files live indefinitely
Log4j properties file
log4j.rootLogger=QuietAppender, LoudAppender, FirstLog, SecondLog, TRACE
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=%d %p [%c] - <%m>%n
# setup A1
log4j.appender.QuietAppender=org.apache.log4j.RollingFileAppender
log4j.appender.QuietAppender.Threshold=INFO
log4j.appender.QuietAppender.File=${wls.logs-path}/test-api/test-api-info.log
log4j.appender.QuietAppender.MaxFileSize=512KB
log4j.appender.QuietAppender.layout=org.apache.log4j.PatternLayout
log4j.appender.QuietAppender.layout.ConversionPattern=%d %p [%c] - %m%n
# Keep three backup files.
log4j.appender.QuietAppender.MaxBackupIndex=100
# Pattern to output: date priority [category] - message
log4j.appender.QuietAppender.layout=org.apache.log4j.PatternLayout
log4j.appender.QuietAppender.layout.ConversionPattern=%d %p [%c] - %m%n
# setup A2
log4j.appender.LoudAppender=org.apache.log4j.RollingFileAppender
log4j.appender.LoudAppender.Threshold=DEBUG
log4j.appender.LoudAppender.File=${wls.logs-path}/test-api/test-api-debug.log
log4j.appender.LoudAppender.MaxFileSize=512KB
# Keep three backup files.
log4j.appender.LoudAppender.MaxBackupIndex=3
# Pattern to output: date priority [category] - message
log4j.appender.LoudAppender.layout=org.apache.log4j.PatternLayout
log4j.appender.LoudAppender.layout.ConversionPattern=%d %p [%c] - %m%n
# setup FirstLog
log4j.appender.FirstLog=org.apache.log4j.RollingFileAppender
log4j.appender.FirstLog.Threshold=INFO
log4j.appender.FirstLog.File=${wls.logs-path}/test-api/first-info.log
log4j.appender.FirstLog.MaxFileSize=10240kB
log4j.appender.FirstLog.MaxBackupIndex=99999
log4j.appender.FirstLog.layout=org.apache.log4j.PatternLayout
log4j.appender.FirstLog.layout.ConversionPattern=%d %p [%c] - %m%n
# setup SecondLog
log4j.appender.SecondLog=org.apache.log4j.RollingFileAppender
log4j.appender.SecondLog.Threshold=INFO
log4j.appender.SecondLog.File=${wls.logs-path}/test-api/second-info.log
log4j.appender.SecondLog.MaxFileSize=10240kB
log4j.appender.SecondLog.MaxBackupIndex=99999
log4j.appender.SecondLog.layout=org.apache.log4j.PatternLayout
log4j.appender.SecondLog.layout.ConversionPattern=%d %p [%c] - %m%n
Java Class
private static final Logger logkp = Logger.getLogger("FirstLog");
private static final Logger logda = Logger.getLogger("SecondLog");
logkp.info(sb.toString());
logda.info(sb.toString());
Current Results
2015-05-27 10:27:46,175 INFO [SecondLog] - 12645,APIServer1,0,000bdc5000000100011055042d0114a6
2015-05-27 10:27:46,583 INFO [FirstLog] - APIServer1,Caller,test-Version-1.0,certValue,1
2015-05-27 10:28:22,458 INFO [SecondLog] - 12645,APIServer1,0,000bdc5000000100011055042d0114a6
2015-05-27 10:28:22,793 INFO [FirstLog] - APIServer1,Caller,test-Version-1.0,certValue,1
2015-05-27 10:28:25,203 INFO [SecondLog] - 12645,APIServer1,0,000bdc5000000100011055042d0114a6
2015-05-27 10:28:25,528 INFO [FirstLog] - APIServer1,Caller,test-Version-1.0,certValue,1
2015-05-27 10:28:26,686 INFO [SecondLog] - 12645,APIServer1,0,000bdc5000000100011055042d0114a6
I'm not 100% sure about this, because it's been a while that I was using log4j and we used to write the configuration in xml, but I think you have to create loggers like this:
log4j.rootLogger=QuietAppender, LoudAppender, TRACE
log4j.logger.FirstLogger = FirstLog, INFO
log4j.additivity.FirstLogger = false
log4j.logger.SecondLogger = SecondLog, INFO
log4j.additivity.SecondLogger = false
... // then configure appenders as you did
To get the outputs you want. Setting the additivity to false "cuts" the connection to the rootLogger. It is set to true by default and will cause all logmessages to be appended to the logger and to all ancestors of the logger. Setting it to false will change that.
If your API has its own namespace - let's say "my.own.API" then you could also create an API-Logger like this:
log4j.logger.my.own.API = MyAPIAppender, INFO
log4j.additivity.my.own.API = false
And create loggers like this:
package my.own.API
public class MyAPIClass{
private static Logger apiLog = Logger.getLogger(MyAPIClass.class);
// ...
}
Can you explain me, how to configure log4j.properties - so to save only levels: "trace","debug","info" in the file. This code saved in the file only levels from "INFO" to "FATAL"
log4j.rootLogger=INFO, file
log4j.appender.file=org.apache.log4j.RollingFileAppender
log4j.appender.file.File=log4j-HigherWarnSaveInFile.log
log4j.appender.file.MaxFileSize=1MB
log4j.appender.file.MaxBackupIndex=1
log4j.appender.file.Append=true
log4j.appender.file.layout=org.apache.log4j.PatternLayout
log4j.appender.file.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss} %-5p %c{2}:%L - %m%n
Java file
LOGGER.trace("Trace Message!");
LOGGER.debug("Debug Message!");
LOGGER.info("Info Message!");
LOGGER.warn("Warn Message!");
LOGGER.error("Error Message!");
LOGGER.fatal("Fatal Message!");
Log4j works with a threshold mechanism, so usually you define a threshold and all messages logged on or above that level will be logged.
So if you set the threshold level to INFO, you'll log INFO, WARN, ERROR and FATAL. TRACE and DEBUG will be filtered out. This is not what you need.
But you can set the root logger's threshold to TRACE or ALL and then use a LevelRangeFilter to filter out the higher level messages.
log4.rootLogger=ALL, file
[...]
log4j.appender.file.filter=org.apache.log4j.varia.LevelRangeFilter
log4j.appender.file.filter.LevelMin=TRACE
log4j.appender.file.filter.LevelMax=INFO
I am trying to use KafkaLog4jAppender to write to kafka and also a file appender to write to a file. The conversion pattern works for the file, but for the kafka appender it is not working
The output of the file appender is 2014-09-19T22:30:14.781Z INFO com.test.poc.StartProgram Message1
But the Kafka appender has output as Message1
Find below my log4j.properties file
log4j.rootCategory=INFO
log4j.appender.file=org.apache.log4j.DailyRollingFileAppender
log4j.appender.file.DatePattern='.'yyyy-MM-dd-HH
log4j.appender.file.File=logs/test.log
log4j.appender.file.layout=org.apache.log4j.PatternLayout
log4j.appender.file.layout.ConversionPattern=%d{yyyy-MM-dd'T'HH:mm:ss.SSS'Z'}{UTC} %p %C %m%n
log4j.logger.com=INFO,file,KAFKA
#Kafka Appender
log4j.appender.KAFKA=kafka.producer.KafkaLog4jAppender
log4j.appender.KAFKA.layout=org.apache.log4j.PatternLayout
log4j.appender.KAFKA.layout.ConversionPattern=%d{yyyy-MM-dd'T'HH:mm:ss.SSS'Z'}{UTC} %p %C %m%n
log4j.appender.KAFKA.ProducerType=async
log4j.appender.KAFKA.BrokerList=localhost:9092
log4j.appender.KAFKA.Topic=test`enter code here`
log4j.appender.KAFKA.Serializer=kafka.test.AppenderStringSerializer
This is a known Kafka bug; the fix is in Kafka 0.8.2:
https://issues.apache.org/jira/browse/KAFKA-847