java.util.logging assign logger to a specific package - java

I implemented two customized handler to log information on DB and an additional flat file (DBHandler and MyFileHandler). This new log handlers will be used by a single class on a specific package.
I attached the two new loggers to a specific package only.
The idea is to switch between this two handlers (file and database)for the classes contaiend on a specific package, but currently with the current configuration I could not do that, so either I am logging with both handler either there is no log at all.
I tried to set the log level for DB handler to off but it is still logging normally on DB.
below the configuration file is use logging.properties
############################################################
##### Global properties
############################################################
handlers= java.util.logging.FileHandler, java.util.logging.ConsoleHandler, com.test.logging.DBHandler, com.test.logging.MyFileHandler
.level = INFO
############################################################
# Handler specific properties.
# Describes specific configuration info for Handlers.
############################################################
java.util.logging.FileHandler.level = ALL
java.util.logging.FileHandler.pattern = %t/CLog%g.log
java.util.logging.FileHandler.limit = 50000
java.util.logging.FileHandler.count = 1
java.util.logging.FileHandler.formatter = java.util.logging.SimpleFormatter
java.util.logging.ConsoleHandler.level = ALL
java.util.logging.ConsoleHandler.formatter = java.util.logging.SimpleFormatter
com.test.logging.MyFileHandler.level = ALL
com.test.logging.MyFileHandler.pattern = %t/custLog%g.log
com.test.logging.MyFileHandler.limit = 50000
com.test.logging.MyFileHandler.count = 1
com.test.logging.MyFileHandler.formatter = java.util.logging.SimpleFormatter
com.test.logging.DBHandler.level=OFF
com.test.ccb.mon.handlers=com.test.logging.DBHandler, com.test.logging.MyFileHandler
The class using the logger to track he inforamtion is below
package com.test.ccb.mon;
public class Utils {
public static final Logger logger = Logger.getLogger(Utils.class.getCanonicalName());
public void logging()
{
//processing
logger.info("message);
}
}
DBHandler class:
public class DBHandler extends Handler {
#Override
public void close() throws SecurityException {
}
#Override
public void flush() {
}
#Override
public void publish(LogRecord logRecord) {
if (isLoggable(logRecord))
{
try {
//SQL call to insert onDB
} catch (Exception e) {
e.printStackTrace();
}
}
}
}
MyFileHandler class:
public class MyFileHandler extends FileHandler{
public MyileHandler() throws IOException, SecurityException {
super();
}
#Override
public void close() throws SecurityException {
super.close();
}
#Override
public void flush() {
super.flush();
}
#Override
public void publish(LogRecord record) {
super.publish(record);
}
}

The Handler class doesn't read any properties from the LogManager by default. You have to code that logic in all of your subclasses.
public class DBHandler extends Handler {
public DBHandler() {
LogManager m = LogManager.getLogManager();
String p = getClass().getName();
String v = m.getProperty(p + ".level");
try {
if (v != null) {
super.setLevel(Level.parse(v));
}
} catch (RuntimeException re) {
reportError(v, re, ErrorManager.OPEN_FAILURE);
}
//#todo create code to parse filter, formatter, encoding, etc.
}
#Override
public void close() throws SecurityException {
}
#Override
public void flush() {
}
#Override
public void publish(LogRecord logRecord) {
if (isLoggable(logRecord)) {
try {
//SQL call to insert onDB
} catch (Exception e) {
reportError("", e, ErrorManager.WRITE_FAILURE);
}
}
}
}

Reproducing your problem is not so easy for me. With handler classes similar to yours, changes to the configuration file have the expected effect. With the DBHandler.level=OFF setting, the database handler output is missing for me:
Aug 11, 2015 1:47:26 PM com.test.ccb.mon.Utils logging
DBHandler.publish - handler level: OFF; log record level: INFO
INFO: message
MyFileHandler - message
Logging handlers:
###java.util.logging.FileHandler-ALL
###java.util.logging.ConsoleHandler-ALL
###com.test.logging.DBHandler-OFF
###com.test.logging.MyFileHandler-ALL
Your debug code to print the logging handlers is now also included in the following main method to your Utils class. You could run this method yourself, to see whether this way of reading the configuration file works better for you:
public static void main(final String[] arguments) throws IOException
{
final String fileName = "logging.properties";
final InputStream propertiesStream = Utils.class.getResourceAsStream(fileName);
//final InputStream propertiesStream = new FileInputStream("path to file");
LogManager.getLogManager().readConfiguration(propertiesStream);
new Utils().logging();
System.out.println();
// No handlers for this logger directly, but four for its parent.
System.out.println("Logging handlers:");
for (final Handler handler : logger.getParent().getHandlers())
System.out.println("###" + handler.getClass().getName()
+ "-" + handler.getLevel());
}
A very simple version of your DBHandler class could look like this (please note the if (isLoggable(record)) check in the publish method):
package com.test.logging;
import java.util.logging.*;
/**
* Logging handler that stores logging in the database.
*/
public class DBHandler extends Handler {
#Override
public void publish(final LogRecord record) {
System.out.println("DBHandler.publish - handler level: " + getLevel()
+ "; log record level: " + record.getLevel());
if (isLoggable(record))
System.out.println(getClass().getSimpleName() + " - " + record.getMessage());
}
#Override
public void flush() {
// Empty.
}
#Override
public void close() throws SecurityException {
// Empty.
}
}

Related

Logback custom appender to push logs to ElasticSearch via Open Telemetry

I am using apache Nifi , which uses logback framework for logging.
I have written a custom log appender to push logs to MongoDB.
Now I have a requirement to push logs in specific format (json) to ElasticSearch.
As few other apps are already using open telemetry to push data to ES, i have been asked to do so.
So now I am looking for right way to push data from my custom log appender (java class) to Open Telemetry.
Could some please point me to the usefull example which I can refer ?
Mongo appender looks like below, same message I want to push Open Telemetry also.
#Override
public void start() {
super.start();
logger.info("Initialising MongoDBLogAppender ~~~~~~~~~~~~~~~~");
try{
createConnection();
}catch (Exception e){
logger.error("Failed to obtain a database instance from the MongoClient at server [{}] and port [{}].", this.server, this.port);
}
}
private MongoClient createConnection () throws Exception {
if (client == null)
{
client = createMongoClient(this.server, this.port, this.databaseName, this.userName, this.password);
}
if (this.databaseName != null && (!"".equals(this.databaseName))) {
database = client.getDatabase(this.databaseName);
} else {
logger.error("Mongo database name is required.");
}
return client;
}
#Override
protected void append(ILoggingEvent iLoggingEvent) {
if (database == null)
return;
String logMessage = iLoggingEvent.getMessage() == null ? "" : iLoggingEvent.getMessage();
logger.info("LogMessage (inside append method) : "+logMessage);//Change the level to debug after testing
String[] msgParts = parseLogMessage(logMessage);
Document doc = new Document()
.append("Timestamp", new Date(iLoggingEvent.getTimeStamp()))
.append("Ip", this.hostIp)
.append("Server", "NiFi")
.append("Instance", "")
.append("Url", "")
.append("TTId", msgParts[1])
.append("LTId", msgParts[0])
.append("LUId", msgParts[2])
.append("SId", msgParts[4])
.append("RId", msgParts[3])
.append("Level", iLoggingEvent.getLevel().levelStr)
.append("Logger", iLoggingEvent.getLoggerName())
.append("Thread", iLoggingEvent.getThreadName())
.append("Message", msgParts[5])
.append("Exception", iLoggingEvent.getThrowableProxy() != null ? ThrowableProxyUtil.asString(iLoggingEvent.getThrowableProxy()) : null);
try
{
Publisher<InsertOneResult> publisher = database.getCollection(collectionName).insertOne(doc);
publisher.subscribe(new Subscriber<InsertOneResult>() {
#Override
public void onSubscribe(final Subscription s) {
s.request(1); // <--- Data requested and the insertion will now occur
}
#Override
public void onNext(final InsertOneResult result) {}
#Override
public void onError(final Throwable t) {
logger.error("Failed to insert Nifi log to mongodb : "+this.toString(),t);
}
#Override
public void onComplete() {}
});
}
catch (Exception e)
{
logger.error("Encountered exception while logging Nifi log to MongoDB : "+this.toString(), e);
}
}
/* Log message format expected : "~(<LoggedinTenantId>, <TargetTenantId>, <Userid>) ~ <RequestId> ~ <SessionId> ~ <DetailedMessage>" */
public static String[] parseLogMessage(String logMessage){
String[] msgParts = new String[6];
..................................
return msgParts;
}
private synchronized static MongoClient createMongoClient(String server, int port, String databaseName, String userName, String password)
{
...................................
return MongoClients.create(settings);
}
}
Thanks
Mahendra

How to dynamically configure log directory (every request) in java util logging with helidon

I want to configure separate log directory for every request. Is this possible with helidon?
The oracle helidon JUL examples are located in github. Building off of those examples you would have to create a custom Handler to read the request from the HelidonMdc:
import io.helidon.logging.jul;
import io.helidon.logging.common;
import java.util.logging.*;
public class RequestFileHandler extends Handler {
public RequestFileHandler() {
super.setFormatter(new HelidonFormatter());
}
#Override
public synchronized void publish(LogRecord r) {
if (isLoggable(r)) {
try {
FileHandler h = new FileHandler(fileName(r), Integer.MAX_VALUE, 1, true);
try {
h.setLevel(getLevel());
h.setEncoding(getEncoding());
h.setFilter(null);
h.setFormatter(getFormatter());
h.setErrorManager(getErrorManager());
h.publish(r);
} finally {
h.close();
}
} catch (IOException | SecurityException jm) {
this.reportError(null, jm, ErrorManager.WRITE_FAILURE);
}
}
}
#Override
public void flush() {
}
#Override
public void close() {
super.setLevel(Level.OFF);
}
private String fileName(LogRecord r) {
Optional<String> o = HelidonMdc.get("name");
return o.isPresent() ? o.get() +".log" : "unknown.log";
}}
Like the example code this code is assuming that you have set the value of 'name' to the request id. You would then have to install this handler on your application logger.

How to keep initial log file in rotating log handler

I am using Java logging to log the memory static in my file and use java.util.logging.FileHandler to implement rotating log. Now I have a situation where my manager wants to keep the initial logging file and rotate the rest of the file. Is there any way I can keep the initial log file but yet rotate the rest of the file.
public class TopProcessor extends Handler {
Handler handler;
public TopProcessor() throws IOException{
File dir = new File(System.getProperty("user.home"), "logs");
dir.mkdirs();
File fileDir = new File(dir,"metrics");
fileDir.mkdirs();
String pattern = "metrics-log-%g.json";
int count = 5;
int limit = 500000;
handler = new TopProcessorHandler(fileDir.getAbsolutePath()+File.separator+pattern, limit, count);
}
class TopProcessorHandler extends FileHandler{
public TopProcessorHandler(String pattern, int limit, int count)
throws IOException {
super(pattern, limit, count);
}
}
private void writeInformationToFile(String information) {
handler.publish(new LogRecord(Level.ALL, information));
}
#Override
public void close() {
handler.close();
}
#Override
public void flush() {
handler.flush();
}
#Override
public void publish(LogRecord record) {
handler.publish(record);
}
}
Create 2 files one initial log file and other rotating log file..You can merge two files when you want to read logs

having issues calling a method in java

a simple question but i litterally can not remember, basically i want to run java methods in an certain order, i did have it working perfectly, but i have had to add something to the start and now it will not run in order
Basically before was this code,
#PostConstruct
public void init() {
//System.out.println(destinationPDF);
//System.out.println(destination);
// Get the username from the login page, this is used to create a folder for each user
System.out.println("called get username");
username = FacesContext.getCurrentInstance().getExternalContext().getRemoteUser();
}
public void File() {
File theFile = new File(destination + username); // will create a sub folder for each user (currently does not work, below hopefully is a solution)
theFile.mkdirs();
System.out.println("Completed File");
}
it would run and automatically call the next required method, it would call them in this order :
INFO: buttonToUploadText invoked
INFO: called get username
INFO: called handle file
INFO: Completed Creation of folder
INFO: Now in copying of file proccess
INFO: Completed Creation of folder for copy of PDF
INFO: End of copying file creation
INFO: Called CopyFile
INFO: New file created!
INFO: Copying is now happening
But i have added a new method, that calls variables from a file :
#PostConstruct
public void loadProp() {
System.out.println("Loading properties");
InputStream in = this.getClass().getClassLoader().getResourceAsStream("config.properties"); //points to a properties file, this will load up destinations instead of having to declare them here
try {
configProp.load(in);
System.out.println(configProp.getProperty("destinationPDF"));
System.out.println(configProp.getProperty("destination"));
System.out.println(configProp.getProperty("fileList"));
} catch (IOException e) {
e.printStackTrace();
}
}
This now must run first when it is triggered in order to declare variables, however now it will now run public void int() once complete instead it skips a lot and runs public void handleFileUpload
so what is the best way of calling public void init() from public void loadProp() {
Edit 2:
private Properties configProp = new Properties();
public void loadProp() {
System.out.println("Loading properties");
InputStream in = this.getClass().getClassLoader().getResourceAsStream("config.properties"); //points to a properties file, this will load up destinations instead of having to declare them here
try {
configProp.load(in);
System.out.println(configProp.getProperty("destinationPDF"));
System.out.println(configProp.getProperty("destination"));
System.out.println(configProp.getProperty("fileList"));
} catch (IOException e) {
e.printStackTrace();
}
}
private String destinationPDF = configProp.getProperty("destinationPDF");
public String destination = configProp.getProperty("destination");
private String username;
//public static String destination = "D:/Documents/NetBeansProjects/printing~subversion/fileupload/uploaded/"; // main location for uploads//TORNADO ONLY //"D:/My Documents/NetBeansProjects/printing~subversion/fileupload/uploaded/"; // USE ON PREDATOR ONLY
public static String NewDestination;
public static String UploadedfileName;
public static String CompletefileName;
//
//Strings for file copy
//
//private String destinationPDF = "D:/Documents/NetBeansProjects/printing~subversion/fileupload/web/resources/pdf/"; //USE ON TORNADO//"D:/My Documents/NetBeansProjects/printing~subversion/fileupload/web/resources/pdf/";//USE ON PREDATOR
private String NewdestinationPDF;
public static String PdfLocationViewable;
//
#PostConstruct
public void init() {
FileUploadController.loadProp();
//System.out.println(destinationPDF);
//System.out.println(destination);
// Get the username from the login page, this is used to create a folder for each user
System.out.println("called get username");
username = FacesContext.getCurrentInstance().getExternalContext().getRemoteUser();
}
You can and should have only one #PostConstruct method.
Replace
#PostConstruct
public void loadProp() {
// ...
}
#PostConstruct
public void init() {
// ...
}
By
#PostConstruct
public void postConstruct() {
loadProp();
init();
}
private void loadProp() {
// ...
}
private void init() {
// ...
}
(I'd only consider renaming postConstruct() to init() and rename the original init() to something else matching its actual job)

How to create my own Appender in log4j?

I am new in log4j. Can anyone explain how to create my own Appender? i.e. how to implement the classes and interfaces and how to override it?
Update: the provided solution is valid for Log4J 1.x . If you're looking for 2.x versions, take a look at this article: How to create a custom appender in log4j2
You should extend AppenderSkeleton class, that (quoting javadoc) "provides the code for common functionality, such as support for threshold filtering and support for general filters."
If you read the code of AppenderSkeleton, you'll see that it handles almost all, leaving to you just:
protected void append(LoggingEvent event)
public void close()
public boolean requiresLayout()
The core method is append. Remember that you don't need to implement the filtering logic in it because it is already implemented in doAppend that in turn calls append.
Here I made a (quite useless) class that stores the log entries in an ArrayList, just as a demo.
public /*static*/ class MyAppender extends AppenderSkeleton {
ArrayList<LoggingEvent> eventsList = new ArrayList();
#Override
protected void append(LoggingEvent event) {
eventsList.add(event);
}
public void close() {
}
public boolean requiresLayout() {
return false;
}
}
Ok, let's test it:
public static void main (String [] args) {
Logger l = Logger.getLogger("test");
MyAppender app = new MyAppender();
l.addAppender(app);
l.warn("first");
l.warn("second");
l.warn("third");
l.trace("fourth shouldn't be printed");
for (LoggingEvent le: app.eventsList) {
System.out.println("***" + le.getMessage());
}
}
You should have "first", "second", "third" printed; the fourth message shouldn't be printed since the log level of root logger is debug while the event level is trace. This proves that AbstractSkeleton implements "level management" correctly for us. So that's definitely seems the way to go... now the question: why do you need a custom appender while there are many built in that log to almost any destination? (btw a good place to start with log4j: http://logging.apache.org/log4j/1.2/manual.html)
If you would like to do some manipulations or decisions you can do it like this:
#Override
protected void append(LoggingEvent event) {
String message = null;
if(event.locationInformationExists()){
StringBuilder formatedMessage = new StringBuilder();
formatedMessage.append(event.getLocationInformation().getClassName());
formatedMessage.append(".");
formatedMessage.append(event.getLocationInformation().getMethodName());
formatedMessage.append(":");
formatedMessage.append(event.getLocationInformation().getLineNumber());
formatedMessage.append(" - ");
formatedMessage.append(event.getMessage().toString());
message = formatedMessage.toString();
}else{
message = event.getMessage().toString();
}
switch(event.getLevel().toInt()){
case Level.INFO_INT:
//your decision
break;
case Level.DEBUG_INT:
//your decision
break;
case Level.ERROR_INT:
//your decision
break;
case Level.WARN_INT:
//your decision
break;
case Level.TRACE_INT:
//your decision
break;
default:
//your decision
break;
}
}
I would like to expend #AgostinoX answer to support pro file configuration and the ability to start and stop the logging capture :
public class StringBufferAppender extends org.apache.log4j.AppenderSkeleton {
StringBuffer logs = new StringBuffer();
AtomicBoolean captureMode = new AtomicBoolean(false);
public void close() {
// TODO Auto-generated method stub
}
public boolean requiresLayout() {
// TODO Auto-generated method stub
return false;
}
#Override
protected void append(LoggingEvent event) {
if(captureMode.get())
logs.append(event.getMessage());
}
public void start()
{
//System.out.println("[StringBufferAppender|start] - Start capturing logs");
StringBuffer logs = new StringBuffer();
captureMode.set(true);
}
public StringBuffer stop()
{
//System.out.println("[StringBufferAppender|start] - Stop capturing logs");
captureMode.set(false);
StringBuffer data = new StringBuffer(logs);
logs = null;
return data;
}
}
Now all you have to do is to define in in the log4j.property file
log4j.rootLogger=...., myAppender # here you adding your appendr name
log4j.appender.myAppender=com.roi.log.StringBufferAppender # pointing it to the implementation
than when ever you want to enable it during runtume:
Logger logger = Logger.getRootLogger();
StringBufferAppender appender = (StringBufferAppender)logger.getAppender("myAppender");
appender.start();
and while want to stop it:
StringBuffer sb = appender.stop();
To create a own Appender you just implement the Appender Interface and just override it.
And also study this link start log

Categories

Resources