A good database log appender for Java? - java

At my workplace, we wrote a custom log4j appender that writes log messages to the database (uses a dedicated thread asynchronously, so no performance hit).
I prefer it a lot over writing to log files - a database-based log is much more easy to query and analyze.
Is there an open source solution that does this (for log4j specifically, or any other java loggers)?
Some things that our appender has, and I would like to see in an alternative:
Logs exceptions (duh!)
Database writes are from a separate thread/pool
Our appender supports the following columns, and I would like to see all of them in whatever solution we find.
LogId
Time
message
stacktrace
process id
thread id
machine name
component
Level (debug/info/warn/...)
ThreadName

There is also a DBAppender class provided by log4j (log4j requires a specific set of tables to log using this appender).
http://logging.apache.org/log4j/companions/receivers/apidocs/org/apache/log4j/db/DBAppender.html
There is an updated non-Apache jdbc logger available here you may also want to try:
http://www.dankomannhaupt.de/projects/index.html

Just curious, wouldn't it severely affect the performance of an application hosting such appender? Logging directly into relational database is quite costly even when you do it asynchronously.

You don't need a custom appender for LOG4J to write to databases. You can use JDBCAppender bundled with Apache's distribution.
According to APACHE's documentation, this API could be replaced in the future. But for now, we use it and it works very well.

Related

Logging output into Database

I am developing a middleware application using Java, Spring, Hibernate, slf4j, log4j, Oracle db stack. Currently I log output to text file. I want to store logs in database for troubleshooting purpose. I tried using log4j db appender to directly log into db but I found the performance to be too slow. So now instead I let log4j append to a file and in a separate thread I read the log file line by line and insert into database. This method is not too slow and also it does not affect the performance of main application.
My question is, does anyone else have a better idea or is there a better way to do it ? I dont want to use any tools like loggy or splunk, because for my purpose those tools are an overkill. I want to know of any homegrown techniques I can use.
I know you have said you don't want to use external tools but I think that is a mistake. The effort you are putting in to create a bespoke solution for your logging has already be made by others and will provide a better more efficient and robust solution than you can.
For starters, loading the log files into the database simply to make them more searchable is a bad idea. You then have the performance overhead of running the database and loading all the files in, plus having to write and test all your code to do this.
I would recommend looking at the logstash tool.
If you are determined to go with your solution of loading then to a database then you need to provide some information on what type of database you intend to use.

In log4j how to write different query for different table for logging to database

From log4j property file ,I have a need to insert data in multiple tables.
So I need to write multiple query for a jdbc appender for logging to database.
Seems that the stock JDBCAppender in log4j-package, or other DB-appenders I found do not allow multiple inserts per log-event.
Attaching 2 JDBCAppenders to log4j could solve your problem. Having 2 appenders would on the other hand cause some transaction overhead, and the rows inserted into database wont have any relation with each other.
Another solution I can think of is writing your own appender, like in the accepted answer of question How to create a own Appender in log4j?
Maybe using FlumeAppender would be a good solution to the scenario that you described.
Apache Flume is a distributed, reliable, and available system for efficiently collecting, aggregating, and moving large amounts of log data from many different sources to a centralized data store. The FlumeAppender takes LogEvents and sends them to a Flume agent as serialized Avro events for consumption.
More details in:
https://logging.apache.org/log4j/2.x/manual/appenders.html#FlumeAppender

Logging SQL statements in application

I am adding SLF4J and Logback to my application and I am unsure if I should log the SQL statements that I generate in the repository layer (using Spring JDBC). The level of these statements would of course be set to DEBUG since it could generate a lot of log statements.
Is it common to log SQL statements generated by the application?
Yes, It is common.
All ORMs, including openjpa and hibernate do it. All mappers like MyBatis have some logging mechanism to hook into any of several logging implementations.
Even in immemorial times. The drivers used to do it when a java.sql.DriverManager#setLogStream was invoked
;)
I'd write this kind of information to database, keeping statistics so I can query and summarize them.
It's certainly possible to do it with the log, but it's not as easy to summarize as it is from the database. No SQL for log files without extraordinary efforts.
Actually if you have a huge application that is considered a financial asset in your point of view, than of course you should log your application since the logging will be considered a security major you can refer to when ever you want. Plus the logging is useful for debugging.
But you have to consider which level of logging you want to choose since it will have a huge load on your database if you want log all your SQL statements.

Azure Java Tomcat logging

I am planning to migrate a previously created Java web application to Azure. The application previously used log4j for application level logs that where saved in a locally created file. The problem is that with the Azure Role having multiple instances I must collect and aggregate these logs and also make sure that they are stored in a persistent storage instead of the virtual machines hard drive.
Logging is a critical component of the application but it must not slow down the actual work. I have considered multiple options and I am curious about the best practice, the best solution considering security, log consistency and performance in both storage-time and by later processing. Here is a list of the options:
Using log4j with a custom Appender to store information in Azure SQL.
Using log4j with a custom Appender to store information in Azure Tables storage.
Writing an additional tool that transfers data from local hard drive to either of the above persistent storages.
Is there any other method or are there any complete solutions for this problem for Java?
Which of the above would be best considering the above mentioned criteria?
There's no out-of-the-box solution right now, but... a custom appender for Table Storage makes sense, as you can then query your logs in a similar fashion to diagnostics (perf counters, etc.).
The only consideration is if you're writing log statements in a massive quantity (like hundreds of times per second). At that rate, you'll start to notice transaction costs showing up on the monthly bill. At a penny per 10,000, and 100 per second, you're looking about $250 per instance. If you have multiple instances, the cost goes up from there. With SQL Azure, you'd have no transaction cost, but you'd have higher storage cost.
If you want to go with a storage transfer approach, you can set up Windows Azure diagnostics to watch a directory and upload files periodically to blob storage. The only snag is that Java doesn't have direct support for configuring diagnostics. If you're building your project from Eclipse, you only have a script file that launches everything, so you'd need to write a small .net app, or use something like AzureRunMe. If you're building a Visual Studio project to launch your Java app, then you have the ability to set up diagnostics without a separate app.
There's a blog post from Persistent Systems that just got published, regarding Java and diagnostics setup. I'll update this answer with a link once it's live. Also, have a look at Cloud Ninja for Java, which implements Tomcat logging (and related parsing) by using an external .net exe that sets up diagnostics, as described in the upcoming post.
Please visit my blog and download the document. In this document you can look for chapter "Tomcat Solution Diagnostics" for error logging solution. This document was written long back but you sure can use this method to generate the any kind of Java Based logging (log4j, sure )in Tomcat and view directly.
Chapter 6: Tomcat Solution Diagnostics
Error Logging
Viewing Log Files
http://blogs.msdn.com/b/avkashchauhan/archive/2010/10/29/windows-azure-tomcat-solution-accelerator-full-solution-document.aspx
In any scenario where there is custom application i.e. java.exe, php.exe, python etc, I suggest to create the log file directly at "Local Storage" Folder and then initialize Azure Diagnostics in Worker Role (WorkerRole.cs) to export these custom log files directly from Azure VM to your Azure Blob storage.
How to create custom logs on local storage is described here.
Using Azure Diagnostics and sending logs to Azure blob would be cheapest and robust then any other method u have described.
Finally I decided to write a Log4J Appender. I didn't need to gather diagnostics information, my main goal was only to gather the log files in an easily exchangeable way. My first fear was that it would slow down the application, but with by writing only to memory and only periodically writing out the log data to Azure tables it works perfectly without making too many API calls.
Here are the main steps for my implementation:
First I created an entity class to be stored in Azure Tables, called LogEntity that extends com.microsoft.windowsazure.services.table.client.TableServiceEntity.
Next I wrote the appender that extends org.apache.log4j.AppenderSkeleton containing a java.util.List<LogEntity>.
From the overrided method protected void append(LoggingEvent event) I only added to this collection and then created a thread that periodically empties this list and writes the data to Azure tables.
Finally I added the newly created Appender to my log4j configuration file.
Another alternative;
Can we not continue using log4j the standard way (such as DailyRollingFileAppender) only the file should be created on a UNC path, on a VM (IaaS).
This VM, will only need to have a bit of disk space, but need not have any great processing power. So one could share an available VM, or create a VM with the most minimal configuration, preferably in the same region and cloud service.
The accumulated log files can be accessed via RDP/ FTP etc.
That way one will not incur transaction cost and
cost of developing a special Log4j appender ... it could turn out as a cheaper alternative.
thanks
Jeevan
PS: I am referring more towards, ones application logging and not to the app-server logs (catalina/ manager .log or .out files of Weblogic)

Java Message logging Web Application

My Apache Tomcat Server is getting periodic updates from a Java based client Application, At the moment the scenario is just one client and talking to the server.
I want to log the messages from the client onto the server with time-stamp what kind of framework will help me in achieving this?
EDIT: The OP goal was actually pretty unclear and I'm modifying my answer after some clarifications.
Well, I'm not sure, but maybe a logging framework will suit your needs. If so, have a look at:
Log4J: The most famous logging framework, widely used.
Java Logging aka java.util.logging: didn't succeed to replace Log4J.
Logback: "Logback is intended as a successor to the popular log4j project. It was designed, in addition to many individual contributors, by Ceki Gülcü, the founder of log4j".
SL4J: A "Simple Logging Facade for Java serves as a simple facade or abstraction for various logging frameworks, e.g. java.util.logging, log4j and logback, allowing the end user to plug in the desired logging framework at deployment time".
And pick one of them (I'd use Log4J or Logback).
To save your messages for later processing from the webapp (e.g. generating a web page with some graphs/charts), the best approach is to use a database. Just read/write them from/to a simple table with a timestamp column.
If you are not really familiar with Java, JDBC, persistence, connection pooling, datasource, etc, I'd suggest to use the Spring framework as it will hide most of the complexity. For the database part, have a look at the Chapter 11. Data access using JDBC from the Spring documentation. Pay a special attention to the JdbcTemplate or the SimpleJdbcTemplate, they should allow you to get the job done.
Create a special JSP page for accepting log entries, and invoke it with
http://..... foo.jsp?l=the%20stuff%to%log (i.e. URL encoded)
You then just need to pick out the "l" parameter and do with it what you need to do. An initial implementation could be invoking the log(String s) method in the servlet context.

Categories

Resources