I need to monitor a log file for a pattern. The log file continually gets written by an application.
The application can add new log statements while my program is reading it.
The log gets rolled over when it’s >200 MB or at end of the day, so my program should handle change in filename dynamically.
If my program crashes for any reason, it has to resume from where it left off.
I do not want to re-invent the wheel. I am looking for a Java API. I wrote a program to read file and put in a loop with 30 seconds sleep, but that does not meet all the criteria.
You might consider looking at apache commons io classes, in particular Tailer/TailerListener classes. See http://www.devdaily.com/java/jwarehouse/commons-io-2.0/src/main/java/org/apache/commons/io/input/Tailer.java.shtml.
These two API's can be helpful:
1
JxFileWatcher (Official Site)
Read here what it is capable of
2
Jnotify
JNotify is a java library that allow java application to listen to file system events, such as:
File created
File modified
File renamed
File deleted
If you are using Log4j, or can integrate it, it is possible to append log outputs to a convenient object, such as a StringBuffer, as it has been discussed in this related question: Custom logging to gather messages at runtime
This looks similar: Implementation of Java Tail
Essentially you use a BufferedReader. Tracking where you left off will be something you'll have to add, perhaps capture the last line read?
That same question references JLogTailer which looks interesting and may do most of what you want already.
Related
I am running a Java application on Azure Cloud Services.
I have seen this article which shows how to configure a java project to send logs to Azure insights using log4j: https://azure.microsoft.com/en-us/documentation/articles/app-insights-java-trace-logs/
However, for various reasons I do not want to do this. My java application already writes multiple log files in a log directory (application.log, error.log, etc). I want to point to this directory in Azure Insights so that it can aggregate these log files over multiple instances of my application running on Cloud Services and then present them to me. (In a similar way that AWS Cloudwatch presents logs). How can I accomplish this?
I think this is a deep question and would require a bit of custom coding to accomplish it.
The problem as I read it is, you have multiple log files writing to a location and you just want to parse those log files and send the log lines. Moreover, you don't want to add the log appenders to your Java code for various reasons.
The short answer is, no. There isn't a way to have AI monitor a directory of log files and then send them.
The next short answer is, no. AI can't do it out of the box, but Log Analytics can. This is a bit more heavy handed and I haven't read enough about it to say it would fit in this scenario. However, since you're using a cloud service you could more than likely install the agent and start collecting logs.
The next answer is the long answer, kinda. You can do this but it would require a lot of custom coding on your part. What I envision is something akin to how the Azure Diagnostics Sink works.
You would need to create an application that reads the log files and enumerates them line by line, it would then parse them based on some format and then call the TrackTrace() method to log it.
This option requires some considerable thought since you would be reading the file and then determining what to do with it.
My java program needs to know if a text file content changes. Currently Im using File.lastModified() method to do it but ideally I don't want to perform the check. Instead I want an event to fire every time the file is modified. Are there any third party libraries available for this kind of thing? I've heard it can be accomplished using apache file descriptors but couldn't find any information regarding it.
You can accomplish this with the new Java 7 WatchService. This enables you to watch a directory and be notified of create, modified and delete events.
There are many factors that might determine your solution. How often the files updates, what type the information is, etc...
My advice would be either the Java 7 standard, or the Apache de facto standard (if the requirement wont allow Java 7 solution)...
Apache configuration
If it is a file that is kind of property information, a configuration, then you might want to look at Apache commons configuration. Which gives a refresh method each time the file is updated, to reload your configuration, etc. http://commons.apache.org/proper/commons-configuration/userguide/howto_events.html#An_example
Java 7, WatchService
If you use Java 7, look at the WatchService, link http://docs.oracle.com/javase/7/docs/api/java/nio/file/WatchService.html
Apache IO
If you dont want to use Java 7, and the file is not a configuration (which is better to use commons configuration that way), then you might want to look at FileAlterationObserver from Apache io. http://commons.apache.org/proper/commons-io/
Try Apache's DefaultFileMonitor.
All you need to do is add the file in question to the list of files that need to be monitored and start the monitoring thread.
If the file has been changed or deleted it shall fire an OnChange event.
Google say that you can use libraries like JNotify and inotify-java
I have two java processes which I want completely decoupled from each other.
I figure that the best way to do this is for one to write out its data to file and the other to read it from that file (the second might also have to write to the file to say its processed the line).
Problems I envisage are do with similtaneous access to the file. Is there a good simple pattern I can use to get around this problem? Is there a library that handles this sort of functionality?
Best way to describe it is as a simple direct message passing mechanism I could implement using files. (Simpler than JMS).
Thanks Dan
If you want a simple solution and you can assume that "rename file" is an atomic operation (this is not completely true), each one of the processes can rename the file when reading it or writing to it and rename back when it finishes. The other one will not find the file and will wait until the file appears.
you mean like a named pipe? it's possible but java doesn't allow pipe creation unless you use non portable processes
You are asking for functionality that is exactly what JMS does. JMS is an API which has many implemententations. Can you you not just use a lightweight implementation? I don't see why you think this is "complicated". By the time you've mananged to reliably implement your solution you'll have found that it's not trivial to deal with all the edge cases.
Correct me if I don't understand your problem...
Why don't you look at file locks ? When a program acquire the lock, the other wait until the lock is released
If you are not locked on a file-based solution, a database can solve your problem.
Each record will be a line written by the writing process. A single column in the record will be untouched and the reading process will use it to indicate that it red the record.
Naturally you will have to deal with cleanup of the table before it becomes to large, or its partitioning so it will be easy for the reading process to find information inside it.
If you must use a file - you can think of another file that just has the ID of the record that the reader process read - that way you don't need to have concurrently writing processes on the same file.
I have a Java program which runs as 3 separate processes on the same server. I would like all of the processes to share a single log file, is there a way to specify that in a logging.properties file? I am using java.util.logging to handle logging.
Currently, this is how I define my FileHandler in my logging.properties file:
java.util.logging.FileHandler.pattern=%h/log/logfile.log
This works fine for 1 instance of the program, however, if I attempt to start 3 separate instances of the program the result is:
logfile.log
logfile.log.1
logfile.log.2
Any advice on this?
Thankyou
Logback is another logger, but it supports your case.
from the docs: http://logback.qos.ch/manual/appenders.html
check out prudent mode for FileAppender
Writing to the same file from different processes (the different JVMs) is not recommended.
The only safe way to do it is to somehow lock the file, open it, write to it and then close it. This considerably slows down each writing, which is generally deemed unacceptable for a logger. If you really want to go this way, you can always write your own handler.
I would write a 2nd Java program - a logger. Have the other processes send log messages to the logging program, which would then write to the single log file. You can communicate between the programs using sockets. Here's an example of how to do that.
Paul
Elaborating on Paul's answer, you can use a SocketHandler to direct the log events from all processes to a single process, which actually writes to a file.
Most log packages provide a simple implementation of this functionality. Another widely supported option is integration with the system's logging facility (Window's Event Log or syslogd).
I have made a java application and wants to generate log files so whenever my client would encounter some problem, he can deliver me those log files so that I can correct my code accordingly.
Kindly provide me a small sample program that writes a statement to a log file. Kindly mention the .class files you are using with their full import statements.
The application is multi-threaded so Is it better to generate separate log files for each thread or not?
Is it better to clear all previous log files before starting the program?
macleojw is correct: You should try writing the code yourself.
Here is an overview of the Java logging framework that ships with the JDK. You may wish to check out Commons Logging and Log4J.
Regarding the second part of your question (which was editted out for some reason) I would recommend having all threads log to the same file but logging the thread name along with the log message allowing you to grep the file for a specific thread if required. Also, with most logging frameworks you can configure them to maintain a rolling window of the last N log files rather than explicitly deleting old files when an application starts.
Apache Log4j does everything you require. I hope that you can figure out how to use it on your own.
Take a look at Log4j, and specifically this set of step-by-step examples. It's pretty trivial.