I'm trying to figure out how to continuously read a file and once there is a new line added, output the line. I'm doing this using a sleep thread however it just seems to blow through the whole file and exit the program.
Any suggestions what I'm doing wrong?
Here is my code:
import java.io.*;
import java.lang.*;
import java.util.*;
class jtail {
public static void main (String args[])
throws InterruptedException, IOException{
BufferedReader br = new BufferedReader(
new FileReader("\\\\server01\\data\\CommissionPlanLog.txt"));
String line = null;
while (br.nextLine ) {
line = br.readLine();
if (line == null) {
//wait until there is more of the file for us to read
Thread.sleep(1000);
}
else {
System.out.println(line);
}
}
} //end main
} //end class jtail
thanks in advance
UPDATE: I've since changed the line "while (br.nextLine ) {" to just "while (TRUE) {"
This in somewhat old, but I have used the mechanism and it works pretty well.
edit: link no longer works, but I found it in the internet archive
https://web.archive.org/web/20160510001134/http://www.informit.com/guides/content.aspx?g=java&seqNum=226
The trick is to use a java.io.RandomAccessFile, and periodically check if the file length is greater that your current file position. If it is, then you read the data. When you hit the length, you wait. wash, rinse, repeat.
I copied the code, just in case that new link stops working
package com.javasrc.tuning.agent.logfile;
import java.io.*;
import java.util.*;
/**
* A log file tailer is designed to monitor a log file and send notifications
* when new lines are added to the log file. This class has a notification
* strategy similar to a SAX parser: implement the LogFileTailerListener interface,
* create a LogFileTailer to tail your log file, add yourself as a listener, and
* start the LogFileTailer. It is your job to interpret the results, build meaningful
* sets of data, etc. This tailer simply fires notifications containing new log file lines,
* one at a time.
*/
public class LogFileTailer extends Thread
{
/**
* How frequently to check for file changes; defaults to 5 seconds
*/
private long sampleInterval = 5000;
/**
* The log file to tail
*/
private File logfile;
/**
* Defines whether the log file tailer should include the entire contents
* of the exising log file or tail from the end of the file when the tailer starts
*/
private boolean startAtBeginning = false;
/**
* Is the tailer currently tailing?
*/
private boolean tailing = false;
/**
* Set of listeners
*/
private Set listeners = new HashSet();
/**
* Creates a new log file tailer that tails an existing file and checks the file for
* updates every 5000ms
*/
public LogFileTailer( File file )
{
this.logfile = file;
}
/**
* Creates a new log file tailer
*
* #param file The file to tail
* #param sampleInterval How often to check for updates to the log file (default = 5000ms)
* #param startAtBeginning Should the tailer simply tail or should it process the entire
* file and continue tailing (true) or simply start tailing from the
* end of the file
*/
public LogFileTailer( File file, long sampleInterval, boolean startAtBeginning )
{
this.logfile = file;
this.sampleInterval = sampleInterval;
}
public void addLogFileTailerListener( LogFileTailerListener l )
{
this.listeners.add( l );
}
public void removeLogFileTailerListener( LogFileTailerListener l )
{
this.listeners.remove( l );
}
protected void fireNewLogFileLine( String line )
{
for( Iterator i=this.listeners.iterator(); i.hasNext(); )
{
LogFileTailerListener l = ( LogFileTailerListener )i.next();
l.newLogFileLine( line );
}
}
public void stopTailing()
{
this.tailing = false;
}
public void run()
{
// The file pointer keeps track of where we are in the file
long filePointer = 0;
// Determine start point
if( this.startAtBeginning )
{
filePointer = 0;
}
else
{
filePointer = this.logfile.length();
}
try
{
// Start tailing
this.tailing = true;
RandomAccessFile file = new RandomAccessFile( logfile, "r" );
while( this.tailing )
{
try
{
// Compare the length of the file to the file pointer
long fileLength = this.logfile.length();
if( fileLength < filePointer )
{
// Log file must have been rotated or deleted;
// reopen the file and reset the file pointer
file = new RandomAccessFile( logfile, "r" );
filePointer = 0;
}
if( fileLength > filePointer )
{
// There is data to read
file.seek( filePointer );
String line = file.readLine();
while( line != null )
{
this.fireNewLogFileLine( line );
line = file.readLine();
}
filePointer = file.getFilePointer();
}
// Sleep for the specified interval
sleep( this.sampleInterval );
}
catch( Exception e )
{
}
}
// Close the file that we are tailing
file.close();
}
catch( Exception e )
{
e.printStackTrace();
}
}
}
If you're planning to implement this on a reasonable sized application where multiple objects might be interested in processing the new lines coming to the file, you might want to consider the Observer pattern.
The object reading from the file will notify each object subscribed to it as soon as a line has been processed.
This will allow you to keep logic well separated on the class where it's needed.
Also consider org.apache.commons.io.input.Tailer if you do not have the requirement of writing this from scratch.
The way your code is written now, you will not go through your while loop when your 'line==null' because you are checking to see that it has a next line before you even get into the loop.
Instead, try doing a while(true){ } loop. That way, you will always be looping through it, catching your pause cases, until you hit a condition that would cause the program to end.
Related
I'm trying to produce file listing of a given directory and it's sub directories in a ftp server.
The server works fine, and I have been successfully able to produce the file listing of the current directory. When I try to list the subdirectories and their files is where it gets complicated.
I was asked not to use a recursion algorithm, so I did some research of my own. I have tried using threads (for every directory found, start a new thread), but I wasn't able to keep my connection stable and open. Any ideas on how to do so correctly with threads, or other alternatives?
EDIT: below is my code, when using the recursive statement (last line of code), it works
class TEST {
public static synchronized void main(String[] args) {
String server = args[0]; //server,path will be given as an arguments
String pass = "SOMEPASS";
String user = "SOMEUSER";
int port = 21;
FTPClient ftpClient = new FTPClient();
try {
ftpClient.connect(server, port);
showServerReply(ftpClient);
int replyCode = ftpClient.getReplyCode();
if (!FTPReply.isPositiveCompletion(replyCode)) {
System.out.println("Connect failed");
return;
}
boolean success = ftpClient.login(user, pass);
showServerReply(ftpClient);
if (!success) {
System.out.println("Could not login to the server");
return;
}
/*START THE FILE LISTING HERE*/
} catch (IOException ex) {
System.out.println("Oops! Something wrong happened");
ex.printStackTrace();
} finally {
// logs out and disconnects from server
try {
if (ftpClient.isConnected()) {
ftpClient.logout();
ftpClient.disconnect();
}
} catch (IOException ex) {
ex.printStackTrace();
}
}
}
private static void showServerReply(FTPClient ftpClient) {
String[] replies = ftpClient.getReplyStrings();
if (replies != null && replies.length > 0) {
for (String aReply : replies) {
System.out.println("SERVER: " + aReply);
}
}
}
private static void scanDir(FTPClient client, String path) throws IOException {
FTPFile[] files = client.listFiles(path); // Search all the files in the current directory
for (int j = 0; j < files.length; j++) {
System.out.println(files[j].getName()); // Print the name of each files
}
FTPFile[] directories = client.listDirectories(path); // Search all the directories in the current directory
for (int i = 0; i < directories.length; i++) {
String dirPath = directories[i].getName();
System.out.println(dirPath); // Print the path of a sub-directory
scanDir(client,dirPath); // Call recursively the method to display the files in the sub-directory DONT WANT TO DO THAT...
}
}
}
Okay, here is an example of how to handle it non-recursively, but with lists.
Mind, that this example is based on /accessing the local filesystem, but can easily be rewritten/extended for any kind of hierarchial/recursive structure.
package stackoverflow.nonrecursivefilesearch;
import java.io.File;
import java.io.IOException;
import java.nio.file.Files;
import java.nio.file.Path;
import java.util.ArrayList;
import java.util.stream.Stream;
public class NonRecursiveFileSearch {
public static void main(final String[] args) throws IOException {
final File searchDir = new File("D:\\test\\maven-test"); // set one
System.out.println("\nOld Java");
printDirs(listFiles_old(searchDir, true, true), "OLD: Depth first, include dirs");
printDirs(listFiles_old(searchDir, true, false), "OLD: Breadth first, include dirs");
printDirs(listFiles_old(searchDir, false, true), "OLD: Depth first, exclude dirs");
printDirs(listFiles_old(searchDir, false, false), "OLD: Breadth first, exclude dirs");
System.out.println("\nNew java.io with streams");
printDirs(listFiles_newIO(searchDir, true), "Java NIO, include dirs");
printDirs(listFiles_newIO(searchDir, false), "Java NIO, exclude dirs");
}
/**
* this is the way to 'manually' find files in hierarchial/recursive structures
*
* reminder: "Depth First" is not a real depth-first implementation
* real depth-first would iterate subdirs immediately.
* this implementation iterates breadth first, but descends into supdirs before it handles same-level directories
* advantage of this implementation is its speed, no need for additional lists etc.
*
* in case you want to exclude recursion traps made possible by symbolic or hard links, you could introduce a hashset/treeset with
* visited files (use filename strings retrieved with canonicalpath).
* in the loop, check if the current canonical filename string is contained in the hash/treeset
*/
static public ArrayList<File> listFiles_old(final File pDir, final boolean pIncludeDirectories, final boolean pDepthFirst) {
final ArrayList<File> found = new ArrayList<>();
final ArrayList<File> todo = new ArrayList<>();
todo.add(pDir);
while (todo.size() > 0) {
final int removeIndex = pDepthFirst ? todo.size() - 1 : 0;
final File currentDir = todo.remove(removeIndex);
if (currentDir == null || !currentDir.isDirectory()) continue;
final File[] files = currentDir.listFiles();
for (final File file : files) {
if (file.isDirectory()) {
if (pIncludeDirectories) found.add(file);
// additional directory filters go here
todo.add(file);
} else {
// additional file filters go here
found.add(file);
}
}
}
return found;
}
static private void printDirs(final ArrayList<File> pFiles, final String pTitle) {
System.out.println("====================== " + pTitle + " ======================");
for (int i = 0; i < pFiles.size(); i++) {
final File file = pFiles.get(i);
System.out.println(i + "\t" + file.getAbsolutePath());
}
System.out.println("============================================================");
}
/**
* this is the java.nio approach. this is NOT be a good solution for cases where you have to retrieve/handle files in your own code.
* this is only useful, if the any NIO class provides support. in this case, NIO class java.nio.file.Files helps handling local files.
* if NIO or your target system does not offer such helper methods, this way is harder to implement, as you have to set up the helper method yourself.
*/
static public Stream<Path> listFiles_newIO(final File pDir, final boolean pIncludeDirectories) throws IOException {
final Stream<Path> stream = Files.find(pDir.toPath(), 100,
(path, basicFileAttributes) -> {
final File file = path.toFile(); // conversion to File for easier access (f.e. isDirectory()), could also use NIO methods
return (pIncludeDirectories || !file.isDirectory() /* additional filters go here */ );
});
return stream;
}
static private void printDirs(final Stream<Path> pStream, final String pTitle) {
System.out.println("====================== " + pTitle + " ======================");
pStream.forEach(System.out::println);
System.out.println("============================================================");
}
}
AND, one must add, java.nio.file.Files.find() might be implemented recursively. But as it's just one call, this maybe could count as 'non-recursive' too.
ALSO, as the OP stated in comments, one might use Stack or other FIFO/LIFO collections. LIFO for a mixed depth-first, FIFO for breadth-first approach.
I would like to know how to properly put in a text file and then convert it to an HTML file.
When I try putting a text file I get the errors
java.security.InvalidParameterException: Configuration file unreadable.
at com.outsideinsdk.ExportTest.parseConfig(ExportTest.java:51)
at com.outsideinsdk.ExportTest.<init>(ExportTest.java:35)
at com.outsideinsdk.ExportTest.main(ExportTest.java:197)
Sample code below
package com.outsideinsdk;
import com.outsideinsdk.*;
import java.io.*;
import java.security.*;
import java.util.*;
/**
* The <code>ExportTest</code> class tests the {#link Export Export} technology
* according to the properties provided in a given configuration file. The
* configuration file is assumed to be correctly formatted.
*
* #author Kevin Glannon
* #version 1.00
* #see Export Export
*/
public class ExportTest
{
private static final String INPUTPATHKEY = "inputpath";
private static final String OUTPUTPATHKEY = "outputpath";
private static final String OUTPUTIDKEY = "outputid";
Properties configProps = new Properties();
/**
* Since <code>ExportTest</code> objects are always associated with a
* configuration file, the constructor requires a configuration file path.
*
* #param cfp The configuration file path.
*/
public ExportTest(String cfp)
throws FileNotFoundException, IOException
{
parseConfig(cfp);
}
/**
* Parse the configuration file specified by the given path.
*
* #param cfp The configuration file path.
*/
public void parseConfig(String cfp)
throws FileNotFoundException, IOException
{
// Assure the configuration file exists and is readable.
File cff = new File(cfp);
if (!cff.exists() || !cff.isFile() || !cff.canRead())
{
throw(new InvalidParameterException("Configuration file unreadable."));
}
BufferedReader cfr = new BufferedReader(new FileReader(cff));
String line;
// Loop over all lines from the file.
while ((line = cfr.readLine()) != null)
{
processLine(line);
}
}
/**
* Support the parsing of the configuration file by processing a given
* line.
*
* #param l A line from a configuration file.
*/
private void processLine(String l)
{
// Look for comments.
int indPound = l.indexOf('#');
// Remove comments and whitespace.
String line = (indPound == -1) ? l.trim() :
l.substring(0, indPound).trim();
if (line.length() != 0)
{
StringTokenizer stl = new StringTokenizer(line);
String key = stl.nextToken();
String value = stl.nextToken();
while(stl.hasMoreTokens())
{
value +=" " + stl.nextToken();
}
// Fill in the appropriate property.
configProps.setProperty(key, value);
}
}
/**
* Run the conversion using the given input path, output path.
*
* #param ifp Input path.
* #param ofp Output path.
* #param timeout Export process timeout in milliseconds.
*/
public void convert(String ifp, String ofp, long timeout)
{
String oid = configProps.getProperty(OUTPUTIDKEY);
// Display the parameters.
System.out.println("Input Path: "+ifp+" Output Path: "+ofp+
" Output ID: "+oid);
// Remove extra control properties.
configProps.remove(INPUTPATHKEY);
configProps.remove(OUTPUTPATHKEY);
// Create list of input files.
File iff = new File(ifp);
File [] iffa;
if (iff.isDirectory())
iffa = iff.listFiles();
else
{
iffa = new File[1];
iffa[0] = iff;
}
// Create output directory if needed. Assuming that if the input path
// is a directory, the output path should also be a directory.
File off = new File(ofp);
if (iff.isDirectory() && !off.exists()) off.mkdir();
// Process the conversion.
Export e = new Export(configProps);
if (off.isDirectory())
{
// The outputid is in the form fi_XXX where XXX is a reasonable
// extension so we take the extension for the oid.
// oid.substring(3) means to get the string following the fi_
String ext = "." + oid.substring(3);
for (int i=0; i<iffa.length; i++)
{
String ifn = iffa[i].toString();
String ofn = ofp + File.separator + iffa[i].getName() + ext;
System.out.println("Converting "+ifn+" to "+ofn);
ExportStatusCode result = e.convert(ifn, ofn, oid, timeout);
if (result.getCode() == ExportStatusCode.SCCERR_OK.getCode())
{
System.out.println("Conversion Successful!");
}
else {
System.out.println("Conversion Error: " + result);
}
}
}
else
{
for (int i=0; i<iffa.length; i++)
{
ExportStatusCode result = e.convert(iffa[i].toString(), ofp, oid, timeout);
if (result.getCode() == ExportStatusCode.SCCERR_OK.getCode())
{
System.out.println("Conversion Successful!");
}
else {
System.out.println("Conversion Error: " + result);
}
}
}
}
/**
* Run the test according to the given arguments. These arguments must adhere to the following usage.<br><br>
* Usage:<br>
* ExportTest InputPath OutputPath ConfigurationFile [Timeout]<br><br>
*
* InputPath and OutputPath may be single files or directories. If InputPath is a directory, then all files in
* that directory will be converted, but without recursion. If OutputPath is a directory, then all converted
* files from InputPath are placed in the OutputPath directory by appending an extension which represents the
* output file type. Timeout is in milliseconds.
*
* #param args Command line arguments.
*/
public static void main(String[] args)
{
int count = args.length;
// Check for specification of configuration file.
if (count != 3 && count != 4)
{
System.out.println("Input path, output path and configuration file are required.");
System.out.println("Usage: ExportTest InputPath OutputPath "+
"ConfigurationFile [Timeout(in milliseconds)]");
System.out.println();
}
else
{
ExportTest ct = null;
try
{
ct = new ExportTest(args[2]);
}
catch (Exception ex)
{
ex.printStackTrace();
return;
}
long timeout = 0;
if( count == 4 )
{
timeout = Integer.decode( args[3] ).longValue();
}
ct.convert(args[0], args[1], timeout);
}
}
}
This is how my program arguments look like: here
This is where the yes.txt files are located in the project called explorer: here
Say the user runs SomeProgram.java to calculate a bunch of stuff. One of the things they want to keep track of is how many times this program has been run and output the current run number. This is how far I got but it resets each time.
public class SomeProgram
{
public volatile int counter = 1;
public int getNextRun()
{
return counter++;
}
//calculates a bunch of variable that get output to user
public static void main(String args[])
{
SomeProgram variable = new SomeProgram();
runNumber = variable.getNextRun();
System.out.println(runNumber + "Some bunch of calculations");
}
}
Can someone explain why this got downvoted?
Whenever the user stops running your program, you're going to lose any variables stored in memory, so you're going to have to store that value somewhere else. The easiest solution would be to store it in a local file.
If your business needs to know this number, you can have the program call home to a webserver every time it starts up - this prevents the user from modifying the file on their computer - but is far more complicated to set up, and some users might not appreciate this unexpected behavior.
Complete implementation which stores updated counter in a file, invoke it whenever you want a counter to increment (i.e. when program starts). When a file doesn't exist, it is created. This method returns updated counter or 0 if there was some IOException.
public static int updateCounter() {
String counterFileName = "counter.txt";
int counter = 0;
File counterFile = new File(counterFileName);
if (counterFile.isFile()) {
try (BufferedReader reader = new BufferedReader(new FileReader(counterFileName))) {
counter = Integer.parseInt(reader.readLine());
} catch (IOException e) {
e.printStackTrace();
return 0;
}
}
try (FileWriter writer = new FileWriter(counterFileName)) {
writer.write(String.valueOf(++counter));
} catch (IOException e) {
e.printStackTrace();
return 0;
}
return counter;
}
Writing to the local file is not a good idea. You'll have to implement locking mechanism on your local file, otherwise you'll suffer of race conditions in case of simultaneous start of several program instances.
Alternative idea is to log each run into a persistent storage. So if you write each run's date and time to the db, you'll be able to calculate number of runs for arbitrary time interval.
Actual implementation depends on your requirements
You can use a Properties file:
public void loadProperties(String fileName)
{
Properties props = new Properties();
InputStream is = null;
// First try loading from the current directory
try {
File f = new File(fileName);
is = new FileInputStream( f );
}catch ( Exception e ) {
is = null;
e.printStackTrace();
}
try {
if ( is == null ) {
// Try loading from classpath
is = getClass().getResourceAsStream("example.properties");
}
// Try loading properties from the file (if found)
props.load( is );
String counter1 = props.getProperty("COUNTER_RUN");
String counter2 = props.getProperty("COUNTER_OUTPUT");
counterRun = Integer.parseInt(counter1);
counterOutput = = Integer.parseInt(counter2);
}catch ( Exception e ) {
e.printStackTrace();
}
}
public void saveProperties(String fileName) {
try {
Properties props = new Properties();
props.setProperty("COUNTER_RUN", ""+counterRun );
props.setProperty("COUNTER_OUTPUT", ""+counterOutput );
File f = new File(fileName);
OutputStream out = new FileOutputStream( f );
props.store(out, "Config params");
} catch (Exception e ) { e.printStackTrace(); }
}
counterRun and counterOutput are global vars
File example.properties
#Config paramns
#Tue May 03 14:17:35 COT 2016
COUNTER_RUN=241
COUNTER_OUTPUT=123
I have tried a program which download files parallely using java.nio by creating a thread per file download.
package com.java.tftp.nio;
import java.io.File;
import java.io.FileNotFoundException;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.OutputStream;
import java.net.InetSocketAddress;
import java.net.SocketAddress;
import java.nio.ByteBuffer;
import java.nio.channels.DatagramChannel;
import java.nio.channels.SelectionKey;
import java.nio.channels.Selector;
import java.util.Iterator;
import java.util.List;
import java.util.Set;
/**
* This class is used to download files concurrently from tftp server by
* configuring the filenames, no of files.
*
* #author SHRIRAM
*
*/
public class TFTP_NIO_Client {
/**
* destination folder
* */
private String destinationFolder;
/**
* list of files names to download
* */
private List<String> fileNames;
/**
* integer indicates the number of files to download concurrently
* */
private int noOfFilesToDownload;
public TFTP_NIO_Client(List<String> fileNames, String destinationFolder,
int noOfFilesToDownload) {
this.destinationFolder = destinationFolder;
this.fileNames = fileNames;
this.noOfFilesToDownload = noOfFilesToDownload;
initializeHandlers();
}
/**
* This method creates threads to register the channel to process download
* files concurrently.
*
* #param noOfFilesToDownload
* - no of files to download
*/
private void initializeHandlers() {
for (int i = 0; i < noOfFilesToDownload; i++) {
try {
Selector aSelector = Selector.open();
SelectorHandler theSelectionHandler = new SelectorHandler(
aSelector, fileNames.get(i));
theSelectionHandler.start();
} catch (IOException e) {
e.printStackTrace();
}
}
}
/**
* Setup RRQ/WRQ packet Packet : | Opcode | FileName | 0 | mode | 0 |
* Filename -> Filename in array of bytes. 0 -> indicates end of file mode
* -> string in byte array 'netascii' or 'octet'
*
* #param aOpcode
* #param aMode
* #param aFileName
* #throws IOException
*/
private void sendRequest(int aOpcode, int aMode, String aFileName,
DatagramChannel aChannel, InetSocketAddress aAddress)
throws IOException {
// Read request packet
TFTPReadRequestPacket theRequestPacket = new TFTPReadRequestPacket();
aChannel.send(
theRequestPacket.constructReadRequestPacket(aFileName, aMode),
aAddress);
}
/**
* sends TFTP ACK Packet Packet : | opcode | Block# | opcode -> 4 -> 2 bytes
* Block -> block number -> 2bytes
*
* #param aBlock
*/
private ByteBuffer sendAckPacket(int aBlockNumber) {
// acknowledge packet
TFTPAckPacket theAckPacket = new TFTPAckPacket();
return theAckPacket.getTFTPAckPacket(aBlockNumber);
}
/**
* This class is used to handle concurrent downloads from the server.
*
* */
public class SelectorHandler extends Thread {
private Selector selector;
private String fileName;
/**
* flag to indicate the file completion.
* */
private boolean isFileReadFinished = false;
public SelectorHandler(Selector aSelector, String aFileName)
throws IOException {
this.selector = aSelector;
this.fileName = aFileName;
registerChannel();
}
private void registerChannel() throws IOException {
DatagramChannel theChannel = DatagramChannel.open();
theChannel.configureBlocking(false);
selector.wakeup();
theChannel.register(selector, SelectionKey.OP_READ);
sendRequest(Constants.OP_READ, Constants.ASCII_MODE, fileName,
theChannel, new InetSocketAddress(Constants.HOST,
Constants.TFTP_PORT));
}
#Override
public void run() {
process();
}
private void process() {
System.out.println("Download started for " + fileName + " ");
File theFile = new File(destinationFolder
+ fileName.substring(fileName.lastIndexOf("/")));
FileOutputStream theFout = null;
try {
theFout = new FileOutputStream(theFile);
} catch (FileNotFoundException e) {
e.printStackTrace();
}
while (!isFileReadFinished) {
try {
if (selector.select() == 0) {
try {
// sleep 2sec was introduced because selector is
// thread safe but keys are not thread safe
Thread.sleep(2000);
} catch (InterruptedException e) {
continue;
}
continue;
}
Set<SelectionKey> theSet = selector.selectedKeys();
Iterator<SelectionKey> theSelectedKeys = theSet.iterator();
synchronized (theSelectedKeys) {
while (theSelectedKeys.hasNext()) {
SelectionKey theKey = theSelectedKeys.next();
theSelectedKeys.remove();
if (theKey.isReadable()) {
isFileReadFinished = read(theKey, theFout,
fileName);
if (!isFileReadFinished) {
theKey.interestOps(SelectionKey.OP_READ);
}
} else if (theKey.isWritable()) {
// there is no implementation for file write to
// server.
theKey.interestOps(SelectionKey.OP_READ);
}
}
}
} catch (IOException ie) {
ie.printStackTrace();
}
}
System.out.println("Download finished for " + fileName);
try {
if (selector.isOpen()) {
selector.close();
}
if (theFout != null) {
theFout.close();
}
} catch (IOException ie) {
}
}
}
/**
* #param aKey
* registered key for the selector
* #param aOutStream
* - file output stream to write the file contents.
* #return boolean
* #throws IOException
*/
private boolean read(SelectionKey aKey, OutputStream aOutStream,
String aFileName) throws IOException {
DatagramChannel theChannel = (DatagramChannel) aKey.channel();
// data packet
TFTPDataPacket theDataPacket = new TFTPDataPacket();
ByteBuffer theReceivedBuffer = theDataPacket.constructTFTPDataPacket();
SocketAddress theSocketAddress = theChannel.receive(theReceivedBuffer);
theReceivedBuffer.flip();
byte[] theBuffer = theReceivedBuffer.array();
byte[] theDataBuffer = theDataPacket.getDataBlock();
if (theDataPacket.getOpCode() == Constants.OP_DATA) {
int theLimit = theDataPacket.getLimit();
// checks the limit of the buffer because a packet with data less
// than 512 bytes of content signals that it is the last packet in
// transmission for this particular file
if (theLimit != Constants.MAX_BUFFER_SIZE
&& theLimit < Constants.MAX_BUFFER_SIZE) {
byte[] theLastBlock = new byte[theLimit];
System.arraycopy(theBuffer, 0, theLastBlock, 0, theLimit);
// writes the lastblock
aOutStream.write(theLastBlock);
// sends an acknowledgment to the server using TFTP packet
// block number
theChannel
.send(sendAckPacket((((theBuffer[2] & 0xff) << 8) | (theBuffer[3] & 0xff))),
theSocketAddress);
if (theChannel.isOpen()) {
theChannel.close();
}
return true;
} else {
aOutStream.write(theDataBuffer);
// sends an acknowledgment to the server using TFTP packet
// block number
theChannel
.send(sendAckPacket((((theBuffer[2] & 0xff) << 8) | (theBuffer[3] & 0xff))),
theSocketAddress);
return false;
}
} else if (Integer.valueOf(theBuffer[1]) == Constants.OP_ERROR) {
System.out.println("File : " + aFileName + " not found ");
handleError(theReceivedBuffer);
}
return false;
}
/**
* This method handles the error packet received from Server.
*
* #param aBuffer
*/
private void handleError(ByteBuffer aBuffer) {
// Error packet
new TFTPErrorPacket(aBuffer);
}
}
Is it possible to download multiple files in parallel using java.nio by not creating a thread per file download? If yes can anybody suggest a solution to proceed further.
I would provide an approach to achieve what you are aiming for :
Let L the list of files to be downloaded.
Create a Map M which will hold the mapping of File name to be downloaded and the corresponding Selector instance.
For each file F in L
Get Selector SK from M corresponding to F
Process the state of the Selector by checking for any of the events being ready.
If processing is complete then set the Selector corresponding to F as null. This will help in identifying files
whose
processing is completed.Alternatively, you can remove F from
L; so that the next time you are looping you only process files that are not yet completely downloaded.
The above being said, I am curious to understand why you would want to attempt such a feat? If the thought process behind this requirement is to reduce the number of threads to 1 then it is not correct. Remember, you would end up really taxing the single thread running and for sure your throughput would not necessarily be optimal since the single thread would be dealing with both network as well as disk I/O. Also, consider the case of encountering an exception while writing one of the several files to the disk - you would end up aborting the transfer for all the files; something I am sure you do not want.
A better and more scalable approach would be to poll selectors on a single thread, but hand off any I/O activity to a worker thread. A better approach still would be to read the techniques presented in Doug Lea's paper and implement them. In fact Netty library already implements this pattern and is widely used in production.
I have an application that writes information to file. This information is used post-execution to determine pass/failure/correctness of the application. I'd like to be able to read the file as it is being written so that I can do these pass/failure/correctness checks in real time.
I assume it is possible to do this, but what are the gotcha's involved when using Java? If the reading catches up to the writing, will it just wait for more writes up until the file is closed, or will the read throw an exception at this point? If the latter, what do I do then?
My intuition is currently pushing me towards BufferedStreams. Is this the way to go?
Could not get the example to work using FileChannel.read(ByteBuffer) because it isn't a blocking read. Did however get the code below to work:
boolean running = true;
BufferedInputStream reader = new BufferedInputStream(new FileInputStream( "out.txt" ) );
public void run() {
while( running ) {
if( reader.available() > 0 ) {
System.out.print( (char)reader.read() );
}
else {
try {
sleep( 500 );
}
catch( InterruptedException ex ) {
running = false;
}
}
}
}
Of course the same thing would work as a timer instead of a thread, but I leave that up to the programmer. I'm still looking for a better way, but this works for me for now.
Oh, and I'll caveat this with: I'm using 1.4.2. Yes I know I'm in the stone ages still.
If you want to read a file while it is being written and only read the new content then following will help you achieve the same.
To run this program you will launch it from command prompt/terminal window and pass the file name to read. It will read the file unless you kill the program.
java FileReader c:\myfile.txt
As you type a line of text save it from notepad and you will see the text printed in the console.
public class FileReader {
public static void main(String args[]) throws Exception {
if(args.length>0){
File file = new File(args[0]);
System.out.println(file.getAbsolutePath());
if(file.exists() && file.canRead()){
long fileLength = file.length();
readFile(file,0L);
while(true){
if(fileLength<file.length()){
readFile(file,fileLength);
fileLength=file.length();
}
}
}
}else{
System.out.println("no file to read");
}
}
public static void readFile(File file,Long fileLength) throws IOException {
String line = null;
BufferedReader in = new BufferedReader(new java.io.FileReader(file));
in.skip(fileLength);
while((line = in.readLine()) != null)
{
System.out.println(line);
}
in.close();
}
}
You might also take a look at java channel for locking a part of a file.
http://java.sun.com/javase/6/docs/api/java/nio/channels/FileChannel.html
This function of the FileChannel might be a start
lock(long position, long size, boolean shared)
An invocation of this method will block until the region can be locked
I totally agree with Joshua's response, Tailer is fit for the job in this situation. Here is an example :
It writes a line every 150 ms in a file, while reading this very same file every 2500 ms
public class TailerTest
{
public static void main(String[] args)
{
File f = new File("/tmp/test.txt");
MyListener listener = new MyListener();
Tailer.create(f, listener, 2500);
try
{
FileOutputStream fos = new FileOutputStream(f);
int i = 0;
while (i < 200)
{
fos.write(("test" + ++i + "\n").getBytes());
Thread.sleep(150);
}
fos.close();
}
catch (Exception e)
{
e.printStackTrace();
}
}
private static class MyListener extends TailerListenerAdapter
{
#Override
public void handle(String line)
{
System.out.println(line);
}
}
}
The answer seems to be "no" ... and "yes". There seems to be no real way to know if a file is open for writing by another application. So, reading from such a file will just progress until content is exhausted. I took Mike's advice and wrote some test code:
Writer.java writes a string to file and then waits for the user to hit enter before writing another line to file. The idea being that it could be started up, then a reader can be started to see how it copes with the "partial" file. The reader I wrote is in Reader.java.
Writer.java
public class Writer extends Object
{
Writer () {
}
public static String[] strings =
{
"Hello World",
"Goodbye World"
};
public static void main(String[] args)
throws java.io.IOException {
java.io.PrintWriter pw =
new java.io.PrintWriter(new java.io.FileOutputStream("out.txt"), true);
for(String s : strings) {
pw.println(s);
System.in.read();
}
pw.close();
}
}
Reader.java
public class Reader extends Object
{
Reader () {
}
public static void main(String[] args)
throws Exception {
java.io.FileInputStream in = new java.io.FileInputStream("out.txt");
java.nio.channels.FileChannel fc = in.getChannel();
java.nio.ByteBuffer bb = java.nio.ByteBuffer.allocate(10);
while(fc.read(bb) >= 0) {
bb.flip();
while(bb.hasRemaining()) {
System.out.println((char)bb.get());
}
bb.clear();
}
System.exit(0);
}
}
No guarantees that this code is best practice.
This leaves the option suggested by Mike of periodically checking if there is new data to be read from the file. This then requires user intervention to close the file reader when it is determined that the reading is completed. Or, the reader needs to be made aware the content of the file and be able to determine and end of write condition. If the content were XML, the end of document could be used to signal this.
There are a Open Source Java Graphic Tail that does this.
https://stackoverflow.com/a/559146/1255493
public void run() {
try {
while (_running) {
Thread.sleep(_updateInterval);
long len = _file.length();
if (len < _filePointer) {
// Log must have been jibbled or deleted.
this.appendMessage("Log file was reset. Restarting logging from start of file.");
_filePointer = len;
}
else if (len > _filePointer) {
// File must have had something added to it!
RandomAccessFile raf = new RandomAccessFile(_file, "r");
raf.seek(_filePointer);
String line = null;
while ((line = raf.readLine()) != null) {
this.appendLine(line);
}
_filePointer = raf.getFilePointer();
raf.close();
}
}
}
catch (Exception e) {
this.appendMessage("Fatal error reading log file, log tailing has stopped.");
}
// dispose();
}
You can't read a file which is opened from another process using FileInputStream, FileReader or RandomAccessFile.
But using FileChannel directly will work:
private static byte[] readSharedFile(File file) throws IOException {
byte buffer[] = new byte[(int) file.length()];
final FileChannel fc = FileChannel.open(file.toPath(), EnumSet.of(StandardOpenOption.READ));
final ByteBuffer dst = ByteBuffer.wrap(buffer);
fc.read(dst);
fc.close();
return buffer;
}
Not Java per-se, but you may run into issues where you have written something to a file, but it hasn't been actually written yet - it might be in a cache somewhere, and reading from the same file may not actually give you the new information.
Short version - use flush() or whatever the relevant system call is to ensure that your data is actually written to the file.
Note I am not talking about the OS level disk cache - if your data gets into here, it should appear in a read() after this point. It may be that the language itself caches writes, waiting until a buffer fills up or file is flushed/closed.
I've never tried it, but you should write a test case to see if reading from a stream after you have hit the end will work, regardless of if there is more data written to the file.
Is there a reason you can't use a piped input/output stream? Is the data being written and read from the same application (if so, you have the data, why do you need to read from the file)?
Otherwise, maybe read till end of file, then monitor for changes and seek to where you left off and continue... though watch out for race conditions.