Holding an static StringBuilder - java

We are looking to have a String builder that holds references to some events in the device,
We considered to write and read a file but the cost of opening and closing a file every time we write to it seems too high.
The issue is that sometimes we are getting a StackOverflow exception even if we try to keep the StringBuilder for just a defined size
public class DiagnosticUtil {
private static final int DIAGNOSTIC_SIZE = 5000;
public static StringBuilder DIAGNOSTICS_HOLDER = new StringBuilder(DIAGNOSTIC_SIZE);
public static void addDiagnosticLine(String message){
try {
//Limits the size of the diagnostics recolection removing the first 2000 characters
if (DiagnosticUtil.DIAGNOSTICS_HOLDER.length() > DIAGNOSTIC_SIZE - 300) {
DiagnosticUtil.DIAGNOSTICS_HOLDER.delete(0, DiagnosticUtil.DIAGNOSTICS_HOLDER.length() - 2000);
}
DIAGNOSTICS_HOLDER.append(TimeUtils.getCurrentDate()).append(message).append("\n");
}catch (Exception e){
Timber.d("Error saving additional data");
}
}
}
The question is, Is this a good approach? Or should we save this logs to an external file?.
Thanks!

When you create StringBuilder with preferred size you consume memory, because internally in StringBuilder created char[] array with this size before you pass any String to it, so you need to use default constructor there. Why do you decide to use Builder there instead of List? I don't see all picture but i think you may be prefer to choose something different with two approaches (memory log and file log store) When you collect certain amount of messages simple write it to file, in this way you don't need to touch filesystem for every message and don't populate memory with that amount of log data. You need code something like this:
public class DiagnosticUtil {
private final static int threshold = 1000;
private static List<String> messages = new ArrayList<>();
private static final File log = new File("path to your file");
public static void addDiagnosticLine(String message) {
if (messages.size() > threshold) {
try (BufferedWriter file = new BufferedWriter(new FileWriter(log))) {
for (String msg : messages) {
file.write(msg);
}
file.flush();
} catch (IOException e) {
Timber.d("Error saving additional data " + e);
}
messages = new ArrayList<>();
} else {
messages.add(TimeUtils.getCurrentDate() + message + "\n");
}
}
}
pay attention this is procedural code, not oop, util classes are bad

Related

(Java) How would you write a program that counts the number of times it runs and stores that number?

Say the user runs SomeProgram.java to calculate a bunch of stuff. One of the things they want to keep track of is how many times this program has been run and output the current run number. This is how far I got but it resets each time.
public class SomeProgram
{
public volatile int counter = 1;
public int getNextRun()
{
return counter++;
}
//calculates a bunch of variable that get output to user
public static void main(String args[])
{
SomeProgram variable = new SomeProgram();
runNumber = variable.getNextRun();
System.out.println(runNumber + "Some bunch of calculations");
}
}
Can someone explain why this got downvoted?
Whenever the user stops running your program, you're going to lose any variables stored in memory, so you're going to have to store that value somewhere else. The easiest solution would be to store it in a local file.
If your business needs to know this number, you can have the program call home to a webserver every time it starts up - this prevents the user from modifying the file on their computer - but is far more complicated to set up, and some users might not appreciate this unexpected behavior.
Complete implementation which stores updated counter in a file, invoke it whenever you want a counter to increment (i.e. when program starts). When a file doesn't exist, it is created. This method returns updated counter or 0 if there was some IOException.
public static int updateCounter() {
String counterFileName = "counter.txt";
int counter = 0;
File counterFile = new File(counterFileName);
if (counterFile.isFile()) {
try (BufferedReader reader = new BufferedReader(new FileReader(counterFileName))) {
counter = Integer.parseInt(reader.readLine());
} catch (IOException e) {
e.printStackTrace();
return 0;
}
}
try (FileWriter writer = new FileWriter(counterFileName)) {
writer.write(String.valueOf(++counter));
} catch (IOException e) {
e.printStackTrace();
return 0;
}
return counter;
}
Writing to the local file is not a good idea. You'll have to implement locking mechanism on your local file, otherwise you'll suffer of race conditions in case of simultaneous start of several program instances.
Alternative idea is to log each run into a persistent storage. So if you write each run's date and time to the db, you'll be able to calculate number of runs for arbitrary time interval.
Actual implementation depends on your requirements
You can use a Properties file:
public void loadProperties(String fileName)
{
Properties props = new Properties();
InputStream is = null;
// First try loading from the current directory
try {
File f = new File(fileName);
is = new FileInputStream( f );
}catch ( Exception e ) {
is = null;
e.printStackTrace();
}
try {
if ( is == null ) {
// Try loading from classpath
is = getClass().getResourceAsStream("example.properties");
}
// Try loading properties from the file (if found)
props.load( is );
String counter1 = props.getProperty("COUNTER_RUN");
String counter2 = props.getProperty("COUNTER_OUTPUT");
counterRun = Integer.parseInt(counter1);
counterOutput = = Integer.parseInt(counter2);
}catch ( Exception e ) {
e.printStackTrace();
}
}
public void saveProperties(String fileName) {
try {
Properties props = new Properties();
props.setProperty("COUNTER_RUN", ""+counterRun );
props.setProperty("COUNTER_OUTPUT", ""+counterOutput );
File f = new File(fileName);
OutputStream out = new FileOutputStream( f );
props.store(out, "Config params");
} catch (Exception e ) { e.printStackTrace(); }
}
counterRun and counterOutput are global vars
File example.properties
#Config paramns
#Tue May 03 14:17:35 COT 2016
COUNTER_RUN=241
COUNTER_OUTPUT=123

Limit Android Filesize

Background
I'm keeping a relatively large text file in android storage, and appending to it periodically- while limiting the file's size to some arbitrary size (say 2MB)
Hopefully I'm missing a function somewhere, or hopefully there is a better way to do this process.
Currently, when the file a goes over that arbitrary size, I create a temporary file b, copy the relevant portion of the file a (more or less the substring of the file a starting at byte xxx where xxx is the number of bytes too large the file a would be if I wrote the next bit of data to the log) plus the current data, then overwrite the file a with the second file b.
This is obviously terribly inefficient...
Another solution that I'm not terribly fond of is to keep two files, and toggle between the two of them, clearing the next when the current is full, and switching to that file for output.
However, it would be suuuuuper handy if I could just do something like this
File A = new File("output");
A.chip(500);
or maybe
A.subfile(500,A.length()-500);
TLDR;
Is there a function or perhaps library available for Android that can remove a portion of a file?
Did you already take a look at RandomAccessFile? Though you cannot remove portions of a file you can seek any position within the file and even set the length. So if you detect your file grows too large, just grab the relevant portion and jump to the beginning. Set length to 0 and write the new data.
EDIT:
I wrote a small demo. It shows if the file size is limeted to 10 bytes. If you pass in the values 10 to 15 as strings and separate them with commas, after 10,11,12, the file is written from the beginning, so after 15 it reads 13,14,15
public class MainActivity extends Activity {
private static final String TAG = MainActivity.class.getSimpleName();
private static final long MAX = 10;
private static final String FILE_TXT = "file.txt";
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
for (int i = 10; i <= 15; i++) {
if (i > 10) {
writeToFile(",");
}
writeToFile(Integer.toString(i));
}
}
private void writeToFile(String text) {
try {
File f = new File(getFilesDir(), FILE_TXT);
RandomAccessFile file = new RandomAccessFile(f, "rw");
long currentLength = file.length();
if (currentLength + text.length() > MAX) {
file.setLength(0);
}
file.seek(file.length());
file.write(text.getBytes());
file.close();
} catch (IOException e) {
Log.e(TAG, "writeToFile()", e);
}
printFileContents();
}
private void printFileContents() {
StringBuilder sb = new StringBuilder();
try {
FileInputStream fin = openFileInput(FILE_TXT);
int ch;
while ((ch = fin.read()) != -1) {
sb.append((char) ch);
}
fin.close();
} catch (IOException e) {
Log.e(TAG, "printFileContents()", e);
}
Log.d(TAG, "current content: " + sb.toString());
}
}

Java FTP Download progress

I have looked at many examples and tried to understand what i`m doing wrong but with no success, maybe you can help me. It always stops at the second file, but the first one is just crated on c:\ with 0kb size.
files_server.get(i) is ArrayList with all files that i wish to download.
My code:
public FTPConnection() {
StartD std = new StartD();
std.start();
}
class StartD extends Thread{
#Override
public void run()
{
for (int i = 0; i < files_server.size(); i++) {
err = ftpDownload(files_server.get(i), "C:/"+ files_server.get(i));
if (!err)
{
System.out.println("Error in download, breaking");
break;
}
}
}
public boolean ftpDownload(String srcFilePath, String desFilePath)
{
try {
FileOutputStream desFileStream = new FileOutputStream(desFilePath);
InputStream input = mFTPClient.retrieveFileStream(srcFilePath);
byte[] data = new byte[1024];
int count;
while ((count = input.read(data)) != -1)
{
desFileStream.write(data, 0, count);
}
desFileStream.close();
} catch (Exception e) {
return false;
}
return true;
}}
If I use the finction:
public boolean ftpDownload(String srcFilePath, String desFilePath) {
boolean status = false;
try {
FileOutputStream desFileStream = new FileOutputStream(desFilePath);
status = mFTPClient.retrieveFile(srcFilePath, desFileStream);
desFileStream.close();
return status;
} catch (Exception e) {
}
return status;
}
instead, everything works just fine, but i can`t monitor file download progress.
I've only used it for file unzipping and not FTP, but in that case InputStream buffers can return zero, so I'd say it's worth trying changing your while loop to something like:
while ((count = input.read(data)) >= 0)
public int read(byte[] b) throws IOException
Reads some number of bytes from the input stream and stores them into the buffer array b.
The number of bytes actually read is returned as an integer. This
method blocks until input data is available, end of file is detected,
or an exception is thrown.
If the length of b is zero, then no bytes are read and 0 is returned;
It could also be that you're assigning count twice, which could chop the first byte off the data:
int count = input.read(data);
while ((count = input.read(data)) != -1)
So don't assign anything to count when you declare it.
Let's assume your library is the FTP client from the commons-net package. It's not easy to figure out what's wrong with your code, because we can't run it and because your description (the second file stops) is not sufficient (does it throw an exception? Does it hang forever? Does it complete without any side effect?). Anyway I have a couple of advices:
Use a CountingOutputStream (from Apache commons-io) to monitor progress
Use a ProtocolCommandListener to log what's going on
Also, note that the first 1024 bytes are always lost. Eventually, I don't know how safe it is to put a file in C:\ with the same name it has on the server. At the best, it could lead to permission troubles, at the worst it may originate a security flaw - anyway this doesn't hold if you have some degree of control over the filenames, but hey consider this advice.
This is a sample client
public class FTP {
public static void main(String[] args) throws SocketException, IOException {
FTPClient client = new FTPClient();
client.addProtocolCommandListener(new ProtocolCommandListener(){
#Override
public void protocolCommandSent(ProtocolCommandEvent evt) {
logger.debug(evt.getMessage());
}
#Override
public void protocolReplyReceived(ProtocolCommandEvent evt) {
logger.debug(evt.getMessage());
}
});
client.connect("ftp.mozilla.org");
client.login("anonymous", "");
client.enterLocalPassiveMode();
OutputStream out = new CountingOutputStream(new NullOutputStream()) {
#Override
public void beforeWrite(int count) {
super.beforeWrite(count);
logger.info("Downloaded " + getCount() + " bytes");
}
};
for (String filename: new String[] {"MD5SUMS", "SHA1SUMS"})
client.retrieveFile("pub/firefox/releases/15.0b4/" + filename, out);
out.close();
client.disconnect();
}
private static Logger logger;
static {
logger = Logger.getLogger(FTP.class.getCanonicalName());
}
}
Once configured, the logger will output all the raw socket conversation, and it may help you to better understand the problem, provided it's on the FTP side and not in application IO

design for a wrapper around command-line utilities

im trying to come up with a design for a wrapper for use when invoking command line utilities in java. the trouble with runtime.exec() is that you need to keep reading from the process' out and err streams or it hangs when it fills its buffers. this has led me to the following design:
public class CommandLineInterface {
private final Thread stdOutThread;
private final Thread stdErrThread;
private final OutputStreamWriter stdin;
private final History history;
public CommandLineInterface(String command) throws IOException {
this.history = new History();
this.history.addEntry(new HistoryEntry(EntryTypeEnum.INPUT, command));
Process process = Runtime.getRuntime().exec(command);
stdin = new OutputStreamWriter(process.getOutputStream());
stdOutThread = new Thread(new Leech(process.getInputStream(), history, EntryTypeEnum.OUTPUT));
stdOutThread.setDaemon(true);
stdOutThread.start();
stdErrThread = new Thread(new Leech(process.getErrorStream(), history, EntryTypeEnum.ERROR));
stdErrThread.setDaemon(true);
stdErrThread.start();
}
public void write(String input) throws IOException {
this.history.addEntry(new HistoryEntry(EntryTypeEnum.INPUT, input));
stdin.write(input);
stdin.write("\n");
stdin.flush();
}
}
And
public class Leech implements Runnable{
private final InputStream stream;
private final History history;
private final EntryTypeEnum type;
private volatile boolean alive = true;
public Leech(InputStream stream, History history, EntryTypeEnum type) {
this.stream = stream;
this.history = history;
this.type = type;
}
public void run() {
BufferedReader reader = new BufferedReader(new InputStreamReader(stream));
String line;
try {
while(alive) {
line = reader.readLine();
if (line==null) break;
history.addEntry(new HistoryEntry(type, line));
}
} catch (Exception e) {
e.printStackTrace();
}
}
}
my issue is with the Leech class (used to "leech" the process' out and err streams and feed them into history - which acts like a log file) - on the one hand reading whole lines is nice and easy (and what im currently doing), but it means i miss the last line (usually the prompt line). i only see the prompt line when executing the next command (because there's no line break until that point).
on the other hand, if i read characters myself, how can i tell when the process is "done" ? (either complete or waiting for input)
has anyone tried something like waiting 100 millis since the last output from the process and declaring it "done" ?
any better ideas on how i can implement a nice wrapper around things like runtime.exec("cmd.exe") ?
Use PlexusUtils it is used by Apache Maven 2 to execute all external processes.
I was looking for the same thing myself, and I found a Java port of Expect, called ExpectJ. I haven't tried it yet, but it looks promising
I would read the input in with the stream and then write it into a ByteArrayOutputStream. The byte array will continue to grow until there are no longer any available bytes to read. At this point you will flush the data to history by converting the byte array into a String and splitting it on the platform line.separator. You can then iterate over the lines to add history entries. The ByteArrayOutputStream is then reset and the while loop blocks until there is more data or the end of stream is reached (probably because the process is done).
public void run() {
ByteArrayOutputStream out = new ByteArrayOutputStream();
int bite;
try {
while((bite = stream.read()) != -1) {
out.write(bite);
if (stream.available() == 0) {
String string = new String(out.toByteArray());
for (String line : string.split(
System.getProperty("line.separator"))) {
history.addEntry(new HistoryEntry(type, line));
}
out.reset();
}
}
} catch (Exception e) {
e.printStackTrace();
}
}
This will make sure you pick up that last line of input and it solves your problem of knowing when the stream is ended.

How do I use Java to read from a file that is actively being written to?

I have an application that writes information to file. This information is used post-execution to determine pass/failure/correctness of the application. I'd like to be able to read the file as it is being written so that I can do these pass/failure/correctness checks in real time.
I assume it is possible to do this, but what are the gotcha's involved when using Java? If the reading catches up to the writing, will it just wait for more writes up until the file is closed, or will the read throw an exception at this point? If the latter, what do I do then?
My intuition is currently pushing me towards BufferedStreams. Is this the way to go?
Could not get the example to work using FileChannel.read(ByteBuffer) because it isn't a blocking read. Did however get the code below to work:
boolean running = true;
BufferedInputStream reader = new BufferedInputStream(new FileInputStream( "out.txt" ) );
public void run() {
while( running ) {
if( reader.available() > 0 ) {
System.out.print( (char)reader.read() );
}
else {
try {
sleep( 500 );
}
catch( InterruptedException ex ) {
running = false;
}
}
}
}
Of course the same thing would work as a timer instead of a thread, but I leave that up to the programmer. I'm still looking for a better way, but this works for me for now.
Oh, and I'll caveat this with: I'm using 1.4.2. Yes I know I'm in the stone ages still.
If you want to read a file while it is being written and only read the new content then following will help you achieve the same.
To run this program you will launch it from command prompt/terminal window and pass the file name to read. It will read the file unless you kill the program.
java FileReader c:\myfile.txt
As you type a line of text save it from notepad and you will see the text printed in the console.
public class FileReader {
public static void main(String args[]) throws Exception {
if(args.length>0){
File file = new File(args[0]);
System.out.println(file.getAbsolutePath());
if(file.exists() && file.canRead()){
long fileLength = file.length();
readFile(file,0L);
while(true){
if(fileLength<file.length()){
readFile(file,fileLength);
fileLength=file.length();
}
}
}
}else{
System.out.println("no file to read");
}
}
public static void readFile(File file,Long fileLength) throws IOException {
String line = null;
BufferedReader in = new BufferedReader(new java.io.FileReader(file));
in.skip(fileLength);
while((line = in.readLine()) != null)
{
System.out.println(line);
}
in.close();
}
}
You might also take a look at java channel for locking a part of a file.
http://java.sun.com/javase/6/docs/api/java/nio/channels/FileChannel.html
This function of the FileChannel might be a start
lock(long position, long size, boolean shared)
An invocation of this method will block until the region can be locked
I totally agree with Joshua's response, Tailer is fit for the job in this situation. Here is an example :
It writes a line every 150 ms in a file, while reading this very same file every 2500 ms
public class TailerTest
{
public static void main(String[] args)
{
File f = new File("/tmp/test.txt");
MyListener listener = new MyListener();
Tailer.create(f, listener, 2500);
try
{
FileOutputStream fos = new FileOutputStream(f);
int i = 0;
while (i < 200)
{
fos.write(("test" + ++i + "\n").getBytes());
Thread.sleep(150);
}
fos.close();
}
catch (Exception e)
{
e.printStackTrace();
}
}
private static class MyListener extends TailerListenerAdapter
{
#Override
public void handle(String line)
{
System.out.println(line);
}
}
}
The answer seems to be "no" ... and "yes". There seems to be no real way to know if a file is open for writing by another application. So, reading from such a file will just progress until content is exhausted. I took Mike's advice and wrote some test code:
Writer.java writes a string to file and then waits for the user to hit enter before writing another line to file. The idea being that it could be started up, then a reader can be started to see how it copes with the "partial" file. The reader I wrote is in Reader.java.
Writer.java
public class Writer extends Object
{
Writer () {
}
public static String[] strings =
{
"Hello World",
"Goodbye World"
};
public static void main(String[] args)
throws java.io.IOException {
java.io.PrintWriter pw =
new java.io.PrintWriter(new java.io.FileOutputStream("out.txt"), true);
for(String s : strings) {
pw.println(s);
System.in.read();
}
pw.close();
}
}
Reader.java
public class Reader extends Object
{
Reader () {
}
public static void main(String[] args)
throws Exception {
java.io.FileInputStream in = new java.io.FileInputStream("out.txt");
java.nio.channels.FileChannel fc = in.getChannel();
java.nio.ByteBuffer bb = java.nio.ByteBuffer.allocate(10);
while(fc.read(bb) >= 0) {
bb.flip();
while(bb.hasRemaining()) {
System.out.println((char)bb.get());
}
bb.clear();
}
System.exit(0);
}
}
No guarantees that this code is best practice.
This leaves the option suggested by Mike of periodically checking if there is new data to be read from the file. This then requires user intervention to close the file reader when it is determined that the reading is completed. Or, the reader needs to be made aware the content of the file and be able to determine and end of write condition. If the content were XML, the end of document could be used to signal this.
There are a Open Source Java Graphic Tail that does this.
https://stackoverflow.com/a/559146/1255493
public void run() {
try {
while (_running) {
Thread.sleep(_updateInterval);
long len = _file.length();
if (len < _filePointer) {
// Log must have been jibbled or deleted.
this.appendMessage("Log file was reset. Restarting logging from start of file.");
_filePointer = len;
}
else if (len > _filePointer) {
// File must have had something added to it!
RandomAccessFile raf = new RandomAccessFile(_file, "r");
raf.seek(_filePointer);
String line = null;
while ((line = raf.readLine()) != null) {
this.appendLine(line);
}
_filePointer = raf.getFilePointer();
raf.close();
}
}
}
catch (Exception e) {
this.appendMessage("Fatal error reading log file, log tailing has stopped.");
}
// dispose();
}
You can't read a file which is opened from another process using FileInputStream, FileReader or RandomAccessFile.
But using FileChannel directly will work:
private static byte[] readSharedFile(File file) throws IOException {
byte buffer[] = new byte[(int) file.length()];
final FileChannel fc = FileChannel.open(file.toPath(), EnumSet.of(StandardOpenOption.READ));
final ByteBuffer dst = ByteBuffer.wrap(buffer);
fc.read(dst);
fc.close();
return buffer;
}
Not Java per-se, but you may run into issues where you have written something to a file, but it hasn't been actually written yet - it might be in a cache somewhere, and reading from the same file may not actually give you the new information.
Short version - use flush() or whatever the relevant system call is to ensure that your data is actually written to the file.
Note I am not talking about the OS level disk cache - if your data gets into here, it should appear in a read() after this point. It may be that the language itself caches writes, waiting until a buffer fills up or file is flushed/closed.
I've never tried it, but you should write a test case to see if reading from a stream after you have hit the end will work, regardless of if there is more data written to the file.
Is there a reason you can't use a piped input/output stream? Is the data being written and read from the same application (if so, you have the data, why do you need to read from the file)?
Otherwise, maybe read till end of file, then monitor for changes and seek to where you left off and continue... though watch out for race conditions.

Categories

Resources