I have a file:
RandomAccessFile file = new RandomAccessFile("files/bank.txt", "r");
And I am reading and writing data to/from this file using:
for (int pos = 0; pos < 1000; pos++) {
file.seek(40 * pos + 30);
double money = file.readDouble();
if (pos == num) {
money += i;
System.out.println(money+" "+i);
file.seek(40 * pos + 30);
file.writeDouble(money);
}
}
In this way it reads the double - works correctly, and then it should overwrite that double with the value it held previously plus i. However this is not working as the value doesn't change. What have I done wrong?
You have opened file only for reading:
RandomAccessFile("files/bank.txt", "r");
you should open it with:
new RandomAccessFile("files/bank.txt", "rws");
which opens for reading and writing, as with "rw", and also require that every update to the file's content or metadata be written synchronously to the underlying storage device.
or with:
new RandomAccessFile("files/bank.txt", "rwd");
which opens for reading and writing, as with "rw", and also require that every update to the file's content be written synchronously to the underlying storage device.
You have the file open for reading only:
RandomAccessFile file = new RandomAccessFile("files/bank.txt", "r");
Change that line to:
RandomAccessFile file = new RandomAccessFile("files/bank.txt", "rw");
And you will be able to update with the code you have above. :)
This is a classic debug error, you should've breakpointed at money += i to see the value change, the document not changing is another issue and should have been addressed as the possibility of the file not being writable. i.e. you should've opened for writing.
Related
this is my directory structure
Inside the server I have the following code for saving a file that gets sent from the client
fileName = reader.readLine();
DataInputStream dis = null;
try {
dis = new DataInputStream(csocket.getInputStream());
FileOutputStream fos = new FileOutputStream(fileName);
buffer = new byte[4096];
int fileSize = 15123;
int read = 0;
int totalRead = 0;
int remaining = fileSize;
while((read = dis.read(buffer, 0, Math.min(buffer.length, remaining))) > 0) {
totalRead += read;
remaining -= read;
fos.write(buffer, 0, read);
}
fos.close();
dis.close();
} catch (IOException e) {
}
break;
I'm wondering how I would go about saving the file within the xml folder? I've tried using getClass().getResource and such but nothing seems to work.
fileName is just a simple string containing the name of the file, not a path or anything.
I get the correct path using this code:
File targetDir = new File(getClass().getResource("xml").getPath());
File targetFile = new File(targetDir, fileName);
targetFile.createNewFile();
System.out.println(targetFile.getAbsolutePath());
dis = new DataInputStream(csocket.getInputStream());
FileOutputStream fos = new FileOutputStream(targetFile.getAbsolutePath(), false);
But it still won't save it there...
The best way is to receive explicitly the target path for storing files, either through a .properties file or a command-line argument. In this way, you make your program flexible to be installed and adapted in different environments.
But if you wish your program to assume the target directory automatically, the best option is to set a relative path before creating the FileOutputStream, as long as you start your program always from the same path:
File targetDir=new File("xml");
File targetFile=new File(targetDir, fileName);
FileOutputStream fos = new FileOutputStream(targetFile);
This will work assuming the program is started from server as current directory.
Update
Other minor suggestions about your program:
Never base the exit condition of the loop on a hard-coded file size, because it is not possible to know it a priori. Instead, check explicitly if the value returned by read is less than 0 => that means End Of File reached.
Consequently, do not bother to calculate the exact amount of data to get through a call to read. Just enter the buffer size, because you are setting a maximum data size.
Never let exceptions catched without a proper treatment: If you know how to make your program recover, enter a proper code into the catch block. Otherwise, you'd better not catch them: Declare them in the throws clause and let them be propagated to the caller.
Always create stream resources through the try-with-resources instruction, to ensure they got closed at the end:
try (FileOutputStream fos = new FileOutputStream(...))
{
// ... use fos...
}
Save unnecessary instructions: If you don't care about if the file already exists on the filesystem or not, don't call createNewFile. But if you care, check the returned value and bifurcate consequently.
I try to create file and it does created but not at ProjectName\src\com\company\xml but in ProjectName\out\production\ProjectName\com\company\xml
my code:
File targetDir = new File(this.getClass().getResource("xml").getPath());
// get the parent of the file
String parentPath = targetDir.getParent( );
String fileName="xml/name.txt";
//do something
File targetFile = new File(parentPath, fileName);
targetFile.createNewFile();
Just pay attention that after compilation you will try to save it into a jar file and it a complicated thing to do.
usually you need to save file into file outside from your jar(separate in the root) like this:
I have to read a URLConnection response containing 2MB of pretty printed JSON in java.
2mb is not "small" but by no means large. It contains JSON. However, it is pretty printed JSON with around 60k lines. A
while ((line = bufferedReader.readLine()) != null) {
lineAllOfIt += line;
}
takes around 10 minutes to read this response. There must be something wrong with my approach, but I cannot picture a better approach.
For this particular case, I would cache the file locally using java you can have a low memory transfer of the file to your computer, then you can go through it line by line without loading the file into memory as well and pull out the data you need or loading it all at once.
EDIT: Made changes on variable names i pulled this from my code and forgot to neutralize the variables. Also FileChannel transferTo/transferFrom can be much more efficient as there is potentially less copies and depending on operation could go from the SocketBuffer -> Disk. FileChannel API
String urlString = "http://update.domain.com/file.json" // File URL Path
Path diskSaveLocation = Paths.get("file.json"); // This will be just help place it in your working directory
final URL url = new URL(fileUrlString);
final URLConnection conn = url.openConnection();
final long fileLength = conn.getContentLength();
System.out.println(String.format("Downloading file... %s, Size: %d bytes.", fileUrlString, fileLength));
try(
FileOutputStream stream = new FileOutputStream(diskSaveLocation.toFile(), false);
FileChannel fileChannel = stream.getChannel();
ReadableByteChannel inChannel = Channels.newChannel(conn.getInputStream());
) {
long read = 0;
long readerPosition = 0;
while ((read = fileChannel.transferFrom(inChannel, readerPosition, fileLength)) >= 0 && readerPosition < fileLength) {
readerPosition += read;
}
if (fileLength != Files.size(diskSaveLocation)) {
Files.delete(diskSaveLocation);
System.out.println(String.format("File... %s did not download correctly, deleting file artifact!", fileUrlString));
}
}
System.out.println(String.format("File Download... %s completed!", fileUrlString));
((HttpURLConnection) conn).disconnect();
You can now read this same file using a NIO2 method that allows you to read line by line without loading into memory. Using Scanner or RandomAccessFile methods you can prevent reading lines into the heap. If you want to read the whole file in you can also do so locally from the cached file using many of the methods from Javas Files utility methods.
Java Read Large Text File With 70million line of text
I want to know when a file has rotated because because I'm watching a file and I have to get a new file with the content of this file and the new file. I have to use Java6, so I don't have the new features of JDK7.
The problem is that when I rotate the file (keeping a reference to the file), the old reference gets updated, I would like that it'd point to the old file.
I was doing something like this:
if (reader == null) {
reader = new RandomAccessFile(toWatch, "r");
}
System.out.println("LasModified:" + new Date(toWatch.lastModified()));
System.out.println("Cuando se creo to Watch long:" + reader.length()); --> lenght is okay.
//Here, I use logrotate to simulate what it could happen. (I execute the code in debug mode to be able to do it
long len = 0L;
File f = new File("/home/gortiz/logRotate/test.flume");
System.out.println("LasModified NewReader:" + new Date(f.lastModified()));
RandomAccessFile newReader = new RandomAccessFile(toWatch, "r");
len = reader.length(); --> len = 0
long newLength = newReader.length(); --> newLenght = 0
I could check if there's a file with name.1. Usually logs system renames logs like that.. but, it's not a good solution because nobody guarantees that that's going to be as log system renames files or even they could be moved to another directory.
So om working my way trough a task in my java-course at school. For better understanding of what the code is supposed to do ill quote it:
"(Split files) Suppose you want to back up a huge file(e.g., a 10-GB AVI file) to a CD-R. You can achieve it by splitting the file into smaller pieces and backing up these pieces separately. Write a utility program that splits a large file into smaller ones using the following command: java ClassName SourceFile numberOfPieces
The command creates the files SourceFile.1, SourceFile2...etc
Now to be clear. This post is in no way an attempt to get a "solution" for the problem. I have solved it (with what i know). And i merely want to get more enlightned on some matters that crossed my mind when writing the code.
Is it neccesary to create a new output for every single file im
copying to? Doesn`t this demand unneccesary system power?
The first file that gets copied(SourceFile is in this case a .png
file) is possible to view. And show a fraction of the original
picture. (If i split into two. i can view half the picture.) But
the latter ones i cant view.. Why is that?
Is it possible to reassemble the splitted files in any way? if my
pictures was split into two files, can i put them back together and
view the whole picture?
The code, if you want to look at it.
All feedback is welcome,
Have a good day! :)
package oblig2;
import java.io.*;
import java.util.*;
public class Test {
/**
* Main method
*
* #param args[0] for source file
* #param args[1] for number of pieces
* #throws IOException
*/
public static void main(String[] args) throws IOException {
// The program needs to be executed with two parameters in order to
// work. This sentence check for it.
if (args.length != 2) {
System.out.println("Usage: java Copy sourceFile numberOfPieces");
System.exit(1);
}
// Check whether or not the sourcefile exists
File sourceFile = new File(args[0]);
if (!sourceFile.exists()) {
System.out.println("Source file " + args[0] + " does not exist");
System.exit(2);
}
// Need an Array to store all the new files that is supposed to contain
// parts of the original file
ArrayList<File> fileArray = new ArrayList<File>();
// All the new files need their own output(or do they?)
ArrayList<BufferedOutputStream> outputArray = new ArrayList<BufferedOutputStream>();
// Using randomAccessFile on the sourcefile to easier read parts of it
RandomAccessFile inOutSourceFile = new RandomAccessFile(sourceFile,
"rw");
// This loop changes the name for the new files, so they match the
// sourcefile with an appended digit
for (int i = 0; i < Integer.parseInt(args[1]); i++) {
String nameAppender = String.valueOf(i);
String nameBuilder;
int suffix = args[0].indexOf(".");
nameBuilder = args[0].substring(0, suffix);
fileArray.add((new File(nameBuilder + nameAppender + ".dat")));
}
// Here i create the output needed for all the new files
for (int i = 0; i < Integer.parseInt(args[1]); i++) {
outputArray.add(new BufferedOutputStream(new FileOutputStream(
new File(fileArray.get(i).getAbsolutePath()))));
}
// Now i determine in how many parts the sourcefile needs to be split,
// and the size of each.
float size = inOutSourceFile.length();
double parts = Integer.parseInt(args[1]);
double partSize = size / parts;
int r, numberOfBytesCopied = 0;
// This loop actually does the job of copying the parts into the new
// files
for (int i = 1; i <= parts; i++) {
while (inOutSourceFile.getFilePointer() < partSize * i) {
r = inOutSourceFile.readByte();
outputArray.get(i - 1).write((byte) r);
numberOfBytesCopied++;
}
}
// Here i close the input and outputs
inOutSourceFile.close();
for (int i = 0; i < parts; i++) {
outputArray.get(i).close();
}
// Display the operations
System.out.println(args[0] + " Has been split into " + args[1]
+ " pieces. " + "\n" + "Each file containig " + partSize
+ " Bytes each.");
}
}
Of course it is necessary to open all output files. But you don't have to open them at all times. You can open the first file, write to it, close it, open the second file, write to it, close it, etc.
File format, .png for example, have a structure that have to follow. It may have special header, and may have special footer. That's why when this file split into two or more, the first will lose its footer, the middle will lose its header and footer, and the last will lose it's header. This make them unusable as individual file.
Of course it is possible. By combining back all the parts, the original file fill be restructured.
is there anyway to remove header information from mp3 file such that mp3 file can't be played?
regards,
hitendrasinh gohil
You have to do more than remove the header to make an mp3 unplayable. If you take my other answer to your question and apply it to the whole file there should be no way it can be played:
RandomAccessFile raf = new RandomAccessFile("input.mp3", "rw");
byte[] buf = new byte[65536];
long pos = 0;
int len;
Random random = new Random(34);
while ((len = raf.read(buf)) != -1) {
for (int i = 0; i < len; i++) {
buf[i] ^= random.nextInt();
}
raf.seek(pos);
raf.write(buf);
pos = raf.getFilePointer();
}
raf.close();
This will XOR every byte in the file. The only reason I suggested in the other answer to only do the first 64k was for performance since you're on an Android device. For me that made it unplayable on my desktop. If doing the whole file doesn't work for you then I suspect you're doing something else wrong. There no way it'll play the original music if every byte is changed like this. You can run this again to undo it and make the mp3 playable again.
MP3s are organized in frames so you can often still play them even you only have a part of them.