I want to write a number to the FileOutputStream after a certain offset. What I have wrote is the following:
String t = Integer.toString(num);
output.write(t.getBytes(), OFFSET, t.length());
Is this a good way to do this, or this there a better way?
This is a classic XY problem. What you want to do is write to a file at a specific offset. Specifying 'FileOutputStream' isn't part of the problem, it is part of your incorrect solution. You should be using RandomAccessFile for this task.
You're not using the FileOutputStream class correctly. The first parameter is the array you want to write on the file. The second one is the offset from that array that you want to start writing from. The third one is the length of what you want to write. So, in your case, if you use an offset which is different from 0, you'll get an out of bounds exception.
If you want to write that number inside a file using FileOutputStream, you can read that file (using, for example, a FileInputStream), read the amount of bytes you want to read and write them into the output stream, then write that number using something akin to:
output.write(t.getBytes(),0,t.length);
And finally reading the rest of the inputstream and writing it into the outputstream.
Best of luck.
I found this solution that corresponds to the #sirs comment :
final RandomAccessFile raf = new RandomAccessFile(file, "rw");
raf.setLength(offset);
raf.seek(offset);
FileOutputStream fos = new FileOutputStream(raf.getFD()) {
#Override
public void close() throws IOException {
super.close();
raf.close();
}
};
fos.write(myInt);
Related
Goal: Decrypt data from one source and write the decrypted data to a file.
try (FileInputStream fis = new FileInputStream(targetPath.toFile());
ReadableByteChannel channel = newDecryptedByteChannel(path, associatedData))
{
FileChannel fc = fis.getChannel();
long position = 0;
while (position < ???)
{
position += fc.transferFrom(channel, position, CHUNK_SIZE);
}
}
The implementation of newDecryptedByteChannel(Path,byte[]) should not be of interest, it just returns a ReadableByteChannel.
Problem: What is the condition to end the while loop? When is the "end of the byte channel" reached? Is transferFrom the right choice here?
This question might be related (answer is to just set the count to Long.MAX_VALUE). Unfortunately this doesn't help me because the docs say that up to count bytes may be transfered, depending upon the natures and states of the channels.
Another thought was to just check whether the amount of bytes actually transferred is 0 (returned from transferFrom), but this condition may be true if the source channel is non-blocking and has fewer than count bytes immediately available in its input buffer.
It is one of the bizarre features of FileChannel. transferFrom() that it never tells you about end of stream. You have to know the input length independently.
I would just use streams for this: specifically, a CipherInputStream around a BufferedInputStream around a FileInputStream, and a FileOutputStream.
But the code you posted doesn't make any sense anyway. It can't work. You are transferring into the input file, and via a channel that was derived from a FileInputStream, so it is read-only, so transferFrom() will throw an exception.
As commented by #user207421, as you are reading from ReadableByteChannel, the target channel needs to be derived from FileOutputStream rather than FileInputStream. And the condition for ending loop in your code should be the size of file underlying the ReadableByteChannel which is not possible to get from it unless you are able to get FileChannel and find the size through its size method.
The way I could find for transferring is through ByteBuffer as below.
ByteBuffer buf = ByteBuffer.allocate(1024*8);
while(readableByteChannel.read(buf)!=-1)
{
buf.flip();
fc.write(buf); //fc is FileChannel derived from FileOutputStream
buf.compact();
}
buf.flip();
while(buf.hasRemainig())
{
fc.write(buf);
}
I'm dealing with some Java code in which there's an InputStream that I read one time and then I need to read it once again in the same method.
The problem is that I need to reset it's position to the start in order to read it twice.
I've found a hack-ish solution to the problem:
is.mark(Integer.MAX_VALUE);
//Read the InputStream is fully
// { ... }
try
{
is.reset();
}
catch (IOException e)
{
e.printStackTrace();
}
Does this solution lead to some unespected behaviours? Or it will work in it's dumbness?
As written, you have no guarantees, because mark() is not required to report whether it was successful. To get a guarantee, you must first call markSupported(), and it must return true.
Also as written, the specified read limit is very dangerous. If you happen to be using a stream that buffers in-memory, it will potentially allocate a 2GB buffer. On the other hand, if you happen to be using a FileInputStream, you're fine.
A better approach is to use a BufferedInputStream with an explicit buffer.
It depends on the InputStream implementation. You can also think whether it will be better if you use byte[]. The easiest way is to use Apache commons-io:
byte[] bytes = IOUtils.toByteArray(inputSream);
You can't do this reliably; some InputStreams (such as ones connected to terminals or sockets) don't support mark and reset (see markSupported). If you really have to traverse the data twice, you need to read it into your own buffer.
Instead of trying to reset the InputStream load it into a buffer like a StringBuilder or if it's a binary data stream a ByteArrayOutputStream. You can then process the buffer within the method as many times as you want.
ByteArrayOutputStream bos = new ByteArrayOutputStream();
int read = 0;
byte[] buff = new byte[1024];
while ((read = inStream.read(buff)) != -1) {
bos.write(buff, 0, read);
}
byte[] streamData = bos.toByteArray();
For me, the easiest solution was to pass the object from which the InputStream could be obtained, and just obtain it again. In my case, it was from a ContentResolver.
This is a newbie question, I know. Can you guys help?
I'm talking about big files, of course, above 100MB. I'm imagining some kind of loop, but I don't know what to use. Chunked stream?
One thins is for certain: I don't want something like this (pseudocode):
File file = new File(existing_file_path);
byte[] theWholeFile = new byte[file.length()]; //this allocates the whole thing into memory
File out = new File(new_file_path);
out.write(theWholeFile);
To be more specific, I have to re-write a applet that downloads a base64 encoded file and decodes it to the "normal" file. Because it's made with byte arrays, it holds twice the file size in memory: one base64 encoded and the other one decoded. My question is not about base64. It's about saving memory.
Can you point me in the right direction?
Thanks!
From the question, it appears that you are reading the base64 encoded contents of a file into an array, decoding it into another array before finally saving it.
This is a bit of an overhead when considering memory. Especially given the fact that Base64 encoding is in use. It can be made a bit more efficient by:
Reading the contents of the file using a FileInputStream, preferably decorated with a BufferedInputStream.
Decoding on the fly. Base64 encoded characters can be read in groups of 4 characters, to be decoded on the fly.
Writing the output to the file, using a FileOutputStream, again preferably decorated with a BufferedOutputStream. This write operation can also be done after every single decode operation.
The buffering of read and write operations is done to prevent frequent IO access. You could use a buffer size that is appropriate to your application's load; usually the buffer size is chosen to be some power of two, because such a number does not have an "impedance mismatch" with the physical disk buffer.
Perhaps a FileInputStream on the file, reading off fixed length chunks, doing your transformation and writing them to a FileOutputStream?
Perhaps a BufferedReader? Javadoc: http://download-llnw.oracle.com/javase/1.4.2/docs/api/java/io/BufferedReader.html
Use this base64 encoder/decoder, which will wrap your file input stream and handle the decoding on the fly:
InputStream input = new Base64.InputStream(new FileInputStream("in.txt"));
OutputStream output = new FileOutputStream("out.txt");
try {
byte[] buffer = new byte[1024];
int readOffset = 0;
while(input.available() > 0) {
int bytesRead = input.read(buffer, readOffset, buffer.length);
readOffset += bytesRead;
output.write(buffer, 0, bytesRead);
}
} finally {
input.close();
output.close();
}
You can use org.apache.commons.io.FileUtils. This util class provides other options too beside what you are looking for. For example:
FileUtils.copyFile(final File srcFile, final File destFile)
FileUtils.copyFile(final File input, final OutputStream output)
FileUtils.copyFileToDirectory(final File srcFile, final File destDir)
And so on.. Also you can follow this tut.
I am trying to capture audio from the line-in from my PC, to do this I am using AudioSystem class. There is one of two choices with the static AudioSystem.write method: Write to a file Or Write to a stream. I can get it to write to a file just fine, but whenever I try to write to a stream I get thrown java.io.IOException (stream length not specified). As for my buffer I am using a ByteArrayOutputStream. Is there another kind of stream I am supposed to be using or messing up somewhere else?
Also in a related subject, one can sample the audio line in (TargetDataLine) directly by calling read. Is this the preferred way doing audio capture or using AudioSystem?
Update
Source code that was requested:
final private TargetDataLine line;
final private AudioFormat format;
final private AudioFileFormat.Type fileType;
final private AudioInputStream audioInputStream;
final private ByteArrayOutputStream bos;
// Constructor, etc.
public void run()
{
System.out.println("AudioWorker Started");
try
{
line.open(format);
line.start();
// This commented part is regarding the second part
// of my question
// byte[] buff = new byte[512];
// int bytes = line.read(buff, 0, buff.length);
AudioSystem.write(audioInputStream, fileType, bos);
}
catch ( Exception e )
{
e.printStackTrace();
}
System.out.println("AudioWorker Finished");
}
// Stack trace in console
AudioWorker Started
java.io.IOException: stream length not specified
at com.sun.media.sound.WaveFileWriter.write(Unknown Source)
at javax.sound.sampled.AudioSystem.write(Unknown Source)
at AudioWorker.run(AudioWorker.java:41)
AudioWorker Finished
From AudioSystem.write JavaDoc:
Writes a stream of bytes representing an audio file of the specified file type to the output stream provided. Some file types require that the length be written into the file header; such files cannot be written from start to finish unless the length is known in advance. An attempt to write a file of such a type will fail with an IOException if the length in the audio file type is AudioSystem.NOT_SPECIFIED.
Since the Wave format requires the length to be written at the beginning of the file, the writer is querying the getFrameLength method of your AudioInputStream. When this returns NOT_SPECIFIED—because your recording "live" data of as-yet-unspecified length— the writer throws the exception.
The File-oriented works around this by writing dummy data to the length field, then re-opening the file when the write is complete and overwriting that area of the file.
Use an output format that doesn't need the length in advance (au), or use an AudioInputStream that returns a valid frame length, or use the File version of the API.
You should check out Richard Baldwin's tutorial on Java sound. There's a complete source listing at the bottom of the article where he uses TargetDataLine's read to capture audio.
You could also try looking into using JMF which is a bit hairy but works a bit better that javax.sound.sampled stuff. There's quite a few tutorials on the JMF page which describe how to record from line in or mic channels.
This problem seems to happen inconsistently. We are using a java applet to download a file from our site, which we store temporarily on the client's machine.
Here is the code that we are using to save the file:
URL targetUrl = new URL(urlForFile);
InputStream content = (InputStream)targetUrl.getContent();
BufferedInputStream buffered = new BufferedInputStream(content);
File savedFile = File.createTempFile("temp",".dat");
FileOutputStream fos = new FileOutputStream(savedFile);
int letter;
while((letter = buffered.read()) != -1)
fos.write(letter);
fos.close();
Later, I try to access that file by using:
ObjectInputStream keyInStream = new ObjectInputStream(new FileInputStream(savedFile));
Most of the time it works without a problem, but every once in a while we get the error:
java.io.StreamCorruptedException: invalid stream header: 0D0A0D0A
which makes me believe that it isn't saving the file correctly.
I'm guessing that the operations you've done with getContent and BufferedInputStream have treated the file like an ascii file which has converted newlines or carriage returns into carriage return + newline (0x0d0a), which has confused ObjectInputStream (which expects serialized data objects.
If you are using an FTP URL, the transfer may be occurring in ASCII mode.
Try appending ";type=I" to the end of your URL.
Why are you using ObjectInputStream to read it?
As per the javadoc:
An ObjectInputStream deserializes primitive data and objects previously written using an ObjectOutputStream.
Probably the error comes from the fact you didn't write it with ObjectOutputStream.
Try reading it wit FileInputStream only.
Here's a sample for binary ( although not the most efficient way )
Here's another used for text files.
There are 3 big problems in your sample code:
You're not just treating the input as bytes
You're needlessly pulling the entire object into memory at once
You're doing multiple method calls for every single byte read and written -- use the array based read/write!
Here's a redo:
URL targetUrl = new URL(urlForFile);
InputStream is = targetUrl.getInputStream();
File savedFile = File.createTempFile("temp",".dat");
FileOutputStream fos = new FileOutputStream(savedFile);
int count;
byte[] buff = new byte[16 * 1024];
while((count = is.read(buff)) != -1) {
fos.write(buff, 0, count);
}
fos.close();
content.close();
You could also step back from the code and check to see if the file on your client is the same as the file on the server. If you get both files on an XP machine, you should be able to use the FC utility to do a compare (check FC's help if you need to run this as a binary compare as there is a switch for that). If you're on Unix, I don't know the file compare program, but I'm sure there's something.
If the files are identical, then you're looking at a problem with the code that reads the file.
If the files are not identical, focus on the code that writes your file.
Good luck!