I am doing in Java a File Comprimir to a Binary File. The problem is the next, how can I write a Byte in a new File that the total size only occups 1 byte? I am doing the next:
FileOutputStream saveFile=new FileOutputStream("SaveObj3.sav");
// Create an ObjectOutputStream to put objects into save file.
ObjectOutputStream save = new ObjectOutputStream(saveFile);
save.writeByte(0);
save.close();
saveFile.close();
That, only must to write a only byte in the file, but when I look the size,it occups 7 bytes. Anyone knows how can I write a only byte? Is there another way better?
Don't use ObjectOutputStream. Use the FileOutputStream directly:
FileOutputStream out=new FileOutputStream("SaveObj3.sav");
out.write(0);
out.close();
As JB Nizet noticed documentation of ObjectOutputStream constructor states that this object also
writes the serialization stream header to the underlying stream
which explains additional bytes.
To prevent this behaviour you can just use other streams like FileOutputStream or maybe DataOutputStream
FileOutputStream saveFile = new FileOutputStream("c:/SaveObj3.sav");
DataOutputStream save = new DataOutputStream(saveFile);
save.writeByte(0);
save.close();
You can use Files class provided by Java 7. It's more easy than expected.
It can be performed in one line:
byte[] bytes = new String("message output to be written in file").getBytes();
Files.write(Paths.get("outputpath.txt"), bytes);
If you have a File class, you can just replace:
Paths.get("outputpath.txt")
to:
yourOutputFile.toPath()
To write only one byte, as you want, you can do the following:
Files.write(Paths.get("outputpath.txt"), new byte[1]);
in file properties:
size: 1 bytes
Related
I have a sample method which copies one file to another using InputStream and OutputStream. In this case, the source file is encoded in 'UTF-8'. Even if I don't specify the encoding while writing to the disk, the destination file has the correct encoding. But, if I have to write a java.lang.String to a file, I need to specify the encoding. Why is that ?
public static void copyFile() {
String sourceFilePath = "C://my_encoded.txt";
InputStream inStream = null;
OutputStream outStream = null;
try{
String targetFilePath = "C://my_target.txt";
File sourcefile =new File(sourceFilePath);
outStream = new FileOutputStream(targetFilePath);
inStream = new FileInputStream(sourcefile);
byte[] buffer = new byte[1024];
int length;
//copy the file content in bytes
while ((length = inStream.read(buffer)) > 0){
outStream.write(buffer, 0, length);
}
inStream.close();
outStream.close();
System.out.println("File "+targetFilePath+" is copied successful!");
}catch(IOException e){
e.printStackTrace();
}
}
My guess is that since the source file has thee correct encoding and since we read and write one byte at a time, it works fine. And java.lang.String is 'UTF-16' by default and if we write it to the file, it reads one byte at a time instead of 2 bytes and hence garbage values. Is that correct or am I completely wrong in my understanding ?
You are copying the file byte per byte, so you don't need to care about character encoding.
As a rule of thumb:
Use the various InputStream and OutputStream implementations for byte-wise processing (like file copy).
There are some convenience methods to handle text directly like PrintStream.println(). Be careful because most of them use the default platform specific encoding.
Use the various Reader and Writer implemenations for reading and writing text.
If you need to convert between byte-wise and text processing use InputStreamReader and OutputStreamWriter with explicit file encoding.
Do not rely on the default encoding. The default character encoding is platform specific (e.g. Windows-ANSI aka Cp1252 for Windows, usually UTF-8 on Linux).
Example: If you need to read a UTF-8 text file:
BufferedReader reader =
new BufferedReader(new InputStreamReader(new FileInputStream(inFile), "UTF-8"));
Avoid using a FileReader because a FileReader uses always the default encoding.
A special case: If you need random access to a file you should use RandomAccessFile. With it you can read and write data blocks at arbitrary positions. You can read and write raw byte blocks or you can use convenience methods to read and write text. But you should read the documentation carefully. E.g. the methods readUTF() and writeUTF() use a modified UTF-8 encoding.
InputStream, OutputStream, Reader, Writer and RandomAccessFile form the basic IO functionality, enough for most use cases. For advanced IO (e.g. memory mapped files, ...) have a look at package java.nio.
Just read your code! (For the copy part at least ;-) )
When you copy the two files, you copy it byte by byte. There is no conversion to String, thus.
When you write a String into a file, you need to convert it (indirectly sometimes) in an array of byte (byte[]). There you need to specify your encoding.
When you read a file to get a String, you need to know its encoding in order to do it properly. Java doesn't 'skip' any byte but you need to make a conversion once again : from a byte[] to a String.
I'm working on FTP-Client tool for connection to ftp.
At this time, I need to upload to ftp via this tool. According to this post and first answer, it's possible to use FileInputStream for saving the files. But i want to store file as a byte array, not FileInputStream.
Is there any way to do that?
Just use a ByteArrayInputStream to wrap the byte array which containes the data you want to upload, and then use this stream in place of the FileInputStream. E.g. something like:
byte[] mydata = <get your data>;
InputStream stream = new ByteArrayInputStream(mydata);
ftpClient.storeFile("remoteName", stream);
stream.close(); // Not strictly needed for ByteArrayInputStream
You can user ByteArrayOutputStream to store byte and can get the byte array.
InputStream baos = new ByteArrayOutputStream();
// write bytes here
baos.write(bytes)
//to get bytes
byte[] bArr = baos.toByteArray();
I am writing a java program in which I have to write all integers to a file. To make it more efficient I just want to write int as only 4 bytes(which I think will be a binary file kind of thing, but I am not sure) and while reading back from the file I just want to read the integers directly(I do not want to read bytes and then convert them to integer).
Is there a way to do that.
I want to write millions of integers to the file I want the method to be fast and efficient.
I am new to this so please put up with me.
Use the DataOutputStream class or a RandomAccessFile. Both provide methods for writing structured binary data, for example the "int as 4 bytes" you want.
FileOutputStream fos = new FileOutputStream("numbers.dat");
DataOutputStream dos = new DataOutputStream(fos);
dos.writeInt(my_int);
dos.flush();
dos.close();
If you want to have the data buffered wrap the file stream in to a buffered stream as below:
FileOutputStream fos = new FileOutputStream("numbers.dat");
BufferedOutputStream bos = new BufferedOutputStream(fos);
dos = new DataOutputStream(bos);
byte[] bytes = ByteBuffer.allocate(4).putInt(myIntVlaue).array();
I have a blob column in my database table, for which I have to use byte[] in my Java program as a mapping and to use this data I have to convert it to InputStream or OutputStream. But I don't know what happens internally when I do so. Can anyone briefly explain me what's happening when I do this conversion?
You create and use byte array I/O streams as follows:
byte[] source = ...;
ByteArrayInputStream bis = new ByteArrayInputStream(source);
// read bytes from bis ...
ByteArrayOutputStream bos = new ByteArrayOutputStream();
// write bytes to bos ...
byte[] sink = bos.toByteArray();
Assuming that you are using a JDBC driver that implements the standard JDBC Blob interface (not all do), you can also connect a InputStream or OutputStream to a blob using the getBinaryStream and setBinaryStream methods1, and you can also get and set the bytes directly.
(In general, you should take appropriate steps to handle any exceptions, and close streams. However, closing bis and bos in the example above is unnecessary, since they aren't associated with any external resources; e.g. file descriptors, sockets, database connections.)
1 - The setBinaryStream method is really a getter. Go figure.
I'm assuming you mean that 'use' means read, but what i'll explain for the read case can be basically reversed for the write case.
so you end up with a byte[]. this could represent any kind of data which may need special types of conversions (character, encrypted, etc). let's pretend you want to write this data as is to a file.
firstly you could create a ByteArrayInputStream which is basically a mechanism to supply the bytes to something in sequence.
then you could create a FileOutputStream for the file you want to create. there are many types of InputStreams and OutputStreams for different data sources and destinations.
lastly you would write the InputStream to the OutputStream. in this case, the array of bytes would be sent in sequence to the FileOutputStream for writing. For this i recommend using IOUtils
byte[] bytes = ...;//
ByteArrayInputStream in = new ByteArrayInputStream(bytes);
FileOutputStream out = new FileOutputStream(new File(...));
IOUtils.copy(in, out);
IOUtils.closeQuietly(in);
IOUtils.closeQuietly(out);
and in reverse
FileInputStream in = new FileInputStream(new File(...));
ByteArrayOutputStream out = new ByteArrayOutputStream();
IOUtils.copy(in, out);
IOUtils.closeQuietly(in);
IOUtils.closeQuietly(out);
byte[] bytes = out.toByteArray();
if you use the above code snippets you'll need to handle exceptions and i recommend you do the 'closes' in a finally block.
we can convert byte[] array into input stream by using ByteArrayInputStream
String str = "Welcome to awesome Java World";
byte[] content = str.getBytes();
int size = content.length;
InputStream is = null;
byte[] b = new byte[size];
is = new ByteArrayInputStream(content);
For full example please check here http://www.onlinecodegeek.com/2015/09/how-to-convert-byte-into-inputstream.html
There is no conversion between InputStream/OutputStream and the bytes they are working with. They are made for binary data, and just read (or write) the bytes one by one as is.
A conversion needs to happen when you want to go from byte to char. Then you need to convert using a character set. This happens when you make String or Reader from bytes, which are made for character data.
output = new ByteArrayOutputStream();
...
input = new ByteArrayInputStream( output.toByteArray() )
I do realize that my answer is way late for this question but I think the community would like a newer approach to this issue.
byte[] data = dbEntity.getBlobData();
response.getOutputStream().write();
I think this is better since you already have an existing OutputStream in the response object.
no need to create a new OutputStream.
This problem seems to happen inconsistently. We are using a java applet to download a file from our site, which we store temporarily on the client's machine.
Here is the code that we are using to save the file:
URL targetUrl = new URL(urlForFile);
InputStream content = (InputStream)targetUrl.getContent();
BufferedInputStream buffered = new BufferedInputStream(content);
File savedFile = File.createTempFile("temp",".dat");
FileOutputStream fos = new FileOutputStream(savedFile);
int letter;
while((letter = buffered.read()) != -1)
fos.write(letter);
fos.close();
Later, I try to access that file by using:
ObjectInputStream keyInStream = new ObjectInputStream(new FileInputStream(savedFile));
Most of the time it works without a problem, but every once in a while we get the error:
java.io.StreamCorruptedException: invalid stream header: 0D0A0D0A
which makes me believe that it isn't saving the file correctly.
I'm guessing that the operations you've done with getContent and BufferedInputStream have treated the file like an ascii file which has converted newlines or carriage returns into carriage return + newline (0x0d0a), which has confused ObjectInputStream (which expects serialized data objects.
If you are using an FTP URL, the transfer may be occurring in ASCII mode.
Try appending ";type=I" to the end of your URL.
Why are you using ObjectInputStream to read it?
As per the javadoc:
An ObjectInputStream deserializes primitive data and objects previously written using an ObjectOutputStream.
Probably the error comes from the fact you didn't write it with ObjectOutputStream.
Try reading it wit FileInputStream only.
Here's a sample for binary ( although not the most efficient way )
Here's another used for text files.
There are 3 big problems in your sample code:
You're not just treating the input as bytes
You're needlessly pulling the entire object into memory at once
You're doing multiple method calls for every single byte read and written -- use the array based read/write!
Here's a redo:
URL targetUrl = new URL(urlForFile);
InputStream is = targetUrl.getInputStream();
File savedFile = File.createTempFile("temp",".dat");
FileOutputStream fos = new FileOutputStream(savedFile);
int count;
byte[] buff = new byte[16 * 1024];
while((count = is.read(buff)) != -1) {
fos.write(buff, 0, count);
}
fos.close();
content.close();
You could also step back from the code and check to see if the file on your client is the same as the file on the server. If you get both files on an XP machine, you should be able to use the FC utility to do a compare (check FC's help if you need to run this as a binary compare as there is a switch for that). If you're on Unix, I don't know the file compare program, but I'm sure there's something.
If the files are identical, then you're looking at a problem with the code that reads the file.
If the files are not identical, focus on the code that writes your file.
Good luck!