I'm trying to update files .ser :
Here's a snippet, without all useless stuffs (try ...etc)
String file1 = "data.ser";
InputStream fis = this.getClass().getResourceAsStream(file1);
InputStream buffer = new BufferedInputStream(fis);
ObjectInputStream ois = new ObjectInputStream(buffer)
ArrayList<Object[]> list = new ArrayList<>();
list = (ArrayList) ois.readObject();
//Reading works fine, I got all my data in list
/*
Some operations in list
*/
OutputStream fout;
fout = new FileOutputStream(file1);
ObjectOutputStream oos = new ObjectOutputStream(new FileOutputStream(file1));
//Here I tried to verify list; it's okay, all is there
oos.writeObject(list);
//I tried also some print just after that, and it works
This doesn't update my file; what can be the trouble?
PS: on IDE, all is fine, but when running jar, it's not; is that possible that I can't change files in jar ?!
Related
I have a very large HashMap of the format HashMap<String, List<String>>, and I want to serialize it using BufferedOutputStream because I think that it will be more efficient than with a regular OutputStream.
But how do I divide the HashMap in chunks of the size of the buffer? Should I just iterate through the HashMap?
If you plan to write into a local file you need to chain FileOutputStream, BufferedOutputStream and ObjectOutputStream. With below setup BufferedOutputStream should minimize direct writes to the file system using default buffer of 8192 bytes.
Map<String, List<String>> data = new HashMap<>();
data.put("myKey", List.of("A", "B", "C"));
File outFile = new File("out.bin");
try (FileOutputStream fos = new FileOutputStream(outFile);
BufferedOutputStream bos = new BufferedOutputStream(fos);
ObjectOutputStream oos = new ObjectOutputStream(bos)) {
oos.writeObject(data);
oos.flush();
}
Unless the output file is too big there is no need for further chunking.
I have the below code which stores a List to File and then to a mysql blob column. Is it possible to recover that List from the database?
ArrayList<String> myList = new ArrayList<String>();
File tempFile = File.createTempFile("myFile.temp");
tempFile.deleteOnExit();
FileOutputStream fileOutputStream = new FileOutputStream(tempFile);
ObjectOutputStream objectOutputStream = new ObjectOutputStream(fileOutputStream);
objectOutputStream.writeUnshared(myList);
objectOutputStream.close();
String trailQuery = "INSERT INTO table (tempFile) VALUES (?)";
PreparedStatement preparedStatement = connection.prepareStatement(trailQuery);
preparedStatement.setObject(1, tempFile);
I tried to read that but I get a FileNotFoundException
if (resultSet.next()) {
InputStream fileInputStream = resultSet.getBinaryStream("tempFile");
ObjectInputStream objectInputStream = new ObjectInputStream(fileInputStream);
File inner = (File) objectInputStream.readObject();
FileInputStream fileInputStreamInner = new FileInputStream(inner);
ObjectInputStream objectInputStreamInner = new ObjectInputStream(fileInputStreamInner);
ArrayList<String> myList = (ArrayList<String>) objectInputStreamInner.readObject();
System.out.println(myList.size());
}
Exception
java.io.FileNotFoundException: \tmp\myFile.1708309680603860570.temp (The system cannot find the path specified)
As #EJP mentioned a java.io.File is nothing more than a holder for a filename... had wrong understanding - so none of the contents of the files are there....
I try to create a zip file and on-the-fly via NanoHTTPD
That is what I currently have:
#Override
public Response serve(IHTTPSession session) {
String uri = session.getUri();
if (uri.toString().equals("/test.zip")) {
try {
PipedOutputStream outPipe = new PipedOutputStream();
ZipEncryptOutputStream zeos = new ZipEncryptOutputStream(outPipe, "Foo");
ZipOutputStream zos = new ZipOutputStream(zeos);
File file = new File("Test.txt");
PipedInputStream inPipe = new PipedInputStream(outPipe, 2048);
ZipEntry ze = new ZipEntry("Test.txt");
zos.putNextEntry(ze);
FileInputStream fis = new FileInputStream(file);
IOUtils.copy(fis, zos);
fis.close();
zos.closeEntry();
zos.close();
return newChunkedResponse(Response.Status.OK, "application/zip", inPipe);
But when debugging - of course - takes some time, because everything is saved to the ZIP file first.
I guess I have to do the writing to zos somehow in a call back and after wards close the entry and stream? But how?
Is that an efficient way of doing it (I am aiming on Android) ? Creating temp files would be much easier - but it should work on low-end smartphones and also the initial wait of creating the zip file (around 40 MB) should be low when downloading.
I have few files in my local folder. I want to store the file-names as the key and the content of the corresponding file as value.
HashMap<String,String> hm = new HashMap<String,String>();
hm.put(filename,filecontent);
Can someone tell me is this the right way to do?
When storing file contents as a String, you have to make sure the encoding is respected, I would recommend to use byte array instead:
Map<String, byte[]> hm = new HashMap<String, byte[]>();
Also: depending on how many files you are manipulating, you may want to consider using file streams to avoid keeping everything in memory.
There are a couple of steps to what you would like.
I am going to assume you have the filename already as a String.
HashMap<String, byte[]> hm = new HashMap<String, byte[]>(); //Initialize our hashmap with String as key, and byte array as data (binary data)
FileInputStream fileStream = new FileInputStream(filename); //Get a stream of the file
byte[] buff = new byte[512]; //A buffer for our read loop
ByteArrayOutputStream byteStream = new ByteArrayOutputStream(); //Where to write buff content to, we can convert this into the output byte array with toByteArray()
while(fileStream.read(buff) > 0) { //read 512 bytes of file at a time, until end of file
byteStream.write(buff); //write buff content to byte stream
}
fileStream.close(); //Close our file handle, if we don't do this we may not be able to see changes!
hm.put(filename, byteStream.toByteArray()); //insert filename and filecontent to hashmap
As others have suggested, however, this is less than ideal. You are holding multiple files in memory for an arbitrary length of time. You can eat a lot of ram and not realize it doing this, and quickly run into an out of memory exception.
You would be better off reading the file content only when needed, so there isn't a whole file sitting in your ram for god knows how long. The only plausible reason I could see to store file contents would be if you were reading it a lot, and you could afford the ram to cache the file in memory.
Update for Binary Data
HashMap<String,String> hm = new HashMap<String, byte[]>();
final File folder = new File("/home/you/Desktop");
listFilesForFolder(folder);
public void listFilesForFolder(final File folder) {
for (final File fileEntry : folder.listFiles()) {
if (fileEntry.isDirectory()) {
listFilesForFolder(fileEntry);
} else {
String name = fileEntry.getName();
byte[] fileData = new byte[(int) fileEntry.length()];
DataInputStream dis = new DataInputStream(new FileInputStream(fileEntry));
dis.readFully(fileData);
dis.close();
hm.put(name,fileData);
}
}
}
Tested for Zip file for OP:
public static void main(String[] args) throws FileNotFoundException, IOException {
File file = new File("D:\\try.zip");
System.out.println(file.length());
byte[] fileData = new byte[(int) file.length()];
DataInputStream dis = new DataInputStream(new FileInputStream(file));
dis.readFully(fileData);
dis.close();
}
For a class, I have to send a file of any type from my client to a server. I have to handle each packet individually and use UDP. I have managed to transfer the file from the client to the server, and I now have a file object which I cannot figure out how to save to a user specified directory.
f = new File(path + '\\' + filename);//path and filename are user specified.
FileOutputStream foutput = new FileOutputStream(f);
ObjectOutputStream output = new ObjectOutputStream(foutput);
output.writeObject(result);//result is a File
output.flush();
output.close();
Any time I run this code, it writes a new file with the appropriate name, but the text file I am testing ends up just containing gibberish. Is there any way to convert the File object to a file in the appropriate directory?
EDIT: As it turns out, I was misunderstanding what, exactly, a file is. I have not been transferring the data, but rather the path. How do I transfer an actual file?
ObjectOutputStream is a class that outputs a specific format of data to a text file. Only ObjectInputStream's readObject() can decoding that text file.
If you open the text file , it is just gibberish ,as you have seen.
you want this:
FileOutputStream fos = new FileOutputStream(path + '\\' + filename);
FileInputStream fis = new FileInputStream(result);
byte[] buf = new byte[1024];
int hasRead = 0;
while((hasRead = fis.read(buf)) > 0){
fos.write(buf, 0, hasRead);
}
fis.close();
fos.close();
If I understand your question, how about using a FileWriter?
File result = new File("result.txt");
result.createNewFile();
FileWriter writer = new FileWriter(result);
writer.write("Hello user3821496\n"); //just an example how you can write a String to it
writer.flush();
writer.close();