I have this upload method:
try {
Files.createDirectories(filesPath);
} catch (IOException e1) {
e1.printStackTrace();
return null;
}
for (MultipartFile file : Arrays.asList(files)) {
try {
// Get the file and save it somewhere
byte[] bytes = file.getBytes();
Path path = Paths.get(filesPath + File.separator + file.getOriginalFilename());
Files.write(path, bytes);
} catch (IOException e) {
e.printStackTrace();
return null;
}
}
It work well, but when i try upload bigger file around 1,5GB i get this error:
Invalid string length
How can i fix it?
First you need to adjust two properties if you want to allow uploads of such big files.
spring.servlet.multipart.max-file-size=-1
spring.servlet.multipart.max-request-size=-1
Then it is better to use MultipartFile.getInputStream() to read the contents of the file.
You might also use IOUtils.copy() from Apache Commons IO to simplify your job.
Related
I am trying to save a stream of bytes in h264 format, to an h264 file.
I did it in JAVA, and the file is being saved and I can open it and see the video.
BUT, when I try the exact same code, in android, and I'm trying to save the file through the android device, the file is corrupted.
This is my code (both for android and for java):
File path = Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_MOVIES);
File file = new File(path, "/" + "filename2.mp4");
FileOutputStream output2 = null;
try {
output2 = new FileOutputStream(file, true);
output2.write(my_stream.toByteArray());
} catch (IOException e) {
e.printStackTrace();
} finally {
try {
output2.close();
} catch (IOException e) {
e.printStackTrace();
}
}
offcourse just the path is different in java and in the android version
Maybe because my_stream.toByteArray() is only one part of the whole video. Read the video stream in a loop and write it to the output stream chunk by chunk.
Alternatively there is this function that will do it for you:
Files.copy(videoInputStream, filePath, StandardCopyOptions.REPLACE_EXISTING);
Or if the input is a byte array:
Files.write(outputPath, bytes, StandardOpenOptions.WRITE,
StandardOpenOptions.CREATE_NEW,
StandardOpenOptions.CREATE);
Full documentation: https://docs.oracle.com/javase/7/docs/api/java/nio/file/Files.html#write(java.nio.file.Path,%20byte[],%20java.nio.file.OpenOption...)
https://docs.oracle.com/javase/7/docs/api/java/nio/file/StandardOpenOption.html
I want to read file content using this code:
String content = new String(Files.readAllBytes(Paths.get("/sys/devices/virtual/dmi/id/chassis_serial")));
On some systems this file is not present or it's empty. How I catch this exception? I want to print message "No file" when there is no file and there is no value.
The AccessDeniedException can be thrown only when using the new file API. Use an inputStream to open a stream from the source file so that you could catch that exception.
Try with this code :
try
{
final InputStream in = new Files.newInputStream(Path.get("/sys/devices/virtual/dmi/id/chassis_serial"));
} catch (FileNotFoundException ex) {
System.out.print("File not found");
} catch(AccessDeniedException e) {
System.out.print("File access denied");
}
Try to use filter file.canRead()) to avoid any access exceptions.
Create a File object and check if it exists.
If it does then it's safe to convert that file to a byte array and check that the size is greater then 0. If it is convert it to a String. I added some sample code below.
File myFile = new File("/sys/devices/virtual/dmi/id/chassis_serial");
byte[] fileBytes;
String content = "";
if(myFile.exists()) {
fileBytes = File.readAllBytes(myfile.toPath);
if(fileBytes.length > 0) content = new String(fileBytes);
else System.out.println("No file");
else System.out.println("No file");
I know it's not the one liner you were looking for. Another option is just to do
try {
String content = new String(Files.readAllBytes(Paths.get("/sys/devices/virtual/dmi/id/chassis_serial")));
} catch(Exception e) {
System.out.print("No file exists");
}
Read up on try catch blocks here like MrTux suggested, as well as java Files and java io here.
I have a program which writes out some data- I am using this logic
FileOutputStream outpre = new FileOutputStream(getfile());
FileChannel ch = outpre.getChannel();
ch.position(startwr);
ch.write(ByteBuffer.wrap(dsave));
outpre.close();
It writes out the correct data to the correct location, the only problem is that everything before the starting location to write (startwr) gets replaced with 00's, and the file is also changed making the point at which the writing was done, the end of the file.
How can I write data to the file without corrupting the previous data and changing the file size?
You need to instruct the stream to either append or overwrite the contents of the file...
Take a look at FileOutpuStream(File, boolean) for more details
Updated
After some mucking around, I found that the only solution I could get close to working was...
RandomAccessFile raf = null;
try {
raf = new RandomAccessFile(new File("C:/Test.txt"), "rw");
raf.seek(3);
raf.writeBytes("BB");
} catch (IOException exp) {
exp.printStackTrace();
} finally {
try {
raf.close();
} catch (Exception e) {
}
}
You can fix it this way
FileChannel fc = FileChannel.open(getFile().toPath(), StandardOpenOption.APPEND);
EDIT: children is an array of directories. This code loops trough this array in order to enter to each directory and load into the array webs all the files listed. Then, for each file, the readFile function is supposed to read the file.
My code is:
for (File cat: children) {
File[] webs = cat.listFiles();
System.out.println(" Indexing category: " + cat.getName());
for (File f: webs) {
Web w = readFile(f);
// Do things with w
}
}
I'm getting this error:
org.htmlparser.util.ParserException: Error in opening a connection to 209800.webtrec
209801.webtrec
...
422064.webtrec
422071.webtrec
422087.webtrec
422089.webtrec
422112.webtrec
422125.webtrec
422127.webtrec
;
java.io.IOException: File Name Too Long
at java.io.UnixFileSystem.canonicalize0(Native Method)
at java.io.UnixFileSystem.canonicalize(UnixFileSystem.java:172)
at java.io.File.getCanonicalPath(File.java:576)
at org.htmlparser.http.ConnectionManager.openConnection(ConnectionManager.java:848)
at org.htmlparser.Parser.setResource(Parser.java:398)
at org.htmlparser.Parser.<init>(Parser.java:317)
at org.htmlparser.Parser.<init>(Parser.java:331)
at IndexGenerator.IndexGenerator.readFile(IndexGenerator.java:156)
at IndexGenerator.IndexGenerator.main(IndexGenerator.java:101)
It's strange because I don't see any of those files in that directory.
Thanks!
EDIT2: This is the readFile function. It loads the contents of the file into a string and parses it. Actually, files are html files.
private static Web readFile(File file) {
try {
FileInputStream fin = new FileInputStream(file);
FileChannel fch = fin.getChannel();
// map the contents of the file into ByteBuffer
ByteBuffer byteBuff = fch.map(FileChannel.MapMode.READ_ONLY,
0, fch.size());
// convert ByteBuffer to CharBuffer
// CharBuffer chBuff = Charset.defaultCharset().decode(byteBuff);
CharBuffer chBuff = Charset.forName("UTF-8").decode(byteBuff);
String f = chBuff.toString();
// Close imputstream. By doing this you close the channel associated to it
fin.close();
Parser parser = new Parser(f);
Visitor visit = new Visitor();
parser.visitAllNodesWith((NodeVisitor)visit);
return new Web(visit.getCat(), visit.getBody(), visit.getTitle());
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (ParserException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
return null;
}
Okay, finally I got the solution.
It was a very stupid error. I had a file in that directory that contained the names of all empty html files that I had deleted in a previous task. So, I was trying to parse it, and then the parser would interpret it like an URL and not as an htmlfile (since there aren't tags and a lot of points...). I couldn't find the file easily because I have millions of files in that folder.
I am working in an application development. On that application i am performing files store, retrieve and delete operations. For identifying the files on server i am using an index(a hash map file) file. every time when i perform upload operation i update "index" file and upload "index" file on server along with other uploading files.
For performing delete operation first i am retrieving the "index" file and based on the index i am deleting the files from server and after updating "index" file i again upload "index" file on server.
I am able to perform file uploading operation successfully but while performing delete operation, i am getting "java.io.EOFException" exception, when i am trying to retrieve "index" file.
i am writing following code to download "index" file from FTPS server
//download index file
if (service.retrFile("INDEX", "") == service.OK) {
try {
ObjectInputStream objIn = new ObjectInputStream(new FileInputStream("INDEX"));
try {
Map<String, FileData> filesUploaded = (HashMap<String, FileData>) objIn.readObject();
} catch (ClassNotFoundException ex) {
ex.printStackTrace();
}
objIn.close();
} catch (IOException ex) {
ex.printStackTrace();
}
}
Where "service.ok" returns '0' if it is successfully connected to FTPS server
and "FileData" contains information about file(attributes).
Same code i am using while performing uploading operation. there it is working fine with no exception. but while performing delete operation when i am retrieving "index" file i am getting exception on the statement :
Map filesUploaded = (HashMap) objIn.readObject();
Exception is :
SEVERE: null
java.io.EOFException
at java.io.ObjectInputStream$PeekInputStream.readFully(ObjectInputStream.java:2298)
at java.io.ObjectInputStream$BlockDataInputStream.readShort(ObjectInputStream.java:2767)
at java.io.ObjectInputStream.readStreamHeader(ObjectInputStream.java:798)
at java.io.ObjectInputStream.<init>(ObjectInputStream.java:298)
at com.pixelvault.gui.DeleteServerFilesDialog.startDeleting(DeleteServerFilesDialog.java:447)
I have checked whether FTPS server connections are properly closed after performing corresponding operations.
I am not getting where i am doing wrong.
please give me your valuable suggestions. i thank to all your suggestions which will help me to overcome with this problem.
i am using org.apache.commons.net.ftp and "retrFile" is a method created by me for retrieving files from server.
Here is code for "retrFile"
FTPSClient ftp;
public int retrFile(String filename, String savePath) {
if (!connected) {
return ERR;
}
FileOutputStream fout = null;
InputStream bin = null;
try {
ftp.enterLocalPassiveMode();
fout = new FileOutputStream(savePath + filename);
bin = ftp.retrieveFileStream(filename);
if (bin == null) {
fout.close();
return ERR;
}
byte[] b = new byte[ftp.getBufferSize()];
int bytesRead = 0;
while ((bytesRead = bin.read(b, 0, b.length)) != -1) {
fout.write(b, 0, bytesRead);
}
ftp.completePendingCommand();
fout.close();
} catch (FTPConnectionClosedException ex) {
ex.printStackTrace();
connected = false;
return NOT_CONNECTED;
} catch (IOException ex) {
ex.printStackTrace();
return ERR;
} finally {
try {
fout.close();
} catch (IOException ex) {
ex.printStackTrace();
return ERR;
}
try {
if (bin != null) {
bin.close();
}
} catch (IOException ex) {
ex.printStackTrace();
return ERR;
}
}
return OK;
}
Are you sure the INDEX file is correctly downloaded?
It's present in the filesystem when application is closed?
What FTP lib are you using?. i only know commons.net from Apache and i not recognice the "retrFile" file method. Could it be threaded so that the file is not completely downloaded when the readObject statement is executed?