exception during file copy in Java - java

I have a function that copies binary file
public static void copyFile(String Src, String Dst) throws FileNotFoundException, IOException {
File f1 = new File(Src);
File f2 = new File(Dst);
FileInputStream in = new FileInputStream(f1);
FileOutputStream out = new FileOutputStream(f2);
byte[] buf = new byte[1024];
int len;
while ((len = in.read(buf)) > 0) {
out.write(buf, 0, len);
}
in.close();
out.close();
}
and the second function
private String copyDriverToSafeLocation(String driverPath) {
String safeDir = System.getProperty("user.home");
String safeLocation = safeDir + "\\my_pkcs11tmp.dll";
try {
Utils.copyFile(driverPath, safeLocation);
return safeLocation;
} catch (Exception ex) {
System.out.println("Exception occured while copying driver: " + ex);
return null;
}
}
The second function is run for every driver found in the system.
The driver file is copied and I am trying to initialize PKCS11 with that driver.
If initialization failed I go to next driver, I copy it to the tmp location and so on.
The initialization is in try/catch block
After the first failure I am no longer able to copy next driver to the standard location.
I get the exception
Exception occured while copying driver: java.io.FileNotFoundException: C:\Users\Norbert\my_pkcs11tmp.dll (The process cannot access the file because it is being used by another process)
How can I avoid the exception and safely copy the driver file?
For those curious why am I trying to copy the driver ... PKCS11 has nasty BUG, which prevents using drivers stored in the location that has "(" in the path ... and this is a case I am facing.
I will appreciate your help.

I would move the try-catch block into the copyFile method. That way you can properly handle closing the InputStreams (which is probably causing the JVM to hold onto the file handle). Something like this:
public static void copyFile(String Src, String Dst) {
try {
File f1 = new File(Src);
File f2 = new File(Dst);
FileInputStream in = new FileInputStream(f1);
FileOutputStream out = new FileOutputStream(f2);
byte[] buf = new byte[1024];
int len;
while ((len = in.read(buf)) > 0) {
out.write(buf, 0, len);
}
}
catch(Exception e) {
System.out.println("Exception occured while copying driver: " + ex);
}
finally {
in.close();
out.close();
}
}
Then you can remove the try-catch from the copyDriverToSafeLocation method.

Or there's the Java 7 Way:
public static void copyFile(String src, String dst) throws FileNotFoundException, IOException {
try (FileInputStream in = new FileInputStream(new File(src))) {
try (FileOutputStream out = new FileOutputStream(new File(dst))) {
byte[] buf = new byte[1024];
int len;
while ((len = in.read(buf)) > 0) {
out.write(buf, 0, len);
}
}
}
}
Edit: And the Java 7 NIO way.
public static void copyFile(String src, String dst) throws FileNotFoundException, IOException {
copyFile(new File(src), new File(dst));
}
public static void copyFile(File src, File dst) throws FileNotFoundException, IOException {
try (FileInputStream in = new FileInputStream(src)) {
try (FileOutputStream out = new FileOutputStream(dst)) {
copyFile(in, out);
}
}
}
public static void copyFile(FileInputStream in, FileOutputStream out) throws IOException {
FileChannel cin = in.getChannel();
FileChannel cout = out.getChannel();
cin.transferTo(0, cin.size(), cout);
}

If the file is used by an other process and locked, there is no generic solutions to be able to access it. You best chance is to use FileLock but it's plateform-dependant, read the documentation, it's written that the results are "advisory", so be carefull. you can also take a look at the ReentrantReadWriteLock class.

I would choose to go with Apache Commons IO and their FileUtils.copyFile() routine(s).
Standard concise way to copy a file in Java?

I'm not sure why a problem with one file would prevent copying a different file. However, not closing a file when an exception occurs could definitely cause problems. Use try...finally to make sure you call close on every file you open.

Related

compressing file by independent chunks and then concat them into one valid archive

I wonder if it is possible to compress an arbitrary file (or folder, or any other file structure) by independent chunks and then get a valid archive (e.g. gzip) by concatenating them together. Some requirements:
java 8
chunks <= 16MB
folder structure does not change during the process
chunks are compressed independently, but order is preserved
each compressed chunk is appended to the end of the resulting archive
resulting archive should be valid and decompressable by any standard tool
It looks like to achieve that I would need to create an archive header first and then just append compressed blocks to it https://www.rfc-editor.org/rfc/rfc1952, however I'm not sure if it is supported by any of standard java utils or 3rd party libraries. Does anybody have any ideas on where to start from?
Some background:
I have a client-server app, which allows user to upload files to a cloud storage. Communication via REST api, client side is going to be responsible for dividing files into chunks and upload them one by one. It is possible to do compression in browser, however I wonder if we can move that load to the backend.
Yes. A concatenation of gzip files is a valid gzip file, per the standard (RFC 1952). gzip certainly handles this.
You are correct to be concerned that some code out there might not support it, since it is not very common to have concatenated gzip members. If you want to be super-safe, you can combine the gzip files into a single gzip member, without having to recompress. You do however need to read through all of the compressed data, effectively decompressing it in memory (which is still much faster than compressing). You can find an example of that in gzjoin.c.
You can try something like this for tar + gzip:
Maven dependency:
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-compress</artifactId>
<version>1.18</version>
</dependency>
Java code to compress into chunks:
import org.apache.commons.compress.archivers.tar.TarArchiveEntry;
import org.apache.commons.compress.archivers.tar.TarArchiveOutputStream;
import org.apache.commons.compress.compressors.gzip.GzipCompressorOutputStream;
import org.apache.commons.compress.utils.IOUtils;
import java.io.*;
import java.nio.file.Files;
import java.nio.file.Paths;
[..]
private static final int MAX_CHUNK_SIZE = 16000000;
public void compressTarGzChunks(String inputDirPath, String outputDirPath) throws Exception {
PipedInputStream in = new PipedInputStream();
final PipedOutputStream out = new PipedOutputStream(in);
new Thread(() -> {
try {
int chunkIndex = 0;
int n = 0;
byte[] buffer = new byte[8192];
do {
String chunkFileName = String.format("archive-part%d.tar.gz", chunkIndex);
try (OutputStream fOut = Files.newOutputStream(Paths.get(outputDirPath, chunkFileName));
BufferedOutputStream bOut = new BufferedOutputStream(fOut);
GzipCompressorOutputStream gzOut = new GzipCompressorOutputStream(bOut)) {
int currentChunkSize = 0;
if (chunkIndex > 0) {
gzOut.write(buffer, 0, n);
currentChunkSize += n;
}
while ((n = in.read(buffer)) != -1 && currentChunkSize + n < MAX_CHUNK_SIZE) {
gzOut.write(buffer, 0, n);
currentChunkSize += n;
}
chunkIndex++;
}
} while (n != -1);
in.close();
} catch (IOException e) {
// logging and exception handling should go here
}
}).start();
try (TarArchiveOutputStream tOut = new TarArchiveOutputStream(out)) {
compressTar(tOut, inputDirPath, "");
}
}
private static void compressTar(TarArchiveOutputStream tOut, String path, String base)
throws IOException {
File file = new File(path);
String entryName = base + file.getName();
TarArchiveEntry tarEntry = new TarArchiveEntry(file, entryName);
tarEntry.setSize(file.length());
tOut.putArchiveEntry(tarEntry);
if (file.isFile()) {
try (FileInputStream in = new FileInputStream(file)) {
IOUtils.copy(in, tOut);
tOut.closeArchiveEntry();
}
} else {
tOut.closeArchiveEntry();
File[] children = file.listFiles();
if (children != null) {
for (File child : children) {
compressTar(tOut, child.getAbsolutePath(), entryName + "/");
}
}
}
}
Java code to concatenate the chunks into a single archive:
public void concatTarGzChunks(List<InputStream> sortedTarGzChunks, String outputFile) throws IOException {
try {
try (FileOutputStream fos = new FileOutputStream(outputFile)) {
for (InputStream in : sortedTarGzChunks) {
int len;
byte[] buf = new byte[1024 * 1024];
while ((len = in.read(buf)) != -1) {
fos.write(buf, 0, len);
}
}
}
} finally {
sortedTarGzChunks.forEach(is -> {
try {
is.close();
} catch (IOException e) {
// logging and exception handling should go here
}
});
}
}

Close function in finnally block throws NPE randomly

Here's a snippet code which throws NullPointerException at in.close() randomly (And I know java8 supports try-with-resources whilst this snippet uses the old style):
public static byte[] readReqFile(String filename) throws IOException {
File f = new File(filename);
if (!f.exists()) {
throw new FileNotFoundException(filename);
}
ByteArrayOutputStream bos = new ByteArrayOutputStream((int)f.length());
BufferedInputStream in = null;
try {
in = new BufferedInputStream(new FileInputStream(f));
int buf_szie = (int)f.length();
byte[] buffer = new byte[buf_szie];
int len = 0;
while (-1 != (len = in.read(buffer, 0, buf_szie))) {
bos.write(buffer, 0, len);
}
return bos.toByteArray();
} catch (IOException e) {
e.printStackTrace();
throw e;
} finally {
try {
in.close(); <-- Throws NullPointerException at this line
} catch (IOException e) {
e.printStackTrace();
}
bos.close();
}
}
public static void main( String[] args ) throws IOException, InterruptedException
{
String folderA, folderB;
while (/*Find a file in folderA*/) {
Moves the file into folderB
readReqFile(/*the file in folderB*/);
}
}
The process of the code is like this. There's a program (called ProgramA below) creating files at folderA, and if this snippet code finds a new file in folderA, it moves the file into folderB and read it. And then I find this snippet of code throws NullPointerException at "in.close()" randomly.
And the interesting thing is, if ProgramA creates files regularly, say create one every 0.1 seconds. The above code is ok and never throw NPE after trying 1000 times. If ProgramA creates files with random interval from 0.1s to 3s, the above code throws NPE after got every 40-50 files.
My java version is 1.8.0_111
My guess is that maybe the jvm does something to optimize the code, and it meets some bug when dealing random input. But it's just a guess. Does anyone know the reason?

Setting permissions for created directory to copy files into it

During the execution of my program it creates a directory which contains two sub-directories/two folders. Into one of these folders I need to copy a Jar-File. My programm resembles an installation routine. The copying of the Jar file is not the problem here, but the permissions of the created directories.
I tried to set the permissions of the directories (before actually creating them with the mkdirs() method) with File.setWritable(true, false) and also with the .setExecutable and .setReadable methods, but the access to the sub-directories is still denied.
Here's an excerpt of my code for the creation of one of the two sub-directories:
folderfile = new File("my/path/to/directory");
folderfile.setExecutable(true, false);
folderfile.setReadable(true, false);
folderfile.setWritable(true, false);
result = folderfile.mkdirs();
if (result) {
System.out.println("Folder created.");
}else {
JOptionPane.showMessageDialog(chooser, "Error");
}
File source = new File("src/config/TheJar.jar");
File destination = folderfile;
copyJar(source, destination);
And my "copyJar" method:
private void copyJar(File source, File dest) throws IOException {
InputStream is = null;
OutputStream os = null;
try {
is = new FileInputStream(source);
os = new FileOutputStream(dest);
byte[] buffer = new byte[1024];
int length;
while ((length = is.read(buffer))>0) {
os.write(buffer, 0, length);
}
} catch (Exception e) {
e.printStackTrace();
}
is.close();
os.close();
}
At os = new FileOutputStream(dest); the debugger throws a FileNotFoundException with the description that the access to the directory has been denied.
Does anyone have an idea what I am doing wrong or have a better solution for setting the permissions via Java? Thanks in advance!
A similar question was asked there are several years.
A possible solution for Java 7 and Unix system is available here : How do i programmatically change file permissions?
Or, below the best response, a example with JNA.
I hope that that will help you !
I solved the problem. In the end it was much easier to solve than expected.
The main problem was not the permission issue but the FileNotFoundException. The file that is assigned to the OutputStream is not really a file, but just a directory so that the Stream can't find it. You have to create the file before initializing the OutputStream and after that you copy your source file into the newly created file. The code:
private void copyJar(File source, File dest) throws IOException {
InputStream is = null;
File dest2 = new File(dest+"/TheJar.jar");
dest2.createNewFile();
OutputStream os = null;
try {
is = new FileInputStream(source);
os = new FileOutputStream(dest2);
byte[] buffer = new byte[1024];
int length;
while ((length = is.read(buffer))>0) {
os.write(buffer, 0, length);
}
} catch (Exception e) {
e.printStackTrace();
}
is.close();
os.close();
}

java the system can not find the file specified

I used Java to copy file but it appeared a exception (the system can not find the file specified).
The codes are
public static void copyFile(String sourceFile, String destFile){
try {
InputStream in = new FileInputStream(sourceFile);
OutputStream os = new FileOutputStream(destFile);
byte[] buffer = new byte[1024];
int count;
while ((count = in.read(buffer)) > 0) {
os.write(buffer, 0, count);
}
in.close();
os.close();
} catch (IOException e) {
e.printStackTrace();
}
}
The test codes
public static void main(String[] args) {
String name = getFileName("D:/z/temp.txt");
String target = "D:/tem.txt";
copyFile(name, target);
}
the exception is java.io.FileNotFoundException: temp.txt(the system can not find the file specified)
The file 'temp.txt' is existence.
The path is right no problem.
I guess that is the problem of Permissions. who can come up with the answer thanks!
We need to see the method getFileName() to be sure, but based on the error message and the method name, I suspect the problem is just that this method returns only the name of the file, removing the path info, so that the file is, indeed, not found.

Java unzip from URL misses 2kb on file

I am trying to unzip a file from the internet using the following code. On one of the files("uq.class"), after it has been unzipped from the online source, is missing about 2kb of file size(the original file is 10,084, unzipped I get 8,261). All the other files seem to be completely fine, and when I copy the uq.class file from the zip and place it in manually, it functions perfectly. Can anyone explain whats going on and provide a fix? Below is the unzipping portions of the code.
public static File unpackArchive(URL url, File targetDir) throws IOException {
if (!targetDir.exists()) {
targetDir.mkdirs();
}
InputStream in = new BufferedInputStream(url.openStream(), 2048);
// make sure we get the actual file
File zip = File.createTempFile("arc", ".zip", targetDir);
OutputStream out = new BufferedOutputStream(new FileOutputStream(zip),2048);
copyInputStream(in, out);
out.close();
return unpackArchive(zip, targetDir);
}
public static File unpackArchive(File theFile, File targetDir) throws IOException {
if (!theFile.exists()) {
throw new IOException(theFile.getAbsolutePath() + " does not exist");
}
if (!buildDirectory(targetDir)) {
throw new IOException("Could not create directory: " + targetDir);
}
ZipFile zipFile = new ZipFile(theFile);
for (Enumeration entries = zipFile.entries(); entries.hasMoreElements();) {
ZipEntry entry = (ZipEntry) entries.nextElement();
File file = new File(targetDir, File.separator + entry.getName());
if (!buildDirectory(file.getParentFile())) {
throw new IOException("Could not create directory: " + file.getParentFile());
}
if (!entry.isDirectory()) {
copyInputStream(zipFile.getInputStream(entry), new BufferedOutputStream(new FileOutputStream(file),2048));
} else {
if (!buildDirectory(file)) {
throw new IOException("Could not create directory: " + file);
}
}
}
zipFile.close();
theFile.delete();
return theFile;
}
public static void copyInputStream(InputStream in, OutputStream out) throws IOException {
byte[] buffer = new byte[1024];
int len = in.read(buffer);
while (len >= 0) {
out.write(buffer, 0, len);
len = in.read(buffer);
}
in.close();
out.close();
}
public static boolean buildDirectory(File file) {
return file.exists() || file.mkdirs();
}
Cannot directly see anything wrong with the code at first sight. What I would recommend you doing however is closing your streams more safely. In your current implementation you close the in and out streams at the same time, close statements can cause exceptions as can read and write statements! If any one of those fails, your files will be left open and in time your application will run out of file descriptors. You're better off doing the closing in a finally statement, that way you're sure they get closed.
I don't know why I cant sign in, but I figured out the issue. I did the whole cart before the horse thing. I extracted the proper file, then extracted the old file over it, so I kept re-integrating the older file. 5 hours of programming out the window. Remember, kiddies, proper programming architecture saves you A TON of headaches.

Categories

Resources