what is the default Storage location in File Object - java

In this code i didn't mention the path for the file hello.xls. But, I am reading the values from hello.txt file but i don't know where it gets stored. Is it stored in the JVM memory or some where else. if so what is the maximum size. I am using unix box.
sample java code:
File f = new File(hello.xls);
InputStream f = new FileInputStream(f);
If it is store some where in the server, Please suggest, how to handle without storing the files in the server to read the values and write the values in the same excel sheet.

Default storage location in File object is a directory obtained by executing the line :
System.getProperty("user.dir"); //represents the current directory the user is executing the program, rather than where the program is located.
It's the directory from where the java was run -- where you started the JVM.

According to the javadocs, if you don't specify a path in the file constructor, the file is assumed to be in the directory pointed to by the
"system property user.dir, and is typically the directory in which the
Java virtual machine was invoked."

Related

Permission issue to file written to shared drive (NAS) using Java filewriter

I have a shared drive (NAS) attached to my Linux server wherein I am able to create and write to file usiing the following Java code.
String filePath = remotePath + fileName;
BufferedWriter fileWriter = new BufferedWriter(new FileWriter(filePath));
fileWriter.write(fileContents);
fileWriter.close();
File file = new File(filePath);
file.setExecutable(true);
file.setWritable(true);
file.setReadable(true);
I have tried to log the permission attribute too using canExecute(), canWrite(), canRead() and all the output are logged as true.
But this newly created file is not inheriting the folder permissions. When user try to access(Read/Delete) files using Linux script it gives permission denied.
The user running the script is the folder owner while the file shows owner as root. Due to policy, the user doesn't have sudo rights. How can I make it accessible?
If i understand this right, then your Java process is running as root. The created file is owned by the user that runs the process. Which is in your case root.
I see two options for you:
Let the Java process that creates the file run as the user that owns the directory. So the files will be owned and accessible by the user.
If the Java process must run as root then you need to change the owner of the file after it had been written. see Change file owner group under Linux with java.nio.Files on how this can be done.

Not able to move file in the linux OS using java files.move

I am trying to move files (in linux OS) by using java code this happen when trying to move single file. Issue which looks to me is i guess similar to
java.nio.Files.move() - DirectoryNotEmptyException on OS X
If so how can we handle it . Shall it be done on linux level or java.
Files.move(src, dst, new CopyOption[] { StandardCopyOption.REPLACE_EXISTING });
in this src folder is present in projs/dd/output/TAR
and dest folder is config_files/billing/xml/workArea
the error which comes is even though destination there is no file present nor folder in target:
WARNING: FAILED_TO$Failed to get MHS user handler
java.nio.file.DirectoryNotEmptyException:
config_files/billing/xml/workArea/GBF_2017030001_000007540_00001_0000000004
at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:491)
at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262)
at java.nio.file.Files.move(Files.java:1347)
what i could see is both folders are on different FileSystem
by running stat -c "%d" "${Folder}" on linux OS.
is this cause of exception

Unable to Load Dependent SO file in LInux

I am new to linux. I am trying to load a SO file in Ubuntu using Java. The file that I have specified in the java method "System.load(/home/ab/Downloads/libtesseract.so)" loads fine but its dependent so file placed in the same place as "libtesseract.so" is not found. Here is the error message I get. Error: UnSatisfiedLinkError and says "liblept.so.4" cound not be found. This so file is placed in the same location as libtesseract.so. When I place "liblept.so.4" in the "/lib". It is able to load this so file from. So what I understood is that for, its not for java to load the dependent so. It has to be loaded by ubuntu. So I tried a simple application to load this by setting the PATH variable with the location of the so file. And exported the java code into a jar and tried to run this jar file from terminal as the path variable is not persistent for entire system. It worked fine. So I tried to do the same thing programmatically by using the code below to its not working. Please advice. TIA
Code:
ProcessBuilder pb = new ProcessBuilder("/bin/sh");
Map<String, String> envMap = pb.environment();
envMap.put("LD_LIBRARY_PATH", "/home/ab/Downloads");
envMap.put("PATH", "/home/ab/Downloads");
Set<String> keys = envMap.keySet();
for(String key:keys)
{
System.out.println(key+" ==> "+envMap.get(key));
}
System.load("/home/ab/Downloads/libtesseract.so");
As far as I know you can't really modify the environment variables in Java "on-the-fly". That means you should set both LD_LIBRARY_PATH and PATH before running the java.

Flume java.lang.IllegalStateException: File has changed size since being read

I have a java application that gather data from different sources and write the output into files under a specific directory.
And I have a flume agent configured to use spooldir source to read from that directory and write the output to Solr using MorphlineSolrSink.
The flume agent throws the following exception
java.lang.IllegalStateException: File has changed size since being read
Here is the configuration of the flume agent
agent02.sources = s1
agent02.sinks = solrSink
agent02.channels = ch1
agent02.channels.ch1.type = file
agent02.channels.ch1.checkpointDir=/home/flume/prod_solr_chkpoint/file-channel/checkpoint
agent02.channels.ch1.dataDirs= /home/flume/prod_solr_chkpoint/file-channel/data
agent02.sources.s1.type = spooldir
agent02.sources.s1.channels = ch1
agent02.sources.s1.spoolDir = /DataCollection/json_output/solr/
agent02.sources.s1.deserializer.maxLineLength = 100000
agent02.sinks.solrSink.type = org.apache.flume.sink.solr.morphline.MorphlineSolrSink
agent02.sinks.solrSink.channel = ch1
agent02.sinks.solrSink.batchSize = 10000
agent02.sinks.solrSink.batchDurationMillis = 10000
agent02.sinks.solrSink.morphlineFile = morphlines.conf
agent02.sinks.solrSink.morphlineId = morphline
What I understand from the exception is that the flume agent start working on a file while the java application did not finish writing it.
How can I fix this problem ?
Edit
I have no idea this information is valuable or not.
These configurations were working before without any problem. We faced a hard desk failure in the machine we run flume from. After recovering from that failure flume throws this exception.
As stated in the documentation regarding the Spooling Directory Source:
In exchange for this reliability, only immutable, uniquely-named files
must be dropped into the spooling directory. Flume tries to detect
these problem conditions and will fail loudly if they are violated:
If a file is written to after being placed into the spooling directory, Flume will print an error to its log file and stop
processing.
If a file name is reused at a later time, Flume will print an error to its log file and stop processing.
I'll suggest your Java application dumps buckets of data into temporal files; name them by adding the timestamp of creation. Once the bucket is full (i.e. a certain size is reached), then move the file to the spooling directory.
write the source file to another directory, then move (mv command) the files to spool source directory. it should work. dont use copy command.

Problems with FileOutputStream

Why does this code cause an error: access denied?
public void armazenaPerfil() throws FileNotFoundException, IOException {
FileOutputStream out = new FileOutputStream(this.login + "_perfil.mbk");
ObjectOutputStream objOut = new ObjectOutputStream(out);
objOut.writeObject(this);
System.out.println("Escrevi!");
objOut.close();
}
The error message:
ric93_perfil.mbk(acess denied)
java.io.FileNotFoundException
at java.io.FileOutputStream.open(Native Method)
at java.io.FileOutputStream.<init>(FileOutputStream.java:212)
at java.io.FileOutputStream.<init>(FileOutputStream.java:104)
at br.uefs.ecomp.myBook.model.Perfil.armazenaPerfil(Unknown Source)
Access denied problems are basically the operating system saying "You are not allowed to write that". Basically, an OS-level access control / permissions issue is preventing you from reading or writing the file at the specified location.
When you write a file using a relative pathname, the JVM will attempt to write it in a location relative to the running application's current working directory. What directory that will be depends on how the JVM is launched, but if you launch from a command prompt using the java command, it will be the command shell's current directory.
You can find out what the current director actually is using the one-liner suggested by Brendan Long:
System.out.println(new File(pathname).getAbsolutePath());
where pathname is the pathname of the file you were trying to read or write. Note that this doesn't actually check that the pathname refers to an existing file, or tell you that you should be able to create or open the file. It merely tells you what the absolute pathname for the file would be.

Categories

Resources