I can't seem to figure out why I can't create a directory from a Java program FOR A SPECIFIC LOCATION.
The specifics are - the folder of my computer is shared on the network.
Code:
File xmlDirectory = new File(sXMLOutputPath);
/*
* TODO: if multiple threads arive at the 1st if, if will evaluate to true,
* then 1st thread would create directory, and the 2nd being .01 sec later, will fail
* to create directory and have exception
* SOLUTION: Provide additional if exists, so that 2nd thread will recognize that it
* was created.
*/
if (!xmlDirectory.exists()){
if (!xmlDirectory.mkdirs()){
if (!xmlDirectory.exists()){
throw new BillAdminException("Failed to create xml directory: " +
sXMLOutputPath);
}
}
}
This is a server side code
In summary - if I share my foder C:\folder\etc - and pass it as JVM options to the program, the server side program "appends" \xml\333\333.xml" to it, and is supposed to create that xml file on MY PC. First it creates a structire C:\folder\etc\xml\333\, and then it creates 333.xml. It fails creating C:\folder\etc\xml\333 if the C:\folder\etc is passed as shared location in the form "\myMachine\etc", but works OK if I make that structure on some other machine "\OtherMachine\etc". If I pass it as "C:\folder\etc" (absolute, not shared form) it will work fine, creating directory and file on the server machine where code is executed. I need it created on my machine (client). What am I doing wrong when sharing folder.
P.S - this functionality worked about 2 month ago. However, the folder properties could have been tempered with since then. Not the java code, though.
P.S 2: this is not the only shared folder I pass from JVM options. There are 2 others, but used for reading (not creating subfolders)
Thanks, for help
P.S 3: The error I get is:
Failed to create xml directory: \\myMachine\etc\xml/333/
What I smell fishy, is that the slash before "333" is reversed. But, there were no changes in the code, so the same would have been happening before.
Related
I want to change the creation time of a directory in java (Linux).
Files are no problem:
Files.createFile(file.toPath());
Files.setAttribute(file.toPath(), "basic:lastModifiedTime", FileTime.fromMillis(creationDate), NOFOLLOW_LINKS);
Files.setAttribute(file.toPath(), "basic:creationTime", FileTime.fromMillis(creationDate), NOFOLLOW_LINKS);
That works.
But if I try to set this by directory creation nothing happens:
Files.createDirectory(file.toPath());
Files.setAttribute(file.toPath(), "basic:lastModifiedTime", FileTime.fromMillis(creationDate), NOFOLLOW_LINKS);
Files.setAttribute(file.toPath(), "basic:creationTime", FileTime.fromMillis(creationDate), NOFOLLOW_LINKS);
What could be the problem or is another approach the better way? (I use JDK 8)
Little background for context:
The application I support allows third parties to develop plugins that can leverage some of our functionality. We hand them our "externalAPI.jar"; they put it in their project, implement some interfaces, and build their own APK. We find the would-be plugin by asking the package manager for all installed applications and see if each has a "pluginclass.xml" in the assets directory. If it has that XML file, we anticipate its contents being the canonical path of a class that implements our ExternalPluginVX interface, and using a new PathClassLoader(ApplicationInfo.sourceDir, this.getClass().getClassLoader()), we load the class, create a new instance, and start using it.
The problem:
Sometimes third parties will put
compile files ("./libs/externalAPI.jar")
in their gradle files instead of the correct syntax:
provided files ("./libs/externalAPI.jar")
The result of course being things don't work properly. Sometimes they almost work, but then have unpredictability in their behavior - usually involving vicious crashes. Notably, since their APK is well-formed in its own right, and the XML file is there, we'll see the plugin, load the target class successfully, instantiate it successfully, and things go haywire from there when they try and reference back to us.
The question:
Is there a way for my application to check at runtime if the other application compiled our API classes into their APK instead of using provided files like they should have?
A viable solution is to use a DexFile.
Since I already have the ApplicationInfo.sourceDir, I can just construct a DexFile and iterate through its contents.
//this variable's value assigned by iterating through context.getPackageManager().getInstalledApplications(0)
ApplicationInfo pkg;
String interfaceTheyShouldntHave = ExternalPluginVX.class.getCanonicalName(); //"com.project.external.ExternalPluginVX"
DexFile dexFile = new DexFile(pkg.sourceDir);
Enumeration<String> entries = dexFile.entries();
while(entries.hasMoreElements()){
String entry = entries.nextElement();
if(entry.equals(interfaceTheyShouldntHave)){
Toast.makeText(ctxt, "Plugin \"" + pluginName + "\" could not be loaded. Please use 'provided files' instead of 'compile files'", Toast.LENGTH_LONG).show();
return;
}
}
I am generating code starting from two related Metamodels. The main one has references to classes of the second one. The Acceleo execution works well when executed as an Acceleo plugin but not when executed as a Java application. If I start the Java main Class, data of the 2nd related metamodels are not visible.
The error I get is
org.eclipse.acceleo.engine.AcceleoEvaluationException: Unresolved compilation error in generation module
I show you a snippet of Debug mode. target is a reference to a class of the 2nd metamodel (named peersbehavior).
---- The URI is correct, it's pointing to the exact location ----
---- But values are not retrieved ----
I had a similar problem with ATL Model2Model transformation: the option "Allow inter-model reference" must be checked. But in Acceleo I don't find anything similar
[EDIT]
As pointed by standalone documentation,
I added these 2 rows of code at the Java class
public void registerResourceFactories(ResourceSet resourceSet)
{
super.registerResourceFactories(resourceSet);
// code added by me
resourceSet.getResourceFactoryRegistry().getExtensionToFactoryMap().put("systembehavior", new XMIResourceFactoryImpl());
resourceSet.getResourceFactoryRegistry().getExtensionToFactoryMap().put("peerbehavior", new XMIResourceFactoryImpl());
}
Now It works also starting the Java class, But if I export the project as Jar, and try to use it in another project, I have the same problem as before
I solved this problem adding this code (as pointed in the [EDIT] section of my question)
public void registerResourceFactories(ResourceSet resourceSet)
{
super.registerResourceFactories(resourceSet);
// code added by me
resourceSet.getResourceFactoryRegistry().getExtensionToFactoryMap().put("systembehavior", new XMIResourceFactoryImpl());
resourceSet.getResourceFactoryRegistry().getExtensionToFactoryMap().put("peerbehavior", new XMIResourceFactoryImpl());
}
and adding manually the .emtl compiled files, in the src dir (otherwise they will not be inserted in the .jar).
With these modification, the code generation works if executed as Java application. Running the transformation as Acceleo application doesn't work
i'm working on a project for my thesis in JAVA which requires automatic RMI generation from Abstract Syntax Tree. I'm using RMI as
`public int createProcess(CompilationUnit cu){
//Some Code Here
return processid;
} `
for generating RMI from AST on each node. And it will automatically generates the Interface file and all the java files from AST and put all the methods in these files. I am able to execute the javac, rmic <remote-class>, rmiRegistry commands using process builder. But
how to destroy and unbind the remote objects after process completion ? Do i have to put this code at the end of each file where control exits ?
public void exit() throws RemoteException
{
try{
// Unregister ourself
Naming.unbind(mServerName);
// Unexport; this will also remove us from the RMI runtime
UnicastRemoteObject.unexportObject(this, true);
}
catch(Exception e){}
}
Do i have to execute rmiRegistry after every remote method/classes creation or it will automatically adds the later remote methods/classes to registry, if it is already in the executing state (means if processbuilder is already executing the command "rmiRegistry") ? For example if nodeA creates a Process1 (RMI class) on nodeB and then execute it using commands via Processbuilder, rmiRegistery will be in running state. Now if NodeA wants to create another Process2 on NodeB, do i have to stop that instance of rmiRegistery and rerun it, or there is no need to do that Registery will detect & add new bindings automatically?
Will all the RMI run on same port ?? means if i create process1 and bind it with localhost/process1 & process2 with localhost/process2 , can we access them via same port ?
i'm working with RMI first time so don't have any previous experience or knowledge.
Apologies, my question seemed unclear , so i tried to put more explanation by editing ?
Following this tutorial Link
1 how to destroy and unbind the remote objects after process completion ?
See 2, but I don't know why you want to do so. Just leave them in existence and bound to the Registry.
2 Do I have to put this code at the end of each file where control exits ?
Yes, if you want it to execute, otherwise no. Don't generate empty catch-blocks though.
3 Do I have to execute rmiRegistry after every remote object creation
No, you have to start it once, at the beginning of the containing process. Simplest way is via LocateRegustry.createRegistry().
Is there a way to convert JAR lib into JAR standalone?
I need to find a standalone java executable that convert PDF into TIFF and I've found these JARs: http://www.icefaces.org/JForum/posts/list/17504.page
Any ideas?
Easiest might be to create another Jar with a Main() entry point, and then just use the java.exe executable to run it:
e.g.
> java.exe -cp MyJarMain.jar;MyPDFJar.jar com.mydomain.MyMain myPDF.pdf
Where MyMain is a class with a Main static method.
You'll need something with a main entry point to pass in and interpret some command line arguments (myPDF.pdf in my made-up example)
You could do an assembly (are you using maven?) and make sure the Main-Class entry in the manifest.mf points to the main class.
Since there is no main-Method, you have to write one, or write a whole new class to call the class/method TiffConver.convertPDF .
The question is, how you're going to use it. From the command line, you need no executable jar. From the Gui, maybe you want to pass a file to be converted by drag and drop? Then you should take the parameter(s) passed to main as Input-PDF-Names (if they end in .pdf) and pass the names iteratively to TiffConverter, for "a.pdf b.pdf" =>
TiffConver.convertPDF ("a.pdf", "a.tiff");
TiffConver.convertPDF ("b.pdf", "b.tiff");
TiffCoverter will silently overwrite existing tiffs, so check that before or change the code there - this is clearly bad habit, and look out for more such things - I didn't.
/*
* Remove target file if exists
*/
File f = new File(tif);
if (f.exists()) {
f.delete();
}
Maybe you wan't to write a swing-wrapper, which let's you choose Files interactively to be converted. This would be a nice idee, if no filename is given.
If the user passes "a.pdf xy.tiff" you could rename the converted file to xy, as additional feature.
Without a main-class, however, a standalone jar would be magic.
However, building a native executale is almost always a bad idea. You loose portability, you don't profit from security- and performance improvements to the JVM or fixed bugs. For multiple programs you need always an independend bugfix, which you might have to manage yourself, if you don't have a package-management as most linux distros have.
after clearing some questions:
public static void main (String [] args) {
if (args.length == 1 && args[0].endsWith (".pdf")) {
String target = args[0].replaceAll (".pdf$", ".tif");
convertPDF (args[0], target);
}
}
This method you put into TiffConvert. It will allow you to convert a simple pdf-File, and generate a tif-File with the same basename but ending in .tif, silently overwriting an existing one of the same name.
I guess you now need to know how to start it?