Finding classpath of class invoking a library - java

I am using google Reflections in a java library I am developing.
The reason I use Reflections is because I want to find all the classes with a particular annotation.
Simplifying things, in my library I have a method answering those classes, that is invoked with a line like:
//this method uses the Reflections library
Repository.getDefault().getMyAnnotatedClasses()
In the current version of my library, I require the user to explicitly add the package name where the Reflections library need to look for the annotated classes:
Repository.getDefault().addSearchPath("...");
In this way Reflections will look for classes located only in that package.
If the user of my library does not add this search path, I configure Reflections to search in all the classes in the system class loader. Obviously this solution is quite inefficient. However, I really want to get rid of the requirement of asking the user to ALWAYS set the search path.
- A side note in case it is important: in Reflections you can configure a search path with a package name, a url (the classpath where the classes can be found), or a class loader.
So my question is: Is there a way to find the classpath of the class invoking my library ? (from my library code).
In this way, I could detect that if the user has not explicitly set a search location, I will add that location by default, which seems to be a better solution than adding the entire system class loader as the alternative search path.
I know I could manually inspect the method call stack from my library code, but this seems a bit of a hack and dirty solution, so I am looking for alternative ideas.
Thanks in advance.

You can find the caller of a method by analyzing the stacktrace. Although not a good idea in general IMHO as it breaks encapsulation and sensitive to managed environments (proxies and so), it will work. Look more in this question.
What else you can do, and avoid the inefficiency of scanning all classpath every time, is to scan and save once on compile time with reflections, and than collect it on bootstrap time.
To scan and save once as XML:
new Reflections(...).save([somepath]);
And than collect without scanning:
Reflections reflections = Reflections.collect([somepath]);
If you're using Maven, you can automate it and do it as part of every build using the Reflections-Maven plugin.
Look for more info at collect pre scanned metadata in the Reflections UseCases Wiki page

Related

Is there a way to use annotations to build a java class, based on the properties of multiple java classes?

There is a way to use annotations to build a java class, based on the properties of multiple java classes?
I want to create a generic log history table for all operations and entities in a spring data jpa project, for this i was thinking if would be possible to get all properties of my entities at compilation time to generate this generic entity log class.
I don't know so much about annotations, but it is used to generate source files so i believe that isn't a impossible ideia.
Could someone give some direction? If it's possible would be nice to point me a good starting point. Or if there is something already done that match my intent.
Annotations themselves do not generate source files -- they signify pointcuts for other classes to enhance/enrich them, or as a marker interface.
However, you can definately use an annotation scanner to scan files and get all the fields.
Then what is left is generating a class from this.
(and then, compile it). Be aweare that this is a multi-step process, and it may seem a bit clunky: you create a file with name GenericEntity, make sure it's in the proper package (so start it with package my.fun.project, write the imports, and write the java class, all as strings which you send tot he file.
From you scan you have an annotated field / class and you can get the type and name (see the reflections library if necessary), and write that to your file as well. Then close the class properly with a }. Now it should a file which should not give compilation errors when loaded in your IDE.
This GenericEntityGenerator then has to be executed (using a maven plugin, probably) on your source code, probably during the generate-sources phase, after which your generated class will be compiled during the compile phase.... and bob's your uncle now.
In all, a fun project

Two JARs on buildpath with identical method names but different constructors. How can I specify which JAR's method to use?

I am building a tool from several different open source libraries. My buildpath is in the following order:
My first JAR file, stanford-corenlp-3.3.0.jar, contains a package called edu.stanford.nlp.process, which has the Morphology.class class.
My second JAR file, ark-tweet-nlp-0.3.2.jar, contains an identical package name (edu.stanford.nlp.process), and an identical class name Morphology.class.
In both JARS, inside their respective Morphology classes there exists a method called stem(). However, the constructors for these methods are different. I want to use the stem(String, String) method from my second JAR file, but since the import statement (import edu.stanford.nlp.process.Morphology;) does not specify which JAR to use, I get an error since it thinks the first JAR on the buildpath is the one I want to implement.
I don't want to change the order of my buildpath since it would throw off my other method calls.
How can I specify which JAR's Morphology class to use? Is there an import statement that specifies the JAR, along with the package.class?
EDIT: What about a way to combine my two JARs so that the two Morphology classes merge, giving me two methods with different constructors?
As several others pointed out above, it is possible to tweak Java's classloader mechanism to load classes from certain places… but this is not what you are looking for, believe me.
You hit a known problem. Instead of worrying how to tell Java to use a class from one JAR and not from the other, you should consider using a different version of ArkTweet.
Fetch the ArkTweet JAR from Maven Central. It does not contain Stanford classes.
When you notice that people package third-party classes in their JARs, I'd recommend pointing out to them that this is generally not a good idea and to encourage them to refrain from doing so. If a project provides a runnable fat-jar including all dependencies, that is fine. But, it should not be the only JAR they provide. A plain JAR or set of JARs without any third-party code should also be offered. In the rare cases that third-party code was modified and must be included, it should be done under the package namespace of the provider, not of the original third-party.
Finally, for real solutions to building modular Java applications and handling classloader isolation, check out one of the several OSGi implementations or project Jigsaw.
The default ClassLoader will only load one of the jars, ignoring the second one, so this can't be done out of the box. Maybe a custom ClassLoader can help.
For more info about ClassLoaders start from here.
Good luck!
EDIT: We are looking at some horrible packaging choices causing as side effect this Jar Hell here. The author of this "Ark Twitter" library decided it was a good idea to release a JAR artifact that includes a third party library (the Stanford NLP library). This leads to unnecessarily tight coupling between Ark Twitter and the specific version of the Stanford NLP library used by it. This is a very bad practice that should be discouraged in any case: this violates the whole idea about transitive dependencies.
EDIT (continued): One possible (and hopefully working) solution is to rebuild the Ark Twitter JAR so that it does not include the aforementioned library but only its own code (basically the cmu.arktweetnlp package only) and hoping that the version of NLP required by your project works with Ark Twitter. Ideally you should submit a pull request to the author of the library but in the meantime you can get away with un-jarring and re-jarring the existing JAR file.
EDIT 2: Looking at the JAR file again, it's much worse that I originally thought: ALL the dependencies are repackaged in the released JAR file. This is really the worst possible solution for releasing a library. Good luck.
I think your problem can be solved simply by using the lemma(String word, String tag) method in the current CoreNLP's Morphology class:
String word = ...;
String tag = ...;
String lemma = morphology.lemma(word, tag);
WordTag wt = new WordTag(lemma, tag);
When the class was revised a couple of years ago, the method you're looking for was deleted. The feeling was that with most of the Stanford NLP code moving to using CoreLabels, methods that return WordTag are less useful (though deleting all such methods is still a work in progress).
No there isn't. This is a weakness of Java, that cannot be simply solved. You should use only one of the libraries. Having both on the classpath will make java always select the first one.
This problem is named as Jar hell.
The order in the buildpath generally determines the order in which the classloader will search for the class. In general, though, you don't want duplicates of the same class in your build path--and it sure doesn't seem like ark-tweet-nlp-0.3.2.jar should have a edu.stanford package within it.
When you load a class, it's loaded at given address, and that address is then placed in the header of objects created from the class, so that (among other things) the methods in the class can be located.
So if you somehow load ClassA, with method abc(String), from zip file XYZ.zip, that loads into address 12345. Then (using a class loader trick) you load another ClassA, with method abc(String, String), from zip file ZYX.zip, and that loads into address 67890.
Now create an instance of the first ClassA. In its header will the class address 12345. If you could somehow attempt to invoke the method abc(String,String) on that class, that method would not be found in the class at 12345. (In actuality, you will not even be able to attempt the call, since the verifier will stop you because, to it, the two classes are entirely different and you're trying to use one where the other is called for, just as if their names were entirely different.)

Resolve implementation of an API dynamically

We have a system where we wanted to consume the implementation of our interfaces in a separate jar. The scenario is clients consume our work and provide their own implementation to override default implementation.
The question is what is the best way to bind/wire the actual implementation classes into our system?
One way is let spring wire the dependencies. It is currently not an option since all clients are not using spring.
Looked into some options like resolving interface implementation classes using reflection. Not very happy with this solution.
Another good old option is configure the class name in one of the property and let clients configure it. It looks good.
But wanted to find some elegant option if available.
Also any idea how SLF4J / EL resolves their implementations automatically?
I'd suggest you to use SPI (Service Provider Interface).
It requires creating file that enumerates all available implementations of specific service. This may be annoying. Fortunately you can use this open source library that does this work for you: http://code.google.com/p/spi/
Perhaps the Reflections library is what you are looking for.
Reflections scans your classpath, indexes the metadata, allows you to query it on runtime and may save and collect that information for many modules within your project.
Using Reflections you can query your metadata such as:
get all subtypes of some type
get all types/methods/fields annotated with some annotation, w/o annotation parameters matching
get all resources matching matching a regular expression

Is there a way to get all the classes that implement a certain method?

The title speaks for itself. The language is Java.
Yes, there is. This is however a tedious and expensive work. You need to crawl through all class files and all JAR files with help of ClassLoader#getResources() and a shot of java.io.File and load all classes of it with help of Class#forName() and finally check if the method is there by Class#getMethod().
However, there are 3rd party API's which can take the tedious work from hands, but it is still expensive, because loading a class would cause its static initializers being executed.
A cleaner way is to make use of annotations and annotate the methods in question and then make use of libraries which searches for classes/methods/fields based on the annotations, such as Google Reflections.
On the other hand, if the entire package name or the JAR file name is known beforehand, then the work will be less tedious and expensive (no need to do stuff recursively nor to load the all of the classes of entire classpath).
Update: I remember, I ever wrote sample code to achieve something like that, you can find it here. It's good to start with, you only need to change it a bit to check the method.
No, you can't, in general. If you could get a complete list of available classes you could check each of them using reflection - but you can't ask a classloader for a list of everything that's available. (For instance, it may be fetching classes over HTTP, and may not know all the files available.)
If you knew that you were interested in classes in a jar file, however, you could open the jar file, find all the class files within it and ask the classloader for those classes. It would be somewhat fiddly.
What's the bigger picture here? There may be a better way to approach the problem.
Also, in Eclipse, you can simply ask for this :
Clic on the method, and type Ctrl-T.

Implementing dynamic plugins in Java

I'd like to implement a dynamic plugin feature in a Java application. Ideally:
The application would define an interface Plugin with a method like getCapabilities().
A plugin would be a JAR pluginX.jar containing a class PluginXImpl implementing Plugin (and maybe some others).
The user would put pluginX.jar in a special directory or set a configuration parameter pointing to it. The user should not necessarily have to include pluginX.jar in their classpath.
The application would find PluginXImpl (maybe via the JAR manifest, maybe by reflection) and add it to a registry.
The client could get an instance of PluginXImpl, e.g., by invoking a method like getPluginWithCapabilities("X"). The user should not necessarily have to know the name of the plugin.
I've got a sense I should be able to do this with peaberry, but I can't make any sense of the documentation. I've invested some time in learning Guice, so my preferred answer would not be "use Spring Dynamic Modules."
Can anybody give me a simple idea of how to go about doing this using Guice/peaberry, OSGi, or just plain Java?
This is actually quite easy using plain Java means:
Since you don't want the user to configure the classpath before starting the application, I would first create a URLClassLoader with an array of URLs to the files in your plugin directory. Use File.listFiles to find all plugin jars and then File.toURI().toURL() to get a URL to each file. You should pass the system classloader (ClassLoader.getSystemClassLoader()) as a parent to your URLClassLoader.
If the plugin jars contain a configuration file in META-INF/services as described in the API documentation for java.util.ServiceLoader, you can now use ServiceLoader.load(Plugin.class, myUrlClassLoader) to obatin a service loader for your Plugin interface and call iterator() on it to get instances of all configured Plugin implementations.
You still have to provide your own wrapper around this to filter plugin capabilites, but that shouldn't be too much trouble, I suppose.
OSGI would be fine if you want to replace the plugins during runtime i.g. for bugfixes in a 24/7 environment. I played a while with OSGI but it took too much time, because it wasn't a requirement, and you need a plan b if you remove a bundle.
My humble solution then was, providing a properties files with the class names of plugin descriptor classes and let the server call them to register (including quering their capabilities).
This is obvious suboptimal but I can't wait to read the accepted answer.
Any chance you can leverage the Service Provider Interface?
The best way to implement plug-ins with Guice is with Multibindings. The linked page goes into detail on how to use multibindings to host plugins.
Apologize if you know this, but check out the forName method of Class. It is used at least in JDBC to dynamically load the DBMS-specific driver classes runtime by class name.
Then I guess it would not be difficult to enumerate all class/jar files in a directory, load each of them, and define an interface for a static method getCapabilities() (or any name you choose) that returns their capabilities/description in whatever terms and format that makes sense for your system.

Categories

Resources