Java: Libraries within Libraries and Classpath Issues - java

We are having a discussion at work and an interesting point came up:
Say you are developing a small library, call it somelib. Say that somelib needs to do some logging, but you don't want to reinvent the wheel, so you decide to use a 3rd party logging library.
Additionally, you want to make integration of somelib as painless as possible, so you distribute a single JAR file (somelib.jar), which has the other logging JAR, call it logger.jar, embedded inside of it. Much like what Maven's jar-with-dependencies assembly does.
Now comes the issue. Since your product is a library, what if your customer is using somelib and also happen to be using a different version of the same logging library on their own. Now we have a classpath problem.
This seems to me like it would be a common problem for people that write libraries, so what is the typical solution?
Do they avoid using JAR bundling methods altogether? Even if we do that, there is still an issue with a user's code expecting version X of the logging library, and somelib's code expecting version Y.
Do they somehow insert a dummy package prefix so that the logger classes in somelib won't conflict?
What about dynamic loading of the logger library? (though this still has versioning problems from 1.)

You may consider to use OSGI or wait for JDK 8 and its Jigsaw project.

Related

Java - make a library and import optional

I have a library that I'm using in an Java application - it's important for certain functionality, but it's optional. Meaning that if the JAR file is not there, the program continues on without issue. I'd like to open source my program, but I can not include this library, which is necessary to compile the source code as I have numerous import statements to use the API. I don't want to maintain two code sets. What is the best way to remove the physical jar file from open source release, but still maintain the code to support it where other people could still compile it?
the typical approach taken is to define the wrapper API (i.e. interfaces) and include those interfaces in the open sourced code, and then provide configuration options where one can specify class names of classes that implement certain interfaces.
You will import API interfaces instead of importing classes directly into your open sourced code. This way, you are open sourcing the API but not the implementation of the parts that you do not want to open source or you cannot open source.
There are many examples, but take a look at JDBC API (interfaces) and JDBC drivers (implementation classes) for starters.
I was pretty much typing the same thing as smallworld with one addition. If this API were necessary you can use a project build tool like Maven to handle the dependencies on you project. If someone checks it out from source control with the pom they can download the dependencies for themselves and you don't have to include them in a source repo.
There's probably a number of ways to fix this, here's a couple I can think of:
If you have only a couple of methods you need to invoke in the 3rd party library, you could use reflection to invoke those methods. It creates really verbose code, that is hard to read though.
If you don't have too much of the API in the 3rd party library you use, you could also create a separate JAR file, containing just a non-functional shell of the classes in the library (just types with the same names and methods with the same signatures). You can then use this JAR to distribute and compile against. At run-time you'd replace it with the real JAR if available.
The most common way is probably to just create a wrapper API in a separate module/project for the code that is dependent on the 3rd party library, and possibly distribute a pre-built JAR. This might go against your wish to not maintain two code sets, but may prove to be the best and less painful solution in the long run.

Consistent OSGi import of 3rd party libraries

I've been developing OSGi modules but so far I've come across a number of issues when I've had to wrap existing jars. An example of this is the use of the Oracle database driver which, even though I've wrapped the jar as bundle, just refuses to work (cannot find the driver class even though its present). This is just a single example but I've had issues with other 3rd party libraries and was wondering if there's a best practice approach to using 3rd party libraries which works every time?
Jlove
The problem in your case is that jdbc uses a class from the java runtime to find the database driver (DriverManager.getConnection). This can not work as the database driver is not accessible from the system classloader (that loaded the DriverManager class).
A way that works in OSGi is to use a DataSource instead: http://docs.oracle.com/javase/tutorial/jdbc/basics/sqldatasources.html . There you simply create the data source using new and this of course works. The problem is that it makes your user bundle depend on the specific DB driver. So the best practice is to create the DataSource centrally and publish it as service.
You can find some more details in my Apache Karaf DB Tutorial (http://www.liquid-reality.de/display/liquid/2012/01/13/Apache+Karaf+Tutorial+Part+6+-+Database+Access).
Btw. In general this kind of factories are tpyically where libraries fail in OSGi. Every lib invents another and different factory system and most of the are incompatible with the restricted classloaders of OSGi. Luckily most libs are made OSGi ready nowadays. Most times this simply means that you can also call the factory with a concrete object that you can retrieve using an OSGi service.
My preferred approach is not to wrap the library, but to unjar it, add a manifest, and re-jar it. Jars-inside-jars tend to cause issues that are hard to debug. Unjar and re-jar can be automated with a simple ant script.
Also, I like to write MANIFEST.MF manually. If the library being wrapped is small, then it's easy enough to do that. Tools like bnd that generate MANIFEST.MF for you do not always give the right results, and if you rely on them too much you don't know what is going on under the hood.

JAR Hell Hacks for Non-OSGi Developers

Edit: After reviewing the play, the example I used below is a tad misleading. I am looking for the case where I have two 3rd party jars (not homegrown jars where I have access to the source code) that both depend on different versions of the same jar.
Original:
So I've recently familiarized myself with what OSGi is, and what ("JAR Hell") problems it addresses at its core. And, as intrigued as I am with it (and plan on migrating somewhere down the road), I just don't have it in me to begin learning what it will take to bring my projects over to it.
So, I'm now lamenting: if JAR hell happens to me, how do I solve this sans OSGi?
Obviously, the solution would almost have to involve writing my own ClassLoader, but I'm having a tough time visualizing how that would manifest itself, and more importantly, how that would solve the problem. I did some research and the consensus was that you have to write your own ClassLoader for every JAR you produce, but since I'm already having a tough time seeing that forest through the trees, that statement isn't sinking in with me.
Can someone provide a concrete example of how writing my own ClassLoader would put a band-aid on this gaping wound (I know, I know, the only real solution is OSGi)?
Say I write a new JAR called SuperJar-1.0.jar that does all sorts of amazing stuff. Say my SuperJar-1.0.jar has two other dependencies, Fizz-1.0.jar and Buzz-1.0.jar. Both Fizz and Buzz jars depend on log4j, except Fizz-1.0.jar depends on log4j-1.2.15.jar, whereas Buzz-1.0.jar depends on log4j-1.2.16.jar. Two different versions of the same jar.
How could a ClassLoader-based solution resolve this (in a nutshell)?
If you're asking this question from an "I'm building an app, how do I avoid this" problem rather than a "I need this particular solution" angle, I would strongly prefer the Maven approach - namely, to only resolve a single version of any given dependency. In the case of log4j 1.2.15 -> 1.2.16, this will work fine - you can include only 1.2.16. Since the older version is API compatible (it's just a patch release) it's extremely likely that Fizz 1.0 won't even notice that it's using a newer version than it expected.
You'll find that doing this will probably be way easier to debug issues with (nothing confuses me like having multiple versions of even classes or static fields floating around! Who knows which one you're dealing with!) and doesn't need any clever class loader hacks.
But, this is exactly what all the appservers out there have to deal with. Pretend that your Fizz and Buzz are web applications (WARs), and Super-Jar is you appserver. Super-Jar will arrange a class loader for each web app that "breaks" the normal delegation model, i.e. it will look locally (down) before looking up the hierarchy. Go read about it in any of the appservers's documentation. For example http://download.oracle.com/docs/cd/E19798-01/821-1752/beade/index.html.
Use log4j-1.2.16. It only contains bugfixes wrt 1.2.15.
If Fizz breaks with 1.2.16, fork and patch it, then submit those patches back to the author of Fizz.
The alternative of creating custom classloaders with special delegation logic is very complex and likely to cause you many problems. I don't see why you would want to do this rather than just use OSGi. Have you considered creating an embedded OSGi framework, so you don't have to convert your whole application?

Using serviceloader on android

I am very new to java and android development and to learn I am trying to start with an application to gather statistics and information like munin does. I am trying to be able to load "plugins" in my application. These plugins are already in the application but I don't want to have to invoke them all separately, but be able to iterate over them. I was trying to use serviceloader but could never get the META-INF/services into my apk. So I am wondering if it is possible to use serviceloader on android
Thanks
EDIT: I am asking about java.util.ServiceLoader, I think it should, but I can't figure out how to get my services folder into META-INF on the apk
There is an open bug report against this issue. See https://code.google.com/p/android/issues/detail?id=59658
The META-INF folder is deliberately excluded from the APK by ApkBuilder; the only comment in ApkBuilder.java is "we need to exclude some other folder (like /META-INF)" but there is no other explanation.
Even after adding META-INF with ant, you will still get in trouble if you want to use Proguard, which refuses to replace the content of META-INF/services/* files or rename them (that's another story, the author wants to keep Proguard agnostic).
However, people using maven may want to check https://github.com/pa314159/maven-android-plugin (the branch named "modified"), that tries to solve both issues. It is a fork from the original "android-maven-plugin" I modified one month ago for my own Android projects.
It also provides a patch for Proguard-4.7
Hope this helps, any feedback is welcome.
I've figured out a solution that may work for some situations. Instead of ServiceLoader, I'm using the org.openide.util.Lookup class / library that comes with NetBeans - it is a superset of ServiceLoader. It does not require NetBeans itself and seems to work ok with Eclipse. It is necessary to replace whatever ServiceLoader functionality you are using in your application with Lookup equivalents, and add the org-openide-util-lookup library. Then, you can just do something like this:
Lookup lookup = new ProxyLookup(Lookup.getDefault(),
Lookups.metaInfServices(myClass.getClassLoader(), "services/"));
And move your ServiceLoader files from META-INF/services/ to services/.
Note that, because of the ProxyLookup, this will continue to work on standard Java environments unchanged (i.e., in those cases it will continue to look in META-INF/services).
Here is a link to the documentation for the library: http://bits.netbeans.org/dev/javadoc/org-openide-util-lookup/org/openide/util/lookup/Lookups.html
UPDATE
After working with this for a couple of days, it seems to function well - I move between environments (standard Java and Android) and it works properly in each location. The primary downside is having to manually copy the files to the /services directory.
It is possible. You may want to check http://developer.android.com/reference/java/util/ServiceLoader.html
ServiceLoader is stuff from the Java language that is not really relevant on Android. I recommend not using it. If you just want to find a list of classes within your .apk to load, there are all kinds of ways to do this -- put in XMl file in res/xml that lists them, use reflection, annotations, etc.

Determining what minimal jars are needed for a feature

How do you determine what jars are needed for such and such feature of a framework? For example, what jars would be needed out of all those available for Spring in order to support only dependency injection?
There are tools that create minimal JARs by figuring out which classes are actually used in an application by statically analyzing the code, then creating a new JAR containing only those classes. (I recall using Zelix Classmaster to do this, but there are many alternatives.)
The problem with using these tools for a DI framework like Spring include:
The existing only trace static dependencies. If you dynamically load classes, you have to specifically tell the analyser about each one. DI frameworks in general, and Spring in particular is replete with dynamic loading, including dynamic loading that is opaque to application code.
The existing tools work by creating a new output JAR, not by telling you which of the input JARs are not used. While repackaging the JARs is OK if you are creating a shrink-wrapped application from a closed-source codebase, it is undesirable in general, and potentially problematic with some open-source licenses. Certainly you don't want to do this with Spring.
In theory, someone could write a tool to help. In practice, the tool would need to (for example) know how to extract dynamic class dependencies from Spring configurations expressed in annotations, XML and from bean descriptors created at runtime from higher order configuration (SpringSecurity does this for example). That is a big ask. And even then you have the problem that a "small" change to the wirings made on the installation platform could fail due to a required JARs having been left out by the JAR pruning process.
In my view, the more practical alternatives are:
If you use Maven / Ivy to manage your dependencies, look at the dependency graphs, strip out dependencies that appear to be no longer needed ... and test, test, test.
Manually strip out JARs that appear to be unused ... and test, test, test.
Don't worry about it. A moderate level of unused JAR cruft might add a second or three to deployment and webapp startup times, but that generally doesn't matter. (But if it does ... see above.)
This is why some older Java projects end up having 600 Jars and a 200 MB war file, for a 10,000 line application. Kind of a pain if you don't manage it carefully...
You should really ask the framework provider or read the documentation. Statically analyzing what jars are required might not be enough in some cases(dynamic loading) and sometimes you might end up with too many jars.
I once did some ftp helper stuff to a sort of "utility" library. It depended on some apache ftp jar. If you never used the ftp features in the library you would not need the ftp jar but statical analysis of the code might say you need it. This is something you should documents.

Categories

Resources