Why isn't my ResourceBundleControlProvider being loaded? - java

I thought I would use the new ResourceBundleControlProvider framework in Java 8 to fix something which Oracle themselves will never fix - the default encoding used when reading resource bundles.
So I made a control:
package com.acme.resources;
import java.io.IOException;
import java.util.Locale;
import java.util.ResourceBundle;
public class AcmeResourceBundleControl extends ResourceBundle.Control
{
#Override
public ResourceBundle newBundle(String baseName, Locale locale, String format,
ClassLoader loader, boolean reload)
throws IllegalAccessException, InstantiationException, IOException
{
throw new UnsupportedOperationException("TODO");
}
}
Then I made a provider:
package com.acme.resources;
import java.util.ResourceBundle;
import java.util.spi.ResourceBundleControlProvider;
public class AcmeResourceBundleControlProvider implements ResourceBundleControlProvider
{
private static final ResourceBundle.Control CONTROL = new AcmeResourceBundleControl();
#Override
public ResourceBundle.Control getControl(String baseName)
{
if (baseName.startsWith("com.acme."))
{
return CONTROL;
}
else
{
return null;
}
}
}
Then in META-INF/services/java.util.spi.ResourceBundleControlProvider:
com.acme.resources.AcmeResourceBundleControlProvider
Then I just tried to run our application from IDEA and I find that it never loads my provider (otherwise the exception would be raised.)
I have checked the names and they all seem to match up. I have checked the compiler output directory IDEA is using and it does contain the service file. I wrote a simple test program which just tries to look up the service:
public static void main(String[] args)
{
for (ResourceBundleControlProvider provider :
ServiceLoader.load(ResourceBundleControlProvider.class))
{
System.out.println(provider.getClass());
}
}
This does print out one entry which is the name of my implementation class. So the issue is not in the service file.
If I breakpoint inside ResourceBundle, I seem to be able to access the custom provider class. Initial forays into the debugger show that ServiceLoader isn't finding any implementations, but I can't figure out why. I'm sure there is some dodgy class loader magic going on which results in not loading my class. :(
Some scary documentation on the Javadoc makes it sound like it might have to be installed as a global extension. If that really is the case, it's a bit of a shame, because it seemed like a useful way to override the default (and in my opinion broken) behaviour. But I also read the tutorial on the matter and it didn't seem to be describing anything like that (unless the good behaviour was pulled out of Java 8 at the very last minute and the docs are out of date!)

The tutorial does state that the JAR containing the ResourceBundleControlProvider must be in the JVM's system extension directory. Section 6 of the tutorial describes the requirement:
java -Djava.ext.dirs=lib -cp build RBCPTest
When you install a Java extension, you typically put the JAR file of the extension in the lib/ext directory of your JRE. However, this command specifies the directory that contains Java extensions with the system property java.ext.dirs.
The JavaDoc for ServiceLoader.loadInstalled() also states that providers on the application's class path are ignored.

Your problem is that the java.util.ResourceBundle that comes with the JVM does a ServiceLoader.loadInstalled(ResourceBundleControlProvider.class) to obtain a list of providers in the static initializer, and uses the thus obtained list ever after.

Related

Java compilation issue on Linux, using Windows specific

I encountered a compilation issue under Linux.
I'm compiling java programs on Linux; the target use is both Linux and Windows.
The code check if in there are platform specific classes (as shown in the code below).
So if the code is running under Linux, the specific Windows code will not be executed.
The issue arise on the use of a platform specific class Win32MediaTray
The compile error reported is
PrinterScanner.java:9: error: cannot find symbol
import sun.print.Win32MediaTray;
^
Is it possible to compile it under Linux? Or is it just impossible?
I can use some workaround (reflection?)
Needless to say that the compilation under Windows gives no errors.
Thankyou for your help.
For reference, the code behind this issue is the following:
private String getTrayName(Media media) {
String result = "id:" + media.getValue();
boolean isWin32 = media.getClass().getName().equals("sun.print.Win32MediaTray");
if (isWin32) {
Win32MediaTray w32 = (Win32MediaTray) media;
result = result + ",winId:" + w32.winID;
}
return result;
}
I believe that the class you are trying to use is sun.print.Win32MediaTray.
And the answer is that you cannot use it ... or compile a class that uses it ... on a Linux release of Java. That class is not included in the rt.jar file on a Linux release of Java.
Furthermore, you shouldn't be using it. The Java documentation makes it very clear that application code should not make use of classes in the sun.* package hierarchy.
If you have no choice but to do this, then your best bet is to use reflection to fetch the value of that w32Id field. You'll also need to deal with the case where the media object is not an instance of the Win32MediaTray class. Beware that you are relying on implementation details that Oracle says specifically that you shouldn't. There is a risk that they will change (without notice!) in some future Windows release.
The other alternatives are:
Implement your own platform adapter classes with a different one for each platform. These have to be compiled separately on each platform, and then dynamically loaded.
Implement separate codebases for each platform.
To make the compiler happy you could implement a dummy class named sun.print.Win32MediaTray and make it available both on the compile and runtime classpath. The class doesn't need to work, it only has to be API compatible (same signatures and return types, but in this case you only really need to extend Media and have a public int winID), so that you can satisfy both the compiler and the verifier.
At runtime, the version included in rt.jar should be loaded on Windows thanks to loading delegation. On Linux, the dummy version is the only one available, but you stated that the program checks for the platform and executes another branch of code, so it shouldn't cause your program to fail.
For example, with the following class on the classpath:
package sun.print;
import javax.print.attribute.standard.Media;
public class Win32MediaTray extends Media {
public int winID = 0xBADC0DE;
protected Win32MediaTray(int value) {
super(value);
}
static {
System.out.println("Won't see me on Windows");
}
}
I managed to run this program on Windows:
public class Main {
public static void main(String[] args) {
PrintService[] services = PrintServiceLookup.lookupPrintServices(null, null);
for (PrintService svc : services ) {
DocFlavor flavor = DocFlavor.SERVICE_FORMATTED.PAGEABLE;
Object o = svc.getSupportedAttributeValues(Media.class, flavor, null);
if (o != null && o.getClass().isArray()) {
for (Media media : (Media[]) o) {
if ( media instanceof Win32MediaTray )
System.out.println( ((Win32MediaTray) media).winID );
}
}
}
}
}
The message in the static initializer is not printed on Windows, because the definition that is actually loaded is the one from rt.jar. Obviously, the code can be compiled on any platform.
How about putting the code that uses windows-specific stuff into a separate jar; then you can compile and include that jar on windows, and leave it off systems otherwise.
One standard way to do this is to have one or more interfaces used by your application code; you can have a factory provide the implementing classes or inject them with Spring or whatever. But I think rather than "how can I compile this on Linux" your question should be "I have this Windows dependency in an app targeted at multiple operating systems, how do I handle it?"

How to access package via Package.getPackage(...)?

In my current project I would like to store some configuration data in a package annotation and to access it by some CDI producers. If the annotation is not found in the current package the producers will search for it upward in the package hierarchy.
So far so good. Unfortunately it seems so that I can access an existing package via Package.getPackage("my.package") only after the first access to one of its classes or interfaces.
The following example illustrates this behaviour:
Class in package a.b
package a.b;
public class ClassInMitte {
}
Example programm to access the package oben.mitte
package other;
public class Refl {
public static void main(String[] args)
{
Package viaName = Package.getPackage("a.b");
System.out.println(viaName.getName());
System.out.println(viaName.hashCode());
}
}
Running Refl results in a NullPointerException. But if I add new ClassInMitte() as first statement I can access the package information. Somehow I must access the content of a package before I can access the package information itself. This makes sense since otherwise the classloaders must scan the whole classpath while starting.
But netherless is there an easy way to access package information without accessing the content of the package before? I know I could use frameworks like reflections but a 'lightweight' solution would be my prefered solution.
Package.getPackage only returns packages that are already known to the current class loader, and the only way to do that is by loading a class from that package. It's basically a wrapper for ClassLoader.getPackage.

Different result between eclipse and text editor

I use the following code to test the BouncyCastle crypto library:
import java.security.Security;
public class SimpleTest {
public static void main(String[] args)
{
String providerName="BC";
if (Security.getProvider(providerName)==null)
{
System.out.println(providerName + "provider not installed");
}
else
{
System.out.println(providerName + "is installed");
}
}
}
However, Eclipse shows "BCprovider not installed", and my EditPlus shows "BC is installed".
What makes this difference?
There is a library missing somewhere in your eclipse that it can not find it. You need to revise your configuration in eclipse.
The difference?
I guess different java runtimes may have different error messages.
IMHO, unless you check underlying JRE's are the same, the difference is not between Eclipse and texteditor, but between Java SE 7u7 and Java SE 6u35 (for example).
Anyway, that is not your actual problem, but effectively deploying the provider jar in your classpath and registering it.
Actually the difference is likely to be in the static security provider configuration in Java. You need to register providers somehow to be able to use them. If you cannot change the Java runtime, you can register them dynamically if they have been signed properly.

NoClassDefFoundError

I have an issue where NoClasDefFoundError is being thrown. It puzzles me since I am using interfaces, and no class definition should be available. I have read through some posts which point to Classpath, but I don't believe that to be the issue here (although I may be wrong). I am using NetBeans 6.9.1 IDE.
I have created a sample setup to reproduce the issue. Four projects: Interfaces, Objects, Locator and Consumer. Below you will find the implementations.
At runtime consumer coplains about missing SomeObject implementation, which it should not be aware of since it is accepting interface.
Exception in thread "main"
java.lang.NoClassDefFoundError:
objects/SomeObject
What am I missing?
package interfaces;
public interface ISomeInterface { }
package objects;
import interfaces.ISomeInterface;
public class SomeObject implements ISomeInterface{ }
package locator;
import interfaces.ISomeInterface;
import objects.SomeObject;
public class Locator { public static ISomeInterface LocateImplementation() { return new SomeObject(); }}
package consumer;
import interfaces.ISomeInterface;
import locator.Locator;
public class Main { public static void main(String[] args) { ISomeInterface object = Locator.LocateImplementation(); }}
You can get a NoClassDefFoundError exception with interfaces just as you can with classes. Consider the "Class" in the name of the exception to be the .class file that is generated from compiling a class or interface, not a Java class.
This is saying that the class/interface objects.SomeObject isn't visible on your classpath. Check the location of that .class file and ensure that it's on your classpath - if you're positive it's there, give us some screen shots or something that might help to debug the problem.
Think of NoClassDefFoundError as a runtime linkage problem. JRE loaded one class (or an interface) and it references another class (or an interface), but that referenced class isn't found.
The only way this can happen if you have packaging/classpath issues such that your runtime environment doesn't reflect how things are at build time.
If you are launching this from IDE, make sure that you aren't ignoring any errors and launching anyway. Some classes will not be generated that way.
Usually I run into these problems not when a class is missing, but when there is an error in the static initializers.
Try running your code in a debugger, and set the exception breakpoint to break when any exception is thrown, whether caught or not. I bet you have an uncaught exception in the static initializer for some reason.
In the locateImplementation() method you are returning "new SomeObject()",
JVM needs to have its definition when called. I think it is missing.
You should check if your SomeObject class is in class path because -
Well the JVM will be running the below code -
ISomeInterface object = Locator.LocateImplementation();
and when it does that it will call Locator.LocateImplementation(). This code internally tries to instantiate your SomeObject class which it does not find in the classpath.
So your below understanding
It puzzles me since I am using
interfaces, and no class definition
should be available.
Is not really valid.
Any Interface must be declared inside class
public class Calbacks {
public interface IBaseFragmentInterface {
void NotifyMainActivity();
}
}

Overloaded package-private method causes compilation failure - Is this a JLS oddity or javac bug?

I've come across an oddity of the JLS, or a JavaC bug (not sure which). Please read the following and provide an explanation, citing JLS passage or Sun Bug ID, as appropriate.
Suppose I have a contrived project with code in three "modules" -
API - defines the framework API - think Servlet API
Impl - defines the API implementation - think Tomcat Servlet container
App - the application I wrote
Here are the classes in each module:
API - MessagePrinter.java
package api;
public class MessagePrinter {
public void print(String message) {
System.out.println("MESSAGE: " + message);
}
}
API - MessageHolder.java (yes, it references an "impl" class - more on this later)
package api;
import impl.MessagePrinterInternal;
public class MessageHolder {
private final String message;
public MessageHolder(String message) {
this.message = message;
}
public void print(MessagePrinter printer) {
printer.print(message);
}
/**
* NOTE: Package-Private visibility.
*/
void print(MessagePrinterInternal printer) {
printer.print(message);
}
}
Impl - MessagePrinterInternal.java - This class depends on an API class. As the name suggests, it is intended for "internal" use elsewhere in my little framework.
package impl;
import api.MessagePrinter;
/**
* An "internal" class, not meant to be added to your
* application classpath. Think the Tomcat Servlet API implementation classes.
*/
public class MessagePrinterInternal extends MessagePrinter {
public void print(String message) {
System.out.println("INTERNAL: " + message);
}
}
Finally, the sole class in the App module...MyApp.java
import api.MessageHolder;
import api.MessagePrinter;
public class MyApp {
public static void main(String[] args) {
MessageHolder holder = new MessageHolder("Hope this compiles");
holder.print(new MessagePrinter());
}
}
So, now I attempt to compile my little application, MyApp.java. Suppose my API jars are exported via a jar, say api.jar, and being a good citizen I only referencd that jar in my classpath - not the Impl class shiped in impl.jar.
Now, obviously there is a flaw in my framework design in that the API classes shouldn't have any dependency on "internal" implementation classes. However, what came as a surprise is that MyApp.java didn't compile at all.
javac -cp api.jar src\MyApp.java
src\MyApp.java:11: cannot access impl.MessagePrinterInternal class file for impl.MessagePrinterInternal not found
holder.print(new MessagePrinter());
^
1 error
The problem is that the compiler is trying to resolve the version print() to use, due to method overloading. However, the compilation error is somewhat unexpected, as one of the methods is package-private, and therefore not visible to MyApp.
So, is this a javac bug, or some oddity of the JLS?
Compiler: Sun javac 1.6.0_14
There is is nothing wrong with JLS or javac. Of course this doesn't compile, because your class MessageHolder references MessagePrinterInternal which is not on the compile classpath if I understand your explanation right. You have to break this reference into the implementation, for example with an interface in your API.
EDIT 1: For clarification: This has nothing to do with the package-visible method as you seem to think. The problem is that the type MessagePrinterInternal is needed for compilation, but you don't have it on the classpath. You cannot expect javac to compile source code when it doesn't have access to referenced classes.
EDIT 2: I reread the code again and this is what seems to be happening: When MyApp is compiled, it tries to load class MessageHolder. Class MessageHolder references MessagePrinterInternal, so it tries to load that also and fails. I am not sure that is specified in the JLS, it might also depend on the JVM. In my experience with the Sun JVM, you need to have at least all statically referenced classes available when a class is loaded; that includes the types of fields, anything in the method signatures, extended classses and implemented interfaces. You could argue that this is counter-intuitive, but I would respond that in general there is very little you do with a class where such information is missing: you cannot instantiate objects, you cannot use the metadata (the Class object) etc. With that background knowledge, I would say the behavior you see is expected.
First off I would expect the things in the api package to be interfaces rather than classes (based on the name). Once you do this the problem will go away since you cannot have package access in interfaces.
The next thing is that, AFAIK, this is a Java oddity (in that it doesn't do what you would want). If you get rid of the public method and make the package on private you will get the same thing.
Changing everything in the api package to be interfaces will fix your problem and give you a cleaner separation in your code.
I guess you can always argue that javac can be a little bit smarter, but it has to stop somewhere. it's not human, human can always be smarter than a compiler, you can always find examples that make perfect sense for a human but dumbfound a compiler.
I don't know the exact spec on this matter, and I doubt javac authors made any mistake here. but who cares? why not put all dependencies in the classpath, even if some of them are superficial? doing that consistently makes our lives a lot easier.

Categories

Resources