While searching through the Java Language Specification to answer this question, I learned that
Before a class is initialized, its direct superclass must be
initialized, but interfaces implemented by the class are not
initialized. Similarly, the superinterfaces of an interface are not
initialized before the interface is initialized.
For my own curiosity, I tried it and, as expected, the interface InterfaceType was not initialized.
public class Example {
public static void main(String[] args) throws Exception {
InterfaceType foo = new InterfaceTypeImpl();
foo.method();
}
}
class InterfaceTypeImpl implements InterfaceType {
#Override
public void method() {
System.out.println("implemented method");
}
}
class ClassInitializer {
static {
System.out.println("static initializer");
}
}
interface InterfaceType {
public static final ClassInitializer init = new ClassInitializer();
public void method();
}
This program prints
implemented method
However, if the interface declares a default method, then initialization does occur. Consider the InterfaceType interface given as
interface InterfaceType {
public static final ClassInitializer init = new ClassInitializer();
public default void method() {
System.out.println("default method");
}
}
then the same program above would print
static initializer
implemented method
In other words, the static field of the interface is initialized (step 9 in the Detailed Initialization Procedure) and the static initializer of the type being initialized is executed. This means that the interface was initialized.
I could not find anything in the JLS to indicate that this should happen. Don't get me wrong, I understand that this should happen in case the implementing class doesn't provide an implementation for the method, but what if it does? Is this condition missing from the Java Language Specification, did I miss something, or am I interpreting it wrongly?
This is a very interesting issue!
It seems like JLS section 12.4.1 ought to cover this definitively. However, the behavior of Oracle JDK and OpenJDK (javac and HotSpot) differs from what's specified here. In particular, the Example 12.4.1-3 from this section covers interface initialization. The example as follows:
interface I {
int i = 1, ii = Test.out("ii", 2);
}
interface J extends I {
int j = Test.out("j", 3), jj = Test.out("jj", 4);
}
interface K extends J {
int k = Test.out("k", 5);
}
class Test {
public static void main(String[] args) {
System.out.println(J.i);
System.out.println(K.j);
}
static int out(String s, int i) {
System.out.println(s + "=" + i);
return i;
}
}
Its expected output is:
1
j=3
jj=4
3
and indeed I get the expected output. However, if a default method is added to interface I,
interface I {
int i = 1, ii = Test.out("ii", 2);
default void method() { } // causes initialization!
}
the output changes to:
1
ii=2
j=3
jj=4
3
which clearly indicates that interface I is being initialized where it wasn't before! The mere presence of the default method is enough to trigger the initialization. The default method doesn't have to be called or overridden or even mentioned, nor does the presence of an abstract method trigger initialization.
My speculation is that the HotSpot implementation wanted to avoid adding class/interface initialization checking into the critical path of the invokevirtual call. Prior to Java 8 and default methods, invokevirtual could never end up executing code in an interface, so this didn't arise. One might think this is part of the class/interface preparation stage (JLS 12.3.2) which initializes things like method tables. But perhaps this went too far and accidentally did full initialization instead.
I've raised this question on the OpenJDK compiler-dev mailing list. There's been a reply from Alex Buckley (editor of the JLS) in which he raises more questions directed at the JVM and lambda implementation teams. He also notes that there's a bug in the spec here where it says "T is a class and a static method declared by T is invoked" should also apply if T is an interface. So, it might be that there are both specification and HotSpot bugs here.
Disclosure: I work for Oracle on OpenJDK. If people think this gives me an unfair advantage at getting the bounty attached to this question, I'm willing to be flexible about it.
The interface is not initialized because the constant field InterfaceType.init , which is being initialized by non constant value (method call), is not used anywhere.
It is known at compile time that constant field of interface is not used anywhere, and the interface is not containing any default method (In java-8) so there is no need to initialize or load the interface.
Interface will be initialized in following cases,
constant field is used in your code.
Interface contains a default method (Java 8)
In case of Default Methods, You are implementing InterfaceType. So, If InterfaceType will contain any default methods, It will be INHERITED (used) in implementing class. And Initialization will be into the picture.
But, If you are accessing constant field of interface (which is initialized in normal way), The interface initialization is not required.
Consider following code.
public class Example {
public static void main(String[] args) throws Exception {
InterfaceType foo = new InterfaceTypeImpl();
System.out.println(InterfaceType.init);
foo.method();
}
}
class InterfaceTypeImpl implements InterfaceType {
#Override
public void method() {
System.out.println("implemented method");
}
}
class ClassInitializer {
static {
System.out.println("static initializer");
}
}
interface InterfaceType {
public static final ClassInitializer init = new ClassInitializer();
public void method();
}
In above case, Interface will be initialized and loaded because you are using the field InterfaceType.init.
I am not giving the default method example as you already given that in your question.
Java language specification and example is given in JLS 12.4.1 (Example does not contain default methods.)
I can not find JLS for Default methods, there may be two possibilities
Java people forgot to consider the case of default method. (Specification Doc bug.)
They just refer the default methods as non-constant member of
interface. (But mentioned no where, again Specification Doc bug.)
The instanceKlass.cpp file from the OpenJDK contains the initialization method InstanceKlass::initialize_impl that corresponds to the Detailed Initialization Procedure in the JLS, which is analogously found in the Initialization section in the JVM Spec.
It contains a new step that is not mentioned in the JLS and not in the JVM book that is referred to in the code:
// refer to the JVM book page 47 for description of steps
...
if (this_oop->has_default_methods()) {
// Step 7.5: initialize any interfaces which have default methods
for (int i = 0; i < this_oop->local_interfaces()->length(); ++i) {
Klass* iface = this_oop->local_interfaces()->at(i);
InstanceKlass* ik = InstanceKlass::cast(iface);
if (ik->has_default_methods() && ik->should_be_initialized()) {
ik->initialize(THREAD);
....
}
}
}
So this initialization has been implemented explicitly as a new Step 7.5. This indicates that this implementation followed some specification, but it seems that the written specification on the website has not been updated accordingly.
EDIT: As a reference, the commit (from October 2012!) where the respective step has been included in the implementation: http://hg.openjdk.java.net/jdk8/build/hotspot/rev/4735d2c84362
EDIT2: Coincidentally, I found this Document about default methods in hotspot which contains an interesting side note at the end:
3.7 Miscellaneous
Because interfaces now have bytecode in them, we must initialize them at the
time that an implementing class is initialized.
I'll try to make a case that an interface initialization should not cause any side-channel side effects that the subtypes depend on, therefore, whether this is a bug or not, or whichever way the Java fixes it, it should not matter to the application in which order interfaces are initialized.
In the case of a class, it is well accepted that it can cause side effects that subclasses depend on. For example
class Foo{
static{
Bank.deposit($1000);
...
Any subclass of Foo would expect that they'll see $1000 in the bank, anywhere in the subclass code. Therefore the superclass is initialized prior to the subclass.
Shouldn't we do the same thing for superintefaces as well? Unfortunately, the order of superinterfaces are not supposed to be significant, therefore there is no well defined order in which to initialize them.
So we better not establish this kind of side effects in interface initializations. After all, interface is not meant for these features (static fields/methods) we pile on for convenience.
Therefore if we follow that principle, it'll be no concern to us in which order interfaces are initialized.
In English, a homograph pair is two words that have the same spelling but different meanings.
In software engineering, a pair of homographic methods is two methods with the same name but different requirements. Let's see a contrived example to make the question as clear as possible:
interface I1 {
/** return 1 */
int f()
}
interface I2 {
/** return 2*/
int f()
}
interface I12 extends I1, I2 {}
How can I implement I12? C# has a way to do this, but Java doesn't. So the only way around is a hack. How can it be done with reflection/bytecode tricks/etc most reliably (i.e it doesn't have to be a perfect solution, I just want the one that works the best)?
Note that some existing closed source massive piece of legacy code which I cannot legally reverse engineer requires a parameter of type I12 and delegates the I12 both to code that has I1 as a parameter, and code that has I2 as a parameter. So basically I need to make an instance of I12 that knows when it should act as I1 and when it should act as I2, which I believe can be done by looking at the bytecode at runtime of the immediate caller. We can assume that no reflection is used by the callers, because this is straightforward code. The problem is that the author of I12 didn't expect that Java merges f from both interfaces, so now I have to come up with the best hack around the problem. Nothing calls I12.f (obviously if the author wrote some code that actually calls I12.f, he would have noticed the problem before selling it).
Note that I'm actually looking for an answer to this question, not how to restructure the code that I can't change. I'm looking for the best heuristic possible or an exact solution if one exists. See Gray's answer for a valid example (I'm sure there are more robust solutions).
Here is a concrete example of how the problem of homographic methods within two interfaces can happen. And here is another concrete example:
I have the following 6 simple classes/interfaces. It resembles a business around a theater and the artists who perform in it. For simplicity and to be specific, let's assume they are all created by different people.
Set represents a set, as in set theory:
interface Set {
/** Complements this set,
i.e: all elements in the set are removed,
and all other elements in the universe are added. */
public void complement();
/** Remove an arbitrary element from the set */
public void remove();
public boolean empty();
}
HRDepartment uses Set to represent employees. It uses a sophisticated process to decode which employees to hire/fire:
import java.util.Random;
class HRDepartment {
private Random random = new Random();
private Set employees;
public HRDepartment(Set employees) {
this.employees = employees;
}
public void doHiringAndLayingoffProcess() {
if (random.nextBoolean())
employees.complement();
else
employees.remove();
if (employees.empty())
employees.complement();
}
}
The universe of a Set of employees would probably be the employees who have applied to the employer. So when complement is called on that set, all the existing employees are fired, and all the other ones that applied previously are hired.
Artist represents an artist, such as a musician or an actor. An artist has an ego. This ego can increase when others compliment him:
interface Artist {
/** Complements the artist. Increases ego. */
public void complement();
public int getEgo();
}
Theater makes an Artist perform, which possibly causes the Artist to be complemented. The theater's audience can judge the artist between performances. The higher the ego of the performer, the more likely the audience will like the Artist, but if the ego goes beyond a certain point, the artist will be viewed negatively by the audience:
import java.util.Random;
public class Theater {
private Artist artist;
private Random random = new Random();
public Theater(Artist artist) {
this.artist = artist;
}
public void perform() {
if (random.nextBoolean())
artist.complement();
}
public boolean judge() {
int ego = artist.getEgo();
if (ego > 10)
return false;
return (ego - random.nextInt(15) > 0);
}
}
ArtistSet is simply an Artist and a Set:
/** A set of associated artists, e.g: a band. */
interface ArtistSet extends Set, Artist {
}
TheaterManager runs the show. If the theater's audience judges the artist negatively, the theater talks to the HR department, which will in turn fire artists, hire new ones, etc:
class TheaterManager {
private Theater theater;
private HRDepartment hr;
public TheaterManager(ArtistSet artists) {
this.theater = new Theater(artists);
this.hr = new HRDepartment(artists);
}
public void runShow() {
theater.perform();
if (!theater.judge()) {
hr.doHiringAndLayingoffProcess();
}
}
}
The problem becomes clear once you try to implement an ArtistSet: both superinterfaces specify that complement should do something else, so you have to implement two complement methods with the same signature within the same class, somehow. Artist.complement is a homograph of Set.complement.
New idea, kinda messy...
public class MyArtistSet implements ArtistSet {
public void complement() {
StackTraceElement[] stackTraceElements = Thread.currentThread().getStackTrace();
// the last element in stackTraceElements is the least recent method invocation
// so we want the one near the top, probably index 1, but you might have to play
// with it to figure it out: could do something like this
boolean callCameFromHR = false;
boolean callCameFromTheatre = false;
for(int i = 0; i < 3; i++) {
if(stackTraceElements[i].getClassName().contains("Theatre")) {
callCameFromTheatre = true;
}
if(stackTraceElements[i].getClassName().contains("HRDepartment")) {
callCameFromHR = true;
}
}
if(callCameFromHR && callCameFromTheatre) {
// problem
}
else if(callCameFromHR) {
// respond one way
}
else if(callCameFromTheatre) {
// respond another way
}
else {
// it didn't come from either
}
}
}
Despite Gray Kemmey's valiant attempt, I would say the problem as you have stated it is not solvable. As a general rule given an ArtistSet you cannot know whether the code calling it was expecting an Artist or a Set.
Furthermore, even if you could, according to your comments on various other answers, you actually have a requirement to pass an ArtistSet to a vendor-supplied function, meaning that function has not given the compiler or humans any clue as to what it is expecting. You are completely out of luck for any sort of technically correct answer.
As practical programming matter for getting the job done, I would do the following (in this order):
File a bug report with whoever created an interface requiring ArtistSet and whoever generated the ArtistSet interface itself.
File a support request with the vendor supplying the function requiring an ArtistSet and ask them what they expect the behavior of complement() to be.
Implement the complement() function to throw an exception.
public class Sybil implements ArtistSet {
public void complement() {
throw new UnsupportedOperationException('What am I supposed to do');
}
...
}
Because seriously, you don't know what to do. What would be the correct thing to do when called like this (and how do you know for sure)?
class TalentAgent {
public void pr(ArtistSet artistsSet) {
artistSet.complement();
}
}
By throwing an exception you have a chance at getting a stack trace that gives you a clue as to which of the two behaviors the caller is expecting. With luck nobody calls that function, which is why the vendor got as far as shipping code with this problem. With less luck but still some, they handle the exception. If not even that, well, at least now you will have a stack trace you can review to decide what the caller was really expecting and possibly implement that (though I shudder to think of perpetuation a bug that way, I've explained how I would do it in this other answer).
BTW, for the rest of the implementation I would delegate everything to actual Artist and Set objects passed in via the constructor so this can be easily pulled apart later.
How to Solve For Your Specific Case
ArtistSet is simply an Artist and a Set:
/** A set of associated artists, e.g: a band. */
interface ArtistSet extends Set, Artist { }
From an OO perspective, that's not a useful declaration. An Artist is a type of noun, a "thing" that has defined properties and actions (methods).
A Set is an aggregate of things - a collection of unique elements. Instead, try:
ArtistSet is simply a Set of Artists.
/** A set of associated artists, e.g: a band. */
interface ArtistSet extends Set<Artist> { };
Then, for your particular case, the homonym methods are on interfaces that are never combined within the one type, so you have no clash and can program away...
Further, you don't need to declare ArtistSet because you aren't actually extending Set with any new declarations. You're just instantiating a type parameter, so you can replace all usage with Set<Artist>.
How to Solve For the More General Case
For this clash the method names don't even need to be homographic in the english language sense - they can be the same word with same english meaning, used in different contexts in java. Clash occurs if you have two interfaces that you wish to apply to a type but they contain the same declaration (e.g. method signature) with conflicting semantic/processing definitions.
Java does not allow you to implement the behaviour you request - you must have an alternative work-around. Java doesn't allow a class to provide multiple implementations for the same method signature from multiple different interfaces (implementing the same method multiple times with some form of qualification/alias/annotation to distinguish). See Java overriding two interfaces, clash of method names,
Java - Method name collision in interface implementation
Avoid use of Inheritence (extends or implements) and instead use Object Composition (see http://en.wikipedia.org/wiki/Composition_over_inheritance)
E.g. If you have the following
interface TV {
void switchOn();
void switchOff();
void changeChannel(int ChannelNumber);
}
interface Video {
void switchOn();
void switchOff();
void eject();
void play();
void stop();
}
Then if you have an object that is both of these things, you can combine the two in a new interface (optional) or type:
interface TVVideo {
TV getTv();
Video getVideo();
}
class TVVideoImpl implements TVVideo {
TV tv;
Video video;
public TVVideoImpl() {
tv = new SomeTVImpl(....);
video = new SomeVideoImpl(....);
}
TV getTv() { return tv };
Video getVideo() { return video };
}
How can I implement a class which has two superinterfaces having homographic methods?
In Java, a class which has two superinterfaces having homographic methods is considered to have only one implementation of this method. (See the Java Language Specification section 8.4.8). This allows classes to conveniently inherit from multiple interfaces that all implement the same other interface and only implement the function once. This also simplifies the language because this eliminates the need for syntax and method dispatching support for distinguishing between homographic methods based on which interface they came from.
So the correct way to implement a class which has two superinterfaces having homographic methods is to provide a single method that satisfies the contracts of both superinterfaces.
C# has a way to do this. How can it be done in Java? Is there no construct for this?
C# defines interfaces differently than Java does and therefore has capabilities that Java does not.
In Java, the language construct is defined to mean that all interfaces get the same single implementation of homographic methods. There is no Java language construct for creating alternate behaviors of multiply-inherited interface functions based on the compile time class of the object. This was a conscious choice made by the Java language designers.
If not, how can it be done with reflection/bytecode tricks/etc most reliably?
"It" cannot be done with reflection/bytecode tricks because the information needed to decide which interface's version of the homographic method to choose is not necessarily present in the Java source code. Given:
interface I1 {
// return ASCII character code of first character of String s
int f(String s); // f("Hello") returns 72
}
interface I2 {
// return number of characters in String s
int f(String s); // f("Hello") returns 5
}
interface I12 extends I1, I2 {}
public class C {
public static int f1(I1 i, String s) { return i.f(s); } // f1( i, "Hi") == 72
public static int f2(I2 i, String s) { return i.f(s); } // f2( i, "Hi") == 2
public static int f12(I12 i, String s) { return i.f(s);} // f12(i, "Hi") == ???
}
According to the Java language specification, a class implementing I12 must do so in such a way that C.f1(), C.f2(), and C.f12() return the exact same result when called with the same arguments. If C.f12(i, "Hello") sometimes returned 72 and sometimes returned 5 based on how C.f12() were called, that would be a serious bug in the program and a violation of the language specification.
Furthermore, if the author of class C expected some kind of consistent behavior out of f12(), there is no bytecode or other information in class C that indicates whether it should be the behavior of I1.f(s) or I2.f(s). If the author of C.f12() had in mind C.f("Hello") should return 5 or 72, there's no way to tell from looking at the code.
Fine, so I cannot in general provide different behaviors for homographic functions using bytecode tricks, but I really have a class like my example class TheaterManager. What should I do to implement ArtistSet.complement()?
The actual answer to the actual question you asked is to create your own substitute implementation of TheaterManager that does not require an ArtistSet. You do not need to change the library's implementation, you need to write your own.
The actual answer to the other example question you cite is basically "delegate I12.f() to I2.f()" because no function that receives an I12 object goes on to pass that object to a function expecting an I1 object.
Stack Overflow is only for questions and answers of general interest
One of the stated reasons to reject a question here is that "it is only relevant to an extraordinarily narrow situation that is not generally applicable to the worldwide audience of the internet." Because we want to be helpful, the preferred way to handle such narrow questions is to revise the question to be more broadly applicable. For this question I have taken the approach of answering the broadly applicable version of the question rather than actually editing the question to remove what makes it unique to your situation.
In the real world of commercial programming any Java library that has a broken interface like I12 would not accumulate even dozens of commercial clients unless it could be used by implementing I12.f() in one of these ways:
delegate to I1.f()
delegate to I2.f()
do nothing
throw an exception
pick one of the above strategies on a per-call basis based on the values of some members of the I12 object
If thousands or even only a handful of companies are using this part of this library in Java then you can be assured they have used one of those solutions. If the library is not in use by even a handful of companies then the question is too narrow for Stack Overflow.
OK, TheaterManager was an oversimplification. In the real case it is too hard for me to replace that class and I don't like any of the practical solutions you've outlined. Can't I just fix this with fancy JVM tricks?
It depends on what you want to fix. If you want to fix your specific library by mapping all the calls to I12.f() and then parsing the the stack to determine the caller and choosing a behavior based on that. You can access the stack via Thread.currentThread().getStackTrace().
If you run across a caller you do not recognize you may have a hard time figuring out which version they want. For example you may be called from a generic (as was the actual case in the other specific example you gave), like:
public class TalentAgent<T extends Artist> {
public static void butterUp(List<T> people) {
for (T a: people) {
a.complement()
}
}
}
In Java, generics are implemented as erasures, meaning all type information is thrown away at compile time. There is no class or method signature difference between a TalentAgent<Artist> and a TalentAgent<Set> and the formal type of the people parameter is just List. There is nothing in the class interface or method signature of the caller to tell you what to do by looking at the stack.
So you would need to implement multiple strategies, one of which would be decompiling the code of the calling method looking for clues that the caller is expecting one class or another. It would have to be very sophisticated to cover all the ways this could happen, because among other things you have no way of knowing in advance what class it actually expecting, only that it is expecting a class that implements one of the interfaces.
There are mature and extremely sophisticated open source bytecode utilities, including one that automatically generates a proxy for a given class at runtime (written long before there was support for that in the Java language), so the fact that there isn't an open source utility for handling this case speaks volumes about the ratio of effort to usefulness in pursuing this approach.
Okay, after much research, I have another idea to fully accommodate the situation. Since you can't directly modify their code... you can force the modifications yourself.
DISCLAIMER: The example code below is very simplified. My intention is to show the general method of how this might be done, not to produce functioning source code to do it (since that's a project in itself).
The issue is that the methods are homographic. So to solve it, we can just rename the methods. Simple, right? We can use the Instrument package to achieve this. As you'll see in the linked documentation, it allows you to make an "agent" which can directly modify classes as they're loaded or re-modify them even if they've already been loaded.
Essentially, this requires you to make two classes:
An agent class which preprocesses and reloads classes; and,
A ClassFileTransformer implementation which specifies the changes you want to make.
The agent class must have either a premain() or agentmain() method defined, based on whether you want it to begin its processing as the JVM starts up or after it is already running. Examples of this are in the package documentation above. These methods give you access to an Instrumenation instance, which will allow you to register your ClassFileTransformer. So it might look something like this:
InterfaceFixAgent.java
public class InterfaceFixAgent {
public static void premain(String agentArgs, Instrumentation inst) {
//Register an ArtistTransformer
inst.addTransformer(new ArtistTransformer());
//In case the Artist interface or its subclasses
//have already been loaded by the JVM
try {
for(Class<?> clazz : inst.getAllLoadedClasses()) {
if(Artist.class.isAssignableFrom(clazz)) {
inst.retransformClasses(clazz);
}
}
}
catch(UnmodifiableClassException e) {
//TODO logging
e.printStackTrace();
}
}
}
ArtistTransformer.java
public class ArtistTransformer implements ClassFileTransformer {
private static final byte[] BYTES_TO_REPLACE = "complement".getBytes();
private static final byte[] BYTES_TO_INSERT = "compliment".getBytes();
#Override
public byte[] transform(ClassLoader loader, String className,
Class<?> classBeingRedefined, ProtectionDomain protectionDomain,
byte[] classfileBuffer) throws IllegalClassFormatException {
if(Artist.class.isAssignableFrom(classBeingRedefined)) {
//Loop through the classfileBuffer, find sequences of bytes
//which match BYTES_TO_REPLACE, replace with BYTES_TO_INSERT
}
else return classfileBuffer;
}
This is, of course, simplified. It will replace the word "complement" with "compliment" in any class which extends or implements Artist, so you will very likely need to further conditionalize it (for example, if Artist.class.isAssignableFrom(classBeingRedefined) && Set.class.isAssignableFrom(classBeingRedefined), you obviously don't want to replace every instance of "complement" with "compliment", as the "complement" for Set is perfectly legitimate).
So, now we've corrected the Artist interface and its implementations. The typo is gone, the methods have two different names, so there is no homography. This allows us to have two different implementations in our CommunityTheatre class now, each of which will properly implement/override the methods from the ArtistSet.
Unfortunately, we've now created another (possibly even bigger) issue. We've just broken all the previously-legitimate references to complement() from classes implementing Artist. To fix this, we will need to create another ClassFileTransformer which replaces these calls with our new method name.
This is somewhat more difficult, but not impossible. Essentially, the new ClassFileTransformer (let's say we call it the OldComplementTransformer) will have to perform the following steps:
Find the same string of bytes as before (the one representing the old method name, "complement");
Get the bytes before this which represent the object reference calling the method;
Convert those bytes into an Object;
Check to see if that Object is an Artist; and,
If so, replace those bytes with the new method name.
Once you've made this second transformer, you can modify the InterfaceFixAgent to accommodate it. (I also simplified the retransformClasses() call, since in the example above we perform the needed check within the transformer itself.)
InterfaceFixAgent.java (modified)
public class InterfaceFixAgent {
public static void premain(String agentArgs, Instrumentation inst) {
//Register our transformers
inst.addTransformer(new ArtistTransformer());
inst.addTransformer(new OldComplementTransformer());
//Retransform the classes that have already been loaded
try {
inst.retransformClasses(inst.getAllLoadedClasses());
}
catch(UnmodifiableClassException e) {
//TODO logging
e.printStackTrace();
}
}
}
And now... our program is good to go. It certainly wouldn't be easy to code, and it will be utter hell to QA and test. But it's certainly robust, and it solves the issue. (Technically, I suppose it avoids the issue by removing it, but... I'll take what I can get.)
Other ways we might have solved the problem:
The Unsafe API
A native method written in C
Both of these would allow you to directly manipulate bytes in memory. A solution could certainly be designed around these, but I believe it would be much more difficult and much less safe. So I went with the route above.
I think this solution could even be made more generic into an incredibly useful library for integrating code bases. Specify which interface and which method you need refactored in a variable, a command line argument, or a configuration file, and let her loose. The library that reconciles conflicting interfaces in Java at runtime. (Of course, I think it would still be better for everyone if they just fixed the bug in Java 8.)
Here's what I'd do to remove the ambiguity:
interface Artist {
void complement(); // [SIC] from OP, really "compliment"
int getEgo();
}
interface Set {
void complement(); // as in Set Theory
void remove();
boolean empty(); // [SIC] from OP, I prefer: isEmpty()
}
/**
* This class is to represent a Set of Artists (as a group) -OR-
* act like a single Artist (with some aggregate behavior). I
* choose to implement NEITHER interface so that a caller is
* forced to designate, for any given operation, which type's
* behavior is desired.
*/
class GroupOfArtists { // does NOT implement either
private final Set setBehavior = new Set() {
#Override public void remove() { /*...*/ }
#Override public boolean empty() { return true; /* TODO */ }
#Override public void complement() {
// implement Set-specific behavior
}
};
private final Artist artistBehavior = new Artist() {
#Override public int getEgo() { return Integer.MAX_VALUE; /* TODO */ }
#Override public void complement() {
// implement Artist-specific behavior
}
};
Set asSet() {
return setBehavior;
}
Artist asArtist() {
return artistBehavior;
}
}
If I were passing this object to the HR department, I'd actually give it the value returned from asSet() to hire/fire the entire group.
If I were passing this object to the Theater for a performance, I'd actually give it the value returned from asArtist() to be treated as talent.
This works as long as YOU are in control of talking to the different components directly...
But I realize that your problem is a single third-party vendor has created a component, TheaterManager, that expects one object for both of these functions and it won't know about the asSet and asArtist methods. The problem is not with the vendors that created Set and Artist, it is the vendor that combined them instead of using a Visitor pattern or just specifying an interface that would mirror the asSet and asArtist methods I made above. If you can convince your one vendor "C" to fix that interface, your world will be a lot happier.
Good luck!
Dog, I have a strong feeling you are leaving out some details that are crucial to the solution. This often happens on SO because
people need to leave out a lot of details to get the question to a reasonable size and scope,
people do not fully understand the problem and the solution (which is why they are asking for help) so they cannot be sure which details are important and which are not, and
the reason the person cannot solve the problem on their own is because they do not understand the importance of this detail, which is the same reason they left it out.
I've said in another answer what I would do about ArtistSet. But keeping the above in mind I will give you another solution to a slightly different problem. Lets say I had code from a bad vendor:
package com.bad;
public interface IAlpha {
public String getName();
// Sort Alphabetically by Name
public int compareTo(IAlpha other);
}
This is bad because you should declare a function returning a Comparator<IAlpha> to implement the sorting strategy, but whatever. Now I get code from a worse company:
package com.worse;
import com.bad.IAlpha;
// an Alpha ordered by name length
public interface ISybil extends IAlpha, Comparable<IAlpha> {}
This is worse, because it is totally wrong, in that it overrides behavior incompatibly. An ISybil orders itself by name length, but an IAlpha orders itself alphabetically, except an ISybil is an IAlpha. They were mislead by the anti-pattern of IAlpha when they could and should have done something like:
public interface ISybil extends IAlpha {
public Comparator<IAlpha> getLengthComparator();
}
However, this situation is still much better than ArtistSet because here the expected behavior is documented. There is no confusion about what ISybil.compareTo() should do. So I would create classes as follows. A Sybil class that implements compareTo() as com.worse expects and delegates everything else:
package com.hack;
import com.bad.IAlpha;
import com.worse.ISybil;
public class Sybil implements ISybil {
private final Alpha delegate;
public Sybil(Alpha delegate) { this.delegate = delegate; }
public Alpha getAlpha() { return delegate; }
public String getName() { return delegate.getName(); }
public int compareTo(IAlpha other) {
return delegate.getName().length() - other.getName().length();
}
}
and an Alpha class that works exactly like com.bad said it should:
package com.hack;
import com.bad.IAlpha;
public class Alpha implements IAlpha {
private String name;
private final Sybil sybil;
public Alpha(String name) {
this.name = name;
this.sybil = new Sybil(this);
}
// Sort Alphabetically
public int compareTo(IAlpha other) {
return name.compareTo(other.getName());
}
public String getName() { return name; }
public Sybil getSybil() { return sybil; }
}
Note that I included type conversion methods: Alpha.getSybil() and Sybil.getAlpha(). This is so I could create my own wrappers around any com.worse vendor's methods that take or return Sybils so I can avoid polluting my code or any other vendor's code with com.worse's breakage. So if com.worse had:
public ISybil breakage(ISybil broken);
I could write a function
public Alpha safeDelegateBreakage(Alpha alpha) {
return breakage(alpha.getSybil).getAlpha();
}
and be done with it, except I would still complain vociferously to com.worse and politely to com.bad.
Given the following code:
LinkedList list = mock(LinkedList.class);
doCallRealMethod().when(list).clear();
list.clear();
by executing this test, a NullPointerException is thrown from first line in LinkedList#clear:
public void clear() {
Entry<E> e = header.next;
while (e != header) {
Entry<E> next = e.next;
//Code omitted.
but header has been instantiated before:
private transient Entry<E> header = new Entry<E>(null, null, null);
Could someone please explain what's happening during mock creation?
####### UPDATE. ######
Having read all answers especially Ajay's one, I looked into Objenesis source code and find out that it's using Reflection API to create the proxy instance (through CGLIB) and therefore bypassing all constructors in the hierarchy until java.lang.Object.
Here is the sample code to simulate the issue:
public class ReflectionConstructorTest {
#Test
public void testAgain() {
try {
//java.lang.Object default constructor
Constructor javaLangObjectConstructor = Object.class
.getConstructor((Class[]) null);
Constructor mungedConstructor = ReflectionFactory
.getReflectionFactory()
.newConstructorForSerialization(CustomClient.class, javaLangObjectConstructor);
mungedConstructor.setAccessible(true);
//Creates new client instance without calling its constructor
//Thus "name" is not initialized.
Object client = mungedConstructor.newInstance((Object[]) null);
//this will print "CustomClient"
System.out.println(client.getClass());
//this will print "CustomClient: null". name is null.
System.out.println(client.toString());
} catch(Exception e) {
e.printStackTrace();
}
}
}
class CustomClient {
private String name;
CustomClient() {
System.out.println(this.getClass().getSimpleName() + " - Constructor");
this.name = "My Name";
}
#Override
public String toString() {
return this.getClass().getSimpleName() + ": " + name;
}
}
You are only asking Mockito to call the real thing on clear, the underlying object is still a fake created by Mockito for you. If you need a real LinkedList then just use the LinkedList - only the most heated purist of BDD would tell you to mock everything around you. I mean, you are not mocking Strings are you?
Mockito author himself has said that calling the real thing should be used scarcely, usually only for testing a legacy code.
If you need to spy on the real object (track the invocations) then Mockito has a feature for this too:
List list = new LinkedList();
List spy = spy(list);
With spy, you can still stub a method if you need. It basically works like a mock, but isn't ;)
Your reasoning is flawless.
The key issue is that you are not operating on the actual LinkedList object. Here is what is happening behind the scenes:
The object that you are given by Mockito's mock() is an Enhancer object from the CGLIB library.
For me it is something like java.util.LinkedList$$EnhancerByMockitoWithCGLIB$$cae81a28
which kind of acts like a Proxy, albeit with the fields set to default values. (null,0 etc)
When you mock a class the object you are using is a fake, therefore the variables are not instantiated and the methods don't work as expected. You could use reflection to set a value for the header but I really wouldn't recommend this. As theadam said, the best thing to do would be to just use a list.