Mocking class with existing class in JMockit (replacement for `redefineMethods`) - java

I've got two classes as input and want to mock one with the other. That used to be very simple in JMockit, one just called
Mockit.redefineMethods(originalClass, mockingClass);
But in version 0.999 this deprecated method was removed. I need features of a newer version of JMockit, so I cannot use the older versions any more.
I guess from the documentation in the deprecation message that using the proposed "modern" way to do it would be to define a MockUp<originalClass> and use this as the mockingClass.
Unfortunately, I get both values as input parameters at runtime (declared as class<?>), so creating a class is not an option.
Is there any way to emulate what Mockit.redefineMethods() has done before version 0.999, even if it might be not the most elegant solution to address this issue?
EDIT
What I get as input is a Map<Class<?>, Class<?>> mockedClasses of classes to be mocked pointing to classes mocking them. These are then iterated over and passed to Mockit:
for (Map.Entry<Class<?>, Class<?>> entry : mockedClasses.entrySet()) {
Mockit.redefineMethods(entry.getKey(), entry.getValue());
}
After that, the test code is executed, then the mocking is disabled again, using restoreOriginalDefinition() instead of redefineMethods() in a similar way.

Ok, the question is clearer now. And the answer is that there is no way to mock a class with another arbitrary class; you have to define the mock-up class as a subclass of MockUp. The very old Mockit.redefineMethods(Class, Class) (removed from the API 4.5+ years ago) only accepted arbitrary classes because that initial API also supported Java 1.4 for test code (which is no longer supported since 0.999, which required generics and/or annotations).

Related

Is there a way to add a method to a class definition at runtime? [duplicate]

This question already has answers here:
Can a Java class add a method to itself at runtime?
(11 answers)
Closed 2 years ago.
There is a much better question (linked below). My question encouraged bad coding practices without outlining the risks of those practices.
Can a Java class add a method to itself at runtime?
The original question was the following:
Is there a way to add a method to a class definition at runtime?
For example, lets say I had the following interface
public interface Singleton<T> {
#StaticContract
T getInstanceStatic();
}
At runtime, I would scan all classes for methods with the annotation "StaticContract" and add a static version of the implemented method to the class definition. However, I have no idea how I would go about doing this or if this is even possible.
In my current implemention, if runtime reflection doesn't find a static method for a method during initialization, I throw a NoSuchMethodError. The big problem is that the developer might not know that they are supposed to create a static method if they aren't familiar with the interface. Non-static getInstanceStatic() doesn't really make sense with Singletons. It just serves as a reminder to create the static method.
Combined with the ability to recover the erased type using reflection, this would allow me to use generics for far more than they were intended. For example, you would no longer have to define and pass a factory object. You could just define the method in the class that the factory produces.
Also, if there isn't a way to do this during runtime, is there a way to do it during compile time?
What you want is possible!
But not like this. The answer to your actual question is a simple, flat out 'No'. But you don't want what you describe in your question.
Let me elaborate.
Let's first say that you could add methods at runtime. You can't*, but let's say you could.
That would accomplish nothing whatsoever; given:
public class Example implements Singleton<Example> {
#StaticContract Example getInstanceStatic() { return new Example(); }
}
We can already see issues here (this method is.. public. It has to be, that's the rule of interfaces. But given that you want this to be a singleton, that'd be very bad news).
But let's carry on for a moment. The idea is that you want to be able to write, in other code:
Example.instance();
but - how? The compiler won't LET YOU do that, because the method isn't there, and if we go with your plan (of adding the method at runtime), then at compile time it'll never be there, and javac will refuse to compile this. If somehow it DID compile this, then at runtime, where you pull your magic trick and somehow add this method, all would be well, but that's a moot point - short of hacking together a class file with a bytecode editor, there's no way to obtain a class file with the compiled version of Example.instance().
You don't want to add this at runtime.
But maybe you want to add it at compile time.
And THAT? That you can do!
Strategy #1: Lombok
Project Lombok lets you write #UtilityClass which makes it act singleton-esque. Lombok intentionally does not have #Singleton because as a concept, singletons are so universally deriled as bad code style. I guess you could fork lombok and add it if you must have this.
Strategy #2: Annotation Processors
Other than lombok, annotation processors cannot add things to existing source files. But they can make new ones! Given as actual real bytes on disk source file:
#SingletonizeMe
public class Example {
Example() {} // without lombok you're going to have to write this yourself to ensure nobody outside of the package can instantiate this...
}
then you can write an annotation processor which means that javac will automatically produce this file:
// generated code
package same.pkg.as.your.example;
public class ExampleUtil {
public static final Example EXAMPLE_INSTANCE = new Example();
}
and compile it as part of the build, and any code that contains ExampleUtil.EXAMPLE_INSTANCE will just be compiled right along, without any complaints. Annotation Processors solve the problem of 'okay, maybe at runtime this would work but how do I explain to javac to just do what I want without it refusing to compile code that it thinks stands no chance of working at runtime?'.
Strategy #3: Dependency injection systems
From dagger to spring to guice, there are tons of libraries out there that do 'dependency injection', and pretty much all of them have an option to inject things singleton style. Give those 3 libraries a quick look, it should be fairly obvious how that works once you follow their get-started-quick tutorials.
*) You'd think the answer is yes, what with instrumention and the ability to use agent technology to reload a class file. But is that 'adding a method to a class'? No, it is not - it is reloading a class, which does not normally work if you try to add any new members; the hot code replace tech built into VMs doesn't let you change (or add, or remove) any signatures.

Java's default interface methods collide with private instance methods? [duplicate]

This question already has answers here:
Do Java 8 default methods break source compatibility?
(5 answers)
Closed 6 years ago.
So as far as I know, the main idea behind the new interface default methods of Java 8 is to support Interface Evolution, i.e. extend an interface without braking existing implementations.
But what just occurred to me is that actually all these new default interface methods in the API hava a potential to break existing code. Namely, my implementation breaks if in a class I am implementing an interface X, and that interface X now has a new default method, which has the same signature than some private instance method of my class that already existed! Because in this case the compiler thinks I'm overriding the interface method while reducing its visibility, which is not allowed. So what if I have some implementation of Iterable and came up with some private forEach utility method? No when I update to Java 8 I can no longer compile.
Is it just me that is a bit shocked that Oracle actually released a not fully downwards-compatible API update? Has something like this ever happened in the past, that upgrading to a new compilation version can make some of your code no longer compile? Because if so I'm not aware of it. And what are your opinions of this?
edit: Oh, wait, what I said might have a flaw.. I mentioned the example with the Iterable#forEach method, but actually, this method takes some parameter that is also only introduced with Java 8. So there is no way that I could have defined such a method previously. Now, my next question: Could it be that ALL new default methods take some new type to guarantee they cannot collide with any pre-Java-8 existing instance method?
Cheers
Oracle had to choose between letting the language and APIs stagnate, or risk some backward incompatibilities. Yes, default methods can cause problems with existing extending interfaces and implementations. That's well known.
Has that already happened in the past? Yes: the JDBC interfaces have several times had new methods. assert was not a keyword but is one since Java 1.4, etc. enum was not a keyword before 1.5, etc.
EDIT
Examples of backward incompatibilities:
If you have an interface MyCollection extending Collection and having a method stream(), it will conflict with the new default stream() method, because it has the same signature but a different return type.
If you have an interface or class extending/implementing List<E> and having a method void sort(Comparator<E> c), it will conflict with the new default method void sort(Comparator<? super E>).
So what if I have some implementation of Iterable and came up with some private forEach utility method?
This isn't a problem because this would only overload the method. You can't have a forEach(Consumer) as this interface didn't exist before.
Is it just me that is a bit shocked that Oracle actually released a not fully downwards-compatible API update?
In each major version there is changes which could break backward compatibility, In Java 1.4, the keyword enum was added which meant if you have a variable called enum it would break.
Has something like this ever happened in the past, that upgrading to a new compilation version can make some of your code no longer compile?
Some APIs have changed, one of the oldest changes was a fix to String.hashCode() in Java 1.2.
In my opinion this Default method stuff should not be used by us Java developers. This was probably the only way to extend Existing Code without breaking the backward compatibilty.
But this is just my opinion.

Kotlin reflection interoperability with Java

What are the caveats that a developer should be aware of while writing reflective code that works both with Java and Kotlin?
For example, I have an existing library that uses reflection and it works well with Java. However, when I use the same with Kotlin, my reflective code doesn't seem to pick up the annotations on fields.
Here are some of the differences that I noticed.
1. Acquiring a Class instance
// Example 1.1 - Java
Class<?> userClass = User.class; // From a class name
userClass = userInstance.getClass(); // OR from an instance
Getting a Java class instance in Kotlin
// Example 1.2 - Kotlin
val userClass = userInstance.javaClass // From an instance
I'm unable to use the .class facility or the .getClass() method in Kotlin as we do in Java.
2. Delegates
When I use delegated properties in a Kotlin class, the properties that I retrieve have the $delegate suffix. This is a bit contrary to the fields that we get in Java (I do understand Kotlin does not have fields, only properties). How does this affect meta-programming?
However, with delegates I see that most of the methods retain their behavior as they do in Java. Are there any other differences that I have to be aware of?
Making Java and Kotlin interoperable for me would require understanding about 1 discussed above, plus other limitations / differences that Kotlin brings to meta-programming.
For example, I have an existing library that uses reflection and it works well with Java. However, when I use the same with Kotlin, my reflective code doesn't seem to pick up the annotations on fields.
Can it be because the fields are private now?
Anyway, there are issues with annotations on fields at the moment, this will be fixed in on of the upcoming milestones.
Some other relevant issues:
https://youtrack.jetbrains.com/issue/KT-5967
https://youtrack.jetbrains.com/issue/KT-4169
https://youtrack.jetbrains.com/issue/KT-3625
I'm unable to use the .class facility or the .getClass() method in Kotlin as we do in Java.
Only the syntax is different: javaClass<C>() works exactly the same as C.class, and x.javaClass does the same thing as x.getClass()
When I use delegated properties in a Kotlin class, the properties that I retrieve have the $delegate suffix.
Minor correction: the fields have the $delegate suffix, not the properties.
However, with delegates I see that most of the methods retain their behavior as they do in Java. Are there any other differences that I have to be aware of?
The docs here give you a detailed description of how delegated properties are implemented.
Making Java and Kotlin interoperable for me would require understanding about 1 discussed above, plus other limitations / differences that Kotlin brings to meta-programming.
The more your Kotlin code resembles Java code, the smaller is the difference from the reflection point of view. If you write idiomatic Kotlin, e.g. use default parameter values, traits, properties, delegates, top-level functions, extensions etc, the classes you get differ from idiomatic Java, otherwise they are closely aligned.

Does Reflections library ignore the RetentionPolicy

For understanding of the Java annotations I tried some hands on and got few doubts, even though looking at execution I am still confused. Here is what I am doing.
Define a Annotation
#Retention(RetentionPolicy.CLASS)
#Target(value=ElementType.TYPE)
public #interface Command {
}
Now I initialize the commands
Reflections reflections = new Reflections(CMDS_PACKAGE);
Set<Class<?>> allClasses = reflections.getTypesAnnotatedWith(Command.class); // line 2
for (Class clazz : allClasses) {
MYCommand cmd = (MYCommand) clazz.newInstance();
System.out.println(cmd.getClass().getAnnotation(Command.class));// line 6
log.info("loading Command [ {} ]", clazz.getCanonicalName());
}
when I run the program line 6 displays null.
When the policy is RetentionPolicy.RUNTIME line 6 displays the correct Command.
During this process the line 2 is still giving me correct Annotated class irrespective of policy. So does it mean that the Reflection Library is ignoring the RetentionPolicy
I am really confused even though reading most of tutorials.
The question for me actually is that , why is this different behaviour? When annotated with RetentionPolicy.CLASS policy It should not have given me at runtime. Is my understanding wrong or can anyone please share there valuable inputs on the understanding of these both.
Yes, the Reflections library (not Reflection, but Reflection*s*) does ignore the visibility of annotations by default.
This can be changed using the org.reflections.adapters.JavassistAdapter#includeInvisibleTag flag. Something like:
JavassistAdapter mdAdapter = new JavassistAdapter();
mdAdapter.includeInvisibleTag = false;
new Reflections(new ConfigurationBuilder()
...
.setMetadataAdapter(mdAdapter)
...
Another option would be to use the JavaReflectionAdapter instead.
HTH
In the first place, the RetentionPolicy dictates what a conforming compiler has to do. Annotations with RetentionPolicy.SOURCE do not make it into the class file while the other two, RetentionPolicy.CLASS and RetentionPolicy.RUNTIME, are stored within the class file but using different attributes to allow to make a distinction between them when reading the class file.
The documentation of RetentionPolicy.CLASS says:
Annotations are to be recorded in the class file by the compiler but need not be retained by the VM at run time. This is the default behavior.
Here, the responsibility is clearly documented, the VM shall not retain them and the Reflection API built into the JRE conforms to it. Though “need not be retained” does not sound like a strong requirement.
But 3rd party libraries like the Reflection Library you are using are free to implement whatever they want when parsing a class file. Since the documentation for the method you have called simply says: “get types annotated with a given annotation”, the behavior isn’t wrong as the type has that annotation.
And you are able to find out the RetentionPolicy of that Annotation even before invoking that method by analyzing the Annotations of the Annotation. So it makes no sense invoking the method when you already know that the annotation has the RetentionPolicy.CLASS and then bother because the method does something instead of nothing.
But, of course, it would be better if that behavior was documented completely. So you might ask the author of that 3rd party library to improve the documentation.

Mocking of aggregates with any Java mocking framework

Are there any mocking framework that can do a "full" mocking of every child in an aggregate? For example.
final Report report = createMock(Report.class);
expect(report.getReportSides().get(0).getSideGroup().get(1)).andStubReturn(createSomething());
I want this call with these indices be mocked without me having to do anything else, and before I start to write some massive testing code... is this possible in any framework, EasyMock, PowerMock, Mockito, etc?
(The class example is a legacy class auto-generated from a customers XML, hence the weird class structure, and the absence of domain service layer).
I'm sure you know it's strongly advised to not mock values, but with legacy stuff there could be funky stuff.
Anyway the following declaration might do a great part of the job:
mock(Report.class, RETURNS_DEEP_STUBS)
However you seem to have collections in your aggregate report.getReportSides().get(0).getSideGroup().get(1), and due to type generics erasure Mockito or others frameworks cannot infer the runtime type that should be in the collections, so RETURNS_DEEP_STUBS answer will create a mock matching the return type that is read through reflection, and will certainly be a mock of Object itself in the case of java collections. So you'll have to deal with it manually.
As a side note, there have been progress for generic types in mockito trunk, it can retrieve more generic information that is embedded in the class, it's clearly not near anything that have runtime introspection (impossible with current versions of Java) but it gets closer to it.
With the upgraded RETURNS_DEEP_STUBS you could do :
public interface A<K extends MyKeyType> extends Map<K, MyValueType> {}
deepStubMock.entrySet().iterator().next()
.getValue().someValueTypeMethod().eventuallyFollowedByAnotherMethod();
EDIT : looks like David answered before me in the comment :)

Categories

Resources