So I have a class Foo that should eventually adjust and reload classes. It has a method for that, too:
private void redefineClass(String classname, byte[] bytecode) {
ClassFileLocator cfl = ClassFileLocator.Simple.of(classname,bytecode);
Class clazz;
try{
clazz = Class.forName(classname);
}catch(ClassNotFoundException e){
throw new RuntimeException(e);
}
Debug._print("REDEFINING %s",clazz.getName());
new ByteBuddy()
.redefine(clazz,cfl)
.make()
.load(clazz.getClassLoader(), ClassReloadingStrategy.fromInstalledAgent())
;
}
To test it, I simply load the classes from .class files to byte[] (using ASM)
private byte[] getBytecode(String classname){
try {
Path p = Paths.get(LayoutConstants.SRC_DIR).resolve(classname.replace(".","/") + ".class");
File f = p.toFile();
InputStream is = new FileInputStream(f);
ClassReader cr = new ClassReader(is);
ClassWriter cw = new ClassWriter(cr,0);
cr.accept(cw,0);
return cw.toByteArray();
}catch(IOException e){
throw new RuntimeException(e);
}
}
and pass it on to redefineClass above.
Seems to work for quite a few classes ... not for all, though:
REDEFINING parc.util.Vector$1
Exception in thread "Thread-0" java.lang.InternalError: Enclosing method not found
at java.lang.Class.getEnclosingMethod(Class.java:952)
at sun.reflect.generics.scope.ClassScope.computeEnclosingScope(ClassScope.java:50)
at sun.reflect.generics.scope.AbstractScope.getEnclosingScope(AbstractScope.java:74)
at sun.reflect.generics.scope.AbstractScope.lookup(AbstractScope.java:90)
at sun.reflect.generics.factory.CoreReflectionFactory.findTypeVariable(CoreReflectionFactory.java:110)
at sun.reflect.generics.visitor.Reifier.visitTypeVariableSignature(Reifier.java:165)
at sun.reflect.generics.tree.TypeVariableSignature.accept(TypeVariableSignature.java:43)
at sun.reflect.generics.visitor.Reifier.reifyTypeArguments(Reifier.java:68)
at sun.reflect.generics.visitor.Reifier.visitClassTypeSignature(Reifier.java:138)
at sun.reflect.generics.tree.ClassTypeSignature.accept(ClassTypeSignature.java:49)
at sun.reflect.generics.repository.ClassRepository.getSuperInterfaces(ClassRepository.java:100)
at java.lang.Class.getGenericInterfaces(Class.java:814)
at net.bytebuddy.description.type.TypeList$Generic$OfLoadedInterfaceTypes$TypeProjection.resolve(TypeList.java:722)
at net.bytebuddy.description.type.TypeDescription$Generic$LazyProjection.accept(TypeDescription.java:5308)
at net.bytebuddy.description.type.TypeList$Generic$AbstractBase.accept(TypeList.java:249)
at net.bytebuddy.dynamic.scaffold.InstrumentedType$Factory$Default$1.represent(InstrumentedType.java:221)
at net.bytebuddy.ByteBuddy.redefine(ByteBuddy.java:698)
at net.bytebuddy.ByteBuddy.redefine(ByteBuddy.java:676)
at parc.Foo.redefineClass(Foo.java:137)
disassembling Vector$1 gives me class Vector$1 implements java/util/Enumeration, so that indicates it's this class:
/**
* Returns an enumeration of the components of this vector. The
* returned {#code Enumeration} object will generate all items in
* this vector. The first item generated is the item at index {#code 0},
* then the item at index {#code 1}, and so on.
*
* #return an enumeration of the components of this vector
* #see Iterator
*/
public Enumeration<E> elements() {
return new Enumeration<E>() {
int count = 0;
public boolean hasMoreElements() {
return count < elementCount;
}
public E nextElement() {
synchronized (Vector.this) {
if (count < elementCount) {
return elementData(count++);
}
}
throw new NoSuchElementException("Vector Enumeration");
}
};
}
except I still have no idea what to do with that information.
For some reason the instrumented code that was saved to file can be loaded and used but can't be REloaded.
How do I find out why?
EDIT: I should mention that the project I'm working on requires Java 7.
I tested several Java versions and could not find any problems with Class.getEnclosingMethod and Class.getGenericInterfaces for a local class implementing a generic interface, like in the Vector.elements()/Enumeration<E> case. Perhaps, the problems arise, because the class file has already been manipulated.
But it seems that whatever the ByteBuddy frontend is doing under the hood involving Class.getGenericInterfaces is just overkill for your use case, as you have the intended result byte code already.
I suggest going one level down and use
ClassReloadingStrategy s = ClassReloadingStrategy.fromInstalledAgent();
s.load(clazz.getClassLoader(),
Collections.singletonMap(new TypeDescription.ForLoadedType(clazz), bytecode));
to skip these operations and just activate your byte code.
When the class loading strategy is based on ClassReloadingStrategy.Strategy.REDEFINITION you can also use
ClassReloadingStrategy s = ClassReloadingStrategy.fromInstalledAgent();
s.reset(ClassFileLocator.Simple.of(classname, bytecode), clazz);
as it will use the bytecode retrieved through the ClassFileLocator as base.
Looking at the byte-buddy code, I assume that ClassReloadingStrategy.fromInstalledAgent() will return a ClassReloadingStrategy configured with Strategy.REDEFINITION, which does not support anonymous classes. Use Strategy.RETRANSFORMATION instead.
ClassReloadingStrategy strat = new ClassReloadingStrategy(
(Instrumentation) ClassLoader.getSystemClassLoader()
.loadClass("net.bytebuddy.agent.Installer")
.getMethod("getInstrumentation")
.invoke(null),
Strategy.RETRANSFORMATION);
You may consider writing a bug report, the default behavior does not match the comment which says that the default is Strategy.RETRANSFORMATION.
Related
I would like to know how to create a Closure object at run-time from within a Java application, where the content of the Closure is not known ahead of time. I have found a solution but I doubt that it is optimal.
Background: I have written some Groovy code that parses a Domain Specific Language. The parsing code is statically compiled and included in a Java application. In the parser implementation I have classes acting as delegates for specific sections of the DSL. These classes are invoked using the following pattern:
class DslDelegate {
private Configuration configuration
def section(#DelegatesTo(SectionDelegate) Closure cl) {
cl.delegate = new SectionDelegate(configuration)
cl.resolveStrategy = Closure.DELEGATE_FIRST
cl()
}
}
I wish to call such a method directly from Java code. I am able to create a new DslDelegate object and then invoke the section() method. However I need to create and pass an argument that is an instance of Closure. I want the content to be initialised from a String object.
My Solution: The following Java code (utility) is working but I am asking for improvements. Surely this can be done in a cleaner or more efficient manner?
/**
* Build a Groovy Closure dynamically
*
* #param strings
* an array of strings for the text of the Closure
* #return a Groovy Closure comprising the specified text from {#code strings}
* #throws IOException
*/
public Closure<?> buildClosure(String... strings) throws IOException {
Closure<?> closure = null;
// Create a method returning a closure
StringBuilder sb = new StringBuilder("def closure() { { script -> ");
sb.append(String.join("\n", strings));
sb.append(" } }");
// Create an anonymous class for the method
GroovyClassLoader loader = new GroovyClassLoader();
Class<?> groovyClass = loader.parseClass(sb.toString());
try {
// Create an instance of the class
GroovyObject groovyObject = (GroovyObject) groovyClass.newInstance();
// Invoke the object's method and thus obtain the closure
closure = (Closure<?>) groovyObject.invokeMethod("closure", null);
} catch (InstantiationException | IllegalAccessException e) {
throw new RuntimeException(e);
} finally {
loader.close();
}
return closure;
}
You can use GroovyShell to create a Closure from strings:
public Closure<?> buildClosure(String... strings) {
String scriptText = "{ script -> " + String.join("\n", strings) + " }";
return (Closure<?>) new GroovyShell().evaluate(scriptText);
}
Thanks to #hzpz I've soved the similar task, but I'd made it more beautiful and easy-to-use. In my case the closure might accept any arguments, so I put arguments list to closures code. Lets say the closure dynamically created in the String and looks like this:
script1 = 'out,a,b,c-> out.println "a=${a}; b=${b}; c=${c}"; return a+b+c;'
Now, create new method in the String class
String.metaClass.toClosure = {
return (Closure) new GroovyShell().evaluate("{${delegate}}")
}
Now I can call a closure from String or file or from anything else.
println script1.toClosure()(out,1,2,3)
or
println (new File('/folder/script1.groovy')).getText('UTF-8').toClosure()(out,1,2,3)
If I calling toBytecode() method in my context it throws
java.lang.RuntimeException: remaper.by.moofMonkey.Main class is frozen
at javassist.CtClassType.checkModify(CtClassType.java:515)
at javassist.CtClass.getClassFile(CtClass.java:524)
at com.moofMonkey.Main.writeFile(Main.java:340)
at com.moofMonkey.Main.saveClasses(Main.java:324)
at com.moofMonkey.Main.main(Main.java:309)
My context:
.....
for (CtClass cl : modClasses) {
cl.stopPruning(true);
writeFile(cl, "./ModifiedClasses"); //cl.writeFile("./ModifiedClasses");
cl.stopPruning(false);
}
.....
public static void writeFile(CtClass cl, String directoryName) throws Throwable {
System.out.println(">> " + cl.getName());
byte[] bc = cl.toBytecode();
String s = cl.getClassFile().getSourceFile();
int index = new String(bc).indexOf(s);
for(int i = 0; i < s.length(); i++) //KILL SOURCEFILE (c) moofMonkey
bc[index + i] = '-';
DataOutputStream out = cl.makeFileOutput(directoryName);
out.write(bc);
out.flush();
out.close();
}
BUT... But. If I calling analog of writeFile() - cl.writeFile() - all works!
I can do this:
1. Save File
2. Read bytes from him
3. Dp what I need
4. Save File
Having a look into the javadoc of CtClass reveals
Once this method is called, further modifications are not possible any more.
If you change the call order to
String s = cl.getClassFile().getSourceFile();
byte[] bc = cl.toBytecode();
you can call toBytecode.
The exception isn't coming where you call toBytecode but in the next source line, where you call getClassFile. The documentation says that you aren't allowed to call this on a frozen class.
There's a method called getClassFile2 which seems intended to work around this problem:
Returns a class file for this class (read only). Normal applications do not need calling this method. Use getClassFile().
The ClassFile object obtained by this method is read only. Changes to this object might not be reflected on a class file generated by toBytecode(), toClass(), etc.
This method is available even if isFrozen() is true. However, if the class is frozen, it might be also pruned.
The first paragraph suggests that if there's some way to restructure your code so it doesn't need to get a class file for a frozen class, that might be better (or at least better-thought-of by the creators of Javassist).
Motivation:
To aid in remote debugging (Java), it's useful to be able to request remote servers to send over arbitrary objects to my local machine for inspection. However, this means that the remote server must be able to serialize an arbitrary java object that is not known in advance at runtime.
In particular, I would like to be able to serialize even those objects which don't implement Serializable. I stumbled upon JBossSerialization which claimed with that with JBossSerialization...
...You can serialize classes that are not implementing Serializable
Great! And even better, I managed to find the code that supposedly demonstrates how to do this.
Problem
So pinching the code from schabell.org, I wrote a quick test to check that I could serialize and deserialize without problems:
import org.jboss.serial.io.JBossObjectInputStream;
import org.jboss.serial.io.JBossObjectOutputStream;
import java.io.*;
class MyObj { // Test class which doesn't implement Serializable
public int x;
MyObj(int x) {this.x = x;}
}
public class SerializationTest {
public static void main(String[] args) {
MyObj obj = new MyObj(1);
byte[] byteArray = getByteArrayFromObject(obj); // Try to serialize
MyObj result = (MyObj) getObjectFromByteArray(byteArray); // Try to deserialize
System.out.println(result.x);
}
// Code that I pinched from website below (http://www.schabell.org/2009/03/jboss-serialization-simple-example.html):
public static Object getObjectFromByteArray(byte[] bytes) {
Object result = null;
try {
ByteArrayInputStream bais = new ByteArrayInputStream(bytes);
ObjectInputStream ois = new JBossObjectInputStream(bais);
result = ois.readObject(); // ERROR HERE!!!
ois.close();
} catch (IOException ioEx) {
ioEx.printStackTrace();
} catch (ClassNotFoundException cnfEx) {
cnfEx.printStackTrace();
}
return result;
}
public static byte[] getByteArrayFromObject(Object obj) {
byte[] result = null;
try {
ByteArrayOutputStream baos = new ByteArrayOutputStream();
ObjectOutputStream oos = new JBossObjectOutputStream(baos);
oos.writeObject(obj);
oos.flush();
oos.close();
baos.close();
result = baos.toByteArray();
} catch (IOException ioEx) {
ioEx.printStackTrace();
}
return result;
}
}
Problem is that the test failed. Debugging indicated that I could only serialize, but not deserialize. The call to ois.readObject() on line 26 is the culprit and gives as SerializationException:
org.jboss.serial.exception.SerializationException: Could not create instance of MyObj - MyObj
at org.jboss.serial.classmetamodel.ClassMetaData.newInstance(ClassMetaData.java:342)
at org.jboss.serial.persister.RegularObjectPersister.readData(RegularObjectPersister.java:239)
at org.jboss.serial.objectmetamodel.ObjectDescriptorFactory.readObjectDescriptionFromStreaming(ObjectDescriptorFactory.java:412)
at org.jboss.serial.objectmetamodel.ObjectDescriptorFactory.objectFromDescription(ObjectDescriptorFactory.java:82)
at org.jboss.serial.objectmetamodel.DataContainer$DataContainerDirectInput.readObject(DataContainer.java:643)
at org.jboss.serial.io.JBossObjectInputStream.readObjectOverride(JBossObjectInputStream.java:163)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:364)
at SerializationTest.getObjectFromByteArray(SerializationTest.java:44)
at SerializationTest.main(SerializationTest.java:15)
Caused by: java.lang.InstantiationException: MyObj
at java.lang.Class.newInstance(Class.java:359)
at org.jboss.serial.classmetamodel.ClassMetaData.newInstance(ClassMetaData.java:334)
... 8 more
Does anyone know what's going wrong here and how I can get round this?
Or indeed if JBossSerialization isn't the right tool for this, what is?
Edit:
As #Dima points out, the SerializationException is caused by a lack of a public default constructor of the MyObj class. However, adding a default constructor to MyObj isn't an option as I'd like to be able to serialize arbitrary objects, including those without a default constructor.
Well, it is actually impossible to do what you want in a way, that would be both safe and universal.
You can take a look at Kryo, as someone suggested in comments as well. It does have a way to instantiate objects without invoking a constructor, but it is off by default and there is a good reason for it.
Consider this for example:
public class CanonicalObject {
public static HashMap<String,CannicalObject> canons = new HahMap<~>();
public String name;
private CanonicalObject(String name) {
this.name = name;
canons.put(name, this);
}
public static synchronized CanonicalObject getCanonicalInstance(String name) {
CanonicalObject co = canon.get(name);
return co == null ? new CanonicalObject(name) : co;
}
}
(This is a "semi-real-life" example, in that there are real uses for this pattern. I am aware of the "memory leak", there are ways to avoid it in real applications, but they are irrelevant to this example, so I am just ignoring that issue for the sake of simplicity).
If you serialize an instance of this object, when you deserialize it on the other end, the whole "canonicalization" part will be skipped, which can cause subtle problems in application, that are really hard to diagnose, such as comparisons like if(canon1 != canon2) fireMissile() resulting in "friendly fire" and, possibly, a WorldWar III.
Note, that the problem here is broader than just a constructor not invoked by deserialization: the canon.put call could very well be put into getCanonicalInstance() instead of the constructor, and that would present the problem even if the constructor was invoked.
This is an illustration of why, as a matter of policy, you should not be serializing objects that are not designed to be serialized. IT can sometimes work, but, when it does not, it results in situations that are really hard to detect, and usually even harder to fix.
I can't seem to find the answer anywhere, I'm trying to obtain a socket in Java, and hand over its file descriptor number so that I can use it in a C binary (the fd would be as argument).
I've obtained the FileDescriptor using reflection... but can't access the actual number anywhere.
I know other people have suggested JNI, but I'd like to keep it within Java if possible (and couldn't fully figure out how to do it)
In Java 7, you can cast a SocketInputStream to a FileInputStream, and call getFD() to get the FileDescriptor object.
Then you can use reflection to access the FileDescriptor object's private int fd field. (You use the Class.getDeclaredField(...) method to get the Field, call Field.setAccessible(true), and then get the field's value using Field.getInt(...).)
Beware that you may be making your code platform dependent by doing this. There are no guarantees that the particular private field will be present in older ... or forth-coming versions of Java, or in implementations of Java done by other vendors / suppliers.
Stephen C's answer addresses how to get a FileDescriptor, but here's a method to the file descriptor number from that object. On Windows, FileDescriptor uses long handle instead of int fd internally, so this method first checks if handle is used and returns that if so, otherwise it falls back to returning fd. I haven't tested this with sockets as OP is using, but I imagine Windows JVMs still use handle.
public static long fileno(FileDescriptor fd) throws IOException {
try {
if (fd.valid()) {
// windows builds use long handle
long fileno = getFileDescriptorField(fd, "handle", false);
if (fileno != -1) {
return fileno;
}
// unix builds use int fd
return getFileDescriptorField(fd, "fd", true);
}
} catch (IllegalAccessException e) {
throw new IOException("unable to access handle/fd fields in FileDescriptor", e);
} catch (NoSuchFieldException e) {
throw new IOException("FileDescriptor in this JVM lacks handle/fd fields", e);
}
return -1;
}
private static long getFileDescriptorField(FileDescriptor fd, String fieldName, boolean isInt) throws NoSuchFieldException, IllegalAccessException {
Field field = FileDescriptor.class.getDeclaredField(fieldName);
field.setAccessible(true);
long value = isInt ? field.getInt(fd) : field.getLong(fd);
field.setAccessible(false);
return value;
}
I'm verifying that a function was called using Mockito, but Mockito is telling me that the function I'm verifying was never called and that other functions were called. But it seems to me that I'm calling the right function...
Here's the stack trace for the error I'm getting:
Wanted but not invoked:
relationshipAutoIndexer.getAutoIndex();
-> at org.whispercomm.manes.server.graph.DataServiceImplTest.testInitIndices(DataServiceImplTest.java:117)
However, there were other interactions with this mock:
-> at org.whispercomm.manes.server.graph.DataServiceImpl.updateIndexProperties(DataServiceImpl.java:136)
-> at org.whispercomm.manes.server.graph.DataServiceImpl.updateIndexProperties(DataServiceImpl.java:144)
-> at org.whispercomm.manes.server.graph.DataServiceImpl.updateIndexProperties(DataServiceImpl.java:148)
-> at org.whispercomm.manes.server.graph.DataServiceImpl.updateIndexProperties(DataServiceImpl.java:149)
-> at org.whispercomm.manes.server.graph.DataServiceImpl.initIndices(DataServiceImpl.java:121)
at org.whispercomm.manes.server.graph.DataServiceImplTest.testInitIndices(DataServiceImplTest.java:117)
It occurs at
verify(relAutoIndexer).getAutoIndex();
of the test class code shown below.
Here is my code (I have a tendency to leave things out by accident. Please ask me for any code you think I'm missing and I'll add it):
public DataServiceImpl(GraphDatabaseService graphDb) {
super();
this.graphDb = graphDb;
unarchivedParent = new UnarchivedParent(graphDb.createNode());
archivedParent = new ArchivedParent(graphDb.createNode());
packetParent = new PacketParent(graphDb.createNode());
userParent = new UserParent(graphDb.createNode());
this.initIndices();
}
/**
* Initializes the node and relationship indexes.
*
* Updates the set of indexed properties to match {#link DataServiceImpl}
* .NODE_KEYS_INDEXABLE and {#link DataServiceImpl}.REL_KEYS_INDEXABLE.
*
* Note: auto indices can also be configured at database creation time and
* just retrieved at runtime. We might want to switch to that later.
*/
private void initIndices() {
/* Get the auto-indexers */
AutoIndexer<Node> nodeAutoIndexer = this.graphDb.index()
.getNodeAutoIndexer();
AutoIndexer<Relationship> relAutoIndexer = this.graphDb.index()
.getRelationshipAutoIndexer();
this.updateIndexProperties(nodeAutoIndexer,
DataServiceImpl.NODE_KEYS_INDEXABLE);
this.nodeIndex = nodeAutoIndexer.getAutoIndex();
this.updateIndexProperties(relAutoIndexer,
DataServiceImpl.REL_KEYS_INDEXABLE);
this.relIndex = relAutoIndexer.getAutoIndex();
}
/**
* Sets the indexed properties of an {#link AutoIndexer} to the specified
* set, removing old properties and adding new ones.
*
* #param autoIndexer
* the AutoIndexer to update.
* #param properties
* the properties to be indexed.
* #return autoIndexer, this given AutoIndexer (useful for chaining calls.)
*/
private <T extends PropertyContainer> AutoIndexer<T> updateIndexProperties(
AutoIndexer<T> autoIndexer, Set<String> properties) {
Set<String> indexedProps = autoIndexer.getAutoIndexedProperties();
// Remove unneeded properties.
for (String prop : difference(indexedProps, properties)) {
autoIndexer.stopAutoIndexingProperty(prop);
}
// Add new properties.
for (String prop : difference(properties, indexedProps)) {
autoIndexer.startAutoIndexingProperty(prop);
}
// Enable the index, if needed.
if (!autoIndexer.isEnabled()) {
autoIndexer.setEnabled(true);
}
return autoIndexer;
}
And here's the code for the test class:
#Before
public void setup() {
nA = mock(Node.class);
nB = mock(Node.class);
packetA = new PacketWrapper(nA);
packetB = new PacketWrapper(nB);
RelA = mock(Relationship.class);
RelB = mock(Relationship.class);
graphDb = mock(GraphDatabaseService.class);
nodeAutoIndexer = (AutoIndexer<Node>) mock(AutoIndexer.class);
relAutoIndexer = mock(RelationshipAutoIndexer.class);
}
#After
public void tearDown() {
packetA = null;
packetB = null;
}
/*
* ---------------- Test initIndices() ---------------
*/
//TODO
#Test
public void testInitIndices() throws IllegalArgumentException, IllegalAccessException, InvocationTargetException, NoSuchMethodException {
IndexManager indexManager = mock(IndexManager.class);
when(graphDb.index()).thenReturn(indexManager);
when(indexManager.getNodeAutoIndexer()).thenReturn(nodeAutoIndexer);
when(graphDb.index().getRelationshipAutoIndexer()).thenReturn(relAutoIndexer);
dataService = new DataServiceImpl(graphDb);
verify(nodeAutoIndexer, atLeastOnce()).getAutoIndex();
verify(relAutoIndexer).getAutoIndex();
}
Mockito, till version 1.8.5, had a bug in the case of polymorphic dispatch. It was fixed and is available in the first release candidate of the version 1.9.0. See issue 200.
So how does it happen in your code base. Note you are mocking these two classes
nodeAutoIndexer = (AutoIndexer<Node>) mock(AutoIndexer.class);
relAutoIndexer = mock(RelationshipAutoIndexer.class);
AutoIndexer happen to be a generic parent interface, in this interface there is this method ReadableIndex<T> getAutoIndex(). RelationshipAutoIndexer is a subtype of the AutoInexer where the generic part is fixed to Relationship, and override the getAutoIndex() method to return the covariant type ReadableRelationshipIndex.
See AutoIndexer and RelationshipIndexer.
Well, in your calling code you have these lines:
AutoIndexer<Node> nodeAutoIndexer = this.graphDb.index().getNodeAutoIndexer();
AutoIndexer<Relationship> relAutoIndexer = this.graphDb.index().getRelationshipAutoIndexer();
this.nodeIndex = nodeAutoIndexer.getAutoIndex();
this.relIndex = relAutoIndexer.getAutoIndex();
Both nodeAutoIndex in your production code and the mock nodeAutoIndexer in your test code have a reference of type AutoIndexer<Node>, so there's no problem regarding polymorphic dispatch.
However relAutoIndex in your production code is referenced by the type AutoIndexer<Relationship> and the mock relAutoIndexer in your test code is referenced by the type RelationshipAutoIndexer, so the wrong call is registered on the mock and then fails verification.
Your solution is either to upgrade the mockito version; the 1.9.0 RC1 is very stable and a final release should be coming your way. Or you can migrate your reference type (in your production code) from :
AutoIndexer<Relationship> relAutoIndexer = this.graphDb.index().getRelationshipAutoIndexer();
to :
RelationshipAutoIndexer relAutoIndexer = this.graphDb.index().getRelationshipAutoIndexer();
A few other remarks.
You don't actually need to write an after method here as JUnit creates a new instance on each method run, so your method just adds code that will be done anyway. Note this isn't the case with TestNG.
Instead of creating your mocks in the before method, you might want to use Mockito annotations. Don't forget the runner.
For example :
#RunWith(MockitoJUnitRunner.class)
public class YourTest {
#Mock SomeType someTypeMock;
// ...
}
The stubbing code is a bit ugly for several reasons.
Your should write consistent stubs.
Why not write this in a cleaner way; for example referencing indexManager in both case :
IndexManager indexManager = mock(IndexManager.class);
when(graphDb.index()).thenReturn(indexManager);
when(indexManager.getNodeAutoIndexer()).thenReturn(nodeAutoIndexer);
when(indexManager.getRelationshipAutoIndexer()).thenReturn(relAutoIndexer);
Or don't reference it at all
IndexManager indexManager = mock(IndexManager.class);
when(graphDb.index()).thenReturn(indexManager);
when(graphDb.index().getNodeAutoIndexer()).thenReturn(nodeAutoIndexer);
when(graphDb.index().getRelationshipAutoIndexer()).thenReturn(relAutoIndexer);
Also having a mock that returns a mock is usually a sign of design smell. You are breaking the law of Demeter, and breaking it means you will experience difficult testing, bad maintainability, and difficult evolution. When I say that you could hear me whisper also (without the syllogisms) : it will cost you money. Don't write legacy code! If you are practicing TDD or BDD, you will identify these issues at design time for your own code, which is great to prevent them.
However if you are dealing with legacy code, you can use this deep stubs syntax :
Using the static methods you could write this
GraphDatabaseService graphdb = mock(GraphDatabaseService.class, RETURNS_DEEP_STUBS);
Or using the annotation you could write this :
#Mock(answer = RETURNS_DEEP_STUBS) GraphDatabaseService graphdb;
And the stubbing :
when(graphDb.index().getNodeAutoIndexer()).thenReturn(nodeAutoIndexer);
when(graphDb.index().getRelationshipAutoIndexer()).thenReturn(relAutoIndexer);