Is there any way to let FindBugs check and warn me if a CheckForNull annotation is present on the implementation of a method in a class, but not on the declaration of the method in the interface?
import javax.annotation.CheckForNull;
interface Foo {
public String getBar();
}
class FooImpl implements Foo {
#CheckForNull
#Override
public String getBar() {
return null;
}
}
public class FindBugsDemo {
public static void main(String[] args) {
Foo foo = new FooImpl();
System.out.println(foo.getBar().length());
}
}
I just discovered a bug in my application due to a missing null check that was not spotted by FindBugs because CheckForNull was only present on FooImpl, but not on Foo, and I don't want to spot all other locations of this problem manually.
Yes you can write your own detectors, and package them in your own jar plugin file. See fb-contrib.sf.net as an example of an auxillary findbugs plugin.
The one issue is that FindBugs relies on bcel 5.2 which doesn't have annotation support yet. You still get the attributes passed to you that represent the annotations, but it's more manual coding than will be the case in 5.3 (or whatever the next version of bcel is).
Related
I have written some unit tests for a static method. The static method takes only one argument. The argument's type is a final class. In terms of code:
public class Utility {
public static Optional<String> getName(Customer customer) {
// method's body.
}
}
public final class Customer {
// class definition
}
So for the Utility class I have created a test class UtilityTests in which I have written tests for this method, getName. The unit testing framework is TestNG and the mocking library that is used is Mockito. So a typical test has the following structure:
public class UtilityTests {
#Test
public void getNameTest() {
// Arrange
Customer customerMock = Mockito.mock(Customer.class);
Mockito.when(...).thenReturn(...);
// Act
Optional<String> name = Utility.getName(customerMock);
// Assert
Assert.assertTrue(...);
}
}
What is the problem ?
Whereas the tests run successfully locally, inside IntelliJ, they fail on Jenkins (when I push my code in the remote branch, a build is triggered and unit tests run at the end). The error message is sth like the following:
org.mockito.exceptions.base.MockitoException: Cannot mock/spy class
com.packagename.Customer Mockito
cannot mock/spy because :
- final class
What I tried ?
I searched a bit, in order to find a solution but I didn't make it. I note here that I am not allowed to change the fact that Customer is a final class. In addition to this, I would like if possible to not change it's design at all (e.g. creating an interface, that would hold the methods that I want to mock and state that the Customer class implements that interface, as correctly Jose pointed out in his comment). The thing that I tried is the second option mentioned at mockito-final. Despite the fact that this fixed the problem, it brake some other unit tests :(, that cannot be fixed in none apparent way.
Questions
So here are the two questions I have:
How that is possible in the first place ? Shouldn't the test fail both locally and in Jenkins ?
How this can be fixed based in the constraints I mentioned above ?
Thanks in advance for any help.
An alternative approach would be to use the 'method to class' pattern.
Move the methods out of the customer class into another class/classes, say CustomerSomething eg/CustomerFinances (or whatever it's responsibility is).
Add a constructor to Customer.
Now you don't need to mock Customer, just the CustomerSomething class! You may not need to mock that either if it has no external dependencies.
Here's a good blog on the topic: https://simpleprogrammer.com/back-to-basics-mock-eliminating-patterns/
How that is possible in the first place? Shouldn't the test fail both locally and in Jenkins ?
It's obviously a kind of env-specifics. The only question is - how to determine the cause of difference.
I'd suggest you to check org.mockito.internal.util.MockUtil#typeMockabilityOf method and compare, what mockMaker is actually used in both environments and why.
If mockMaker is the same - compare loaded classes IDE-Client vs Jenkins-Client - do they have any difference on the time of test execution.
How this can be fixed based in the constraints I mentioned above?
The following code is written in assumption of OpenJDK 12 and Mockito 2.28.2, but I believe you can adjust it to any actually used version.
public class UtilityTest {
#Rule
public InlineMocksRule inlineMocksRule = new InlineMocksRule();
#Rule
public MockitoRule mockitoRule = MockitoJUnit.rule();
#Test
public void testFinalClass() {
// Given
String testName = "Ainz Ooal Gown";
Client client = Mockito.mock(Client.class);
Mockito.when(client.getName()).thenReturn(testName);
// When
String name = Utility.getName(client).orElseThrow();
// Then
assertEquals(testName, name);
}
static final class Client {
final String getName() {
return "text";
}
}
static final class Utility {
static Optional<String> getName(Client client) {
return Optional.ofNullable(client).map(Client::getName);
}
}
}
With a separate rule for inline mocks:
import org.junit.rules.TestRule;
import org.junit.runner.Description;
import org.junit.runners.model.Statement;
import org.mockito.internal.configuration.plugins.Plugins;
import org.mockito.internal.util.MockUtil;
import java.lang.invoke.MethodHandles;
import java.lang.invoke.VarHandle;
import java.lang.reflect.Field;
import java.lang.reflect.Modifier;
public class InlineMocksRule implements TestRule {
private static Field MOCK_MAKER_FIELD;
static {
try {
MethodHandles.Lookup lookup = MethodHandles.privateLookupIn(Field.class, MethodHandles.lookup());
VarHandle modifiers = lookup.findVarHandle(Field.class, "modifiers", int.class);
MOCK_MAKER_FIELD = MockUtil.class.getDeclaredField("mockMaker");
MOCK_MAKER_FIELD.setAccessible(true);
int mods = MOCK_MAKER_FIELD.getModifiers();
if (Modifier.isFinal(mods)) {
modifiers.set(MOCK_MAKER_FIELD, mods & ~Modifier.FINAL);
}
} catch (IllegalAccessException | NoSuchFieldException ex) {
throw new RuntimeException(ex);
}
}
#Override
public Statement apply(Statement base, Description description) {
return new Statement() {
#Override
public void evaluate() throws Throwable {
Object oldMaker = MOCK_MAKER_FIELD.get(null);
MOCK_MAKER_FIELD.set(null, Plugins.getPlugins().getInlineMockMaker());
try {
base.evaluate();
} finally {
MOCK_MAKER_FIELD.set(null, oldMaker);
}
}
};
}
}
Make sure you run the test with the same arguments. Check if your intellij run configurations match the jenkins. https://www.jetbrains.com/help/idea/creating-and-editing-run-debug-configurations.html. You can try to run test on local machine with the same arguments as on jenkins(from terminal), if it will fail that means the problem is in arguments
Question:
Is it possible to access elements annotated with a #Target(ElementType.TYPE_USE) annotation via an annotation processor?
Is it possible to access the annotated type bounds via an annotation processor?
Links to related documentation I missed are highly appreciated.
Context:
The annotation:
#Target(ElementType.TYPE_USE)
#Retention(RetentionPolicy.SOURCE)
public #interface TypeUseAnno {}
An example class:
public class SomeClass extends HashMap<#TypeUseAnno String, String> {}
The processor:
#SupportedSourceVersion(SourceVersion.RELEASE_8)
#SupportedAnnotationTypes("base.annotations.TypeUseAnno")
public class Processor extends AbstractProcessor {
#Override
public synchronized void init(ProcessingEnvironment processingEnv) {
super.init(processingEnv);
this.processingEnv.getMessager().printMessage(Diagnostic.Kind.WARNING, "Initialized.");
}
#Override
public boolean process(Set<? extends TypeElement> annotations, RoundEnvironment roundEnv) {
this.processingEnv.getMessager().printMessage(Diagnostic.Kind.WARNING, "Invoked.");
for (TypeElement annotation : annotations) {
this.processingEnv.getMessager().printMessage(Diagnostic.Kind.WARNING, "" + roundEnv.getElementsAnnotatedWith(annotation));
}
return true;
}
}
Compiling the above SomeClass with Processor on the classpath will show the "Intialized" message but the process(...) method is never invoked.
Adding another annotation to the processor with #Target(ElementType.PARAMETER) works fine when the annotation is present on a method parameter. If the method parameter is annotated with #TypeUseAnno the process will again ignore the element.
The TYPE_USE annotations are a bit tricky, because the compiler treats them differently, than the "old usage" annotations.
So as you correctly observed, they are not passed to annotation processor, and your process() method will never receive them.
So how to use them at compilation time?
In Java 8, where these annotations got introduced, there was also introduced new way to attach to java compilation. You can now attach listener to compilation tasks, and trigger your own traversal of the source code. So your task to access the annotation splits into two.
Hook to the compiler.
Implement your analyzer.
Ad 1.
There are 2 options to hook on the compiler in Java 8:
Using new compiler plugin API.
Using annotation processor.
I haven't used option #1 much, because it needs to be explicitely specified as javac parameter. So I'll describe option #1:
You have to attach TaskListener to the propper compilation phase. There are various phases. Following one is the only one, during which you have accessible syntax tree representing full source code including method bodies (remember, that TYPE_USE annotations can be used even on local variable declarations.
#SupportedSourceVersion(SourceVersion.RELEASE_8)
public class EndProcessor extends AbstractProcessor {
#Override
public synchronized void init(ProcessingEnvironment env) {
super.init(env);
Trees trees = Trees.instance(env);
JavacTask.instance(env).addTaskListener(new TaskListener() {
#Override
public void started(TaskEvent taskEvent) {
// Nothing to do on task started event.
}
#Override
public void finished(TaskEvent taskEvent) {
if(taskEvent.getKind() == ANALYZE) {
new MyTreeScanner(trees).scan(taskEvent.getCompilationUnit(), null);
}
}
});
}
#Override
public boolean process(Set<? extends TypeElement> annotations, RoundEnvironment roundEnv) {
// We don't care about this method, as it will never be invoked for our annotation.
return false;
}
}
Ad 2.
Now the MyTreeScanner can scan the full source code, and find the annotations. That applies no matter if you used the Plugin or AnnotationProcessor approach. This is still tricky. You have to implement the TreeScanner, or typically extend the TreePathScanner.
This represents a visitor pattern, where you have to properly analyze, which elements are of your interest to be visited.
Let's give simple example, that can somehow react on local variable declaration (give me 5 minutes):
class MyTreeScanner extends TreePathScanner<Void, Void> {
private final Trees trees;
public MyTreeScanner(Trees trees) {
this.trees = trees;
}
#Override
public Void visitVariable(VariableTree tree, Void aVoid) {
super.visitVariable(variableTree, aVoid);
// This method might be invoked in case of
// 1. method field definition
// 2. method parameter
// 3. local variable declaration
// Therefore you have to filter out somehow what you don't need.
if(tree.getKind() == Tree.Kind.VARIABLE) {
Element variable = trees.getElement(trees.getPath(getCurrentPath().getCompilationUnit(), tree));
MyUseAnnotation annotation = variable.getAnnotation(MyUseAnnotation.class);
// Here you have your annotation.
// You can process it now.
}
return aVoid;
}
}
This is very brief introduction. For real examples you can have a look at following project source code:
https://github.com/c0stra/fluent-api-end-check/tree/master/src/main/java/fluent/api/processors
It's also very important to have good tests while developing such features, so you can debug, reverse engineer and solve all the tricky issues you'll face in this area ;)
For that you can also get inspired here:
https://github.com/c0stra/fluent-api-end-check/blob/master/src/test/java/fluent/api/EndProcessorTest.java
Maybe my last remark, as the annotations are really used differently by the javac, there are some limitations. E.g. it's not suitable for triggering java code generation, because the compiler doesn't pick files created during this phase for further compilation.
Let's say I define a custom annotation called #Unsafe.
I'd like to provide an annotation processor which will detect references to methods annotated with #Unsafe and print a warning.
For example, given this code ...
public class Foo {
#Unsafe
public void doSomething() { ... }
}
public class Bar {
public static void main(String[] args) {
new Foo().doSomething();
}
}
... I want the compiler to print something like:
WARN > Bar.java, line 3 : Call to Unsafe API - Foo.doSomething()
It is very similar in spirit to #Deprecated, but my annotation is communicating something different, so I can't use #Deprecated directly. Is there a way to achieve this with an annotation processor? The annotation processor API seems to be more focused on the entities applying the annotations (Foo.java in my example) than entities which reference annotated members.
This question provides a technique to achieve it as a separate build step using ASM. But I'm wondering if I can do it in a more natural way with javac & annotation processing?
I think I could have technically achieved my goal using the response from #mernst, so I appreciate the suggestion. However, I found another route that worked better for me as I'm working on a commercial product and cannot incoporate the Checker Framework (its GPL license is incompatible with ours).
In my solution, I use my own "standard" java annotation processor to build a listing of all the methods annotated with #Unsafe.
Then, I developed a javac plugin. The Plugin API makes it easy to find every invocation of any method in the AST. By using some tips from this question, I was able to determine the class and method name from the MethodInvocationTree AST node. Then I compare those method invocations with the earlier "listing" I created containing methods annotated with #Unsafe and issue warnings where required.
Here is an abbreviated version of my javac Plugin.
import javax.lang.model.element.Element;
import javax.lang.model.element.TypeElement;
import com.sun.source.tree.MethodInvocationTree;
import com.sun.source.util.JavacTask;
import com.sun.source.util.Plugin;
import com.sun.source.util.TaskEvent;
import com.sun.source.util.TaskEvent.Kind;
import com.sun.tools.javac.tree.JCTree;
import com.sun.tools.javac.tree.TreeInfo;
import com.sun.source.util.TaskListener;
import com.sun.source.util.TreeScanner;
public class UnsafePlugin implements Plugin, TaskListener {
#Override
public String getName() {
return "UnsafePlugin";
}
#Override
public void init(JavacTask task, String... args) {
task.addTaskListener(this);
}
#Override
public void finished(TaskEvent taskEvt) {
if (taskEvt.getKind() == Kind.ANALYZE) {
taskEvt.getCompilationUnit().accept(new TreeScanner<Void, Void>() {
#Override
public Void visitMethodInvocation(MethodInvocationTree methodInv, Void v) {
Element method = TreeInfo.symbol((JCTree) methodInv.getMethodSelect());
TypeElement invokedClass = (TypeElement) method.getEnclosingElement();
String className = invokedClass.toString();
String methodName = methodInv.getMethodSelect().toString().replaceAll(".*\\.", "");
System.out.println("Method Invocation: " + className + " : " + methodName);
return super.visitMethodInvocation(methodInv, v);
}
}, null);
}
}
#Override
public void started(TaskEvent taskEvt) {
}
}
Note - in order for the javac plugin to be invoked, you must provide arguments on the command line:
javac -processorpath build/unsafe-plugin.jar -Xplugin:UnsafePlugin
Also, you must have a file META-INF/services/com.sun.source.util.Plugin in unsafe-plugin.jar containing the fully qualified name of the plugin:
com.unsafetest.javac.UnsafePlugin
Yes, this is possible using annotation processing.
One complication is that a standard annotation processor does not descend into method bodies (it only examines the method declaration). You want an annotation processor that examines every line of code.
The Checker Framework is designed to build such annotation processors. You just need to define a callback that, given a method call and issues a javac warning if the call is not acceptable. (In your case, it's simply whether the method's declaration has an #Unsafe annotation.) The Checker Framework runs that callback on every method call in the program.
The AbstractProcessor below processes greghmerrill's #Unsafe annotation and emits warnings on method calls to #Unsafe annotated methods.
It is a slight modification of greghmerrills own answer, which was great, but I had some problems getting my IDEs incremental compiler (I am using Netbeans) to detect the warnings/errors etc emitted from the plugin - only those I printed from the processor was shown, though the behaviour was as expected when I ran 'mvn clean compile' ( I am using Maven). Whether this is due to some problem from my hand, or a points to difference between Plugins and AbstractProcessors/the phases of the compilation process, I do not know.
Anyway:
package com.hervian.annotationutils.target;
import com.sun.source.tree.MethodInvocationTree;
import com.sun.source.util.*;
import com.sun.tools.javac.tree.JCTree;
import com.sun.tools.javac.tree.TreeInfo;
import java.util.Set;
import javax.annotation.processing.*;
import javax.lang.model.SourceVersion;
import javax.lang.model.element.*;
import javax.tools.Diagnostic;
#SupportedAnnotationTypes({"com.hervian.annotationutils.target.Unsafe"})
#SupportedSourceVersion(SourceVersion.RELEASE_8)
public class UnsafeAnnotationProcessor extends AbstractProcessor implements TaskListener {
Trees trees;
#Override
public synchronized void init(ProcessingEnvironment processingEnv) {
super.init(processingEnv);
trees = Trees.instance(processingEnv);
JavacTask.instance(processingEnv).setTaskListener(this);
}
#Override
public boolean process(Set<? extends TypeElement> annotations, RoundEnvironment roundEnv) {
//Process #Unsafe annotated methods if needed
return true;
}
#Override public void finished(TaskEvent taskEvt) {
if (taskEvt.getKind() == TaskEvent.Kind.ANALYZE) {
taskEvt.getCompilationUnit().accept(new TreeScanner<Void, Void>() {
#Override
public Void visitMethodInvocation(MethodInvocationTree methodInv, Void v) {
Element method = TreeInfo.symbol((JCTree) methodInv.getMethodSelect());
Unsafe unsafe = method.getAnnotation(Unsafe.class);
if (unsafe != null) {
JCTree jcTree = (JCTree) methodInv.getMethodSelect();
trees.printMessage(Diagnostic.Kind.WARNING, "Call to unsafe method.", jcTree, taskEvt.getCompilationUnit());
}
return super.visitMethodInvocation(methodInv, v);
}
}, null);
}
}
#Override public void started(TaskEvent taskEvt) { } }
When using the annotation and making calls to the annotated method it will look like this:
One needs to remember to add the fully qualified class name of the annotation processor to a META-INF/service file named javax.annotation.processing.Processor. This makes it available to the ServiceLoader framework.
Maven users having trouble with the com.sun** imports may find this answer from AnimeshSharma helpful.
I keep my annotation + annotation processor in a separate project. I had to disable annotation processing by adding the following to the pom:
<build>
<pluginManagement>
<plugins>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<compilerArgument>-proc:none</compilerArgument>
</configuration>
</plugin>
</plugins>
</pluginManagement>
</build>
Using the annotation and having the processor do its work was simple: In my other project (the one where the screenshot of method foo() is from) I simply added a dependency to the project containing the annotation and processor.
Lastly it should be mentioned that I am new to AbstractProcessors and TaskListeners. I do, fx, not have an overview of the performance or robustness of the code. The goal was simply to "get it to work" and provide a stub for similar projects.
Here's my use case:
I need to do some generic operation before and after each method of a given class, which is based on the parameter(s) of the method. For example:
void process(Processable object) {
LOGGER.log(object.getDesc());
object.process();
}
class BaseClass {
String method1(Object o){ //o may or may not be Processable(add process logic only in former case)
if(o intstanceof Prcessable){
LOGGER.log(object.getDesc());
object.process();
}
//method logic
}
}
My BaseClass has a lot of methods and I know for a fact that the same functionality will be added to several similar classes as well in future.
Is something like the following possible?
#MarkForProcessing
String method1(#Process Object o){
//method logic
}
PS: Can AspectJ/guice be used? Also want to know how to implement this from scratch for understanding.
Edit: Forgot to mention, what I have tried.(Not complete or working)
public #interface MarkForProcessing {
String getMetadata();
}
final public class Handler {
public boolean process(Object instance) throws Exception {
Class<?> clazz = instance.getClass();
for(Method m : clazz.getDeclaredMethods()) {
if(m.isAnnotationPresent(LocalSource.class)) {
LocalSource annotation = m.getAnnotation(MarkForProcessing.class);
Class<?> returnType = m.getReturnType();
Class<?>[] inputParamTypes = m.getParameterTypes();
Class<?> inputType = null;
// We are interested in just 1st param
if(inputParamTypes.length != 0) {
inputType = inputParamTypes[0];
}
// But all i have access to here is just the types, I need access to the method param.
}
return false;
}
return false;
}
Yes, it can be done. Yes, you can use AspectJ. No, Guice would only be tangentially related to this problem.
The traditional aspect approach creates a proxy which is basically a subclass of the class you've given it (e.g. a subclass of BaseClass) but that subclass is created at runtime. The subclass delegates to the wrapped class for all methods. However, when creating this new subclass you can specify some extra behavior to add before or after (or both) the call to the wrapped class. In other words, if you have:
public class Foo() {
public void doFoo() {...}
}
Then the dynamic proxy would be a subclass of Foo created at runtime that looks something like:
public class Foo$Proxy {
public void doFoo() {
//Custom pre-invocation code
super.doFoo();
//Custom post-invocation code
}
}
Actually creating a dynamic proxy is a magical process known as bytecode manipulation. If you want to to do that yourself you can use tools such as cglib or asm. Or you can use JDK dynamic proxies. The main downside to JDK proxies are that they can only wrap interfaces.
AOP tools like AspectJ provide an abstraction on top of the raw bytecode manipulation for doing the above (you can do a lot with bytecode manipulation, adding behavior before and after methods is all aspects allow). Typically they define 'Aspect's which are classes that have special methods called 'advice' along with a 'pointcut' which defines when to apply that advice. In other words you may have:
#Aspect
public class FooAspect {
#Around("#annotation(MarkForProcessing)")
public void doProcessing(final ProceedingJoinPoint joinPoint) throws Throwable
{
//Do some before processing
joinPoint.proceed(); //Invokes the underlying method
//Do some after processing
}
}
The aspect is FooAspect, the advice is doProcessing, and the pointcut is "#annotation(MarkForProcessing)" which matches all methods that are annotated with #MarkForProcessing. It's worth pointing out that the ProceedingJoinPoint will have a reference to the actual parameter values (unlike the java.lang.reflect.Method)
The last step is actually applying your aspect to an instance of your class. Typically this is either done with a container (e.g. Guice or Spring). Most containers have some way of knowing about a collection of aspects and when to apply them to classes constructed by that container. You can also do this programmatically. For example, with AspectJ you would do:
AspectJProxyFactory factory = new AspectJProxyFactory(baseClassInstance);
factory.addAspect(FooAspect.class);
BaseClass proxy = factory.getProxy();
Last, but not least, there are AOP implementations which use compile-time "weaving" which is a second compilation step run on the class files that applies the aspects. In other words, you don't have to do the above or use a container, the aspect will be injected into the class file itself.
I have the following classes:
public interface IDataSource<T> {
public List<T> getData(int numberOfEntries);
}
public class MyDataSource implements IDataSource<MyData> {
public List<MyData> getData(int numberOfEntries) {
...
}
}
public class MyOtherDataSource implements IDataSource<MyOtherData> {
public List<MyOtherData> getData(int numberOfEntries) {
...
}
}
I would like to use a factory that return the correct implementation based on the data type. I wrote the following but I get "Unchecked cast" warnings:
public static <T> IDataSource<T> getDataSource(Class<T> dataType) {
if (dataType.equals(MyData.class)) {
return (IDataSource<T>) new MyDataSource();
} else if (dataType.equals(MyOtherData.class)) {
return (IDataSource<T>) new MyOtherDataSource();
}
return null;
}
Am I doing it wrong? What can I do to get rid of the warnings?
I am not aware of any way to get rid of those warnings without #SuppressWarnings("unchecked").
You are passing in a Class object so T can be captured. But you are forced to check the Class at runtime to determine which IDataSource<T> to return. At this time, type erasure has long since occurred.
At compile time, Java can't be sure of type safety. It can't guarantee that the T in the Class at runtime would be the same T in the IDataSource<T> returned, so it produces the warning.
This looks like one of those times when you're forced to annotate the method with #SuppressWarnings("unchecked") to remove the warning. That warning is there for a reason, so it is up to you to provide and ensure type safety. As written, it looks like you have provided type safety.
#SuppressWarnings("unchecked")
public static <T> IDataSource<T> getDataSource(Class<T> dataType) {
You're doing it right, and you should simply suppress the warnings. Factories are one of the tricky areas in generics where you really do need to manually cast to a generic type, and you have to ensure via whatever means that the returned value matches the Class<T> you pass in. For example, in this case you're hard-coding a couple of IDataSource implementations, so I would recommend writing unit tests that verify that the types are correct so that if the MyData implementation changes in an incompatible way, you'll get an error on build.
Just annotate the getDataSource method with #SuppressWarnings("unchecked"), and it's always a good idea to add an explanatory comment when suppressing warnings.
Generics are for compile-time type safety. They can't be used for runtime type determination like that. To get rid of the warning, you can do something like #SuppressWarnings("unchecked") or use the -Xlint:-unchecked compiler flag, as described in the "Raw Types" part of the Java tutorial.
The other answers have answered the problem as you posed it. But I'd like to take a step back to understand what you're trying to accomplish with this factory method. This factory basically provides a map of data types to IDataSource parameters. Dependency injection might be a more appropriate pattern since this is a small well-known set of data types and implementations (as indicated by your example).
Let's say you want to store all Widgets in Mongo but all Gadgets in Mysql, you might have two classes: a MongoWidgetDataSource that implements IDataSource<Widget> and a MysqlGadgetDataSource that implements IDataSource<Gadget>.
Instead of hardcoding a factory method call like MyFactory.getDataSource(Widget.class) inside a data consumer, I would inject the appropriate IDataSource dependency. We might have MyService that does something with widgets (stored in mongo). Using a factory as you proposed would look like this:
public class MyService {
public void doSomething() {
String value = MyFactory.getDataSource(Widget.class).getSomething();
// do something with data returned from the source
}
}
Instead, you should inject the appropriate data source as a constructor arg into the service:
public class MyService {
private final IDataSource<Widget> widgetDataSource;
public MyService(IDataSource<Widget> widgetDataSource) {
this.widgetDataSource = widgetDataSource;
}
public void doSomething() {
String value = widgetDataSource.getSomething();
// now do something with data returned from the source
}
}
This has the added benefit of making your code more reusable and easier to unit test (mock dependencies).
Then, where you instantiate MyService, you can also wire up your data sources. Many projects use a dependency injection framework (like Guice) to make this easier, but its not a strict requirement. Personally, though, I never work on a project of any real size or duration without one.
If you don't use an DI framework, you just instantiate the dependencies when you create the calling service:
public static void main(String[] args) {
IDataSource<Widget> widgetDataSource = new MongoWidgetDataSource();
IDataSource<Gadget> gadgetDataSource = new MysqlGadgetDataSource();
MyService service = new MyService(widgetDataSource, gadgetDataSource);
service.doSomething();
}
In Guice, you would wire up these data sources like this:
public class DataSourceModule extends AbstractModule {
#Override
protected void configure() {
bind(new TypeLiteral<IDataSource<Widget>>() {}).to(MongoWidgetDataSource.class);
bind(new TypeLiteral<IDataSource<Gadget>>() {}).to(MysqlGadgetDataSource.class);
}
}
Dependency inversion is a bit of a different way to think about the problem, but it can lead to a much more decoupled, reusable and testable code base.
This seems to work:
public static <T> IDataSource<T> getDataSource(MyData dataType) {
System.out.println("Make MyDataSource");
return (IDataSource<T>) new MyDataSource();
}
public static <T> IDataSource<T> getDataSource(MyOtherData dataType) {
System.out.println("Make MyOtherDataSource");
return (IDataSource<T>) new MyOtherDataSource();
}
public void test() {
IDataSource<MyData> myDataSource = getDataSource((MyData) null);
IDataSource<MyOtherData> myOtherDataSource = getDataSource((MyOtherData) null);
}
You may prefer to create empty archetypes rather than cast null like I have but I think this is a viable technique.