Decoupling a Factory to allow for Custom Factory Implementations - java

I'm currently working on a library and I've tried to abstract parts of the code through the use of interfaces as much as possible. However some areas need to return concretions, as I can see no other way of returning the data in a clean way. For example, One area of code needs to return a key object:
public IKey GenerateKey() {
return new Key(param1, param2, param3);
}
My solution to this currently is to create a Factory class with static methods for returning concretions:
public IKey GenerateKey() {
return KeyFactory.Get(param1, param2, param3);
}
But now I feel like I have high coupling between the factory and the codebase, as there's a few lines in many parts that request objects from the factory. It also just feels lazy, as I can whip up some factory function to give me a concrete class in many situations.
My biggest problem is in allowing people to replace classes by creating their own classes implementing current interfaces. I need to allow users to create their OWN factories for their OWN class implementations. If I make an interface for my factory, and allow users to create their own factory implementing the interface, then I need a higher-up factory to create THAT factory, and so on...
I need users to be able to create their own custom classes implementing the correct interfaces, create their own factory that can return their custom class by other areas of the code where needed, and it all to work seamlessly together.
Edit: This is the idea I have currently, however injecting the factory seems like an unconventional thing to do when using a library in Java.
public class Key implements IKey {
public Key(param1) {
//do something
}
//other methods
}
public interface IKey{
//Other methods
}
public class RicksFactory {
public IKey getKey(param1) {
return new Key(param1);
}
}
public interface IFactory {
IKey getKey(param1);
}
//This class is a singleton, for other classes in
//library to find it without having to do dependency
//injection down all the levels of classes.
public class TopClassInLibrary {
//Singleton init
public void SetFactory(IFactory factory) {
this.factory = factory;
}
public IFactory GetFactory() {
return new RicksFactory();
}
}
public class Main {
public static void main(String[] args) {
TopClassInLibrary.GetInstance().SetFactory(new RicksFactory());
}
}
**Edit 2: ** I think I've figured out a nice solution now, would appreciate it if someone could tell me if its good or not? Thanks
public class Key implements IKey {
public Key(param1) {
//do something
}
//other methods
}
public interface IKey{
//Other methods
}
public class RicksFactory {
public IKey getKey(param1) {
return new Key(param1);
}
}
public interface IFactory {
IKey getKey(param1);
}
public class TopClassInLibrary {
private static TopClassInLibrary topClass;
private static TopClassInLibrary GetInstance() {
if(topClass == null)
topClass = new TopClassInLibrary();
return topClass;
}
private IFactory factory;
public void SetFactory(IFactory factory) {
this.factory = factory;
}
public IFactory GetFactory() {
if(factory == null)
factory = new RicksFactory();
return factory;
}
}
public class Main {
public static void main(String[] args) {
//Below not needed now for default implementation
//TopClassInLibrary.GetInstance().SetFactory(new RicksFactory());
IKey myKey = TopClassInLibrary.GetFactory().getKey(param1);
}
}
So in this setup, TopClassInLibrary never needs to be instantiated or touched by exterior code using the library, as it creates its own instance when requested, and creates the default factory if a custom one has not been set.

Your solutions is the abstract factory pattern, a perfect match in your case.

Related

Code refactoring, how to disintegrate two two static functions without making parent functions non static

Context :
We are refectoring the code in order to move to micro services. We've
multiple products(A, B, C and some common code for A,B,C in monolithic
service). now we creating new sandbox for common code.
Problem :
User.java
Class User {
public **static** void init(){
List<Items> users=Items.getItemsList();
}
}
Items.java
Class Items {
public **static** Items getItemsList(){
//many static functions and dependancy
return items;
}
}
So here, both the functions are static and i want to move only User.java
to new sandbox not Items.java. how can i disintegrate this dependancy.
and i can not make User.init() non-static
Assuming sandbox means an independent project that produces a jar, then 'Items` must also exist in the sandbox, otherwise it won't compile.
But you could extract an interface from Items to something such as IItems (forgive the terrible name).
public interface IItems {
// methods...
}
which is included in the sandbox.
And create an interface for a factory such as:
public interface IItemsFactory {
List<IItem> create();
}
which is also included in the sandbox.
The ugly part is keeping User.init() as static. Using a hacky IoC pattern, set an implementation of an IItemsFactory into User. The factory will also have to be static. So User becomes something like:
public class User {
private static volatile IItemsFactory factory;
public static setFactory(IItemsFactory factory) {
User.factory = factory;
}
public static void init() {
List<IItems> users = factory.getItemsList();
}
}
The A, B, and C projects are responsible for providing an implementation of IItemFactory and setting it before calling User.init().
This is half baked and those static methods need to go away during the next refactoring iteration. Still use the IoC pattern, but inject the factory as part of the User constructor.
public class User {
private IItemsFactory factory;
public User(IItemsFactory factory) {
this.factory = factory;
}
public void init() {
List<IItems> users = factory.getItemsList();
}
}

Handling of multiple delegates

I have a java application for which I want to add an extension to execute groovy scripts. So far, so good the parsing, compiling and execution is not the problem!
For reasons of simplification I want to keep the groovy syntax as simple as possible (e.g. no OO-skills required). Furthermore, the groovy scripts shall be able to access library functions which are initialized by the java classes. This is the part where the #Delegate comes into play!
Currently, I came up with two different solutions which are not completely satisfying for me:
GroovyService.java
public interface GroovyService { }
MyService.java
public class MyService implements GroovyService {
public static final MyService INSTANCE = new MyService();
private MyService() { /* ... */ }
public void method1() { /* ... */ }
public void method2() { /* ... */ }
}
Solution #1 - For each delegated method define a method shortcut
ServicesFacade.java
public class ServicesFacade {
public static final ServicesFacade INSTANCE = new ServicesFacade();
#Delegate MyService myService;
// Further #Delegate of services ...
private ServicesFacade() {
myService = MyService.INSTANCE;
}
}
GroovyScript.groovy
def method1 = myService.&method1
def method2 = myService.&method2
if (method1()) {
method2()
}
The code part with the method shortcuts could be prepended to the string result read from the groovy file content. Without the shortcuts it would fulfill my expectations, but I'm looking for a solution for which I don't have to keep track about all the shortcuts.
Solution #2 - Use a list of the service type and the method wildcard access
ServicesFacade.java
public class ServicesFacade {
public static final ServicesFacade INSTANCE = new ServicesFacade();
#Delegate private final List<GroovyService> services = new ArrayList<>();
private ServicesFacade() {
this.services.add(MyService.INSTANCE);
}
public void addService(GroovyService service) {
this.services.add(service);
}
}
GroovyScript.groovy
if (services*.method1()) {
services*.method2()
}
The advantage of this solution is that I can use a fixed member name for any service (services*), but I'm not so impressed by the syntax.
The groovy scripts are used as follows:
CompilerConfiguration compilerConfiguration = new CompilerConfiguration();
compilerConfiguration.setScriptBaseClass(DelegatingScript.class.getName());
GroovyShell groovyShell = new GroovyShell(compilerConfiguration);
DelegatingScript script = (DelegatingScript) groovyShell.parse(fileContent);
if (script != null) {
script.setDelegate(ServicesFacade.INSTANCE);
scripts.add(script);
}
/* ... */
scripts.forEach(s -> {
s.run();
});
Is there a better way in achieving a direct method call of the delegated methods?
I came up with a good solution in which I wrote an analogous Script class similar to DelegatingScript. It looks as follows:
import groovy.lang.Binding;
import groovy.lang.MetaClass;
import groovy.lang.MissingMethodException;
import org.codehaus.groovy.runtime.InvokerHelper;
import java.util.HashMap;
import java.util.Map;
public abstract class MultiDelegatingScript extends groovy.lang.Script {
private final Map<Object, MetaClass> delegateMap = new HashMap<>();
protected MultiDelegatingScript() {
super();
}
protected MultiDelegatingScript(Binding binding) {
super(binding);
}
public void setDelegate(Object delegate) {
this.delegateMap.put(delegate, InvokerHelper.getMetaClass(delegate.getClass()));
}
#Override
public Object invokeMethod(String name, Object args) {
for (Map.Entry<Object, MetaClass> delegate : this.delegateMap.entrySet()) {
try {
// Try to invoke the delegating method
return delegate.getValue().invokeMethod(delegate.getKey(), name, args);
} catch (MissingMethodException mme) {
// Method not found in delegating object -> try the next one
continue;
}
}
// No delegating method found -> invoke super class method for further handling
return super.invokeMethod(name, args);
}
}
Using this class instead of DelegatingScript will completely fulfill my expectations!

Mock inherited method in Mockito Java

My class structure is as follows:
public class MyParentClass {
void doSomethingParent() {
System.out.println("something in parent");
}
}
public class MyClass extends MyParentClass {
protected String createDummyRequest(Holder myHolder) {
//...
super.doSomethingParent();//I want to avoid this
//...
callingDB();
return "processedOutput";
}
private void callingDB() {
System.out.println("Calling to DB");
}
}
Then my unit test:
public class UnitTest {
public void testCreateDummyRequest() {
//create my mock holder
Holder mockHolder = new Holder();
MyClass mockObj = Mockito.mock(MyClass.class);
//mock doSomethingParent()
//mock callingDB()
//as mockObj is a fully mock, but I need to run my real method
//Mockito.when(mockObj.createDummyRequest(mockHolder)).thenCallRealMethod();
mockObj.createDummyRequest(mockHolder);
//Problem: doSomethingParent() is getting called though I have mocked it
}
}
How do I prevent the calling of the super.doSomethingParent() in my method? (method which I am writing my test)
With this class structure mocking and testing is real hard. If possible, I'd advice to change the structure as in mist cases a class structure that's hard to mock and test is equally hard to extend and maintain.
So if you could change your class structure to something similar to:
public class MyClass {
private DoSomethingProvider doSomethingProvider;
private DbConnector dbConnector;
public MyClass (DoSomethingProvider p, DbConnector c) {
doSomethingProvicer = p;
dbConnector = c;
}
protected String createDummyRequest(Holder myHolder){
//...
doSomethingProvider.doSomethingParent();
//...
dbConnector.callingDB();
return "processedOutput";
}
}
Then you could easily create your instance with mocks of DoSomethingProvider and DbConnector and voila....
If you can't change your class structure you need to use Mockito.spy instead of Mockito.mock to stub specific method calls but use the real object.
public void testCreateDummyRequest(){
//create my mock holder
Holder mockHolder = new Holder();
MyClass mockObj = Mockito.spy(new MyClass());
Mockito.doNothing().when(mockObj).doSomething();
mockObj.createDummyRequest(mockHolder);
}
Note: Using the super keyword prevents Mockito from stubbing that method call. I don't know if there is a way to stub calls to super. If possible (as in you didn't override the parent method in your class), just ommit the keyword.
I faced similar issue, so I find out that using spy() can hepld.
public class UnitTest {
private MyClass myObj;
#Before
public void setUp() throws Exception {
MockitoAnnotations.initMocks(this);
myObj= spy(new MyClass());
}
#Test
public void mockedSuperClassMethod(){
doNothing().when((MyParentClass )myObj).doSomethingParent();
//...
}
}
This approach works for me.
I found another approach, which turned out to be very useful in my case.
In the case I had, I needed to create a new class extending another, which included a very complex (legacy code) protected final method. Due to the complexity, it wasn't really possible to refactor to use composition, so here's what I came up with.
Let's say I have the following:
abstract class Parent {
public abstract void implementMe();
protected final void doComplexStuff( /* a long parameter list */) {
// very complex legacy logic
}
}
class MyNewClass extends Parent {
#Override
public void implementMe() {
// custom stuff
doComplexStuff(/* a long parameter list */); // calling the parent
// some more custom stuff
}
}
Here's how I rearranged this code:
abstract class Parent {
public abstract void implementMe();
protected final void doComplexStuff( /* a long parameter list */) {
// very complex legacy logic
}
}
interface ComplexStuffExecutor {
void executeComplexStuff(/* a long parameter list, matching the one from doComplexStuff */);
}
class MyNewClass extends Parent {
private final ComplexStuffExecutor complexStuffExecutor;
MyNewClass() {
this.complexStuffExecutor = this::doComplexStuff;
}
MyNewClass(ComplexStuffExecutor complexStuffExecutor) {
this.complexStuffExecutor = complexStuffExecutor;
}
#Override
public void implementMe() {
// custom stuff
complexStuffExecutor.doComplexStuff(/* a long parameter list */); // either calling the parent or the injected ComplexStuffExecutor
// some more custom stuff
}
}
When creating instance of MyNewClass for "production" purposes, I can use the default constructor.
When writing unit tests, however, I'd use the constructor, where I can inject ComplexStuffExecutor, provide a mock there and only test my custom logic from MyNewClass, i.e.:
class MyNewClassTest {
#Test
void testImplementMe() {
ComplexStuffExecutor complexStuffExecutor = Mockito.mock(ComplexStuffExecutor.class);
doNothing().when(complexStuffExecutor).executeComplexStuff(/* expected parameters */);
MyNewClass systemUnderTest = new MyNewClass(complexStuffExecutor);
// perform tests
}
}
At first glance, it seems like adding some boilerplate code just to make the code testable. However, I can also see it as an indicator of how the code should actually look like. Perhaps one day someone (who would find courage and budget ;) ) could refactor the code e.g. to implement the ComplexStuffExecutor with the logic from doComplexStuff from Parent, inject it into MyNewClass and get rid of inheritance.
Here is how it can be done
public class BaseController {
public void method() {
validate(); // I don't want to run this!
}
}
public class JDrivenController extends BaseController {
public void method(){
super.method()
load(); // I only want to test this!
}
}
#Test
public void testSave() {
JDrivenController spy = Mockito.spy(new JDrivenController());
// Prevent/stub logic in super.method()
Mockito.doNothing().when((BaseController)spy).validate();
// When
spy.method();
// Then
verify(spy).load();
}
Source: https://blog.jdriven.com/2013/05/mock-superclass-method-with-mockito/

Enum Switch to handle interface method calls..bad practice?

I asked this question but I thought maybe this should be a separate question. Given the following class. Is this the best way to handle interface specific method calls based on a enum type? Thanks
#Component
public class HelloWorldImpl implements HelloWorld {
private enum MyEnum{
WALK,RUN,JOG,SKIP
}
#Autowired
#Qualifier("walkService")
private ActivityService walkService;
#Autowired
#Qualifier("runService")
private ActivityService runService;
#Override
public void executeMe(){
MyEnum myEnum = MyEnum.WALK;
for(MyEnum enum : MyEnum.values()){
switch(enum){
case RUN:
runService.execute();
case WALK :
walkService.execute();
etc....
}
}
}
}
I was trying to determine if there was a way I could just use the interface (i.e. ActivityService) to call the execute method instead of being specific to the "MODE" (i.e. switch / if). I was just thinking about what happens if I add a new "MODE" I will have to remember to add a section to this switch statement. Any help is greatly appreciated.
*Update
This exact pattern is suggested here.
I doubt you can make it any better. Well, you could by using the Factory pattern, but that seems to be overkill here.
Take a look at : http://grepcode.com/file/repository.grepcode.com/java/root/jdk/openjdk/6-b14/java/util/Calendar.java#Calendar.getInstance%28java.util.Locale%29
They use If statements in there. Seems like your code goes one better.
In order to evolve code in a factory scenario :
a) Caller has to know something about the "kind" of concrete implementation needed
b) For each "kind" of service a subclass is needed
Perhaps the only thing to criticize in your implementation is that the "kind" is hidden by a HelloWorldImpl that "knows" which service to return. Its probably more explicit to use subclasses directly because the method "executeMe" says nothing about what kind of service will be chosen at runtime (it depends on the enum).
You'd better add a method to the enum itself:
private enum MyEnum {
WALK {
#Override
public void execute() {
...
}
},
RUN {
#Override
public void execute() {
...
}
}
public abstract void execute();
}
That way, there(s no way you can add a new enum value without implementing its associated execute() method.
And the method becomes:
public void executeMe(){
MyEnum myEnum = MyEnum.WALK;
myEnum.execute();
}
You don't need such switch statement :)
#Override
public void executeMe(){
runService.execute();
}
All you need to is just call method on the interface. And JVM will run whichever implementation is already assigned to your service variable. That is the beauty of interfaces and exact reason they exist for.
Define a mapping of enumKey => concreteActivityServiceBean;
something like this in your spring app context:
<util:map id="activityServiceMapping" key-type="java.lang.String" value-type="com.somePackage.ActivityService" map-class="java.util.HashMap">
<entry key="RUN" value-ref="runServiceImpl" />
<entry key="WALK" value-ref="walkServiceImpl" />
</util:map>
#Component("runServiceImpl")
class RunServiceImpl implements ActivityService {
#Override
public void execute(){ ... }
}
#Component("walkServiceImpl")
class WalkServiceImpl implements ActivityService {
#Override
public void execute(){ ... }
}
And conditionally select the implementation to execute:
#Component
class HelloWorldImpl implements HelloWorld {
#Resource(name = "activityServiceMapping")
private Map<String, ActivityService> activityServices;
#Override
public void executeMe() {
ActivityService activityService = activityServices.get("WALK"); // or "RUN" or use the ENUM values....
activityService.execute();
}
}
I think you should try to refactor your class, so you only need one instance of the ActivityService class. Your code would then look something like this:
#Component
public class HelloWorldImpl implements HelloWorld {
private enum MyEnum{
WALK,RUN,JOG,SKIP
}
#Autowired
private ActivityService activityService;
#Override
public void executeMe(){
MyEnum myEnum = MyEnum.WALK;
activityService.execute(myEnum);
}
}
But it is hard to say whether this is a viable option, without knowing more about the responsibilities of ActivityService.
Or if you really just want the runner class to execute on the correct type every time without using DI or any class selection code or ifs or switch, then ensure that the correct class is instantiated prior to executing it.
ActionExecutor actionExecutor = (ActionExecutor)Class.forName("com.package.name." + action.name()).newInstance();
actionExecutor.execute();
Voila! Problem solved as long as you have a class for every possible action and those classes have a default constructor.
I had faced a similar problem. I found a solution that is more generic that the accepted answer.
The first step is to create an Interface.
public interface ActivityExecutor {
public void execute();
}
Now, all the required classes to execute must implement this class
public class WalkExecutor implements ActivityExecutor {
#Autowired
private WalkService walkService;
public void execute(){
walkService.execute();
}
}
public class RunExecutor implements ActivityExecutor {
#Autowired
private RunService runService;
public void execute(){
runService.execute();
}
}
Now the enums are declared in the following way
private enum MyEnum {
WALK {
#Override
public String getClassName() {
return "com.basepackage.WalkExecutor";
}
},
RUN {
#Override
public String getClassName() {
return "com.basepackage.RunExecutor";
}
}
public abstract String getClassName();
}
In the processing part, do the following.
String className = MyEnum.WALK.getClassName();
Class<?> clazz = Class.forName(className);
private static ApplicationContext appContext;
ActivityExecutor activityExecutor = (ActivityExecutor) appContext.getBean(clazz);
activityExecutor.execute(); // executes the required Service
Another way of fixing this problem could be:
public enum ExecutorType {
WALK, RUN
}
interface Executor {
void execute();
ExecutorType type();
}
Here we are able to do DI and create CDI/Spring Bean
final class WalkExecutor implements Executor {
#Override
public void execute() {
/** some logic **/
}
#Override
public ExecutorType type() {
return ExecutorType.WALK;
}
}
then we can access valid executor for given type.
public final class ExecutorService {
private final Map<ExecutorType, Executor> executorMap;
ExecutorService(List<Executor> executors) {
this.executorMap = executors.stream().collect(Collectors.toMap(Executor::type), Function.identity()));
}
public void execute(ExecutorType type) {
executorMap.get(type).execute();
}
}
Additionally we can ensure that every Executor type is implemented using either integration test or configuration class.
Configuration class using Spring:
#Configuration
class ExecutorConfiguration {
/** other beans definition **/
#Bean
ExecutorService executorService(List<Executor> executors) {
if (!allExecutorsImplemented(executors)) {
throw new RuntimeException("Invalid executor configuration");
}
return new ExecutorService(executors);
}
private boolean allExecutorsImplemented(List<Executor> executors) {
return executors.stream().map(Executor::type).distinct().count() == ExecutorType.values().length;
}
}

Allowing object construction only from some packages

I work on a game-like system. Users can submit .class and .java files for customized behaviour. Some objects are delivered to the user via callback, but if the user can construct these object himself (with custom parameters), it would mean an advantage to him. I will disallow reflection for the user and seal my packages. I can get this working if I abandon all package structure (and make the constructors package-private), but I would like not to do so.
Here is an example:
sscce.mycode.a.SomeClass.java:
package sscce.mycode.a;
import sscce.mycode.b.RestrictedObject;
import sscce.usercode.SomeUserClass;
public class SomeClass {
public static void main(String[] args) {
SomeUserClass userClass=new SomeUserClass();
// If I can create it from here, anyone can...
RestrictedObject object=new RestrictedObject();
userClass.someMethod(object);
}
}
sscce.mycode.b.Interface.java:
package sscce.mycode.b;
public interface Interface {
public void someMethod(RestrictedObject restrictedObject);
}
sscce.mycode.b.RestrictedObject.java:
package sscce.mycode.b;
public class RestrictedObject {
public RestrictedObject() {}
}
sscce.usercode.SomeUserClass.java:
package sscce.usercode;
import sscce.mycode.b.Interface;
import sscce.mycode.b.RestrictedObject;
public class SomeUserClass implements Interface {
#Override
public void someMethod(RestrictedObject restrictedObject) {
// It receives an instance, but cannot create it.
System.out.println("Got "+restrictedObject);
}
}
Motivation: Having everything in one package sounds messy...
Does anyone have ideas on how to accomplish this without flattening the packages?
Thanks in advance for any solutions, ideas or comments, Till
You could do it the following way, however you should carefully consider if you really want to use this approach as it is very slow and quite frankly, bad practice.
I'll put it up anyway as to how you can do it:
public final class Secured {
private static final Set<Class<?>> allowedCallers = new HashSet<>();
static {
allowedCallers.add(Allowed.class);
}
private static final class SecurityManagerExtension extends SecurityManager {
private static final int OFFSET = 4;
#Override
protected Class<?>[] getClassContext() {
return super.getClassContext();
}
private Class<?> getCaller() {
try {
return getClassContext()[OFFSET];
} catch (ArrayIndexOutOfBoundsException e) {
return null;
}
}
}
private Secured() {
// protect against reflection attack
Class<?> caller = new SecurityManagerExtension().getCaller();
if (!this.getClass().equals(caller)) {
throw new IllegalStateException();
}
System.out.println("Secured instance constructed!");
}
public static Secured createInstance() {
// this gets the class name of the calling class
Class<?> caller = new SecurityManagerExtension().getCaller();
if (allowedCallers.contains(caller)) {
System.out.println("Created instance by '" + caller + "'!");
return new Secured();
} else {
System.out.println("No instance created because call was made by '" + caller + "'!");
return null;
}
}
}
Note the final keyword on the class to prevent subclassing. If you need to subclass the class yourself, move the final keyword to the factory method.
Also note that this is not protected against serialization attacks.

Categories

Resources