I asked this question but I thought maybe this should be a separate question. Given the following class. Is this the best way to handle interface specific method calls based on a enum type? Thanks
#Component
public class HelloWorldImpl implements HelloWorld {
private enum MyEnum{
WALK,RUN,JOG,SKIP
}
#Autowired
#Qualifier("walkService")
private ActivityService walkService;
#Autowired
#Qualifier("runService")
private ActivityService runService;
#Override
public void executeMe(){
MyEnum myEnum = MyEnum.WALK;
for(MyEnum enum : MyEnum.values()){
switch(enum){
case RUN:
runService.execute();
case WALK :
walkService.execute();
etc....
}
}
}
}
I was trying to determine if there was a way I could just use the interface (i.e. ActivityService) to call the execute method instead of being specific to the "MODE" (i.e. switch / if). I was just thinking about what happens if I add a new "MODE" I will have to remember to add a section to this switch statement. Any help is greatly appreciated.
*Update
This exact pattern is suggested here.
I doubt you can make it any better. Well, you could by using the Factory pattern, but that seems to be overkill here.
Take a look at : http://grepcode.com/file/repository.grepcode.com/java/root/jdk/openjdk/6-b14/java/util/Calendar.java#Calendar.getInstance%28java.util.Locale%29
They use If statements in there. Seems like your code goes one better.
In order to evolve code in a factory scenario :
a) Caller has to know something about the "kind" of concrete implementation needed
b) For each "kind" of service a subclass is needed
Perhaps the only thing to criticize in your implementation is that the "kind" is hidden by a HelloWorldImpl that "knows" which service to return. Its probably more explicit to use subclasses directly because the method "executeMe" says nothing about what kind of service will be chosen at runtime (it depends on the enum).
You'd better add a method to the enum itself:
private enum MyEnum {
WALK {
#Override
public void execute() {
...
}
},
RUN {
#Override
public void execute() {
...
}
}
public abstract void execute();
}
That way, there(s no way you can add a new enum value without implementing its associated execute() method.
And the method becomes:
public void executeMe(){
MyEnum myEnum = MyEnum.WALK;
myEnum.execute();
}
You don't need such switch statement :)
#Override
public void executeMe(){
runService.execute();
}
All you need to is just call method on the interface. And JVM will run whichever implementation is already assigned to your service variable. That is the beauty of interfaces and exact reason they exist for.
Define a mapping of enumKey => concreteActivityServiceBean;
something like this in your spring app context:
<util:map id="activityServiceMapping" key-type="java.lang.String" value-type="com.somePackage.ActivityService" map-class="java.util.HashMap">
<entry key="RUN" value-ref="runServiceImpl" />
<entry key="WALK" value-ref="walkServiceImpl" />
</util:map>
#Component("runServiceImpl")
class RunServiceImpl implements ActivityService {
#Override
public void execute(){ ... }
}
#Component("walkServiceImpl")
class WalkServiceImpl implements ActivityService {
#Override
public void execute(){ ... }
}
And conditionally select the implementation to execute:
#Component
class HelloWorldImpl implements HelloWorld {
#Resource(name = "activityServiceMapping")
private Map<String, ActivityService> activityServices;
#Override
public void executeMe() {
ActivityService activityService = activityServices.get("WALK"); // or "RUN" or use the ENUM values....
activityService.execute();
}
}
I think you should try to refactor your class, so you only need one instance of the ActivityService class. Your code would then look something like this:
#Component
public class HelloWorldImpl implements HelloWorld {
private enum MyEnum{
WALK,RUN,JOG,SKIP
}
#Autowired
private ActivityService activityService;
#Override
public void executeMe(){
MyEnum myEnum = MyEnum.WALK;
activityService.execute(myEnum);
}
}
But it is hard to say whether this is a viable option, without knowing more about the responsibilities of ActivityService.
Or if you really just want the runner class to execute on the correct type every time without using DI or any class selection code or ifs or switch, then ensure that the correct class is instantiated prior to executing it.
ActionExecutor actionExecutor = (ActionExecutor)Class.forName("com.package.name." + action.name()).newInstance();
actionExecutor.execute();
Voila! Problem solved as long as you have a class for every possible action and those classes have a default constructor.
I had faced a similar problem. I found a solution that is more generic that the accepted answer.
The first step is to create an Interface.
public interface ActivityExecutor {
public void execute();
}
Now, all the required classes to execute must implement this class
public class WalkExecutor implements ActivityExecutor {
#Autowired
private WalkService walkService;
public void execute(){
walkService.execute();
}
}
public class RunExecutor implements ActivityExecutor {
#Autowired
private RunService runService;
public void execute(){
runService.execute();
}
}
Now the enums are declared in the following way
private enum MyEnum {
WALK {
#Override
public String getClassName() {
return "com.basepackage.WalkExecutor";
}
},
RUN {
#Override
public String getClassName() {
return "com.basepackage.RunExecutor";
}
}
public abstract String getClassName();
}
In the processing part, do the following.
String className = MyEnum.WALK.getClassName();
Class<?> clazz = Class.forName(className);
private static ApplicationContext appContext;
ActivityExecutor activityExecutor = (ActivityExecutor) appContext.getBean(clazz);
activityExecutor.execute(); // executes the required Service
Another way of fixing this problem could be:
public enum ExecutorType {
WALK, RUN
}
interface Executor {
void execute();
ExecutorType type();
}
Here we are able to do DI and create CDI/Spring Bean
final class WalkExecutor implements Executor {
#Override
public void execute() {
/** some logic **/
}
#Override
public ExecutorType type() {
return ExecutorType.WALK;
}
}
then we can access valid executor for given type.
public final class ExecutorService {
private final Map<ExecutorType, Executor> executorMap;
ExecutorService(List<Executor> executors) {
this.executorMap = executors.stream().collect(Collectors.toMap(Executor::type), Function.identity()));
}
public void execute(ExecutorType type) {
executorMap.get(type).execute();
}
}
Additionally we can ensure that every Executor type is implemented using either integration test or configuration class.
Configuration class using Spring:
#Configuration
class ExecutorConfiguration {
/** other beans definition **/
#Bean
ExecutorService executorService(List<Executor> executors) {
if (!allExecutorsImplemented(executors)) {
throw new RuntimeException("Invalid executor configuration");
}
return new ExecutorService(executors);
}
private boolean allExecutorsImplemented(List<Executor> executors) {
return executors.stream().map(Executor::type).distinct().count() == ExecutorType.values().length;
}
}
Related
This question already has answers here:
How to mock private method for testing using PowerMock?
(6 answers)
Closed 5 years ago.
I have a method which is private . Now, I do not want to call this private method while doing unit test on execute() method. I have tried with PowerMockito and all, but with all type of mockings it still enter into the private method.
Please suggest with workable testcase. Would appreciate the same.
#Component
public class Employee implements SuperClass {
#Autowired
private FileTraverse fileTraverse;
#Override
public void execute() throws Exception {
List<String> traverse = fileTraverse.getFiles();
Boolean t = isFileTraversed(traverse);
}
private Boolean isFileTraversed(List<String> param1) {
Boolean flag;
//do some DB operation and return flag;
}
}
#glytching is right. The best variant it's to extract method in a new service/component and create mock for one. In this case, your code is testable, you can re-use this component ...
BUT in case if you have only one use case for this method and you don't want to create a service/component just for one method, helper method, you can change the method visibility level from private to protected or package-default. In this case, you can override this method in subclass for testing and work with this sub-class. What you should do :
create a subclass for the class that you want to test and use the instance of this subclass instead of the target class.
--service that you have and need to test one
public class MainService {
#Autowired
private SecondService secondService;
public Object getResultFromMainService(){
return getResultFromMainServiceFromPrivate();
}
--here I changed 'private' into 'default-package'
Object getResultFromMainServiceFromPrivate(){
return secondService.getResult();
}
}
_
#RunWith(SpringRunner.class)
#ContextConfiguration(classes = ServiceOverrideTestConfiguration.class)
public class MainServiceTest {
#Autowired
#Qualifier("subMainService") // or add #Primary and don't use Qualifier
private MainService service;
#Autowired
private SecondService secondService;
#Test
public void test(){
Object result = service.getResultFromMainService();
--here, method getResultFromMainService call inside the overrided
method that we can change
assertNotNull(result);
}
}
#ContextConfiguration
#Import(ApplicationConfigure.class)
class ServiceOverrideTestConfiguration {
#Bean("subMainService")
// or add #Primary and don't use Qualifier
MainService mainServiceSubBean(){
return new MainServiceUnderTest();
}
}
class MainServiceUnderTest extends MainService{
#Override
Object getResultFromMainServiceFromPrivate(){
return "SOME DEFAULT";
}
}
! Pls, consider this approach only as a workaround in rare cases when you need to mock/stub some method and you can't use PowerMock or any other libs. Better, try to do refactoring and bring testability in your code
Don't mock private methods.
See the suggestion below:
#Component
public class Employee implements SuperClass {
#Autowired
private FileTraverse fileTraverse;
#Override
public void execute() throws Exception {
List<String> traverse = fileTraverse.getFiles();
Boolean t = isFileTraversed(traverse);
}
private Boolean isFileTraversed(List<String> param1) {
Boolean flag;
//do some DB operation and return flag;
}
}
So inside isFileTraversed - you will have a DB operation. This operation will probably be executed through a DAO/Repository object.
So your code will probably look like:
#Component
public class Employee implements SuperClass {
#Autowired
private FileTraverse fileTraverse;
#Autowired
private DatabaseAccessDao dbAccess;
#Override
public void execute() throws Exception {
List<String> traverse = fileTraverse.getFiles();
Boolean t = isFileTraversed(traverse);
}
#Override
private Boolean isFileTraversed(List<String> param1) {
Boolean flag;
flag = dbAccess.checkFileTraversed(param1);
return flag;
}
}
What you need to do is to mock the public checkFileTraversed() method on the DatabaseAccessDao class.
1) Don't #Autowire on fields - prefer constructor injection.
2) Are you sure you want to return a Boolean? Is "null" allowed as a return value? If not - consider using the primitive boolean type;
Everybody else is right. You should try to avoid mocking private methods as much as you can. And if you really need to mock it, just drop the private to put it in default scope.
BUT
For the sake of completeness, you can indeed to it with PowerMock. Here is an example using PowerMock and EasyMock.
public class Employee {
public void execute() {
// If our mock is working, isFileTraversed will return false
assertThat(isFileTraversed(Collections.emptyList())).isFalse();
}
private Boolean isFileTraversed(List<String> param1) {
return true;
}
}
#RunWith(PowerMockRunner.class)
#PrepareForTest(Employee.class)
public class EmployeeTest {
#Test
public void execute() throws Exception {
Employee employee = PowerMock.createPartialMockForAllMethodsExcept(Employee.class, "execute");
PowerMock.expectPrivate(employee, "isFileTraversed", Collections.emptyList()).andReturn(false);
PowerMock.replay(employee);
employee.execute();
PowerMock.verify(employee);
}
}
I have a java application for which I want to add an extension to execute groovy scripts. So far, so good the parsing, compiling and execution is not the problem!
For reasons of simplification I want to keep the groovy syntax as simple as possible (e.g. no OO-skills required). Furthermore, the groovy scripts shall be able to access library functions which are initialized by the java classes. This is the part where the #Delegate comes into play!
Currently, I came up with two different solutions which are not completely satisfying for me:
GroovyService.java
public interface GroovyService { }
MyService.java
public class MyService implements GroovyService {
public static final MyService INSTANCE = new MyService();
private MyService() { /* ... */ }
public void method1() { /* ... */ }
public void method2() { /* ... */ }
}
Solution #1 - For each delegated method define a method shortcut
ServicesFacade.java
public class ServicesFacade {
public static final ServicesFacade INSTANCE = new ServicesFacade();
#Delegate MyService myService;
// Further #Delegate of services ...
private ServicesFacade() {
myService = MyService.INSTANCE;
}
}
GroovyScript.groovy
def method1 = myService.&method1
def method2 = myService.&method2
if (method1()) {
method2()
}
The code part with the method shortcuts could be prepended to the string result read from the groovy file content. Without the shortcuts it would fulfill my expectations, but I'm looking for a solution for which I don't have to keep track about all the shortcuts.
Solution #2 - Use a list of the service type and the method wildcard access
ServicesFacade.java
public class ServicesFacade {
public static final ServicesFacade INSTANCE = new ServicesFacade();
#Delegate private final List<GroovyService> services = new ArrayList<>();
private ServicesFacade() {
this.services.add(MyService.INSTANCE);
}
public void addService(GroovyService service) {
this.services.add(service);
}
}
GroovyScript.groovy
if (services*.method1()) {
services*.method2()
}
The advantage of this solution is that I can use a fixed member name for any service (services*), but I'm not so impressed by the syntax.
The groovy scripts are used as follows:
CompilerConfiguration compilerConfiguration = new CompilerConfiguration();
compilerConfiguration.setScriptBaseClass(DelegatingScript.class.getName());
GroovyShell groovyShell = new GroovyShell(compilerConfiguration);
DelegatingScript script = (DelegatingScript) groovyShell.parse(fileContent);
if (script != null) {
script.setDelegate(ServicesFacade.INSTANCE);
scripts.add(script);
}
/* ... */
scripts.forEach(s -> {
s.run();
});
Is there a better way in achieving a direct method call of the delegated methods?
I came up with a good solution in which I wrote an analogous Script class similar to DelegatingScript. It looks as follows:
import groovy.lang.Binding;
import groovy.lang.MetaClass;
import groovy.lang.MissingMethodException;
import org.codehaus.groovy.runtime.InvokerHelper;
import java.util.HashMap;
import java.util.Map;
public abstract class MultiDelegatingScript extends groovy.lang.Script {
private final Map<Object, MetaClass> delegateMap = new HashMap<>();
protected MultiDelegatingScript() {
super();
}
protected MultiDelegatingScript(Binding binding) {
super(binding);
}
public void setDelegate(Object delegate) {
this.delegateMap.put(delegate, InvokerHelper.getMetaClass(delegate.getClass()));
}
#Override
public Object invokeMethod(String name, Object args) {
for (Map.Entry<Object, MetaClass> delegate : this.delegateMap.entrySet()) {
try {
// Try to invoke the delegating method
return delegate.getValue().invokeMethod(delegate.getKey(), name, args);
} catch (MissingMethodException mme) {
// Method not found in delegating object -> try the next one
continue;
}
}
// No delegating method found -> invoke super class method for further handling
return super.invokeMethod(name, args);
}
}
Using this class instead of DelegatingScript will completely fulfill my expectations!
My class structure is as follows:
public class MyParentClass {
void doSomethingParent() {
System.out.println("something in parent");
}
}
public class MyClass extends MyParentClass {
protected String createDummyRequest(Holder myHolder) {
//...
super.doSomethingParent();//I want to avoid this
//...
callingDB();
return "processedOutput";
}
private void callingDB() {
System.out.println("Calling to DB");
}
}
Then my unit test:
public class UnitTest {
public void testCreateDummyRequest() {
//create my mock holder
Holder mockHolder = new Holder();
MyClass mockObj = Mockito.mock(MyClass.class);
//mock doSomethingParent()
//mock callingDB()
//as mockObj is a fully mock, but I need to run my real method
//Mockito.when(mockObj.createDummyRequest(mockHolder)).thenCallRealMethod();
mockObj.createDummyRequest(mockHolder);
//Problem: doSomethingParent() is getting called though I have mocked it
}
}
How do I prevent the calling of the super.doSomethingParent() in my method? (method which I am writing my test)
With this class structure mocking and testing is real hard. If possible, I'd advice to change the structure as in mist cases a class structure that's hard to mock and test is equally hard to extend and maintain.
So if you could change your class structure to something similar to:
public class MyClass {
private DoSomethingProvider doSomethingProvider;
private DbConnector dbConnector;
public MyClass (DoSomethingProvider p, DbConnector c) {
doSomethingProvicer = p;
dbConnector = c;
}
protected String createDummyRequest(Holder myHolder){
//...
doSomethingProvider.doSomethingParent();
//...
dbConnector.callingDB();
return "processedOutput";
}
}
Then you could easily create your instance with mocks of DoSomethingProvider and DbConnector and voila....
If you can't change your class structure you need to use Mockito.spy instead of Mockito.mock to stub specific method calls but use the real object.
public void testCreateDummyRequest(){
//create my mock holder
Holder mockHolder = new Holder();
MyClass mockObj = Mockito.spy(new MyClass());
Mockito.doNothing().when(mockObj).doSomething();
mockObj.createDummyRequest(mockHolder);
}
Note: Using the super keyword prevents Mockito from stubbing that method call. I don't know if there is a way to stub calls to super. If possible (as in you didn't override the parent method in your class), just ommit the keyword.
I faced similar issue, so I find out that using spy() can hepld.
public class UnitTest {
private MyClass myObj;
#Before
public void setUp() throws Exception {
MockitoAnnotations.initMocks(this);
myObj= spy(new MyClass());
}
#Test
public void mockedSuperClassMethod(){
doNothing().when((MyParentClass )myObj).doSomethingParent();
//...
}
}
This approach works for me.
I found another approach, which turned out to be very useful in my case.
In the case I had, I needed to create a new class extending another, which included a very complex (legacy code) protected final method. Due to the complexity, it wasn't really possible to refactor to use composition, so here's what I came up with.
Let's say I have the following:
abstract class Parent {
public abstract void implementMe();
protected final void doComplexStuff( /* a long parameter list */) {
// very complex legacy logic
}
}
class MyNewClass extends Parent {
#Override
public void implementMe() {
// custom stuff
doComplexStuff(/* a long parameter list */); // calling the parent
// some more custom stuff
}
}
Here's how I rearranged this code:
abstract class Parent {
public abstract void implementMe();
protected final void doComplexStuff( /* a long parameter list */) {
// very complex legacy logic
}
}
interface ComplexStuffExecutor {
void executeComplexStuff(/* a long parameter list, matching the one from doComplexStuff */);
}
class MyNewClass extends Parent {
private final ComplexStuffExecutor complexStuffExecutor;
MyNewClass() {
this.complexStuffExecutor = this::doComplexStuff;
}
MyNewClass(ComplexStuffExecutor complexStuffExecutor) {
this.complexStuffExecutor = complexStuffExecutor;
}
#Override
public void implementMe() {
// custom stuff
complexStuffExecutor.doComplexStuff(/* a long parameter list */); // either calling the parent or the injected ComplexStuffExecutor
// some more custom stuff
}
}
When creating instance of MyNewClass for "production" purposes, I can use the default constructor.
When writing unit tests, however, I'd use the constructor, where I can inject ComplexStuffExecutor, provide a mock there and only test my custom logic from MyNewClass, i.e.:
class MyNewClassTest {
#Test
void testImplementMe() {
ComplexStuffExecutor complexStuffExecutor = Mockito.mock(ComplexStuffExecutor.class);
doNothing().when(complexStuffExecutor).executeComplexStuff(/* expected parameters */);
MyNewClass systemUnderTest = new MyNewClass(complexStuffExecutor);
// perform tests
}
}
At first glance, it seems like adding some boilerplate code just to make the code testable. However, I can also see it as an indicator of how the code should actually look like. Perhaps one day someone (who would find courage and budget ;) ) could refactor the code e.g. to implement the ComplexStuffExecutor with the logic from doComplexStuff from Parent, inject it into MyNewClass and get rid of inheritance.
Here is how it can be done
public class BaseController {
public void method() {
validate(); // I don't want to run this!
}
}
public class JDrivenController extends BaseController {
public void method(){
super.method()
load(); // I only want to test this!
}
}
#Test
public void testSave() {
JDrivenController spy = Mockito.spy(new JDrivenController());
// Prevent/stub logic in super.method()
Mockito.doNothing().when((BaseController)spy).validate();
// When
spy.method();
// Then
verify(spy).load();
}
Source: https://blog.jdriven.com/2013/05/mock-superclass-method-with-mockito/
Is there a way to always execute a function before any other function of a class is called?
I have a class where I need to refresh some fields always before any function is called:
public class Example {
private int data;
public void function1(){
}
public void function2(){
}
//#BeforeOtherFunction
private void refresh(){
// refresh data
}
}
Because it seems to be bad programming, I don't want to call refresh at the beginning of every other function. Since other persons are going to work on this project as well, there would be the danger, that somebody extends the calls and doesn't call refresh.
JUnit has a solution for this with the #Before-Annotation. Is there a way to do this in other classes as well?
And by the way: If you know a programming pattern wich solves this problem in another way than executing a function everytime any function is called, that would be very helpful, too!
Use a dynamic proxy in which you can filter to those methods before which your specific "before" method should be called. And call it in those cases before dispatching the call. Please see the answer from How do I intercept a method invocation with standard java features (no AspectJ etc)?
UPDATE:
An interface is needed to be separated for the proxy. The refresh() method cannot remain private. It must be public and part of the interface (which is not nice here) to be able to be called from the proxy.
package CallBefore;
public interface ExampleInterface {
void function1();
void function2();
void otherFunction();
void refresh();
}
Your class implements that interface:
package CallBefore;
public class Example implements ExampleInterface {
#Override
public void function1() {
System.out.println("function1() has been called");
}
#Override
public void function2() {
System.out.println("function2() has been called");
}
#Override
public void otherFunction() {
System.out.println("otherFunction() has been called");
}
#Override
public void refresh() {
System.out.println("refresh() has been called");
}
}
The proxy which does the trick. It filters the needed methods and calls refresh().
package CallBefore;
import java.lang.reflect.InvocationHandler;
import java.lang.reflect.InvocationTargetException;
import java.lang.reflect.Method;
public class ExampleProxy implements InvocationHandler {
private ExampleInterface obj;
public static ExampleInterface newInstance(ExampleInterface obj) {
return (ExampleInterface) java.lang.reflect.Proxy.newProxyInstance(obj.getClass().getClassLoader(),
obj.getClass().getInterfaces(), new ExampleProxy(obj));
}
private ExampleProxy(ExampleInterface obj) {
this.obj = obj;
}
#Override
public Object invoke(Object proxy, Method m, Object[] args) throws Throwable {
Object result;
try {
if (m.getName().startsWith("function")) {
obj.refresh();
}
result = m.invoke(obj, args);
} catch (InvocationTargetException e) {
throw e.getTargetException();
} catch (Exception e) {
throw new RuntimeException("unexpected invocation exception: " + e.getMessage());
}
return result;
}
}
The usage:
package CallBefore;
public class Main {
public static void main(String[] args) {
ExampleInterface proxy = ExampleProxy.newInstance(new Example());
proxy.function1();
proxy.function2();
proxy.otherFunction();
proxy.refresh();
}
}
Output:
refresh() has been called
function1() has been called
refresh() has been called
function2() has been called
otherFunction() has been called
refresh() has been called
This may not solve your exact problem but at least could be a starting point if you are allowed considering a re-design. Below is a simple implementation but with some small touches I believe you can achieve a more elegant solution. BTW, this is called Dynamic Proxy Pattern.
First thing you need is an interface for your class.
public interface Interface {
void hello(String name);
void bye(String name);
}
public class Implementation implements Interface {
#Override
public void hello(String name) {
System.out.println("Hello " + name);
}
#Override
public void bye(String name) {
System.out.println("Bye " + name);
}
}
Then java.lang.reflect.Proxy class comes to help. This class is able to create an instance for a given interface at runtime. It also accepts an InvocationHandler which helps you to capture method calls and looks like this.
public class InvocationHandlerImpl implements InvocationHandler {
private final Object instance;
public InvocationHandlerImpl(Object instance) {
this.instance = instance;
}
#Override
public Object invoke(Object proxy, Method method, Object[] args) throws Throwable {
Object result;
try {
System.out.println("Before");
result = method.invoke(instance, args);
System.out.println("After");
} catch (Exception e){
e.printStackTrace();
throw e;
} finally {
System.out.println("finally");
}
return result;
}
}
After all your client code will look like this.
Interface instance = new Implementation();
Interface proxy = (Interface)Proxy.newProxyInstance(
Interface.class.getClassLoader(),
new Class[] { Interface.class },
new InvocationHandlerImpl(instance));
proxy.hello("Mehmet");
proxy.bye("Mehmet");
Output for this code is
Before
Hello Mehmet
After
finally
Before
Bye Mehmet
After
finally
I would define getters for every field and do the refreshment inside the getter. If you want to avoid unrefreshed access to your private fields at all, put them in a superclass (together with the getters which call refresh).
Depending on your project structure, it may be also sensible to introduce a separate class for all data that is regularly refreshed. It can offer getters and avoid that anyone accesses the non-refreshed fields.
Not in Java SE, but if you are using Java EE, you could use interceptors.
For standalone applications, you could consider using a bytecode manipulation framework, like javassist.
You can have a protected getter method for data. Access getData method instead of using data field. Child classes will see only getData and will have updated data every time.
public class Example {
private int data;
public void function1(){
}
public void function2(){
}
protected int getData(){
refresh();
return data;
}
//#BeforeOtherFunction
private void refresh(){
// refresh data
}
}
It is better to write another method which will be made protected(accessible to the child classes) which will call first the refresh method and then call the function.
This way the data would be refreshed before the function is called everytime(As per your requirement).
eg:
protected void callFunction1(){
refresh();
function();
}
Thanks,
Rajesh
You should use Decorator in this case. Decorator is a good choice for something like interceptor. Example here: https://msdn.microsoft.com/en-us/library/dn178467(v=pandp.30).aspx
I'm currently working on a library and I've tried to abstract parts of the code through the use of interfaces as much as possible. However some areas need to return concretions, as I can see no other way of returning the data in a clean way. For example, One area of code needs to return a key object:
public IKey GenerateKey() {
return new Key(param1, param2, param3);
}
My solution to this currently is to create a Factory class with static methods for returning concretions:
public IKey GenerateKey() {
return KeyFactory.Get(param1, param2, param3);
}
But now I feel like I have high coupling between the factory and the codebase, as there's a few lines in many parts that request objects from the factory. It also just feels lazy, as I can whip up some factory function to give me a concrete class in many situations.
My biggest problem is in allowing people to replace classes by creating their own classes implementing current interfaces. I need to allow users to create their OWN factories for their OWN class implementations. If I make an interface for my factory, and allow users to create their own factory implementing the interface, then I need a higher-up factory to create THAT factory, and so on...
I need users to be able to create their own custom classes implementing the correct interfaces, create their own factory that can return their custom class by other areas of the code where needed, and it all to work seamlessly together.
Edit: This is the idea I have currently, however injecting the factory seems like an unconventional thing to do when using a library in Java.
public class Key implements IKey {
public Key(param1) {
//do something
}
//other methods
}
public interface IKey{
//Other methods
}
public class RicksFactory {
public IKey getKey(param1) {
return new Key(param1);
}
}
public interface IFactory {
IKey getKey(param1);
}
//This class is a singleton, for other classes in
//library to find it without having to do dependency
//injection down all the levels of classes.
public class TopClassInLibrary {
//Singleton init
public void SetFactory(IFactory factory) {
this.factory = factory;
}
public IFactory GetFactory() {
return new RicksFactory();
}
}
public class Main {
public static void main(String[] args) {
TopClassInLibrary.GetInstance().SetFactory(new RicksFactory());
}
}
**Edit 2: ** I think I've figured out a nice solution now, would appreciate it if someone could tell me if its good or not? Thanks
public class Key implements IKey {
public Key(param1) {
//do something
}
//other methods
}
public interface IKey{
//Other methods
}
public class RicksFactory {
public IKey getKey(param1) {
return new Key(param1);
}
}
public interface IFactory {
IKey getKey(param1);
}
public class TopClassInLibrary {
private static TopClassInLibrary topClass;
private static TopClassInLibrary GetInstance() {
if(topClass == null)
topClass = new TopClassInLibrary();
return topClass;
}
private IFactory factory;
public void SetFactory(IFactory factory) {
this.factory = factory;
}
public IFactory GetFactory() {
if(factory == null)
factory = new RicksFactory();
return factory;
}
}
public class Main {
public static void main(String[] args) {
//Below not needed now for default implementation
//TopClassInLibrary.GetInstance().SetFactory(new RicksFactory());
IKey myKey = TopClassInLibrary.GetFactory().getKey(param1);
}
}
So in this setup, TopClassInLibrary never needs to be instantiated or touched by exterior code using the library, as it creates its own instance when requested, and creates the default factory if a custom one has not been set.
Your solutions is the abstract factory pattern, a perfect match in your case.