Is it a bad practice to have a Spring Service break down its functionality by implementing multiple interfaces and then having Spring inject that one Service instance using the interface that declares only the required methods where needed?
Like:
public interface OperationsService1 {
public void operation1();
public void operation2();
}
public interface OperationsService2 {
public void operation3();
public void operation4();
}
#Service
public class OperationsServiceImpl implements OperationsService1, OperationsService2 {
public void operation1() {}
public void operation2() {}
public void operation3() {}
public void operation4() {}
}
and then in the calling class:
#Autowire
private OperationsService1 ops1;
or
#Autowire
private OperationsService2 ops2;
This is more a matter of design than a matter of Spring from my point of view. Generally, a class should be responsible for a single functionality (see SRP on wiki). So one service class should implement one service interface.
and then having Spring inject the Service instance using the interface
that declares only the required methods where needed?
First of all I feel like you are confused. In your example there won't be an instance for each interface. When you call
#Autowire
private OperationsService1 ops1;
#Autowire
private OperationsService2 ops2;
they will both point to the same OperationsServiceImpl class because the bean is singleton by default. What you have here is one instace and two interfaces which point to it. By autowiring the interfaces it means that for the first interface you can call only some of the methods in the bean, with the 2nd interface some other methods of the same bean.
It it a good practice?
I don't think so, usually one would use an interface with multiple object instances with various functionality which is not the case here as explained above. It's gonna get even more messy if other classes start implementing these interfaces and you have to use #Qualifier to distinguish between them. If you want a clean solution separate the OperationsServiceImpl into two separate classes and each of them implement the corresponding interface. It would be less complex and easier to support for new developers.
Related
I've been learning a lot about Design Patterns lately, specifically Dependency Injection. I'm pretty sure that abstract factorys are a good way of instantiating objects that have dependencies. However I'm not sure how to tell lower level objects what factories they are supposed to use.
Consider following simplified example:
I have a class MainProgram (I just made this to represent that there is other code in my program..)
At some point during runtime I want to instantiate a IGeneticAlgorithm with an abstract factory:
public class MainProgram{
private AbstractGeneticAlgorithm geneticAlgorithm;
private IGeneticAlgorithmFactory geneticAlgorithmFactory;
public MainProgram(IGeneticAlgorithmFactory geneticAlgorithmFactory){
this.geneticAlgorithmFactory = geneticAlgorithmFactory;
}
private void makeGeneticAlgorithm(){
geneticAlgorithm = geneticAlgorithmFactory.getInstance();
}
public static void main(String[] args){
MainProgram mainProgramm = new MainProgram(new FastGeneticAlgorithmFactory());
//...
}
}
public interface IGeneticAlgorithmFactory{
public IGeneticAlgorithm getInstance();
}
public class FastGeneticAlgorithmFactory implements IGeneticAlgorithmFactory{
public IGeneticAlgorithm getInstance(){
return new FastGeneticAlgorithm();
}
}
public abstract class AbstractGeneticAlgorithm{
private IIndividual individual;
private IIndividualFactory individualFactory;
private void makeIndividual(){
individual = individualFactory.getInstance();
}
//...
}
At some point during runtime I want to instantiate an IIndividual in my GeneticAlgorithm. The IIndividual can't be instantiated at startup. The need to be able to instantiate the IIndividual during runtime comes from the way Genetic Algorithms work, where basically after each Step of Selection-Recombination-Mutation new Individuals have to be instantiated. (For more information see https://en.wikipedia.org/wiki/Genetic_algorithm). I chose to give the AbstractGeneticAlgorithm here only one IIndividual to keep this example simple.
public class FastGeneticAlgorithm implements AbstractGeneticAlgorithm{
private IIndividual individual;
private IIndividualFactory individualFactory;
}
public interface IIndividualFactory{
public IIndividual getInstance();
}
public class SmallIndividualFactory implements IIndividualFactory{
public IIndividual getInstance(){
return new SmallIndividual();
}
//...
}
public interface IIndividual{
//...
}
public class SmallIndividual implements IIndividual{
//...
}
Making the SmallIndividualFactory a static variable in the FastGeneticAlgorithm doesn't seem to me like good practice. And passing the SmallIndividualFactory to Main, so that Main can pass it down to FastGeneticAlgorithm also doesn't seem right.
My question is how to solve this? Thank you.
When it comes to using Dependency Injection, the Abstract Factory pattern is often over-used. This doesn't mean that it's a bad pattern per se, but in many cases there are more suitable alternatives for the Abstract Factory pattern. This is described in detail in Dependency Injection Principles, Practices, and Patterns (paragraph 6.2) where is described that:
Abstract Factories should not be used to create short-lived, stateful dependencies, since a consumer of a dependency should be oblivious to its lifetime; from perspective of the consumer, there should conceptually be only one instance of a service.
Abstract Factories are often Dependency Inversion Principle (DIP) violations, because their design often doesn't suit the consumer, while the DIP states: "the abstracts are owned by the upper/policy layers", meaning that consumer of the abstraction should dictate its shape and define the abstraction in a way that suits its needs the most. Letting the consumer depend on both a factory dependency and the dependency it produces complicates the consumer.
This means that:
Abstract Factories with a parameterless create method should be prevented, because it implies the dependency is short-lived and its lifetime is controlled by the consumer. Instead, Abstract Factories should be created for dependencies that conceptually require runtime data (provided by the consumer) to be created.
But even in case a factory method contains parameters, care must be taken to make sure that the Abstract Factory is really required. The Proxy pattern is often (but not always) better suited, because it allows the consumer to have a single dependency, instead of depending on both the factory and its product.
Dependency Injection promotes composition of classes in the start-up path of the application, a concept the book refers to as the Composition Root. The Composition Root is a location close to that application's entry point (your Main method) and it knows about every other module in the system.
Because the Composition Root takes a dependency on all other modules in the system, it typically makes little sense consume Abstract Factories within the Composition Root. For instance, in case you defined an IXFactory abstraction to produce IX dependencies, but the Composition Root is the sole consumer of the IXFactory abstraction, you are decoupling something that doesn't require decoupling: The Composition Root intrinsically knows about every other part of the system any way.
This seems to be the case with your IGeneticAlgorithmFactory abstraction. Its sole consumer seems to be your Composition Root. If this is true, this abstraction and its implementation can simply be removed and the code within its getInstance method can simply be moved into the MainProgram class (which functions as your Composition Root).
It's hard for me to understand whether or not your IIndividual implementations require a factory (it has been at least 14 years ago since I implemented a genetic algorithm at the University), but they seem more like runtime data rather than 'real' dependencies. So a factory might make sense here, although do verify whether their creation and implementation must be hidden behind an abstraction. I could imagine the application to be sufficiently loosely coupled when the FastGeneticAlgorithm creates SmallIndividual instances directly. This, however, is just a wild guess.
On top of that, best practice is to apply Constructor Injection. This prevents Temporal Coupling. Furthermore, refrain specifying the implementations dependencies in the defined abstractions, as your AbstractGeneticAlgorithm does. This makes the abstraction a Leaky Abstraction (which is a DIP violation). Instead, declare the dependencies by declaring them as constructor arguments on the implementation (FastGeneticAlgorithm in your case).
But even with the existence of the IIndividualFactory, your code can be simplified by following best practices as follows:
// Use interfaces rather than base classes. Prefer Composition over Inheritance.
public interface IGeneticAlgorithm { ... }
public interface IIndividual { ... }
public interface IIndividualFactory {
public IIndividual getInstance();
}
// Implementations
public class FastGeneticAlgorithm implements IGeneticAlgorithm {
private IIndividualFactory individualFactory;
// Use constructor injection to declare the implementation's dependencies
public FastGeneticAlgorithm(IIndividualFactory individualFactory) {
this.individualFactory = individualFactory;
}
}
public class SmallIndividual implements IIndividual { }
public class SmallIndividualFactory implements IIndividualFactory {
public IIndividual getInstance() {
return new SmallIndividual();
}
}
public static class Program {
public static void main(String[] args){
AbstractGeneticAlgorithm algoritm = CreateAlgorithm();
algoritm.makeIndividual();
}
private AbstractGeneticAlgorithm CreateAlgorithm() {
// Build complete object graph inside the Composition Root
return new FastGeneticAlgorithm(new SmallIndividualFactory());
}
}
I have recently reviewed a few spring projects.
I saw some interface which created for one class in dao and service layer of some projects.
like this:
public interface EmployeeDao(){
//some methods declaration
}
public class EmployeeDaoImp implements EmployeeDao (){
// methods overriding
}
public interface CompanyDao(){
//some methods declaration
}
public class CompanyDaoImp implements CompanyDao (){
// methods overriding
}
there is no need to Polymorphism in this case.
why we need to these interfaces?,what is advantage of these?.
I hope I could express myself.
Spring uses interfaces a lot because they follow the design principle "Program to an interface not to an implementation"
See Programming to an interface
I am working on GWT project with JDK7. It has two entryPoints (two clients) that are located in separate packages of the project. Clients share some code that is located in /common package, which is universal and accessible to both by having the following line in their respective xml-build files:
<source path='ui/common' />
Both clients have their own specific implementations of the Callback class which serves their running environments and performs various actions in case of failure or success. I have the following abstract class that implements AsyncCallback interface and then gets extended by its respective client.
public abstract class AbstractCallback<T> implements AsyncCallback<T> {
public void handleSuccess( T result ) {}
...
}
Here are the client's classes:
public class Client1Callback<T> extends AbstractCallback<T> {...}
and
public class Client2Callback<T> extends AbstractCallback<T> {...}
In the common package, that also contains these callback classes, I am working on implementing the service layer that serves both clients. Clients use the same back-end services, just handle the results differently. Based on the type of the client I want to build a corresponding instance of AbstractCallback child without duplicating anonymous class creation for each call. I am going to have many declarations that will look like the following:
AsyncCallback<MyVO> nextCallback = isClient1 ?
new Client1Callback<MyVO>("ABC") {
public void handleSuccess(MyVO result) {
doThatSameAction(result);
}
}
:
new Client2Callback<MyVO>("DEF") {
public void handleSuccess(MyVO result) {
doThatSameAction(result);
}
};
That will result in a very verbose code.
The intent (in pseudo-code) is to have the below instead:
AsyncCallback<MyVO> nextCallback = new CallbackTypeResolver.ACallback<MyVO>(clientType, "ABC"){
public void handleSuccess(MyVO result) {
doThatSameAction(result);
}
};
I was playing with the factory pattern to get the right child instance, but quickly realized that I am not able to override handleSuccess() method after the instance is created.
I think the solution may come from one of the two sources:
Different GWT way of dealing with custom Callback implementations, lets call it alternative existent solution.
Java generics/types juggling magic
I can miss something obvious, and would appreciate any advice.
I've read some articles here and on Oracle about types erasure for generics, so I understand that my question may have no direct answer.
Refactor out the handleSuccess behavior into its own class.
The handleSuccess behavior is a separate concern from what else is going on in the AsyncCallback classes; therefore, separate it out into a more useful form. See Why should I prefer composition over inheritance?
Essentially, by doing this refactoring, you are transforming an overridden method into injected behavior that you have more control over. Specifically, you would have instead:
public interface SuccessHandler<T> {
public void handleSuccess(T result);
}
Your callback would look something like this:
public abstract class AbstractCallback<T> implements AsyncCallback<T> {
private final SuccessHandler<T> handler; // Inject this in the constructor
// etc.
// not abstract anymore
public void handleSuccess( T result ) {
handler.handleSuccess(result);
}
}
Then your pseudocode callback creation statement would be something like:
AsyncCallback<MyVO> nextCallback = new CallbackTypeResolver.ACallback<MyVO>(
clientType,
"ABC",
new SuccessHandler<MyVO>() {
public void handleSuccess(MyVO result) {
doThatSameMethod(result);
}
});
The implementations of SuccessHandler don't have to be anonymous, they can be top level classes or even inner classes based on your needs. There's a lot more power you can do once you're using this injection based framework, including creating these handlers with automatically injected dependencies using Gin and Guice Providers. (Gin is a project that integrates Guice, a dependency injection framework, with GWT).
In all of the Guice examples I have found, getting an instance involves calling Injector.getInstance() with the concrete class as a parameter. Is there a way to get an instance from Guice using only the interface?
public interface Interface {}
public class Concrete implements Interface {}
Interface instance = injector.getInstance(Interface.class);
Thanks
Actually that's exactly what Guice is made for.
In order to make getInstance() work with an interface you'll need to first bind an implementation of that interface in your module.
So you'll need a class that looks something like this:
public class MyGuiceModule extends AbstractModule {
#Override
protected void configure() {
bind(Interface.class).to(Concrete.class);
}
}
Then when you create your injector you just need to pass an instance of your module in:
Injector injector = Guice.createInjector(new MyGuiceModule());
Now your call to injector.getInstance(Interface.class) should return a new instance of Concrete using the default constructor.
Of course there are many many more ways you can do bindings but this is probably the most straight forward.
It works for interface as well:
bind( Interface.class ).to( Concrete.class );
Without using a Module, you can also specify the implementation class to be used by default, directly in the interface declaration:
#ImplementedBy(Concrete.class)
public interface Interface {}
This doesn't necessarily fit every situation but I found this comes in handy most of the times.
Additionnally, when using #ImplementedBy annotation, you can still override the implementation class by binding another concrete class in a Module. That can also be useful.
I am interested in getting the class being proxied from spring, rather than the proxy.
ie:
public class FooImpl<KittyKat> {
#Transactional
public void doStuff() {
getBar();
// java.lang.ClassCastException: $Proxy26 cannot be cast to
// com.my.foo.Bar
}
}
public abstract class AbstractFoo<T extends AbstractBar> {
public String barBeanName;
protected T getBar() {
// java.lang.ClassCastException: $Proxy26 cannot be cast to
// com.my.foo.Bar
return (T)appContext.getBean(barBeanName);
}
}
public class KittyCat extends AbstractBar {
...
}
public abstract class AbstractBar {
...
}
Are you trying to get the proxied bean only because of the ClassCastException? If you could cast to Bar, would you happy with that?
When Spring creates a proxy, it checks to see if the bean class implements any interfaces. If it does, then the generated proxy will also implement those interfaces, but it will not extend the target bean's class. It does this using a standard java.lang.reflect.Proxy. This seems to be the case in your example.
If the target bean's class does not implement any interfaces, then Spring will use CGLIB to generate a proxy class which is a subclass of the target bean's class. This is sort of a stop-gap measure for proxying non-interface beans.
You can force Spring to always proxy the target class, but how you do that depends on how you created the Bar proxy to begin with, and you haven't told us that.
The generally preferred solution is to refer to your proxied beans by their interfaces, and everything works nicely. If your Bar class implement interfaces, could your Foo not refer to that interface?