In Spring I have a controller, a service interface which provides the methods this controller can access. The controller invokes various implementation methods of the service.
To acheive same 'seperation of design' in scala is this the correct implementation :
Define the scala controller, define a scala trait which acts as the service interface. Define a new class which extends this trait and provides the implementations of the service. The controller will then instatiate this new class and call the various methods implementations of the service methods.
Is this good design or how Spring MVC is used in practice ?
As has been commented by others, 'good design' is a flexible concept depending on other factors. I shall not add to that discussion but offer an overview of our approach instead.
We started with a conventional Java & Spring webapp, although we chose Jersey instead of Spring MVC. Later, we recoded in Scala, which went well. We deliberately kept to a Java-like style Scala - this may be seen as uncool but it works well and is easy to train up new colleagues.
Then we decided to drop Spring, along with its XML and the whole shebang of transitive dependencies. This was easy because we already had a set of services and controllers that were all classes with constructor-injected dependencies (all TDD of course). All we had to do was write a new Bootstrap class that instantiates the services and controllers, supplying the necessary concrete classes in each constructor parameter list. Conveniently, the Bootstrap class is essentially a transliteration of the original Spring wiring into (quite simple) Scala. The Bootstrap class is started from web.xml when the app starts. (This approach will be familiar to anyone who has used Pico Container.)
In our case, we didn't need to use traits much in our service layers; clean design of concrete classes driven by TDD was sufficient. But our approach would also work well with pluggable abstractions for the services, if that were necessary.
Now we have a webapp with no XML except web.xml, purely in Scala so it's easy to navigate and modify, and with far fewer external dependencies. This worked very well for us.
"Good design" is quite subjective and the meaning of "good design" changes over time for each programmer. There are a few things that most people consider best practices, yet even best practices have conflicts. My personal opinion is that a programmer should continue to learn these best practices and more importantly keep molding his code until it reaches the best shape for that situation. That point however, where it's the 'best' shape keeps changing as the programmer keeps learning.
I can not tell you what "good design" design is in your case as I don't know the situation. On top of that, I am not you, so my "good design" is not the best for you. I would suggest you find it yourself with the help of some questions:
Who are you programming for?
How long will your code live?
Who will maintain your code?
Do you want to create automatic tests and are you willing to change your design for that?
Do you need more than one implementation of a single principle?
What style feels right for you at this point in time?
How often will the code change?
Do you want to take the future into account, or only create what is needed right now?
What libraries do you like to use?
How much time do you want to spend?
Related
In my Spring project I have many simple services to fetching data (just a simple CRUD). The design of the developers that started this project was to create the implementation for each of the service like
public interface UserService
and then implementation like
public class UserServiceImpl implements UserService
Since there is no chance that UserService will have more implementation I'm really sick of these Impl suffix and the more I read (e.g. this article) I'm realising that I have reasons to being sick
I had a discussion with a friend from a team last week and I shared my thoughts with him but what he answered was 'basically you're right but Spring likes interfaces and works with them better than with classes'.
Unfortunately I'm not an expert in Spring and, however I was trying to look for some arguments, I was not able to find an answer was he right.
Are there some strong arguments to use such approach in Spring to have interface for every little service class?
I can tell from real world projects, that it works well without interfaces only having the implementing class. Following the principle "You aren't gonna need it" (YAGNI), you simplify your code if you follow that rule. Dependency Injection works also well with classes, interfaces are not a requirement for it.
Sure you can write and reuse test implementations, but you can do the same with mocks e.g. with mockito and overwrite the behavior of your implementation class for test cases.
I've gone through all the Answers here, But would like to add more on proxy
AOP can use JDK proxy OR CGlib proxy
If Class has implemented interface it will use JDK proxy(preferred whenever you have a choice).
If Class has not implemented interface it will use CGlib proxy.
Wherever you want to reap benefits of dependecy injection (DI) pattern you need to program against abstractions, usually an interface.
There are more benefits to DI, but the most persuasive seems to be it allows unit testing. There your interfaces will get at least one more implementation (the mock implementantion), when you will want to test your class in isolation from its dependencies (those production implementations of the interfaces).
That said, that doesn't mean every class must implement some interface. Some parts of code can be tightly coupled together without problem.
Note that using Spring or not doesn't play role in the use DI/not use DI decision.
It isn't a must and maybe opinion based, but you are adding interface to enable future flexibility of service,
Although you don't see real usage, it will allow you to use a different implementation of specific services inside unit/integration test
You can add test implementation instead of current implementation and use it instead of real service when executing test (for example by using different Spring profile)
This can be done using mocks as #Simulant points out
Acutally not needed , currently , micro service or mini code base is popular.
So normally , in rest api backend , you really do not have chance to have serveral implemention for certain interface .
In this situation , concrete class with #Serivice is enough.
As others have suggested it really depends upon the use cases. Although Spring and Java in general started as a verbose language with design where interfaces are suppose to act as what client, the implementation classes, can see but I am finding less and less verbose code these days esp. with Spring Boot and libraries like lombok these days.
So, it is not mandatory to create interfaces for Service, DAO but it is preferred if you are working on a fairly medium code base where there are multiple developers and possibly clients consuming those APIs outside of the application as well. But if you are working for a small or proof of concept projects, you can create a CRUD application on one Java class as well.
I'm trying to do something clever. I am creating a weather application in which we can replace the weather API with another weather API without affecting the code base. So I started with a Maven project with multiple modules.
I have a Base module that contains the Interface class and the Base class. The Interface class contains the calls to the APIs (all calls are similar, if not exact) and the Base class contains the properties to the APIs (again, all properties are similar, if not exact).
I have a module for each of the two weather APIs we are testing with plans to create more modules for new weather APIs as we grow the application.
Finally, I have created a Core module (includes main) to implement the specific module class for the weather API I want to test.
Now, I know the simplest way to do this would be to use a switch statement and enumeration. But I want to know if there is a more clever way to do this. Maybe using a Pattern? Any suggestions?
Here is a picture of the structure I have just described:
Here is the UML representation:
This is a learning process for me. I want to discover how a real Java Guru would implement the appropriate module and class based on a specified configuration.
Thank you for your suggestions.
I'm trying to do something clever. I am creating a weather application
in which we can replace the weather API with another weather API
without affecting the code base.
Without reading further down, this first statement makes me think about a plugin architecture design, but in the process of software design, decisions must not be rushed, the more you delay, the more information you have and a better informed decision can be made, for now is just an idea to keep in mind.
I have a Base module that contains the Interface class and the Base
class. The Interface class contains the calls to the APIs (all calls
are similar, if not exact) and the Base class contains the properties
to the APIs (again, all properties are similar, if not exact).
When different modules share behaviour/state, it is a good idea to refactor them and produce base abstract classes and interfaces, so you are on the right track, but, if there are differences, those shouldn't be refactored into the base module. The reason behind that is simple, maintainability. If you start adding if clauses or switches to deal with these differences, you just introduced coupling between modules, and you'll be always having to make changes in the base module, whenever you add/modify other modules, and this is not desirable at all.
This is reflected by the Open/Closed principle form the SOLID principles, which states that a class should be open for extension but closed for modifications.
So after you've refactored the common behaviour into the base modules, then each new API should extend the base module, as you did.
Finally, I have created a Core module (includes main) to implement the
specific module class for the weather API I want to test.
Now, I know the simplest way to do this would be to use a switch
statement and enumeration. But I want to know if there is a more
clever way to do this. Maybe using a Pattern? Any suggestions?
Indeed, making use of a switch, makes it work, but its not a clean design at all, for the same reason as before, when adding, modifying or removing modules, would require to modify this module aswell, and also this code can potentially break.
One possible solution, would be to delegate this responsability on a new component and make use of a creational design pattern like the Abstract Factory, which will provide a interface to instantiate components without specifying its classes.
As for the architecture, so far, the plugin architecture still makes sense, but what if the different modules extend the base contract adding more features? One option is to use the Facade pattern to adapt the module calls and provide an output that implements an interface that clients expect.
But then again, with the provided details, this is the solution I'd suggest, but the scenario should be studied carefully and in greater detail, in order to be able to assure that these are the right tools for the job, and commit to them.
In addition to Salvador Juan Martinez's answer...
To implement a plugin architecture Java's Jar File Specification provides support for service provider interfaces (SPI) and how they are looked up.
As of Java 1.6. you can use the ServiceLoader to lookup service providers. For Java 1.5. and less you must do it on your own or use a library. E.g. commons-discovery.
The usage is quiet simple. In your case put a META-INF/services/com.a2i.weatherbase.IWeather file in each plugin module.
In the Weather Forecast IO module the file should contain only one line
com.a2i.weatherforecastio.ForecastIO
The line must be the full quallified name of an IWeather implementation class.
Do the same for the other module and you can load the implementations via ServiceLoader.
ServiceLoader<IWeather> weatherServicesLoader = ServiceLoader.load(IWeather.class);
Iterator<IWeather> weatherServices = weatherServicesLoader.iterator();
Now it depends on your runtime classpath how many services will be found. Try to add and remove module jar archives from the classpath and run your application.
EDIT
I wrote a blog about a pluggable architecture with standard java. See http://www.link-intersystems.com/blog/2016/01/02/a-plug-in-architecture-implemented-with-java/
Source code is also available at https://github.com/link-intersystems/blog/tree/master/java-plugin-architecture
One solution is you have to define the common interface with all the identified common operations. The extensions/plugins need to implement that interface and have to provide the implementation to common operations.
You can use an abstract factory design pattern to hook up the exact implementation at runtime based on the input parameters.
Interfaces and abstract classes are always good in such scenarios, Thanks.
I have a few questions about Spring paradigm in Java:
1) Suppose I have an application where I write everything to interfaces, and then at the very last moment, somewhere in my actual main() or maybe in a config file, I define my specific classes to be used. Have I achieved the same objective as Spring? In that case, why do I need Spring's DI? Writing to interfaces, and leaving specifics till the very last moment, is standard practice that programmers have been using for decades.
2) If the objection is to new'ing objects at some (final) point in time, this has to be done at some point in my interface-driven app, but what's wrong with that? How does having a "new" statement make a class unusable or untestable - or is it just readability/transparency?
3) People say that declaratively using objects "gets rid of dependencies." But we still have a dependency: we have to import a new class, even if we don't "new" it, before we can compile the code?
Some people, like me, prefer to configure the wiring of dependencies and interface implementations using Spring XML rather than hardcode them. All the wirings are in one place (assuming you are not using annotations) and I can also argue that modifying the configuration of the XML file is easier than modifying code. You can also tweak the Spring file between runs of your application if there is something that needs to change.
Spring is a good framework that has been around for a while. I find it's really really good at Dependency Injection (DI). While there is nothing "wrong" with your approach in #1, I think using Spring will give you a more robust implementation. Why reinvent the wheel?
I have learned Spring for quite some time (just learn, without actual hands-on experience on real project). In my understanding, Spring provides a DI frameworks, which allows centralize way of connecting / wiring all the classes in one place. The classes themselves do not compose / instantiate other components.
I can understand, DI allows easier unit-testing for each component as they are depending on interface.
My question is, why does wiring all the classes in centralize way (externally) helps in development process (besides testing), compared to traditional way (each class instantiates another class).
This link on DI explains it pretty well:
http://en.wikipedia.org/wiki/Dependency_injection#Motivation
The primary purpose of the dependency injection pattern is to allow selection among multiple implementations of a given dependency interface at runtime, or via configuration files, instead of at compile time. The pattern is particularly useful for providing "mock" test implementations of complex components when testing; but is often used for locating plugin components, or locating and initializing software services.
Unit testing of components in large software systems is difficult, because components under test often require the presence of a substantial amount of infrastructure and set up in order to operate at all. Dependency injection simplifies the process of bringing up a working instance of an isolated component for testing. Because components declare their dependencies, a test can automatically bring up only those dependent components required to perform testing.
It improves the quality of your code by reducing coupling between classes.
If a class instantiates an instance of another class, then there is a dependency directly between the two classes (=tight coupling). So for example, if Class A has-a relatoinship with Interface B, if Class A handles the instantiation of the Interface B, then Class A must specify a concrete implementation to instantiate and those classes become tightly coupled.
Lets say we have the following interface:
interface B{}
and then the following Class
class A{
private B b = new BImpl();
...
}
In the above example (without DI), Class A has an explicit dependency on BImpl, which means if you ever want to use a different implementation of B then you also have to change Class A.
DI (and loose coupling in general) aims to remove these kind of dependencies and have a code base where changes to one part of the code do not "ripple" through the entire application requiring lots of changes. The above example is pretty trivial, but if you have a medium to large size codebase with tight coupling this problem can get pretty bad.
why does wiring all the classes in centralize way
Centralized configuration is an implementation detail rather than part of DI. Guice for example can spread the configuration about a bit (I've not used Spring in anger so I can not comment on it).
why does wiring all the classes ... externally
As this allows you to change the implementation. DI is not the only way but it is the most popular. Factories and Service Locators are the main alternatives. Without some way of swapping out the implementation testing becomes near impossible.
development process (besides testing)
Testing is a very important part. It alone is a good reason to separate creation and use.
Unlike the other two methods above and direct initialization DI also makes the dependencies visible (esp. ctor injection) this can help other users of the class. By making the dependencies so visible it can be used to give you a warning when your class is doing too much (as it will require a lot of dependencies).
DI concept is independent of spring. Dependency Injection is possible even without using Spring. I.e. manually injection of dependency is also possible.
Please refer java example given on wiki:- http://en.wikipedia.org/wiki/Dependency_injection#Manually_injected_dependency
DI main purpose is loose coupling.
Spring provides IOC (Inversion Of Control). When we use spring to inject dependency using the Spring IOC, we get features like:
1) Loose coupling, reduce time to add new feature. Code to interface will provide this. Add new service which complies to interface and which can be replaced in bean configuration.
2)No need to change code/compile required while changing dependency.
3)Easier and fast testing. Hence, can cover more cases in same time frame which leads to good product.
4) Spring provides lots of different templates to make developers life easier. These all templates are using DI/Ioc concept. Leads to faster development cycle. Such templates are available for batch processing, JMS, JMX, JDBC operations and many more.
Annotations becoming popular. Spring-3 supports them. CDI depends on them heavily (I can not use CDI with out of annotations, right?)
My question is why?
I heard several issues:
"It helps get rid of XML". But what is bad about xml? Dependencies are declarative by nature, and XML is very good for declarations (and very bad for imperative programming).
With good IDE (like idea) it is very easy to edit and validate xml, is not it?
"In many cases there is only one implementation for each interface". That is not true!
Almost all interfaces in my system has mock implementation for tests.
Any other issues?
And now my pluses for XML:
You can inject anything anywhere (not only code that has annotations)
What should I do if I have several implementations of one interface? Use qualifiers? But it forces my class to know what kind of injection it needs.
It is not good for design.
XML based DI makes my code clear: each class has no idea about injection, so I can configure it and unit-test it in any way.
What do you think?
I can only speak from experience with Guice, but here's my take. The short of it is that annotation-based configuration greatly reduces the amount you have to write to wire an application together and makes it easier to change what depends on what... often without even having to touch the configuration files themselves. It does this by making the most common cases absolutely trivial at the expense of making certain relatively rare cases slightly more difficult to handle.
I think it's a problem to be too dogmatic about having classes have "no idea about injection". There should be no reference to the injection container in the code of a class. I absolutely agree with that. However, we must be clear on one point: annotations are not code. By themselves, they change nothing about how a class behaves... you can still create an instance of a class with annotations as if they were not there at all. So you can stop using a DI container completely and leave the annotations there and there will be no problem whatsoever.
When you choose not to provide metadata hints about injection within a class (i.e. annotations), you are throwing away a valuable source of information on what dependencies that class requires. You are forced to either repeat that information elsewhere (in XML, say) or to rely on unreliable magic like autowiring which can lead to unexpected issues.
To address some of your specific questions:
It helps get rid of XML
Many things are bad about XML configuration.
It's terribly verbose.
It isn't type-safe without special tools.
It mandates the use of string identifiers. Again, not safe without special tool support.
Doesn't take any advantage of the features of the language, requiring all kinds of ugly constructs to do what could be done with a simple method in code.
That said, I know a lot of people have been using XML for long enough that they are convinced that it is just fine and I don't really expect to change their minds.
In many cases there is only one implementation for each interface
There is often only one implementation of each interface for a single configuration of an application (e.g. production). The point is that when starting up your application, you typically only need to bind an interface to a single implementation. It may then be used in many other components. With XML configuration, you have to tell every component that uses that interface to use this one particular binding of that interface (or "bean" if you like). With annotation-based configuration, you just declare the binding once and everything else is taken care of automatically. This is very significant, and dramatically reduces the amount of configuration you have to write. It also means that when you add a new dependency to a component, you often don't have to change anything about your configuration at all!
That you have mock implementations of some interface is irrelevant. In unit tests you typically just create the mock and pass it in yourself... it's unrelated to configuration. If you set up a full system for integration tests with certain interfaces using mocks instead... that doesn't change anything. For the integration test run of the system, you're still only using 1 implementation and you only have to configure that once.
XML: You can inject anything anywhere
You can do this easily in Guice and I imagine you can in CDI too. So it's not like you're absolutely prevented from doing this by using an annotation-based configuration system. That said, I'd venture to say that the majority of injected classes in the majority of applications are classes that you can add an #Inject to yourself if it isn't already there. The existence of a lightweight standard Java library for annotations (JSR-330) makes it even easier for more libraries and frameworks to provide components with an #Inject annotated constructor in the future, too.
More than one implementation of an interface
Qualifiers are one solution to this, and in most cases should be just fine. However, in some cases you do want to do something where using a qualifier on a parameter in a particular injected class would not work... often because you want to have multiple instances of that class, each using a different interface implementation or instance. Guice solves this with something called PrivateModules. I don't know what CDI offers in this regard. But again, this is a case that is in the minority and it's not worth making the rest of your configuration suffer for it as long as you can handle it.
I have the following principle: configuration-related beans are defined with XML. Everything else - with annotations.
Why? Because you don't want to change configuration in classes. On the other hand, it's much simpler to write #Service and #Inject, in the class that you want to enable.
This does not interfere with testing in any way - annotations are only metadata that is parsed by the container. If you like, you can set different dependencies.
As for CDI - it has an extension for XML configuration, but you are right it uses mainly annotations. That's something I don't particularly like in it though.
In my opinion, this is more a matter of taste.
1) In our project (using Spring 3), we want the XML-configuration files to be just that: configuration. If it doesn't need to be configured (from end-user perspective) or some other issue doesn't force it to be done in xml, don't put the bean-definitions/wirings into the XML-configurations, use #Autowired and such.
2) With Spring, you can use #Qualifier to match a certain implementation of the interface, if multiple exist. Yes, this means you have to name the actual implementations, but I don't mind.
In our case, using XML for handling all the DI would bloat the XML-configuration files a lot, although it could be done in a separate xml-file (or files), so it's not that valid point ;). As I said, it's a matter of taste and I just think it's easier and more clean to handle the injections via annotations (you can see what services/repositories/whatever something uses just by looking at the class instead of going through the XML-file looking for the bean-declaration).
Edit: Here's an opinion about #Autowired vs. XML that I completely agree with: Spring #Autowired usage
I like to keep my code clear, as you pointed. XML feets better, at least for me, in the IOC principle.
The fundamental principle of Dependency Injection for configuration is that application objects should not be responsible for looking up the resources or collaborators they depend on. Instead, an IoC container should configure the objects, externalizing resource lookup from application code into the container. (J2EE Development without EJB - Rod Johnson - page 131)
Again, it just my point of view, no fundamentalism in there :)
EDIT: Some useful discussions out there:
http://forum.springsource.org/showthread.php?t=95126
http://www.theserverside.com/discussions/thread.tss?thread_id=61217
"But what is bad about xml?" It's yet another file to manage and yet another place to have to go look for a bug. If your annotations are right next to your code it's much easier to mange and debug.
Like all things, dependency injection should be used in moderation. Moreover, all trappings of the injections should be segregated from the application code and relegated to the code associated with main.
In general applications should have a boundary that separates the abstract application code from the concrete implementation details. All the source code dependencies that cross that boundary should point towards the application. I call the concrete side of that boundary, the main partition, because that's where 'main' (or it's equivalent) should live.
The main partition consists of factory implementations, strategy implementations, etc. And it is on this side of the boundary that the dependency injection framework should do it's work. Then those injected dependencies can be passed across the boundary into the application by normal means. (e.g. as arguments).
The number of injected dependencies should be relatively small. A dozen or less. In which case, the decision between XML or annotations is moot.
Also don't forget Spring JavaConfig.
In my case the developers writing the application are different that the ones configuring it (different departments, different technologies/languages) and the last group doesn't even has access to the source code (which is the case in many enterprise setups). That makes Guice unusable since I would have to expose source code rather than consuming the xmls configured by the developers implementing the app.
Overall I think it is important to recognize that providing the components and assembling/configuring an application are two different exercises and provide if needed this separation of concerns.
I just have a couple of things to add to what's already here.
To me, DI configuration is code. I would like to treat it as such, but the very nature of XML prevents this without extra tooling.
Spring JavaConfig is a major step forward in this regard, but it still has complications. Component scanning, auto-magic selection of interface implementations, and semantics around CGLIB interception of #Configuration annotated classes make it more complex than it needs to be. But it's still a step forward from XML.
The benefit of separating IoC metadata from application objects is overstated, especially with Spring. Perhaps if you confined yourself to the Spring IoC container only, this would be true. But Spring offers a wide application stack built on the IoC container (Security, Web MVC, etc). As soon as you leverage any of that, you're tied to the container anyway.
XML has the only benefit of a declarative style that is defined clearly separated from the application code itself. That stays independent from DI concerns. The downsides are verbosity, poor re-factoring robustness and a general runtime failure behaviour. There is just a general (XML) tool support with little benefit compared to IDE support for e.g. Java. Besides this XML comes with a performance overhead so it usually is slower than code solutions.
Annoations often said to be more intuitive and robust when re-factoring application code. Also they benefit from a better IDE guidance like guice provides. But they mix application code with DI concerns. An application gets dependent on a framework. Clear separation is almost impossible. Annotations are also limited when describing different injection behaviour at the same place (constructor, field) dependent on other circumstances (e.g. robot legs problem). Moreover they don't allow to treat external classes (library code) like your own source. Therefore they are considered to run faster than XML.
Both techniques have serious downsides. Therefore I recommend to use Silk DI. It is declarative defined in code (great IDE support) but 100% separated from your application code (no framework dependency). It allows to treat all code the same no matter if it is from your source or a external library. Problems like the robot legs problem are easy to solve with usual bindings. Furthermore it has good support to adapt it to your needs.