java Spring, autowired singleton components and performance - java

Our java spring app exposes a Rest API and is fairly typical
It is build in layers:
RestController classes
Service classes
Repository classes
oracle DB
Those layers use the annotation #RestController, #Service, etc. with #Autowired to access the lower level.
Now, all those end up being singletons. But since there is no state, it should not be a problem.. Multiple requests can proceed concurrently using the same singletons objects (in different threads of execution). At least, that's my understanding.
Now, for one of our new Rest API endpoints, we have to call a third party API, so the datapath will be like this:
RestController
Service
CustomRestClient
third party server somewhere
In CustomRestClient, there is an #Autowired RestTemplate instance. Because everything is a singleton, this rest_template will be one as well.
Now i checked and it is thread-safe so it should work without any concurrency access exception. However I wonder:
Will that constitute a bottleneck, all parallel requests having to wait to use this one shared RestTemplate? with some internal locks or something? How to make sure no thread has to wait?
(btw one reason rest_template is autowired is to help with mock unit testing)
I guess my question is more about how to make sure this Spring singleton architecture can serve lots of parallel requests even when one layer has to use an injected object that may have state? And if that's impossible and that object has to be instantiated with "new", how can we still unit-test the class using the mock approach?

Related

SpringBoot: How to make generic class that is not tightly coupled to the framework?

Front-End dev here working on my first Java Spring Boot API. I've been reading many articles on the "best practices" in Spring/Spring Boot and have been attempting to refactor my code to follow those practices.
Below I have an example of a generic class I use to handle all HTTP requests for my various services. Originally I had this class annotated with the #Component annontation, but as I mentioned I am trying to learn and follow Spring "best practices." In particular I am interested in implementing what this article on best practices describes (Number 3 & 4 in the article). That says one should avoid using #component, because we don't want to be tightly coupled to the Spring framework and we want to avoid "entire class path scanning."
#Slf4j
public class HttpService {
private HttpServletRequest request;
private RestTemplate restTemplate;
public HttpService(HttpServletRequest request, RestTemplateBuilder restTemplateBuilder) { ... }
public String get(String url, String id) { ... }
}
With the #component annotation my service works as expected, but when I remove it I get the exception:
org.springframework.beans.factory.NoSuchBeanDefinitionException: No
qualifying bean of type
This seems to be a pretty common exception in Java as there is a LOT of questions about it, but those solutions have not worked for me.
My question I'm hoping the community can help me answer is two part:
How do I correctly make this a useable generic class that is not tightly coupled to Spring?
Is this actually a correct approach for Spring development or am I reading too deeply into this article?
That says one should avoid using #Component because we don't want to be tightly coupled to the Spring framework ...
It says we don't want our domain classes to be tightly coupled to Spring. There is a difference between HttpService (which is bound to the web context and depends on Spring's RestTemplateBuilder) and MySuperSpecificDomainCalculator (which should keep working regardless of the context it is put in).
... and we want to avoid "entire classpath scanning."
I see nothing extremely evil in using classpath scanning. You may let Spring scan a small set of packages. You may point exactly to where your Spring-dependent classes reside. A #Configuration with #Bean methods is an alternative, and sometimes the only one.
How do I correctly make this a useable generic class that is not tightly coupled to Spring?
You are designing a web layer. You are choosing a technology to use. You decided to go with Spring Web. At some point, you will have to bind Spring's classes to yours. As you already did. HttpService is dependent on Spring regardless of the annotation. It takes RestTemplateBuilder which is a Spring class.
And there is nothing wrong. You need to write such classes to integrate the framework into your application and to make it work for you. Just make sure the line between the Spring world and your domain world is well-defined.
Is this actually a correct approach for Spring development or am I reading too deeply into this article?
The article is reasonable. Your HttpService looks a valid #Component to me. Don't overthink it.
Looking at the article you have linked, I think there is a misunderstanding on the term "domain". The "domain" is where the application generates its value. Taking Steam as an example, the capability to search and buy games generates (business) value. A HTTP service, on the other hand, is a technical detail. It is necessary, yes, but generates no value on its own. So point 3 and 4 of the article do not apply to the HttpService.
In a typical web application, you have at least three layer:
One layer providing the the API: those are your endpoints and all objects that are used by the API. This includes, for example, entities representing the requests and responses passed through/returned by the endpoints.
One layer providing Persistence: Typically, these are repositories, DAOs, or whatever persistence model you want to use. Again, everything needed by the persistence unit should be included in this layer.
One layer holding the actual domain: this is what we talk about when we talk about the "domain". Here we see what the application actually does.
Now comes the tricky part: The web and the database are nothing else than I/O channels. they could be a CLI and some files, HTTP requests/responses and a database or whatever will be developed in the future. A good software design allows for easy swapping of the I/O channel, while preserving the domain-specific code. You see this in many applications:
OpenShift has a REST API and a CLI
AWS has a REST API and several CLIs, many are 3rd party
Steam has a web storefront, a desktop app and a mobile client
Keycloak allows pulling authentication information from different sources (LDAP, OAuth2/OIDC, database,...)
But to get this flexibility, we have to segregate somewhere. What is a technicality? What is domain? How clean do I want/have to decouple? In your concrete case: Does it matter whether you bind yourself to spring-boot within your service?
To achieve such an architecture, we use patterns and principles. A set of well-known principles are the SOLID-principles. How strict one should follow those principles is a personal matter everyone has to decide for her-/himself. My personal opinion is that a pragmatic approach is often sufficient.
I also want to pick up #chrylis comment on abstraction. We as software engineers like to indulge ourselves in technicalities and create a superawesome abstraction that can basically deal with everything. But that is not the point. It does not have to deal with everything, only with what it will be used for. If you work for clients: that is not what the client is paying you for. And abstraction always comes at a cost. In most cases, this cost is complexity and/or readability. Or in words from wo*men wiser than me: KISS and YAGNI
Adding #Component to your class forces anyone who uses your class to know about Spring (i.e. spring is a compile time dependency).
The easiest alternative is to create separate class annotated with #Configuration in your app, and let it handle creating your class a Spring bean.
For example:
#Configuration
public class MyConfiguration {
#Bean
public HttpService httpService() {
return new HttpService();
}
}
This keeps your HttpService class free from Spring dependencies (assuming it doesn't use any other Spring annotations such as #Autowired), but lets it behave as a Spring bean in your own application.
Note that your class still depends on RestTemplateBuilder, which itself is a Spring boot class, which means your class (and anyone who uses it) will require Spring.

Clean way to wire variables after serialization?

I've got a web application ( tomcat 7, spring 4.2, ZK 7). As I have two servers that can "take over" the other's sessions, serialization of the sessions is required, which leads to the problem that I have to somehow re-initialize the spring services after deserialization. Due to the structure of ZK, it is required that the Composers (kind of Controllers) need to be serialized (and these Composers use Services).
For example, let's say I have an object that needs to be serialized. This objects has a reference to a Spring service (which cannot be serialized, since in the end, there's a reference to a DataSource, SqlSessionTemplate, etc. - all of them not Serializable).
So, now how to solve this problem elegantly? Is there some way to integrate Spring into the deserialization process so that Spring automatically re-wires my (transient, autowired) variables after (or even while) deserialization?
The current solution is to have a singleton bean lying around that has a #Autowired reference to the ApplicationContext, so that I can access it via getInstance() to get a reference to a Service, but this solution is not very elegant and also makes testing more complex (since I prefer to unit test without loading a Spring context).
Is there some other, preferably better, way to do this?
It seems, that the most obvious and elegant answer is to declare the ScopedProxyMode of a bean, that wraps it into a Proxy and dynamically sets the non-serializable dependencies, for example...
#Scope(proxyMode=ScopedProxyMode.TARGET_CLASS)
More can be found in the Spring documentation here. This also has been discussed here on StackOverflow already (with a link to the presentation when they announced it).

Which approach should I use to inject 10+ services in Spring controller?

I am developing Spring mvc application.
I have a controller in which I am injecting more than 10 Services.
I am exposing 10+ url from this controller and at a time I am using one or two service objects in each method.
I was thinking of 2 approaches.
Directly inject all services using #Autowired.
Fetch Service from ApplicationContext at runtime whenever required.
Please suggest me which approach is better or both approach are equal in terms of memory usage and time
Thanks
The best approach in most cases is to break up the controller into multiple controllers.
Having too many dependencies is a Code Smell since your controller most likely violates the Single Responsibility Principle.
Both using #Autowired for many dependencies and using the ApplicationContext to dynamically retrieve the dependency are mediocre solutions in most cases and should be avoided whenever possible. What you should do is break up the controller and then use #Autowired (preferably constructor rather than field injection - check out this for more details) to make Spring inject the dependencies.
In the case you describe you should not be worried about the performance or the memory consumption of your proposed solutions, but the maintanability of the code.
Although dynamic lookup of a dependency from the ApplicationContext will be a little slower that accesing the injected by the container dependency, in almost all cases you we never be able to tell the difference. As I mentioned above, the first concern you must be looking at is code maintanability, not micro-performance/memory issues.

Can I pool controllers in a spring mvc application and if so how will it effect the performance?

In spring 3.0 controllers can be created simply by annotating the class as #Controller and is singleton by default.
So to cater many requests container will have only one object of that type.
On the other-hand if it is prototype,then many objects will be created and hence resource utilization will be poor.
Please correct me if I am wrong. My question is can I pool the controllers and if I can, then will it improve the concurrency and throughput?
You are correct that all Controllers are singleton by default.
Unless your Controller is stateful there is no need to have a pool of instances. Your web container will be using a managed pool of Threads to handle requests, each of which can access the Controller at the same time (due to there being no shared state). I would suggest that tuning your web container will give you better results for concurrency and throughput.
If your Controllers are stateful then there is still no need for a pool of instances. Instead you should probably manage the state within Session or Request scoped beans and rely on Spring to inject these into the Controller on each request ensuring that multiple Threads of execution do not interfere with one another.
Given your current level of understanding you should be fairly comfortable with different scopes. I would suggest also reading and understanding how Spring makes use of Proxys to inject scoped beans into Controllers.

java web applicaton layout, please explain some design principles/patterns

I'm looking at this java web application that is using hibernate, jsp's, and spring framework. (from what I can tell!)
the file layout is like this:
classes/com/example/project1
inside project1
/dao
_entity_Dao.java
/dao/hibernate/
_entity_DaoHibernate.java
/factory
DaoFactory.java
DaoFactoryImpl.java
/managers
_entity_Manager.java
/managers/impl
_entity_ManagerImpl.java
/model
_entity_.java
/service
_xxxx_Service.java
/service/impl/
_xxxx_ServiceImpl.java
Have you guys read about this sort of layout somewhere? Is it considered best-practice?
What is the difference between a Factory and a Manager and a Service? (high level)
For typical layout of an application built with Spring I'd look at the example web applications that ship with it (meaning Spring).
Using things like DaoFactory is definitely not be a best-practice, the Daos should get injected instead. In general you should not need factories with Spring except for some unusual cases. Injecting is done when the web application starts up, spring reads the configuration information and constructs all the objects and plugs them in according to configuration xml and/or annotations (this is assuming singleton-scope for your objects, which is usual for stateless things like daos and services, things scoped as prototypes get new copies created as the application asks for them).
In Spring applications a service is similar to a Stateless Session Bean, it is a transactional layer encompassing application logic for a use case. So if your user takes an action that has the effect of causing several different tables to get updated you can inject the daos into that service, have a method on that service do the updates on the daos, and configure Spring to wrap that service in a proxy that makes that method transactional.
I've seen manager used as another name for what I described as service. Here I don't know what they're doing with it.
I don't like the idea of combining your interfaces and impls in one project. Just because you want to consume the interface doesn't mean you want to consume the impl and it's cumbersome transitive dependencies. The main reason is because there will be more than one impl (hypothetically, i.e. JPA/JDBC/Hibernate, or Axis2/CXF, etc.). The interfaces should not be bound to the implementation, otherwise the point is lost. This also allows for easy dependency injection as the impls simply reside on the classpath, then something like a Proxy or Spring (e.g.) can inject the implementations.
In all likelihood, all you need is a:
Interface Project
dao
EntityDao
types
Entity
HibernateImpl Project
dao
EntityHibernateDao
src/main/resources/
EntityMapping.cfg.xml

Categories

Resources