Spring's Javaconfig and Prototyped Beans - java

I've moved my code from Spring's XML configuration to Java Configuration. I have everything working, but I have a question about how I implemented prototype beans - mainly, while what I'm doing works, is it the best way to do this? Somehow it just feels off!
I wrote the bean class this way:
#Component
#Scope("prototype")
public class ProtoBean {
...
}
Then to use the bean - this is the part that I'm just not sure about, although it does work:
#Component
public class BeanUser implements ApplicationContextAware {
ApplicationContext context;
#Override
public void setApplicationContext(ApplicationContext context)throws BeansException
{
this.context = context;
}
public void getProtoBean() {
ProtoBean protoBean = context.getBean(ProtoBean.class);
}
}
This gets me a prototyped bean, and in unit tests I just mocked the context, called setApplicationContext with the mock, and had the getBean call of the mock return a mock ProtoBean. So all is well.
I did this in the XML by using a factory, but that didn't seem to work too well, so this is where I ended up. But is there a way to do this without the context? Or just a better way?
Thanks!

I don't think is so much an issue of Spring XML vs Java-base configuration, but one of matching dependency scopes. Since Spring can only do dependency injection on the singleton-scoped bean at creation time, you have to lookup the prototype-scoped bean on demand. Of course the current bean-lookup approach works, but creates a dependency on the ApplicationContext. I can suggest a few other possibilities but the root of the issue is really what is involved in producing a ProtoBean, and what trade-offs should you accept.
You could make BeanUser itself prototype-scoped, which would allow you to wire in the ProtoBean as a member. Of course the trade-off is you now have the same problem on the clients of BeanUser, but sometimes that would not be a problem.
Another path could be using something like a singleton-scoped ProtoBeanFactory to provide ProtoBean instances, and hiding dependency lookups within the ProtoBeanFactory.
Finally, you could use a scoped-proxy bean to effectively hide the factory. It uses AOP to do this, and isn't always clear to others what sort of voodoo you have going. With XML you'd use <aop:scoped-proxy/> on the bean declaration. For annotations you'd use:
#Scope(proxyMode = ScopedProxyMode.TARGET_CLASS, value = "prototype")

Related

Prevent injection of bean with narrower scope using Spring

I'm working on a Spring application using beans of different scopes. Many beans are singletons, other request or custom scoped. Especially using those custom scopes makes it sometimes difficult to find out which scope can be safely injected into which other scope or when e.g. a Provider<T> needs to be used.
I am aware that I can just create scope proxies for all beans that are basically not singletons, but in many cases that does not seem to be necessary. For example, a bean might only be supposed to be injected into other beans of the same scope, but not everyone working on the project might be aware of that. Thus, it would be great if one could somehow prevent "misuse" of those beans, especially if one might not always recognize the mistake in time.
So my question is: Is there some way to define which scoped can be safely injected into which scope and then prevent beans with narrower scope from directly (without using Provider<T>) being injected into e.g. singleton beans?
It looks like this can be achieved fairly simple using a custom BeanPostProcessor. Within the postProcessBeforeInitialization, you can simply check the scope of the bean and the scope of all dependencies. Here is a simple example:
#Component
public class BeanScopeValidator implements BeanPostProcessor {
private final ConfigurableListableBeanFactory configurableBeanFactory;
#Autowired
public BeanScopeValidator(ConfigurableListableBeanFactory configurableBeanFactory) {
this.configurableBeanFactory = configurableBeanFactory;
}
#Override
public Object postProcessBeforeInitialization(Object bean, String beanName) throws BeansException {
String beanScope = configurableBeanFactory.getBeanDefinition(beanName).getScope();
String[] dependenciesForBean = configurableBeanFactory.getDependenciesForBean(beanName);
for (String dependencyBeanName : dependenciesForBean) {
String dependencyBeanScope = configurableBeanFactory.getBeanDefinition(dependencyBeanName).getScope();
// TODO: Check if the scopes are compatible and throw an exception
}
return bean;
}
}
This example is still very basic and is not really convenient to use. Most prominently, it lacks the capability of defining which scope can be injected into which other scope. Thus I've created a more complete example here. Using this project, the following injections are allowed by default:
Singletons can be injected into everything
Everything can be injected into prototypes
AOP proxies can be injected into everything
Everything can be injected into beans of the same scope
If you want to allow a bean to be injected into another scope, it needs to be explicitly allowed by using a respective annotation:
#Bean
#Scope("prototype")
#InjectableInto("singleton")
MyBean getMyBean(){
//...
}

How Singleton bean can be Autowired in different places spring boot

I'm confused at this point, and i know all spring boot applications beans are singleton, according to my understanding if we have class annotated with #Service annotation that bean can be #Autowired in only one class (correct me if i'm wrong) here is the code that works fine, but i'm trying to understand how it works? how one bean can be #Autowired in two different classes?
How SampleService bean can be #Autowired in SampleController2 and SampleController3 at a time ?
And is this recommended approach? and in this case two threads can parallely change the data inside bean?
SampleController2
#RestController
#RequestMapping(value="samplemock")
public class SampleController2 {
#Autowired
private SampleService2 sampleservice2;
#RequestMapping(value="/mock1",method=RequestMethod.GET)
public void mockCall1() {
sampleservice2.m1();
}
}
SampleController3
#RestController
#RequestMapping(value="samplemock2")
public class SampleController3 {
#Autowired
private SampleService2 sampleservice2;
#RequestMapping(value="/mock1",method=RequestMethod.GET)
public void mockCall1() {
sampleservice2.m1();
}
}
SampleService2
#Service
public class SampleService2 {
public void m1() {
System.out.println("bean is autowired");
}
}
Here is a simplified view of what Spring does on startup:
// Create bean: sampleService2
SampleService2 sampleService2 = new SampleService2();
// Create bean: sampleController2
SampleController2 sampleController2 = new SampleController2();
sampleController2.sampleservice2 = sampleService2; // because #Autowired
// Create bean: sampleController3
SampleController3 sampleController3 = new SampleController3();
sampleController3.sampleservice2 = sampleService2; // because #Autowired
As you can see, the singleton bean sampleService2 is autowired into both sampleController2 and sampleController3.
The beans are added to a repository, so you can look them up by name or type at any later point in time.
By default, as you mentioned, all Spring beans are singletons, but your second assumption is wrong: the same bean can be autowired in many other beans.
In fact that's the whole point of them being singletons.
That also means two different threads could change the state of the same bean indeed. You would most of the time want to keep your beans stateless for that reason.
If you really ever need to have one different instance of a bean for each place where it is autowired, you can change the scope of that bean to prototype. See Spring bean scopes docs.
The intention behind dependency injection and inversion of control is simple:
You define your injectables (like services) once, and they are instantiated once (unless you specify otherwise).
Those injectables are then used everywhere applicable, and you don't control their lifecycle, scope or state.
While I feel like the last point answers your primary question fairly tacitly, I'll elaborate - in a DI context, the only thing that really matters are enforceable contracts. That is to say, if your service subscribes to a specific type of contract, and you have a component which wishes to inject a service which fulfills that contract, then your DI layer should faithfully register a service which can fulfill that contract.
You get into fun and exciting stuff with bean priority, qualifiers and application profiles at that point, but this is the general idea.
For a concrete example: javax.sql.DataSource is an interface which is implemented by many JDBC-backed solutions, such as MySQL, Postgres, Oracle, and others. If you wish to have two different beans which talk to two different databases, but you want to be able to use those interchangeably, then you define a bean of type DataSource to use and configure which data source gets created. Again, this does involve things like #Qualifier to ensure you wire in the most specific bean at the most appropriate time.
Also, that last point is fairly important to answer this part of your question:
... and in this case two threads can parallely change the data inside bean?
It is very unwise to create an injectable bean with its own inherent state. That is, if you have SampleService attach itself to some sort of cached state with a collection inside of it, you're basically violating expectations since you don't know when or how often that collection is going to have elements added to it or removed from it.
The better convention is to have beans which can reference stateful services, but don't store that state in the bean itself (such as a database connection, but not entire database tables).

How to create a bean by type in Spring?

In my ApplicationContext I have several Beans being created the same style. So I have a lot of dublicated code writing a FactoryBean for each of this beans. Those beans have a common ground, implementing all one special interface.
I would like to move all that bean creation to one factory. That one would have to provide a methode like this
<T extends CommonInterface> T createInstance(Class<T> clazz);
There I could implement all the instantiation necessary to create one of my special beans.
My implementation would be called by spring for
#Autowired
private MyCommonInterfaceImplementation impl;
in that way
createInstance(MyCommonInterfaceImplementation.class)
So far I looked at BeanFactory and FactoryBean, both seem not to be I'm searching for.
Any suggestions?
why not use #bean
#Bean
public MyCommonInterfaceImplementation getMyCommonInterfaceImplementation(){
return MyBeanFactory.createInstance(MyCommonInterfaceImplementation.class);
}
//should autowire here
#Autowired
private MyCommonInterfaceImplementation impl;
Basically you need the #Bean annotation on a "factory" only if you need some special handling during the creation of a bean.
If everything can be #Autowired, either by setters, fields, or one constructor, and nothing else needs to be done on a bean during initialization, you can simply declare the annotation #Component on each implementation of your interface. This works as long as you have component scanning active inside your application. The result will be that for each component spring will create a bean which you can use.
I'm writing this on a mobile so showing code is not the best. Just follow some tutorial on #ComponentScan, or if you need, let me know and I can augment this answer with an example.
As of Spring 4.3 you no longer have to annotate your bean classes and you can let them be instantiated via a componentscan.
#Configuration
#ComponentScan(
value = "some.package.path",
includeFilters = {
#Filter(type = ASSIGNABLE_TYPE, value = {
MyClass1.class,
MyClass2.class,
MyClass3.class
})
})
This actually creates beans for the three classes listed there. The example should work without filters as well (everything in the package becomes a bean). This works as long as the classes have a single constructor that can be used for autowiring. I don't think it is possible to filter for all implementations of a particular interface and then register a bean.
To do that, you might do something with a ContextListener and e.g. use reflection to find out what classes to instantiate and then use context.autowire(..) to inject any dependencies from your context. A bit hacky but it might work.
#Override
public void onApplicationEvent(ContextRefreshedEvent event) {
ApplicationContext context = event.getApplicationContext();
MyClass bean
= context
.getAutowireCapableBeanFactory()
.autowire(MyClass.class, Autowire.BY_NAME.value(), true);
...
}
That still leaves the problem of how to get the bean registered in the context of course.
You might also be able to adapt the answer to this SO question on how to add beans programmatically.
Finally the best approach I've found is using a ConfigurationClassPostProcessor. As example I've used https://github.com/rinoto/spring-auto-mock
But, since it is quite complicated and "too much magic" to create beans from nothing, we decided to explicitly create those beans via #Bean.
Thanks for your answers.

Strategy using different beans in jsf

i want to use 2 different beans (Spring) for one JSF-Page. I do not like to write every method into one bean, so i tried to separate into two beans like JobEditDataBean and JobEditActionBean.
I want to use the JobEdiDataBean just as "Container" for my data objects and move the actionstuff (like saving, update etc.) to the action bean.
What i did by now (and what seems to work, but feels wrong) is the following:
public class JobEditDataBean{
#Autowired
JobEditActionBean actionBean;
// some objects...
#PostConstruct
public void init() {
actionBean.setJobEditDataBean(this);
// do something ...
}
}
public class JobEditActionBean{
JobEditDataBean dataBean;
// some objects...
}
Do you have any hints or tipps how this can be done better, nicer?
Indeed, you don't need to have one bean per each page. You can use as much beans you want for any page, it is fine, as whenever an expression like #{someMB} is found in your XHTML, JSF will find a bean with that name and create a new instance if necessary.
If you need to inject one bean to another, just use #Autowired already:
#Component
#Scope("request")
public class JobEditActionBean {
#Autowired
private JobEditDataBean dataBean;
#PostConstruct
public void init() {
// dataBean.youCanUseDataBeanMethodsHereAlready()
}
}
You just have to make sure both beans are in the Spring container (annotating them with #Component will do), and choosing the right scope for each one. Beware of the scopes of the beans which you are injecting, cause it usually only makes sense to inject beans of broader scope to beans of the same or more restrict scope.
Having said that, I recommend reading following thread about choosing the right scopes:
How to choose the right bean scope?
One more thing: this is only valid if your JSF beans are really being managed by the Spring container (that was my assumption after you used #Autowired). If you are letting JSF container manage the beans (using #ManagedBean with #RequestScoped or #ViewScoped, for example), the way you inject them is with a #ManagedProperty annotation:
...
#ManagedProperty("#{jobEditDataBean}")
private JobEditDataBean dataBean;

Spring session-scoped beans (controllers) and references to services, in terms of serialization

a standard case - you have a controller (#Controller) with #Scope("session").
classes put in the session usually are expected to implement Serializable so that they can be stored physically in case the server is restarted, for example
If the controller implements Serializable, this means all services (other spring beans) it is referring will also be serialized. They are often proxies, with references to transaction mangers, entity manager factories, etc.
It is not unlikely that some service, or even controller, hold a reference to the ApplicationContext, by implementing ApplicationContextAware, so this can effectively mean that the whole context is serialized. And given that it holds many connections - i.e. things that are not serializable by idea, it will be restored in corrupt state.
So far I've mostly ignored these issues. Recently I thought of declaring all my spring dependencies transient and getting them back in readResolve() by the static utility classes WebApplicationContextUtils and such that hold the request/ServletContext in a ThreadLocal. This is tedious, but it guarantees that, when the object is deserialized, its dependencies will be "up to date" with the current application context.
Is there any accepted practice for this, or any guidelines for serializing parts of the spring context.
Note that in JSF, managed beans (~controllers) are stateful (unlike action-based web frameworks). So perhaps my question stands more for JSF, than for spring-mvc.
In this presentation (around 1:14) the speaker says that this issue is resolved in spring 3.0 by providing a proxy of non-serializable beans, which obtains an instance from the current application context (on deserialization)
It appears that bounty didn't attract a single answer, so I'll document my limited understanding:
#Configuration
public class SpringConfig {
#Bean
#Scope(proxyMode = ScopedProxyMode.TARGET_CLASS)
MyService myService() {
return new MyService();
}
#Bean
#Scope("request")
public IndexBean indexBean() {
return new IndexBean();
}
#Bean
#Scope("request")
public DetailBean detailBean() {
return new DetailBean();
}
}
public class IndexBean implements Serializable {
#Inject MyService myService;
public void doSomething() {
myService.sayHello();
}
}
public class MyService {
public void sayHello() {
System.out.println("Hello World!");
}
}
Spring will then not inject the naked MyService into IndexBean, but a serializable proxy to it. (I tested that, and it worked).
However, the spring documentation writes:
You do not need to use the <aop:scoped-proxy/> in conjunction with beans that are scoped as singletons or prototypes. If you try to create a scoped proxy for a singleton bean, the BeanCreationException is raised.
At least when using java based configuration, the bean and its proxy can be instantiated just fine, i.e. no Exception is thrown. However, it looks like using scoped proxies to achieve serializability is not the intended use of such proxies. As such I fear Spring might fix that "bug" and prevent the creation of scoped proxies through Java based configuration, too.
Also, there is a limitation: The class name of the proxy is different after restart of the web application (because the class name of the proxy is based on the hashcode of the advice used to construct it, which in turn depends on the hashCode of an interceptor's class object. Class.hashCode does not override Object.hashCode, which is not stable across restarts). Therefore the serialized sessions can not be used by other VMs or across restarts.
I would expect to scope controllers as 'singleton', i.e. once per application, rather than in the session.
Session-scoping is typically used more for storing per-user information or per-user features.
Normally I just store the 'user' object in the session, and maybe some beans used for authentication or such. That's it.
Take a look at the spring docs for configuring some user data in session scope, using an aop proxy:
http://static.springsource.org/spring/docs/2.5.x/reference/beans.html#beans-factory-scopes-other-injection
Hope that helps
I recently combined JSF with Spring. I use RichFaces and the #KeepAlive feature, which serializes the JSF bean backing the page. There are two ways I have gotten this to work.
1) Use #Component("session") on the JSF backing bean
2) Get the bean from ELContext when ever you need it, something like this:
#SuppressWarnings("unchecked")
public static <T> T getBean(String beanName) {
return (T) FacesContext.getCurrentInstance().getApplication().getELResolver().getValue(FacesContext.getCurrentInstance().getELContext(), null, beanName);
}
After trying all the different alternatives suggested all I had to do was add aop:scoped-proxy to my bean definition and it started working.
<bean id="securityService"
class="xxx.customer.engagement.service.impl.SecurityContextServiceImpl">
<aop:scoped-proxy/>
<property name="identityService" ref="identityService" />
</bean>
securityService is injected into my managedbean which is view scoped. This seems to work fine. According to spring documentation this is supposed to throw a BeanCreationException since securityService is a singleton. However this does not seems to happen and it works fine. Not sure whether this is a bug or what the side effects would be.
Serialization of Dynamic-Proxies works well, even between different JVMs, eg. as used for Session-Replication.
#Configuration public class SpringConfig {
#Bean
#Scope(proxyMode = ScopedProxyMode.INTERFACES)
MyService myService() {
return new MyService();
}
.....
You just have to set the id of the ApplicationContext before the context is refreshed (see: org.springframework.beans.factory.support.DefaultListableBeanFactory.setSerializationId(String))
AnnotationConfigApplicationContext ctx = new AnnotationConfigApplicationContext();
// all other initialisation part ...
// before! refresh
ctx.setId("portal-lasg-appCtx-id");
// now refresh ..
ctx.refresh();
ctx.start();
Works fine on Spring-Version: 4.1.2.RELEASE

Categories

Resources