I know that two requests with same content use different threads. And I thought that different threads will create different instances with #Controller annotation. But when I run the code below, I find my thought is wrong.
Test code:
#Controller
#RequestMapping("test")
public class TestADEDSAController {
private String string = "";
#RequestMapping("controllerTest")
#ResponseBody()
public String controllerTest(#RequestParam String string) {
return this.string += string;
}
}
The first time the response content is like:
test
The second time is like:
testtest
It seems that there is only one Test instance in the JVM.
I would like to know whether it is true that there is always only one #Controller instance in the JVM? Also, where can I find a detailed introduction about this process?
By default, all bean in Spring is created as singleton (one per IOC container).
This is javadoc
(Default) Scopes a single bean definition to a single object instance per Spring IoC container.
By default, Spring creates a single shared instance of the bean. The bean scope is singleton by default. In case you need a new instance created on every request, you should define the bean scope as prototype. This can either be done by annotating the class with #Scope("prototype") or by defining the scope in the spring configuration xml as below:
<bean id="controllerId" class="com.package.name.TestADEDSAController" scope="prototype"/>
Please go through https://docs.spring.io/spring/docs/3.0.0.M3/reference/html/ch04s04.html to gain better understanding of bean scopes in spring.
No, by default objects are singletons. That mean your objects must be thread safe.
So it's a bad practice to have unsafe values like Strings in your controller (except constants).
You could have your fields corrupted if two threads go there at the same time.
Default scope is "singleton", so if you need Spring to create a new instance each time you can use #Scope("prototype") annotation in addition to #Controller annotation. There are also other web-aware scopes like request, session and global session. Read here for examples.
Related
By default, on springboot, when we declare a method with #Bean, the object instance will be shared across all objects that request for #Autowired that class...
What if I want spring to deliver different instances for each autowire class that request that object?
I mean instead of share one single instance of a bean have multiple "disposable" beans for each requisition of that object?
Why I want that?
the reason is quite simple, RestTemplateBuilder is a common bean used in most spring application, by its nature this builder is STATEFUL which means that any changes made to one class to its structure will cause side effects to all other objects that use it.
If you want to have a different instance for each class you inject you should use the scope annotation as follow:
#Bean
#Scope("prototype")
public Person personPrototype() {
return new Person();
}
you can also use the constant as follow:
#Scope(value = ConfigurableBeanFactory.SCOPE_PROTOTYPE)
I'm confused at this point, and i know all spring boot applications beans are singleton, according to my understanding if we have class annotated with #Service annotation that bean can be #Autowired in only one class (correct me if i'm wrong) here is the code that works fine, but i'm trying to understand how it works? how one bean can be #Autowired in two different classes?
How SampleService bean can be #Autowired in SampleController2 and SampleController3 at a time ?
And is this recommended approach? and in this case two threads can parallely change the data inside bean?
SampleController2
#RestController
#RequestMapping(value="samplemock")
public class SampleController2 {
#Autowired
private SampleService2 sampleservice2;
#RequestMapping(value="/mock1",method=RequestMethod.GET)
public void mockCall1() {
sampleservice2.m1();
}
}
SampleController3
#RestController
#RequestMapping(value="samplemock2")
public class SampleController3 {
#Autowired
private SampleService2 sampleservice2;
#RequestMapping(value="/mock1",method=RequestMethod.GET)
public void mockCall1() {
sampleservice2.m1();
}
}
SampleService2
#Service
public class SampleService2 {
public void m1() {
System.out.println("bean is autowired");
}
}
Here is a simplified view of what Spring does on startup:
// Create bean: sampleService2
SampleService2 sampleService2 = new SampleService2();
// Create bean: sampleController2
SampleController2 sampleController2 = new SampleController2();
sampleController2.sampleservice2 = sampleService2; // because #Autowired
// Create bean: sampleController3
SampleController3 sampleController3 = new SampleController3();
sampleController3.sampleservice2 = sampleService2; // because #Autowired
As you can see, the singleton bean sampleService2 is autowired into both sampleController2 and sampleController3.
The beans are added to a repository, so you can look them up by name or type at any later point in time.
By default, as you mentioned, all Spring beans are singletons, but your second assumption is wrong: the same bean can be autowired in many other beans.
In fact that's the whole point of them being singletons.
That also means two different threads could change the state of the same bean indeed. You would most of the time want to keep your beans stateless for that reason.
If you really ever need to have one different instance of a bean for each place where it is autowired, you can change the scope of that bean to prototype. See Spring bean scopes docs.
The intention behind dependency injection and inversion of control is simple:
You define your injectables (like services) once, and they are instantiated once (unless you specify otherwise).
Those injectables are then used everywhere applicable, and you don't control their lifecycle, scope or state.
While I feel like the last point answers your primary question fairly tacitly, I'll elaborate - in a DI context, the only thing that really matters are enforceable contracts. That is to say, if your service subscribes to a specific type of contract, and you have a component which wishes to inject a service which fulfills that contract, then your DI layer should faithfully register a service which can fulfill that contract.
You get into fun and exciting stuff with bean priority, qualifiers and application profiles at that point, but this is the general idea.
For a concrete example: javax.sql.DataSource is an interface which is implemented by many JDBC-backed solutions, such as MySQL, Postgres, Oracle, and others. If you wish to have two different beans which talk to two different databases, but you want to be able to use those interchangeably, then you define a bean of type DataSource to use and configure which data source gets created. Again, this does involve things like #Qualifier to ensure you wire in the most specific bean at the most appropriate time.
Also, that last point is fairly important to answer this part of your question:
... and in this case two threads can parallely change the data inside bean?
It is very unwise to create an injectable bean with its own inherent state. That is, if you have SampleService attach itself to some sort of cached state with a collection inside of it, you're basically violating expectations since you don't know when or how often that collection is going to have elements added to it or removed from it.
The better convention is to have beans which can reference stateful services, but don't store that state in the bean itself (such as a database connection, but not entire database tables).
I am trying to understand purpose of Spring-created beans. Are they just global shared object (such that they are declared like
#Component
public class MySpringBean{},
and later this object is used anywhere like inside some class
public class MyClass {
#Autowired
MySpringBean mySpringBean;
}
)?
Can their internal creation/implementation assumed like this? -
public class MyApp {
MySpringBean mySpringBean;
}
and used in MyClass like -
public class MyClass {
MySpringBean mySpringBean = MyApp.mySpringBean;
}
Its the object valid for only that class hierarchy. In your case Spring just create an object for mySpringBean and will keep it available for MyClass. Internally its more like
MySpringBean mySpringBean = new MySpringBean()
But actually
all Spring beans are managed - they "live" inside a container, called "application context".
Autowiring happens by placing an instance of one bean into the desired field in an instance of another bean. Both classes should be beans, i.e. they should be defined to live in the application context.
so in your case both mySpringBean and an instance of MyClass will be in application context.
Based on your question, I believe you should know about how beans are managed by Spring (or how Spring manages the life cycle of beans from initialization to destroy). But also note that you don't have to go into too much of details (wells, it's a framework that's is providing you). Yes, it's definitely true to init involves using new operator. These objects live inside the Container and Spring wires them whenever it's called for. Since beans are managed by Spring, you can implement callback methods too.
Given having next classes:
XRepository with declared a constructor with 1 argument (simple one,
not autowired), it has some autowired fields.
XService that uses XRepository as autowired.
XProcessor uses XService as autowired.
So I have to init XProcessor on runtime for specific value that will be used in XRepository constructor. On different calls I will have different arguments, so the injection should be on runtime.
Any idea how to achieve that using code configuration or annotations?
Remember that Spring needs to inject all the constructor parameters of Spring managed beans.
I believe you have two options:
Parse your URL info in controller and pass it through parameters down to persistence layer. This would be my preferred mechanism. You can create special DTO for passing various information down and keep your method signatures concise.
Your situation can alos be solved with request scope bean. You will
create one bean like this:
#Component
#Scope("request")
public class {
private String urlPart;
}
And you would autowire this component into XProcessor and
XRepository. Each request to your application will create new
instance of XRequestContext and you will parse your info in
XProcessor and store it into XRequestContext.
In XRepository you will use instance of XRequestContext to
retrieve information you stored in XProcessor.
You can read about request scope in Spring docs. It is like
ThreadLocal per request thread.
i want to use 2 different beans (Spring) for one JSF-Page. I do not like to write every method into one bean, so i tried to separate into two beans like JobEditDataBean and JobEditActionBean.
I want to use the JobEdiDataBean just as "Container" for my data objects and move the actionstuff (like saving, update etc.) to the action bean.
What i did by now (and what seems to work, but feels wrong) is the following:
public class JobEditDataBean{
#Autowired
JobEditActionBean actionBean;
// some objects...
#PostConstruct
public void init() {
actionBean.setJobEditDataBean(this);
// do something ...
}
}
public class JobEditActionBean{
JobEditDataBean dataBean;
// some objects...
}
Do you have any hints or tipps how this can be done better, nicer?
Indeed, you don't need to have one bean per each page. You can use as much beans you want for any page, it is fine, as whenever an expression like #{someMB} is found in your XHTML, JSF will find a bean with that name and create a new instance if necessary.
If you need to inject one bean to another, just use #Autowired already:
#Component
#Scope("request")
public class JobEditActionBean {
#Autowired
private JobEditDataBean dataBean;
#PostConstruct
public void init() {
// dataBean.youCanUseDataBeanMethodsHereAlready()
}
}
You just have to make sure both beans are in the Spring container (annotating them with #Component will do), and choosing the right scope for each one. Beware of the scopes of the beans which you are injecting, cause it usually only makes sense to inject beans of broader scope to beans of the same or more restrict scope.
Having said that, I recommend reading following thread about choosing the right scopes:
How to choose the right bean scope?
One more thing: this is only valid if your JSF beans are really being managed by the Spring container (that was my assumption after you used #Autowired). If you are letting JSF container manage the beans (using #ManagedBean with #RequestScoped or #ViewScoped, for example), the way you inject them is with a #ManagedProperty annotation:
...
#ManagedProperty("#{jobEditDataBean}")
private JobEditDataBean dataBean;