I am using an ObjectProvider to create instances of a prototype scope bean using the getObject() method. Something like this
#Configuration
class Config {
#Bean
#Scope(ConfigurableBeanFactory.SCOPE_PROTOTYPE)
SomeType typeOne() {
return new SomeType();
}
#Bean
#Scope(ConfigurableBeanFactory.SCOPE_PROTOTYPE)
SomeType typeTwo(String param) {
return new SomeType(param);
}
}
#Service
class Service {
private ObjectProvider<SomeType> objectProvider;
public Service(
ObjectProvider<SomeType> objectProvider) {
this.objectProvider = objectProvider;
}
#Override
public String performAction() {
return getSomeType().doAction();
}
private SomeType getSomeType() {
return objectProvider.getObject();
}
}
But since there are two beans of the type that the ObjectProvider is trying to get (SomeType), I get a NoUniqueBeanDefinitionException. (And I do need the other bean of the same type, because that one I need to provide parameters using objectProvider.getObject(Object... params) )
Playing around and debugging Spring I saw that if you name your ObjectProvider exactly like your bean then it works, something like:
private ObjectProvider<SomeType> typeOne;
My question is, are there other ways to use an ObjectProvider and manage to resolve ambiguity, or is this approach the way to go?
Short answer is you just need to properly qualify the ObjectProvider you want injected, like this:
public Service(#Qualifier("typeOne") ObjectProvider<SomeType> objectProvider) {
this.objectProvider = objectProvider;
}
With Spring configuration, when you specify a bean via a method, and don't specify it's name with #Bean("NAME"), Spring uses the method name as the bean name.
Similarly, when injecting a bean that is not specified by #Qualifier("NAME"), Spring takes the injected variable as the name, if that don't exists or is not unique, you might get some exceptions informing you about this (like the NoUniqueBeanDefinitionException you facing).
So, if you match the bean name and the injected variable name you don't need to be more specific, but if you don't, #Qualifier is there to your rescue :D
Related
I have following #Configuration class, in which I am declaring a #Bean that depends on an #Autowired list of beans. However, this list is not complete when I am accessing to it. All #Bean definitions been executed, except the one defined in the same class.
#Configuration
public class MyConfig {
#Autowired
List<RequiredBean> requiredBeans;
#Bean(name="ProblemHere")
public CustomObject customObject() {
log.info(requiredBeans.size()); // 1 (debugging, I can see is the one defined in "AnotherConfigClass")
}
#Bean(name="reqBeanInsideClass")
public RequiredBean reqBean() {
// this method does not get executed
return new RequiredBean();
}
}
Having other classes like;
#Configuration
public class AnotherConfigClass {
#Bean(name="ThisOneGetsExecuted")
public RequiredBean reqBean() {
// this gets executed, and therefore, added to the list
return new RequiredBean();
}
}
Probably, the easiest solution would be to add #DependsOn("reqBeanInsideClass").
However:
I wonder why it works for all #Beans defined in different classes, but not in this one.
I'm not really sure that's exactly like that, and I'm afraid later on, another #Bean does not get executed
I guess the correct approach should be something like
#DependsOn(List<RequiredBean>) // Obviously this does not work
How should I solve this?
Update
I have copied the exact same class twice, in order to see what would happen, so now I have also:
#Configuration
public class MyConfig2 {
#Autowired
List<RequiredBean> requiredBeans;
#Bean(name="ProblemHere2")
public CustomObject customObject() {
log.info(requiredBeans.size());
}
#Bean(name="reqBeanInsideClass2")
public RequiredBean reqBean() {
// this method does not get executed
return new RequiredBean();
}
}
Amazingly, by doing this, both #Beans methods (ProblemHere & ProblemHere2) are called before both reqBeanInsideClass and reqBeanInsideClass2 methods.
For some reason I guess, Springboot is able to recognize #Beans required for a class as long as they are defined in another class.
Does this sound logic to anyone?
Can you not utilize the array input for #DependsOn rather than passing singular value, since it accepts String[]? That would wait for all the beans that are explicitly declared in the array before initializing, though has to be defined manually.
#Configuration
public class MyConfig {
#Autowired
List<RequiredBean> requiredBeans;
#Bean(name="customObject")
#DependsOn({"reqBeanInsideClass", "thisOneGetsExecuted"})
public CustomObject customObject() {
log.info(requiredBeans.size());
}
#Bean(name="reqBeanInsideClass")
public RequiredBean reqBean() {
return new RequiredBean();
}
}
#Autowired list of beans will be same as a single bean of same type, it will contain all beans with that type or with that superclass via springs injection, the problem is the ordering of bean initialization is not controlled properly, #DependsOn with array bean input should resolve this!
Or
You can make CustomObject bean #Lazy, so it will be initialized only when it is used within the code after initialization is done. The bean must not be used within another non-lazy bean I think. Just call some logic where an #Autowired CustomObject is used, it should instantiate the bean at that moment, where the list will contain all possible RequiredBeans
#Lazy
#Bean(name="customObject")
public CustomObject customObject() {
log.info(requiredBeans.size());
}
I am building a library based on Spring Framework and I want to allow users invoke Library's methods in parallel.
In my main class I autowire Service class:
#Autowired
private ExportListCommand exportList;
And that's implementation for Library's method:
public ResponseContainer<ExportListResponse> exportList(ExportListOptions options) {
exportList.setoAuthClient(oAuthClient);
ResponseContainer<ExportListResponse> result = exportList.executeCommand(options);
return result;
}
ExportListCommand is defined as a Bean:
#Bean
#Scope("prototype")
public ExportListCommand exportList() {
return new ExportListCommand();
}
When I as a Library user run 2 exportList's methods in parallel Spring creates only single ExportListCommand bean since it autowired only once. But in reality I need 2 independent ExportListCommand beans. I also tried to change #Scope(value="prototype") to #Scope(value="prototype", proxyMode=ScopedProxyMode.TARGET_CLASS), but that also does not work as I need: Spring creates ExportListCommand bean for each method invocation and I lose oAuthClient value since I get new object.
I made it work only with AnnotationConfigApplicationContext.getBean() approach which I would like to avoid.
What my options are? Thanks.
I believe you are looking to work with a 'factory' object.
There are two primary ways I would consider this from a Spring standpoint.
The 'Java' way: Create a factory object that will return instances of ExportListCommand
This factory would look something like this:
class ExportListCommandFactory {
ExportListCommand newInstance() {
return new ExportListCommand();
}
}
and would be used in your method like this:
#Autowire
private ExportListCommandFactory commandFactory;
public ResponseContainer<ExportListResponse> exportList(ExportListOptions options) {
final ExportListCommand exportList = commandFactory.newInstance();
exportList.setoAuthClient(oAuthClient);
ResponseContainer<ExportListResponse> result = exportList.executeCommand(options);
return result;
}
Of course, doing this would require that you change your configuration to contain a bean that is an ExportListCommandFactory rather than an ExportListCommand.
Alternatively, you could consider...
The 'Spring' way: Use FactoryBean
The only thing you should need to do here is, in your main class, #Autowire a FactoryBean<ExportListCommand> instead of the ExportListCommand, and in your method where you need to invoke the method, consult the factory to get your instance.
#Autowire
private FactoryBean<ExportListCommand> commandFactory;
public ResponseContainer<ExportListResponse> exportList(ExportListOptions options) {
final ExportListCommand exportList = commandFactory.getObject();
exportList.setoAuthClient(oAuthClient);
ResponseContainer<ExportListResponse> result = exportList.executeCommand(options);
return result;
}
You shouldn't need to change your configuration, as FactoryBean is a special bean that will consult the ApplicationContext/BeanFactory for the instance at each invocation of getObject().
The following example shows explicit wiring of dependencies using spring java config that results in a different bean being wired in while using and interface for a spring configuration class.
This seems like it shouldn't occur or at least give the normal warning that there are two beans as candidates for autowiring and it doesn't know which to select.
Any thoughts on this issue? My guess is there is no real name spacing between configuration classes as is implied by the syntax "this.iConfig.a()" Could this be considered a bug (if only for not warning about the 2 candidate beans)?
public class Main
{
public static void main( final String[] args )
{
final ApplicationContext context = new AnnotationConfigApplicationContext( IConfigImpl.class, ServiceConfig.class );
final Test test = context.getBean( Test.class );
System.out.println( test );
}
}
public class Test
{
private final String string;
public Test( final String param )
{
this.string = param;
}
public String toString()
{
return this.string;
}
}
#Configuration
public interface IConfig
{
#Bean
public String a();
}
#Configuration
public class IConfigImpl implements IConfig
{
#Bean
public String a()
{
return "GOOD String";
}
}
#Configuration
public class ServiceConfig
{
#Autowired
IConfig iConfig;
#Bean
Test test()
{
return new Test( this.iConfig.a() );
}
#Bean
String a()
{
return "BAD String";
}
}
In this case, I would expect to have "GOOD String" to be always be wired in the Test object, but flipping the order of IConfigImpl.class, ServiceConfig.class in the context loader changes which string is loaded.
Tested with Spring 4.0.7
EDIT: Further testing shows this has nothing to to with inherented configs. Same thing results if you drop the IConfig interface.
I believe this was a behavior of Spring for years.
If you redefine a bean, the one that is being loaded as last wins.
Another question would be how to control the order of bean loading when java configs are used. Check out this article http://www.java-allandsundry.com/2013/04/spring-beans-with-same-name-and.html which shows you how to do the ordering by using #Import of the other Spring java config.
The solution is actually simple - if you need to override a previously
defined bean(without say the flexibility of autowiring with a
different bean name), either use the XML bean configuration for both
the bean being overridden and the overriding bean or use the
#Configuration. XML bean configuration is the first example in this
entry, the one with #Configuration would be something like this:
#Configuration
public class Context1JavaConfig {
#Bean
public MemberService memberService() {
return new MemberSvcImpl1();
}
}
#Configuration
#Import(Context1JavaConfig.class)
public class Context2JavaConfig {
#Bean
public MemberService memberService() {
return new MemberSvcImpl2();
}
}
Stepan has mentioned the issue of order. The following is about your comment on their answer
Overriding beans of the same name makes sense, but in this case, I'm
specifically referencing the bean as specified in the iConfig
configuration. I would expect to get the one specified there.
In order to implement #Configuration and the caching of beans so that calls like
#Configuration
class Example {
#Bean
public UncaughtExceptionHandler uncaughtExceptionHandler() {
return (thread, throwable) -> System.out.println(thread + " => " + throwable.getMessage());
}
#Bean
#Scope(value = ConfigurableBeanFactory.SCOPE_PROTOTYPE)
public Thread newThread() {
Thread thread = new Thread();
thread.setUncaughtExceptionHandler(uncaughtExceptionHandler()); // <<<<<< allowing this
return thread;
}
}
Spring actually uses CGLIB to create a proxy subtype of the #Configuration annotated class. This proxy maintains a reference to the backing ApplicationContext and uses that to resolve a bean.
So the call in your example
return new Test(this.iConfig.a());
isn't really invoking IConfigImpl#a(). It invokes this code (as of 4.2) from the proxy interceptor. The code uses the corresponding Method to determine the target bean name and uses the ApplicationContext's BeanFactory to resolve the bean. Since the bean definition for a bean named a has already been overriden, that new bean definition gets used. That bean definition is using the ServiceConfig#a() method as its factory method.
This is described in the documentation, here
All #Configuration classes are subclassed at startup-time with CGLIB.
In the subclass, the child method checks the container first for any
cached (scoped) beans before it calls the parent method and creates a
new instance.
Could this be considered a bug [...]?
I don't believe so. The behavior is documented.
Suppose I've a class Fruit and it's two subclasses - Apple and Grape:
class Fruit {
public void grind() { }
}
class Apple extends Fruit { }
class Grape extends Fruit { }
In spring properties file, I've a property that decides which bean to register at startup. At a time, I'll only have either Apple or Grape instance registered as a bean. The property is:
# This can be either apple or grape
app.fruit = apple
In the Java configuration file, I'm binding a String attribute using #Value with this property, and based on that, I'll create appropriate instance. I'm trying to use factory pattern here. So, I've a FruitFactory like this:
class FruitFactory {
private Map<String, Fruit> map = new HashMap<String, Fruit>();
public FruitFactory() {
map.put("apple", new Apple());
map.put("grape", new Grape());
}
public Fruit getFruit(String fruit) {
return map.get(fruit);
}
}
And here's my spring configuration class:
class SpringConfig {
#Value("${app.fruit}")
private String fruitType;
#Bean
public FruitFactory fruitFactory() {
return new FruitFactory();
}
#Bean
public Fruit getFruit() {
return fruitFactory().getFruit(fruitType);
}
}
So, here're my few questions:
Will the instances stored in the map inside the factory be spring managed bean? Is there any issue with the implementation? I've tried it, and it is working fine, and I'm confused whether the instances are really spring managed.
I was trying to implement it in a better way, so that when a new fruit comes, I don't have to modify my factory. On way is to provide a register() method in factory and let all the Fruit subclasses invoke it. But the issue is when and how the subclasses will be loaded? I'll not be using the classes, not before putting their instances into the map. Can anyone suggest a better way?
Edit:
As suggested in comment and answer, I've tried using #Profile instead of factory pattern. But I'm facing some issues in that. Here's what I've:
#Configuration
#Profile("apple")
class AppleProfile {
#Bean
public Fruit getApple() {
return new Apple();
}
}
#Configuration
#Profile("grape")
class GrapeProfile {
#Bean
public Fruit getGrape() {
return new Grape();
}
}
And in a ServletListener, I've set the active profile:
class MyServletListener implements ServletContextListener {
#Value("${app.fruit}")
private String fruitType;
public void contextInitialized(ServletContextEvent contextEvent) {
// Get Spring Context
WebApplicationContext context = WebApplicationContextUtils.getRequiredWebApplicationContext(contextEvent
.getServletContext());
context.getAutowireCapableBeanFactory().autowireBean(this);
ConfigurableEnvironment configEnvironment = (ConfigurableEnvironment) context.getEnvironment();
logger.debug("0;Setting Active Profile: " + cacheRetrievalMode);
configEnvironment.setActiveProfiles(cacheRetrievalMode);
}
}
This is properly setting the active profile, which I can see. The only issue is, the listener is declared before the ContextLoaderListener, and by the time this is executed, the beans are already been created. Is there any alternative?
Will the instances stored in the map inside the factory be spring
managed bean?
Making the FruitFactory a managed bean
#Bean
public FruitFactory fruitFactory() {
return new FruitFactory();
}
doesn't make any of the objects it's referring to managed beans. However, this
#Bean
public Fruit getFruit() {
return fruitFactory().getFruit(fruitType);
}
does make that one returned Fruit a managed bean. #Bean marks a method as a bean definition and bean factory (it creates the bean). The object you return will be managed by Spring's bean life cycle.
Is there any issue with the implementation?
It seems weird that you're creating a FruitFactory bean but also a Fruit from that same FruitFactory. Are you even going to inject the FruitFactory elsewhere in the application?
I was trying to implement it in a better way, so that when a new fruit
comes, I don't have to modify my factory
Seriously, your factory is messing everything up. Spring already does its job, and more! Annotations make your life easier. You can give an identifier to the #Bean. You can qualify the bean with #Qualifier (and then also qualify the injection target with #Qualifier). You can set a #Profile for when and under which conditions the bean should be initialized.
But the issue is when and how the subclasses will be loaded? I'll not
be using the classes, not before putting their instances into the map.
Can anyone suggest a better way?
You can use bean initMethods, which you specify as a #Bean annotation attribute, or a #PostConstruct annotated method to do post-initialization logic. You can use these to register the beans with the factory, which you'll have injected (but that design doesn't sound right to me, you'd have to show us more.)
You should also look into InitializingBean and FactoryBean.
For setting the active profile, one possibility is to do the following. Create an ApplicationContextInitializer which sets the active profile by reading from a .properties file. You won't be able to use #PropertySources here because this isn't a bean.
Something like
public class ProfileContextInitializer implements
ApplicationContextInitializer<ConfigurableApplicationContext> {
#Override
public void initialize(ConfigurableApplicationContext applicationContext) {
PropertySource<Map<String, Object>> source = null;
try {
source = new ResourcePropertySource("spring.properties");
String profile = (String) source.getProperty("active.profile");
System.out.println(profile);
ConfigurableEnvironment env = applicationContext.getEnvironment();
env.setActiveProfiles(profile);
} catch (IOException e) {
e.printStackTrace();
}
}
}
You can register this in your deployment descriptor
<context-param>
<param-name>contextInitializerClasses</param-name>
<param-value>com.yourapp.ProfileContextInitializer</param-value>
</context-param>
<listener>
<listener-class>org.springframework.web.context.ContextLoaderListener</listener-class>
</listener>
When the ContextLoaderListener is created, it will pick up and instantiate your class and call its initialize method. This is done before the WebApplicationContext is refreshed.
You should probably just set a VM argument for the active profile and avoid all of this.
I know it's possible to inject a request scoped bean into a singleton bean in Spring so I know what I'm trying to do will work, I'm just wondering if there is a way to express it more concisely without so many extra unnecessary class definitions. I'm new to Spring annotations so maybe there's an annotation I don't know about.
I have an abstract class that will be extended maybe 100 times in my application as different singleton spring beans. Take this class definition for an example:
/** The abstract class with a field that needs to be request-specific **/
public abstract class AbstractSingletonBean {
private SampleState state;
public SampleState getState() { return state; }
public void setState(SampleState state) { this.state = state; }
// Other fields that are just singleton here
}
And an example of what one of the bean definitions might look like:
#Component
public class SampleSingletonBean extends AbstractSingletonBean {
#Resource(name="sampleState")
public void setState(SampleState state) { super.setState(state); }
}
Now of course we need a bean called sampleState. So I have to create two classes: a base class to define the fields in SampleState and then a request-scoped bean definition. This is because each extension of AbstractSingletonBean will need it's own request-scoped instance of the state field.
Here might be the base class:
public class SampleState {
private String fieldOne;
public String getFieldOne() { return fieldOne }
public void setFieldOne() { this.fieldOne = fieldOne }
}
And here is this silly bean definition:
#Component ("sampleState")
#Scope(value = "request", proxyMode = ScopedProxyMode.TARGET_CLASS)
public class SampleStateBean extends SampleState {}
The thing that bothers me is that if I have 100 extensions of AbstractSingletonBean, I'll need 100 extensions of SampleStateBean with just boilerplate code to make it request-scoped. Is there a way to just override setState() in the extensions of AbstractSingletonBean and indicate to spring that it should create a new request scoped bean on the fly and inject it here? So my SampleSingletonBean could look like this:
#Component
public class SampleSingletonBean extends AbstractSingletonBean {
#Resource
#Scope(value = "request", proxyMode = ScopedProxyMode.TARGET_CLASS)
public void setState(SampleState state) { super.setState(state); }
}
Of course this doesn't work because #Resource needs to refer to a bean that already exists. Is there another annotation to accomplish this without creating a new class for every SampleState bean?
Spring can inject into abstract classes too. So you can move the injection of the SampleState to the abstract class, if each AbstractSingletonBean descendant needs just a SampleState (as in your example).
It doesn't look like this was available out of the box so I created an annotation I call #AnonymousRequest that I put on the field I want, and a BeanDefinitionRegistryPostProcessor to do the work of creating the bean. It basically goes like this:
for each bean in the BeanFactory
if bean class has AnonymousRequest annotation
create request scoped bean from field class
create singleton bean to be request scoped bean wrapper
set the annotated property value to the singleton wrapper
This took a lot of work to figure out how Spring registers request scoped beans. You create the bean definition you want as a request scoped bean. Then you create a singleton bean of type RootBeanDefinition that acts as a wrapper to the request scope bean and set a property on the wrapper called "targetBeanName" to whatever you named the request scoped bean ("scopedTarget." + the singleton bean name by convention).
So this could probably be improved by someone who actually knows this stuff but here's what I came up with:
public void createRequestBeanFromSetterMethod(String containingBeanName, BeanDefinition containingBean, Method method, BeanDefinitionRegistry registry)
{
String fieldName = ReflectionUtil.getFieldNameFromSetter(method.getName());
String singletonBeanName = containingBeanName+"_"+fieldName;
String requestBeanName = "scopedTarget."+singletonBeanName;
BeanDefinition requestBean = createAnonymousRequestBean(ReflectionUtil.getFieldTypeFromSetter(method), containingBean);
RootBeanDefinition singletonBean = new RootBeanDefinition();
singletonBean.setBeanClass(ScopedProxyFactoryBean.class);
singletonBean.getPropertyValues().addPropertyValue("targetBeanName", requestBeanName);
registry.registerBeanDefinition(singletonBeanName, singletonBean);
registry.registerBeanDefinition(requestBeanName, requestBean);
beanDefinition.getPropertyValues().addPropertyValue(fieldName, new RuntimeBeanReference(singletonBeanName));
}
private BeanDefinition createAnonymousRequestBean(Class<?> beanType, BeanDefinition parentBean)
{
BeanDefinition newBean = null;
if (parentBean != null)
{
newBean = new GenericBeanDefinition(parentBean);
}
else
{
newBean = new GenericBeanDefinition();
}
if (beanType != null)
{
newBean.setBeanClassName(beanType.getName());
}
newBean.setScope("request");
newBean.setAutowireCandidate(false);
// This would have come from the Proxy annotation...could add support for different values
String proxyValue = "org.springframework.aop.framework.autoproxy.AutoProxyUtils.preserveTargetClass";
BeanMetadataAttribute attr = new BeanMetadataAttribute(proxyValue, true);
newBean.setAttribute(proxyValue, attr);
return newBean;
}
It seems to work! I effectively have now a request scoped bean created just before the context initialization that is localized to this one containing bean. It's a request-scoped property, more or less.
You can try defining single SampleState request scope bean and then use spring's lookup method to inject this bean wherever you want to.That's works just fine with prototype-scope beans. Fingers crosses it would work with request scope as well.
AFAIK, there is no annotation support for lookup method as of now, so either use it's xml vis-a-vis
or have a look at javax.inject.Provider relevant question here