I am trying to write up an integration test case with Spring Boot Test.
I customize the ConversionService to know about the new java.time types:
#Configuration
public class ConversionServiceConfiguration {
#Bean
public static ConversionService conversionService() {
final FormattingConversionService reg = new DefaultFormattingConversionService();
new DateTimeFormatterRegistrar().registerFormatters(reg);
return reg;
}
}
and then later expect it to work:
#Component
class MyServiceConfig {
#Value("${max-watch-time:PT20s}")
private Duration maxWatchTime = Duration.ofSeconds(20);
}
When running under the normal SpringApplication.run this seems to work fine. However, in my test case:
#RunWith(SpringRunner.class)
#SpringBootTest(webEnvironment=WebEnvironment.RANDOM_PORT, classes= {
MyServiceMain.class,
AttachClientRule.class
})
public class MyTest {
#Inject
#Rule
public AttachClientRule client;
#Test(expected=IllegalArgumentException.class)
public void testBad() throws Exception {
client.doSomethingIllegal();
}
}
it blows up:
Caused by: org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'AttachClientRule': Unsatisfied dependency expressed through constructor parameter 0:
Error creating bean with name 'MyServiceConfig': Unsatisfied dependency expressed through field 'maxWatchTime': Failed to convert value of type [java.lang.String] to required type [java.time.Duration];
nested exception is java.lang.IllegalStateException: Cannot convert value of type [java.lang.String] to required type [java.time.Duration]: no matching editors or conversion strategy found;
Peering deep into the guts of the TypeConverterDelegate that does the actual conversion, it seems to capture the ConversionService used from a field on the DefaultListableBeanFactory. Setting a watchpoint on where that field is set, I find the AbstractApplicationContext.refresh() method:
// Allows post-processing of the bean factory in context subclasses.
postProcessBeanFactory(beanFactory);
// Invoke factory processors registered as beans in the context.
invokeBeanFactoryPostProcessors(beanFactory);
// Register bean processors that intercept bean creation.
registerBeanPostProcessors(beanFactory);
// Initialize message source for this context.
initMessageSource();
// Initialize event multicaster for this context.
initApplicationEventMulticaster();
// Initialize other special beans in specific context subclasses.
onRefresh(); // <--- MyServiceConfig initialized here
// Check for listener beans and register them.
registerListeners();
// Instantiate all remaining (non-lazy-init) singletons.
finishBeanFactoryInitialization(beanFactory); // <--- DefaultListableBeanFactory.conversionService set here!!!
// Last step: publish corresponding event.
finishRefresh();
So the #Value injection is happening before the ConversionService is applied to the BeanFactory. No bueno!
I've found what seems to be a workaround:
#Configuration
public class ConversionServiceConfiguration implements BeanFactoryPostProcessor {
#Bean
public static ConversionService conversionService() {
final FormattingConversionService reg = new DefaultFormattingConversionService();
new DateTimeFormatterRegistrar().registerFormatters(reg);
return reg;
}
#Override
public void postProcessBeanFactory(ConfigurableListableBeanFactory beanFactory) throws BeansException {
beanFactory.setConversionService(conversionService());
}
}
This forces the initialization to happen earlier on, but doesn't feel like the right solution (at least it's not documented as such).
Where have I gone wrong?
Spring 4.3.0, Spring Boot 1.4.0M3
EDIT
And now I've discovered another way for it to fail! Without making the same configuration class implement EnvironmentAware:
#Override
public void setEnvironment(Environment environment) {
((AbstractEnvironment) environment).setConversionService(conversionService());
}
I find that the PropertySourcesPropertyResolver uses the wrong (default) ConversionService. This is driving me mad!
Caused by: java.lang.IllegalArgumentException: Cannot convert value [PT15s] from source type [String] to target type [Duration]
at org.springframework.core.env.PropertySourcesPropertyResolver.getProperty(PropertySourcesPropertyResolver.java:94)
at org.springframework.core.env.PropertySourcesPropertyResolver.getProperty(PropertySourcesPropertyResolver.java:65)
at org.springframework.core.env.AbstractPropertyResolver.getProperty(AbstractPropertyResolver.java:143)
at org.springframework.core.env.AbstractEnvironment.getProperty(AbstractEnvironment.java:546)
at com.mycorp.DoSomething.go(DoSomething.java:103)
The Spring Boot developers have confirmed that this is poorly documented and does not work as specified: https://github.com/spring-projects/spring-boot/issues/6222
Try to remove static keyword from conversionService bean definition.
Related
I'm trying to test a class that depends on a bean to work, which I'd like to mock. This bean requires a string value, which is received from my application.yml file using #ConfigurationProperties and that is likely the problem since other beans mocked in the same test class work just fine. Running the application normally also works just fine, so the error seems to be related to the #MockBean in some way.
I have this configuration class which gets the value from the application.yml:
#Data
#ConfigurationProperties("some_api")
public class SomeApiDaoConfig {
private String url;
}
Also, the value is set in the integrationTest application.yml file:
some_api:
url: http://localhost:8082
And also this factory, which creates the bean:
#Factory
public class SomeApiDaoFactory {
#Singleton
public SomeApiDao someApiDao(SomeApiDaoConfig someApiDaoConfig) {
return new SomeApiDao(someApiDaoConfig.getUrl());
}
}
The test class is basically:
#MicronautTest(packages = {"<<path to someApiDao>>"})
public class ServiceTest {
#Inject private BlockingStub blockingStub;
#Inject private AnotherDao anotherDao;
#Inject private SomeApiDao someApiDao;
#BeforeEach
void setUp() {
MockitoAnnotations.initMocks(this);
}
... (tests)
#MockBean(AnotherDao.class)
AnotherDao anotherDao() {
return mock(AnotherDao.class);
}
#MockBean(SomeApiDao.class)
SomeApiDao someApiDao() {
return mock(SomeApiDao.class);
}
When I run the tests, however, this error pops up while it tries to initialize the SomeApiDao bean:
Failed to inject value for parameter [url] of class: <path to test>.$ServiceTest$SomeApiDao3Definition$Intercepted
Message: No bean of type [java.lang.String] exists. Make sure the bean is not disabled by bean requirements (enable trace logging for 'io.micronaut.context.condition' to check) and if the bean is enabled then ensure the class is declared a bean and annotation processing is enabled (for Java and Kotlin the 'micronaut-inject-java' dependency should be configured as an annotation processor).
Path Taken: new GrpcEmbeddedServer(ApplicationContext applicationContext,ApplicationConfiguration applicationConfiguration,GrpcServerConfiguration grpcServerConfiguration,[ServerBuilder serverBuilder],ApplicationEventPublisher eventPublisher,ComputeInstanceMetadataResolver computeInstanceMetadataResolver,List metadataContributors) --> ServerBuilder.serverBuilder(GrpcServerConfiguration configuration,[List serviceList],List interceptors,List serverTransportFilters) --> new Service([SomeApiDao someApiDao]) --> new $ServiceTest$SomeApiDao3Definition$Intercepted([String url],BeanContext beanContext,Qualifier qualifier,Interceptor[] interceptors)
io.micronaut.context.exceptions.DependencyInjectionException: Failed to inject value for parameter [url] of class: <path to test>.$ServiceTest$SomeApiDao3Definition$Intercepted
#Data
#ConfigurationProperties("some_api")
public class SomeApiDaoConfig {
private String url;
}
I'll assume #Data is the Lombok annotation and thus it's creating a constructor argument for the url. Micronaut did not support injecting configuration keys into constructor arguments until 1.3 and it requires the constructor be annotated with ConfigurationInject
I'm trying to get an application working in Spring-boot, but I'm running into injections errors. I have a #Service with a few #Autowire Classes. The classes our just POJO with a public setDatSource method that I need to set the DataSource via runtime. See below:
#Bean
#Qualifier("datasetDao")
public com.lexi.dao.core.DatasetDAO getDatasetDao() throws NamingException {
DatasetDAOImpl ds = new DatasetDAOImpl();
ds.setDataSource(createAuthReadDataSoure());
return ds;
}
#Bean
public LicenseDAO getLicenseDao() throws NamingException {
LicenseDAOImpl ds = new LicenseDAOImpl();
ds.setReadDataSource(createOnlineDSReadDataSoure());
ds.setWriteDataSource(createOnlineDSWriteDataSoure());
ds.setDistribDataSource(createAuthReadDataSoure());
return ds;
}
I have a Service define as this:
#Service
public class LicenseService {
#Autowired
#Qualifier("datasetDao")
private DatasetDAO datasetDao;
#Autowired
private LicenseDAO licenseDao;
However when I run the application I get this:
***************************
APPLICATION FAILED TO START
***************************
Description:
Field datasetDao in com.wk.online.services.LicenseService required a single bean, but 3 were found:
- createAuthReadDataSoure: defined by method 'createAuthReadDataSoure' in com.wk.online.ws.OnlineWsApplication
- createOnlineDSReadDataSoure: defined by method 'createOnlineDSReadDataSoure' in com.wk.online.ws.OnlineWsApplication
- createOnlineDSWriteDataSoure: defined by method 'createOnlineDSWriteDataSoure' in com.wk.online.ws.OnlineWsApplication
Action:
Consider marking one of the beans as #Primary, updating the consumer to accept multiple beans, or using #Qualifier to identify the bean that should be consumed
I tried to add a #Qualifier but that didn't seem to jive wiht Spring. What am I missing, I been at this for a while and figured i'm doing something very stupid.
When defining the bean, you need to specify name, not qualifier, qualifier annotation should be used where you autowire it:
#Bean(name = "datasetDao")
public com.lexi.dao.core.DatasetDAO getDatasetDao() throws NamingException {
DatasetDAOImpl ds = new DatasetDAOImpl();
ds.setDataSource(createAuthReadDataSoure());
return ds;
}
Do you have #Bean annotation on following methods in OnlineWsApplication class?
createAuthReadDataSoure
createOnlineDSReadDataSoure
createOnlineDSWriteDataSoure
If yes get rid of them.
Full code of OnlineWsApplication would be very useful to invastigate it.
In the bean definition, instead of
#Bean
#Qualifier("datasetDao")
Try using the following:
#Bean(name="datasetDao")
I have a Spring project where multiple beans may have the same bean name.
In order to avoid ConflictingBeanDefinitionException the project has an overridden ContextNamespaceHandler.
public class ContextNamespaceHandler extends NamespaceHandlerSupport {
#Override
public void init() {
registerBeanDefinitionParser("component-scan", new ComponentScanBeanDefinitionParser() {
#Override
protected ClassPathBeanDefinitionScanner createScanner(XmlReaderContext readerContext, boolean useDefaultFilters) {
return new ClassPathBeanDefinitionScanner(readerContext.getRegistry(), useDefaultFilters) {
#Override
protected boolean checkCandidate(String beanName, BeanDefinition beanDefinition) throws IllegalStateException {
return true;
}
};
}
});
}
}
I'm using Swagger/Springfox to generate API documentation for the project.
#Configuration
#EnableWebMvc
#EnableSwagger2
#ComponentScan(basePackages = { "some.package", "some.other.package" })
public class SwaggerConfig {
// ...
}
#ComponentScan is causing ConflictingBeanDefinitionException as it is using default ClassPathBeanDefinitionScanner instead of the overridden one.
Caused by: org.springframework.context.annotation.ConflictingBeanDefinitionException: Annotation-specified bean name 'xxx' for bean class [xxx] conflicts with existing, non-compatible bean definition of same name and class [xxx]
at org.springframework.context.annotation.ClassPathBeanDefinitionScanner.checkCandidate(ClassPathBeanDefinitionScanner.java:320)
at org.springframework.context.annotation.ClassPathBeanDefinitionScanner.doScan(ClassPathBeanDefinitionScanner.java:259)
at org.springframework.context.annotation.ComponentScanAnnotationParser.parse(ComponentScanAnnotationParser.java:140)
at org.springframework.context.annotation.ConfigurationClassParser.doProcessConfigurationClass(ConfigurationClassParser.java:262)
at org.springframework.context.annotation.ConfigurationClassParser.processConfigurationClass(ConfigurationClassParser.java:226)
at org.springframework.context.annotation.ConfigurationClassParser.parse(ConfigurationClassParser.java:193)
at org.springframework.context.annotation.ConfigurationClassParser.parse(ConfigurationClassParser.java:163)
... 22 more
Is there a way to override ClassPathBeanDefinitionScanner used by #ComponentScan or other way to suppress ConflictingBeanDefinitionException?
The best approach here is usually to make each bean name unique by specifying the bean name in the annotation, i.e.:
#Service("someService")
Now, you can specify the name of the bean you want to use when autowiring:
#Autowired
#Qualifier("someService")
Even if overridden components exist for generic objects, it's often a good idea to specify a name in order to make a clear distinction. Suppressing exceptions is typically considered to not be good practice. Hope this helps!
The following example shows explicit wiring of dependencies using spring java config that results in a different bean being wired in while using and interface for a spring configuration class.
This seems like it shouldn't occur or at least give the normal warning that there are two beans as candidates for autowiring and it doesn't know which to select.
Any thoughts on this issue? My guess is there is no real name spacing between configuration classes as is implied by the syntax "this.iConfig.a()" Could this be considered a bug (if only for not warning about the 2 candidate beans)?
public class Main
{
public static void main( final String[] args )
{
final ApplicationContext context = new AnnotationConfigApplicationContext( IConfigImpl.class, ServiceConfig.class );
final Test test = context.getBean( Test.class );
System.out.println( test );
}
}
public class Test
{
private final String string;
public Test( final String param )
{
this.string = param;
}
public String toString()
{
return this.string;
}
}
#Configuration
public interface IConfig
{
#Bean
public String a();
}
#Configuration
public class IConfigImpl implements IConfig
{
#Bean
public String a()
{
return "GOOD String";
}
}
#Configuration
public class ServiceConfig
{
#Autowired
IConfig iConfig;
#Bean
Test test()
{
return new Test( this.iConfig.a() );
}
#Bean
String a()
{
return "BAD String";
}
}
In this case, I would expect to have "GOOD String" to be always be wired in the Test object, but flipping the order of IConfigImpl.class, ServiceConfig.class in the context loader changes which string is loaded.
Tested with Spring 4.0.7
EDIT: Further testing shows this has nothing to to with inherented configs. Same thing results if you drop the IConfig interface.
I believe this was a behavior of Spring for years.
If you redefine a bean, the one that is being loaded as last wins.
Another question would be how to control the order of bean loading when java configs are used. Check out this article http://www.java-allandsundry.com/2013/04/spring-beans-with-same-name-and.html which shows you how to do the ordering by using #Import of the other Spring java config.
The solution is actually simple - if you need to override a previously
defined bean(without say the flexibility of autowiring with a
different bean name), either use the XML bean configuration for both
the bean being overridden and the overriding bean or use the
#Configuration. XML bean configuration is the first example in this
entry, the one with #Configuration would be something like this:
#Configuration
public class Context1JavaConfig {
#Bean
public MemberService memberService() {
return new MemberSvcImpl1();
}
}
#Configuration
#Import(Context1JavaConfig.class)
public class Context2JavaConfig {
#Bean
public MemberService memberService() {
return new MemberSvcImpl2();
}
}
Stepan has mentioned the issue of order. The following is about your comment on their answer
Overriding beans of the same name makes sense, but in this case, I'm
specifically referencing the bean as specified in the iConfig
configuration. I would expect to get the one specified there.
In order to implement #Configuration and the caching of beans so that calls like
#Configuration
class Example {
#Bean
public UncaughtExceptionHandler uncaughtExceptionHandler() {
return (thread, throwable) -> System.out.println(thread + " => " + throwable.getMessage());
}
#Bean
#Scope(value = ConfigurableBeanFactory.SCOPE_PROTOTYPE)
public Thread newThread() {
Thread thread = new Thread();
thread.setUncaughtExceptionHandler(uncaughtExceptionHandler()); // <<<<<< allowing this
return thread;
}
}
Spring actually uses CGLIB to create a proxy subtype of the #Configuration annotated class. This proxy maintains a reference to the backing ApplicationContext and uses that to resolve a bean.
So the call in your example
return new Test(this.iConfig.a());
isn't really invoking IConfigImpl#a(). It invokes this code (as of 4.2) from the proxy interceptor. The code uses the corresponding Method to determine the target bean name and uses the ApplicationContext's BeanFactory to resolve the bean. Since the bean definition for a bean named a has already been overriden, that new bean definition gets used. That bean definition is using the ServiceConfig#a() method as its factory method.
This is described in the documentation, here
All #Configuration classes are subclassed at startup-time with CGLIB.
In the subclass, the child method checks the container first for any
cached (scoped) beans before it calls the parent method and creates a
new instance.
Could this be considered a bug [...]?
I don't believe so. The behavior is documented.
I'm new to spring and am working on an app that uses multiple application contexts, configured using annotation.
I have one context where I am creating 3 singleton beans, one of which I want to pass as an argument into the factory method for a prototype bean which will live in a different application context.
This other application context is created as one of the singleton beans within this original context.
The problem I am seeing is that, at the point at which I try to use getBean() to create this other bean that lives in this second context (see the 'someBean()' factory method below), I get an exception from the framework:
Error creating bean with name 'someBean' defined in class
org.imaginary.SpringAppDependencyConfiguration: Instantiation of bean
failed; nested exception is...Unsatisfied
dependency expressed through constructor argument with index 0 of type
[org.imaginary.ISomeDependency]: : No qualifying bean of type
[org.imaginary.ISomeDependency] found for dependency: expected at
least 1 bean which qualifies as autowire candidate for this
dependency.
What have I jacked up here?
The config for the original context looks like so:
#Configuration
public class SpringAppDependencyConfiguration
{
#Autowired
private ISomeDependency someDependency;
#Autowired
private AnnotationConfigApplicationContext otherSpringContext;
#Bean(destroyMethod="close")
public ISomeDependency someDependency()
{
return new SomeDependencyImpl( 13 );
}
#Bean (destroyMethod="close")
public AnnotationConfigApplicationContext otherSpringContext()
{
return new AnnotationConfigApplicationContext(OtherContextDependencyConfiguration.class);
}
#Bean
#DependsOn( { "otherSpringContext", "someDependency" } )
public ISomeBean someBean() throws Exception
{
if ( !otherSpringContext.getBeansOfType( ISomeOtherBean.class ).containsKey( "SomeOtherBean" ) )
{
throw new Exception("SpringAppDependencyConfiguration.someBean(): " +
"unable to find SomeOtherBean implementation");
}
ISomeOtherBean someOtherBean = (ISomeOtherBean) otherSpringContext.getBean( "SomeOtherBean", someDependency );
return new SomeBeanImpl( someDependency, someOtherBean );
}
}
The config for the other application context looks like so:
#Configuration
public class OtherContextDependencyConfiguration
{
#Bean
#Scope("prototype")
public ISomeOtherBean someOtherBean(ISomeDependency theDependency) throws Exception
{
return new SomeOtherBeanImpl(theDependency);
}
}
So, I think I learned that one way I can get the result I was looking for is to create this second context in such a way that it treats the initial context as a parent.
To support this required a different conceptual organization of the classes, so I will try to post an update to this question with my current approach once I've got a digestible version of it.