I'm trying to write a basic controller test in a micronaut (3.2.7) application. When I run it, it fails to start as it wants to create DB related beans too. micronaut-hibernate-jpa, flyway, etc. are in the pom.xml.
Can I configure the context somehow so it doesn't pick up hikaripool,flyway, and jpa realted beans?
11:46:23.820 [main] INFO i.m.context.env.DefaultEnvironment - Established active environments: [test]
11:46:24.112 [main] WARN i.m.c.h.j.JpaConfiguration$EntityScanConfiguration - Runtime classpath scanning is no longer supported. Use #Introspected to declare the packages you want to index at build time. Example #Introspected(packages="foo.bar", includedAnnotations=Entity.class)
11:46:24.133 [main] INFO com.zaxxer.hikari.HikariDataSource - HikariPool-1 - Starting...
11:46:25.197 [main] ERROR com.zaxxer.hikari.pool.HikariPool - HikariPool-1 - Exception during pool initialization.
org.postgresql.util.PSQLException: FATAL: password authentication failed for user "postgres"
The code:
class HelloTest {
private static EmbeddedServer server;
private static HttpClient client;
#BeforeAll
public static void setupServer() {
server = ApplicationContext.run(EmbeddedServer.class);
client = server
.getApplicationContext()
.createBean(HttpClient.class, server.getURL());
}
#AfterAll
public static void stopServer() {
if (server != null) {
server.stop();
}
if (client != null) {
client.stop();
}
}
#Test
void testHelloWorldResponse() {
...
}
}
I tried to exclude configurations like this, but with no luck:
server = ApplicationContext.builder("test")
.exclude("io.micronaut.configuration.hibernate.jpa","io.micronaut.configuration.jdbc.hikari")
.run(EmbeddedServer.class);
Note: If I remove everything from application.yml then the test works. It looks like that in tests the default properties are resolved which turns on jpa,metrics, etc. So I guess the test needs to ignore the default settings too somehow.
You can override all of your (default) application.yml with (test-)environment specific property files: https://docs.micronaut.io/latest/guide/index.html#_included_propertysource_loaders
So you can just provide a dedicated application-mycustomtest.yml as part of your test resources, in which you override all default settings.
Then you can specify as part of the test, which environments shall be active:
#MicronautTest(environments={"mycustomtest"})
Asked the micronaut team on gitter and currenlty the only option is not having a default configuration and having multiple configuration files for controller, repo and e2e testing.
Related
I want to create a microservice with Spring Boot. For persistence i use a mariadb database. To wait for the database which is running in a docker container, i implemented the following code like shown here:
#Bean
public DatabaseStartupValidator databaseStartupValidator(DataSource dataSource) {
var dsv = new DatabaseStartupValidator();
dsv.setDataSource(dataSource);
dsv.setTimeout(60);
dsv.setInterval(7);
dsv.setValidationQuery(DatabaseDriver.MYSQL.getValidationQuery());
return dsv;
}
The code is working very well, my application is now waiting for the database connection. But i get an exception at startup of the application:
java.sql.SQLNonTransientConnectionException: Could not connect to Host ....
...
...
...
In the next line i get an information, that it will wait for the database:
021-04-07 21:29:40.816 INFO 16569 --- [ main] o.s.j.support.DatabaseStartupValidator : Database has not started up yet - retrying in 7 seconds (timeout in 57.65 seconds)
After that the application is starting as expected. So i think everything is working fine, but what i have to do to suppress the Exception? In the linked article it should work without an exception. Do i have to implement the "dependsOnPostProcessor" function? Which dependency i have to use? Sorry, possible a dumb question, i am new to spring boot.
to get rid of that exception you can state the below directive in your application.properties file:
logging.level.com.zaxxer.hikari=OFF
Keep in mind that if the application will not be able to get in contact with the db your spring crashes after a while due to that exception. In addition the above directive prevent you to see any logging activity related to Hikari.
In summary you hide the appearance of the exception until it is possible before the application dies due to timeout.
hoping I clarified a bit the case
Yes indeed you need to add the "depends-on" for the beans that rely on the data source. Note the following part of the documentation:
To be referenced via "depends-on" from beans that depend on database startup, like a Hibernate SessionFactory or custom data access objects that access a DataSource directly.
If I understand it well, this means that beans such as an EntityManagerFactory which rely on the database will now have to go through the DatabaseStartupValidator bean and wait for the DB startup. I don't know what caused your exception, but usually there is an EntityManagerFactory involved, so try adding the DependsOn on this object at least.
This is how the linked article is doing it:
#Bean
public static BeanFactoryPostProcessor dependsOnPostProcessor() {
return bf -> {
// Let beans that need the database depend on the DatabaseStartupValidator
// like the JPA EntityManagerFactory or Flyway
String[] flyway = bf.getBeanNamesForType(Flyway.class);
Stream.of(flyway)
.map(bf::getBeanDefinition)
.forEach(it -> it.setDependsOn("databaseStartupValidator"));
String[] jpa = bf.getBeanNamesForType(EntityManagerFactory.class);
Stream.of(jpa)
.map(bf::getBeanDefinition)
.forEach(it -> it.setDependsOn("databaseStartupValidator"));
};
}
You may not necessarily have Flyway configured, but the main thing to note is the dependency itself is referenced by the bean name databaseStartupValidator which is the name of the method that creates the bean.
I have a requirement to send a message to a RabbitMQ instance with a JPA entity after persisting/flushing it, which lead me to configure the rabbitTemplate as channelTransacted.
The consumer is external, but, to create an integration test, I have added an embedded broker (Apache QPid) and a listener to be able to check that the messages are sent.
As indicated by the documentation, I seem to have ran into a deadlock:
If we have producers and consumers in the same application, we may end up with a deadlock when producers are blocking the connection (because there are no resources on the Broker any more) and consumers cannot free them (because the connection is blocked). [...]
A separate CachingConnectionFactory is not possible for transactional producers that execute on a consumer thread, since they should reuse the Channel associated with the consumer transactions.
If I set rabbitTemplate.channelTransacted = false, the listener gets invoked fine, otherwise harness.getNextInvocationDataFor just waits until it times out.
What I'm hoping is that there's still a way to do this kind of integration test and that perhaps I configured something wrong.
I've tried with both the simple and the direct listener types, didn't make any difference:
queues:
transactions: 'transactions'
spring:
rabbitmq:
host: rabbitmq
username: guest
password: guest
dynamic: true # automatically create queues and exchanges on the RabbitMQ server
template:
routing-key: ${queues.transactions}
retry.enabled: true
# mandatory: true # interesting only for cases where a reply is expected
# publisher-confirms: true # does not work in transacted mode
publisher-returns: true # required to get notifications in case of send problems
# used for integration tests
listener:
type: direct
direct:
retry:
enabled: true
stateless: false # needed when transacted mode is enabled
max-attempts: 1 # since this is used just for integration tests, we don't want more
I'm using Spring Boot 2.1.3 with the spring-boot-starter-amqp, which pulls in spring-rabbit-2.1.4 and Apache Qpid 7.1.1 as the embedded broker for the test:
#RunWith(SpringRunner.class)
#SpringBootTest(properties = "spring.main.allow-bean-definition-overriding=true")
#AutoConfigureTestDatabase
#Transactional
#ActiveProfiles("test")
public class SalesTransactionGatewayTest {
private static final String TRANSACTIONS_LISTENER = "transactions";
#TestConfiguration
#RabbitListenerTest(spy = false, capture = true)
public static class Config {
#Bean
public SystemLauncher broker() throws Exception {
SystemLauncher broker = new SystemLauncher();
Map<String, Object> attributes = new HashMap<>();
attributes.put(SystemConfig.TYPE, "Memory");
attributes.put(SystemConfig.INITIAL_CONFIGURATION_LOCATION, "classpath:qpid-config.json");
attributes.put(SystemConfig.STARTUP_LOGGED_TO_SYSTEM_OUT, false);
broker.startup(attributes);
return broker;
}
#Bean
public Listener listener() {
return new Listener();
}
}
public static class Listener {
#RabbitListener(id = TRANSACTIONS_LISTENER, queues = "${queues.transactions}")
public void receive(SalesTransaction transaction) {
Logger.getLogger(Listener.class.getName()).log(Level.INFO, "Received tx: {0}", transaction);
}
}
#Before
public void setUp() {
// this makes the test work, setting it to `true` makes it deadlock
rabbitTemplate.setChannelTransacted(false);
}
#Test
public void shouldBeSentToGateway() throws Exception {
SalesTransaction savedTransaction = service.saveTransaction(salesTransaction);
InvocationData invocationData = this.harness.getNextInvocationDataFor(TRANSACTIONS_LISTENER, 10, TimeUnit.SECONDS);
assertNotNull(invocationData);
assertEquals(salesTransaction, invocationData.getArguments()[0]);
}
}
11:02:56.844 [SimpleAsyncTaskExecutor-1] INFO org.springframework.amqp.rabbit.listener.DirectMessageListenerContainer - SimpleConsumer [queue=transactions, consumerTag=sgen_1 identity=16ef3497] started
Mar 25, 2019 11:02:57 AM AmqpSalesTransactionGateway send
INFO: Sending transaction: 01a92a56-c93b-4d02-af66-66ef007c2817 w/ status COMPLETED
11:02:57.961 [main] INFO org.springframework.amqp.rabbit.connection.CachingConnectionFactory - Attempting to connect to: [localhost:5672]
11:02:57.972 [main] INFO org.springframework.amqp.rabbit.connection.CachingConnectionFactory - Created new connection: rabbitConnectionFactory.publisher#6d543ec2:0/SimpleConnection#79dd79eb [delegate=amqp://guest#127.0.0.1:5672/, localPort= 56501]
java.lang.AssertionError
at org.junit.Assert.fail(Assert.java:86)
at org.junit.Assert.assertTrue(Assert.java:41)
at org.junit.Assert.assertNotNull(Assert.java:712)
at org.junit.Assert.assertNotNull(Assert.java:722)
I'm using spring-cloud-aws-autoconfigure:2.1.0.RELEASE to connect to AWS. However when the app is running in an enviromnent other than AWS, I don't want the auto configuration to take place.
I tried turning off the auto configuration as suggested here and here with java configuration class, and also with spring.autoconfigure.excludes property in my yml file like this:
spring:
autoconfigure:
exclude:
- org.springframework.cloud.aws.autoconfigure.context.ContextCredentialsAutoConfiguration
- org.springframework.cloud.aws.autoconfigure.context.ContextInstanceDataAutoConfiguration
- org.springframework.cloud.aws.autoconfigure.context.ContextStackAutoConfiguration
- org.springframework.cloud.aws.autoconfigure.messaging.MessagingAutoConfiguration
But none of those solutions seems to work. The autoconfiguration still takes place and consequently, the app fails to start.
Found a solution: I added this directly to my main application class:
import org.springframework.cloud.aws.autoconfigure.context.*;
#SpringBootApplication
#EnableAutoConfiguration(exclude = {
ContextCredentialsAutoConfiguration.class,
ContextInstanceDataAutoConfiguration.class,
ContextRegionProviderAutoConfiguration.class,
ContextResourceLoaderAutoConfiguration.class,
ContextStackAutoConfiguration.class,
MailSenderAutoConfiguration.class,
})
public class MyApplication {
public static void main(String[] args) {
SpringApplication.run(MyApplication.class, args);
}
}
Found solution: I excluded every class I found in the autoconfiguration jar:
spring:
autoconfigure:
exclude:
- org.springframework.cloud.aws.autoconfigure.cache.ElastiCacheAutoConfiguration
- org.springframework.cloud.aws.autoconfigure.context.ContextCredentialsAutoConfiguration
- org.springframework.cloud.aws.autoconfigure.context.ContextInstanceDataAutoConfiguration
- org.springframework.cloud.aws.autoconfigure.context.ContextRegionProviderAutoConfiguration
- org.springframework.cloud.aws.autoconfigure.context.ContextRegionProviderAutoConfiguration
- org.springframework.cloud.aws.autoconfigure.context.ContextRegionProviderAutoConfiguration
- org.springframework.cloud.aws.autoconfigure.jdbc.AmazonRdsDatabaseAutoConfiguration
- org.springframework.cloud.aws.autoconfigure.mail.MailSenderAutoConfiguration
- org.springframework.cloud.aws.autoconfigure.messaging.MessagingAutoConfiguration
- org.springframework.cloud.aws.autoconfigure.metrics.CloudWatchExportAutoConfiguration
I have a SpringBatch project where I want to catch an exception thrown when the application cannot find the datasource. I've already by-passed this, so it will use 'in memory DAO objects' instead of tables.. but it still throws an exception when datasource is not found.
I want to catch that exception and throw my own error code, but I have no idea where the try/catch block must be placed.
Here is a piece of the error log:
2016-11-24 09:25:36.171 INFO 36770 --- [main] c.d.d.e.config.ReaderConfiguration : Configuring FlatFileItemReader for [MAP]
2016-11-24 09:25:51.664 ERROR 36770 --- [main] o.a.tomcat.jdbc.pool.ConnectionPool : Unable to create initial connections of pool.
com.microsoft.sqlserver.jdbc.SQLServerException: The TCP/IP connection to the host [***], port 1433 has failed. Error: "null. Verify the connection properties. Make sure that an instance of SQL Server is running on the host and accepting TCP/IP connections at the port. Make sure that TCP connections to the port are not blocked by a firewall.".
at com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDriverError(SQLServerException.java:190) ~[sqljdbc4.jar:na]
This is overridden to bypass table creation. Also, I use two datasources and this class needed to be here anyway.
#Component
public class MyBatchConfigurer extends DefaultBatchConfigurer {
//Spring batch needs this in order to allow to use more than one datasource
#Override
public JobRepository createJobRepository() throws Exception {
return new MapJobRepositoryFactoryBean().getObject();
}
}
To be noted that I even tried to place try/catch on "the main" method.. and it still throws the exception above.. and then gets to the breakpoint inside catch.
Also, I tried creating the datasource manually.. but to no avail. Even more, ApplicationEvent doesn't seem to work either.
This is a log where datasource is found:
2016-10-25 16:05:13 [main] INFO c.d.d.e.config.CompanyConfiguration - Configure FlatFileItemReader for [IW]
2016-10-25 16:05:13 [main] INFO o.s.jdbc.datasource.init.ScriptUtils - Executing SQL script from class path resource [org/springframework/batch/core/schema-sqlserver.sql]
2016-10-25 16:05:13 [main] INFO o.s.jdbc.datasource.init.ScriptUtils - Executed SQL script from class path resource [org/springframework/batch/core/schema-sqlserver.sql] in 49 ms.
2016-10-25 16:05:13 [main] INFO o.s.b.f.config.PropertiesFactoryBean - Loading properties file from URL [jar:file:/home/etl/d-d/d-e-1.0.4-SNAPSHOT.jar!/lib/spring-integration-core-4.2.5.RELEASE.jar!/META-INF/spring.integration.default.properties]
Is there anyway I can know in my program, the full path of file loaded through #PropertySource annotation of Spring.
I need it to show in logs so that one can know which property file is being used in the application
This information is logged already by StandardServletEnvironment. You can set log level to DEBUG for org.springframework.web.context.support.StandardServletEnvironment class to show details in your logs.
If you use spring-boot you can simply add following line into your application.properties file.
logging.level.org.springframework.web.context.support.StandardServletEnvironment = DEBUG
Below seems to be working, though I am not sure if the instance is always of type ConfigurableEnvironment
#Component
public class MyListener implements ApplicationListener<ContextRefreshedEvent>{
#Autowired
private Environment env;
private static final Logger log = LoggerFactory.getLogger(MyListener.class);
#Override
public void onApplicationEvent(ContextRefreshedEvent event) {
if(env instanceof ConfigurableEnvironment){
MutablePropertySources propertySources = ((ConfigurableEnvironment)env).getPropertySources();
for(PropertySource ps : propertySources){
log.info(ps.getName()); //if only file based needed then check if instanceof ResourcePropertySource
}
}
}
}
Edit: don't really need all this. As already answered by Selim, enabling the proper logs does the trick
log4j.logger.org.springframework.core.env.MutablePropertySources=DEBUG
in current ('21) versions of Spring Boot, neither of the two above suggestions for the logging level seem to work. moreover - if the file is actually NOT loaded, because it is NOT found or for whatever other reason, nothing is printed anyway.
at the moment when i have my ROOT logger set to DEBUG (logging.level.root=DEBUG in application.properties), the only thing I see in the log file, when the file is loaded correctly and the #Value annotated property is resolved successfully is:
2021-07-23 11:06:10.299 DEBUG 16776 --- [ restartedMain]
o.s.b.f.s.DefaultListableBeanFactory : Creating shared instance of
singleton bean 'bahblahService'
2021-07-23 11:06:10.302 DEBUG 16776 --- [ restartedMain]
o.s.c.e.PropertySourcesPropertyResolver : Found key
'blahblah.username' in PropertySource 'class path
resource [custom-local.properties]' with value of type String