We are getting a lot of errors in our Spring boot app using Spring data Neo4j, caused by this error:
org.neo4j.driver.exceptions.ServiceUnavailableException: Connection pool for server server-url.example.com:xxxx is closed while acquiring a connection.
Does anyone have an idea of where this error comes from ? Do we have to tweak the pools on the database server side or on application side ?
We are using Neo4j 4.0.11 and spring-data-neo4j 6.1.1 (in spring-boot-starter-data-neo4j 2.5.0). The connection is made with bolt protocol.
EDIT:
Here are additional infos on our configuration:
application.yml:
server:
port: 9080
error.include-message: always
management:
endpoint:
health:
group:
readiness:
include: neo4j
liveness:
include: neo4j
logging.level:
root: WARN
org.springframework: INFO
io.package.directions: INFO
spring:
application.name: directions
cache:
cache-names: nodes, records
caffeine.spec: maximumSize=1000
config.location: classpath:/config/
profiles.active: prod
jmx.enabled: false
jackson.serialization:
FAIL_ON_EMPTY_BEANS: false
WRITE_DATES_AS_TIMESTAMPS: true
WRITE_DATE_TIMESTAMPS_AS_NANOSECONDS: false
spring.neo4j:
uri: ${NEO4J_URL}
authentication:
username: ${NEO4J_USERNAME}
password: ${NEO4J_PASSWORD}
mail:
api: "http://${MAIL_SERVICE_HOST:mail}:${MAIL_SERVICE_PORT:9000}"
We connect to another kubernetes pod using bolt protocol.
Here is our fairly simple transaction management bean:
#Configuration
#EnableTransactionManagement
public class TxConfig {
final Driver driver;
public TxConfig(Driver driver) {
this.driver = driver;
}
#Bean
public ReactiveTransactionManager reactiveTransactionManager() {
return new ReactiveNeo4jTransactionManager(driver);
}
}
I'm playing around with Spring Boot and the reactive jdbc driver called r2dbc. In my main application I'm using Postgres as a database and now I want to the use h2 for the tests. And the Flyway migration is working with the setup but when the Spring application is able to insert records.
Here is my setup and code
#SpringBootTest
class CustomerRepositoryTest {
#Autowired
CustomerRepository repository;
#Test
void insertToDatabase() {
repository.saveAll(List.of(new Customer("Jack", "Bauer"),
new Customer("Chloe", "O'Brian"),
new Customer("Kim", "Bauer"),
new Customer("David", "Palmer"),
new Customer("Michelle", "Dessler")))
.blockLast(Duration.ofSeconds(10));
}
}
Here is the error that I'm getting
:: Spring Boot :: (v2.3.4.RELEASE)
2020-10-14 15:59:18.538 INFO 25279 --- [ main] i.g.i.repository.CustomerRepositoryTest : Starting CustomerRepositoryTest on imalik8088.fritz.box with PID 25279 (started by imalik in /Users/imalik/code/private/explore-java/spring-example)
2020-10-14 15:59:18.540 INFO 25279 --- [ main] i.g.i.repository.CustomerRepositoryTest : No active profile set, falling back to default profiles: default
2020-10-14 15:59:19.108 INFO 25279 --- [ main] .s.d.r.c.RepositoryConfigurationDelegate : Bootstrapping Spring Data R2DBC repositories in DEFAULT mode.
2020-10-14 15:59:19.273 INFO 25279 --- [ main] .s.d.r.c.RepositoryConfigurationDelegate : Finished Spring Data repository scanning in 160ms. Found 1 R2DBC repository interfaces.
2020-10-14 15:59:19.894 INFO 25279 --- [ main] o.f.c.internal.license.VersionPrinter : Flyway Community Edition 6.5.0 by Redgate
2020-10-14 15:59:20.052 INFO 25279 --- [ main] o.f.c.internal.database.DatabaseFactory : Database: jdbc:h2:mem:///DBNAME (H2 1.4)
2020-10-14 15:59:20.118 INFO 25279 --- [ main] o.f.core.internal.command.DbValidate : Successfully validated 1 migration (execution time 00:00.022s)
2020-10-14 15:59:20.131 INFO 25279 --- [ main] o.f.c.i.s.JdbcTableSchemaHistory : Creating Schema History table "PUBLIC"."flyway_schema_history" ...
2020-10-14 15:59:20.175 INFO 25279 --- [ main] o.f.core.internal.command.DbMigrate : Current version of schema "PUBLIC": << Empty Schema >>
2020-10-14 15:59:20.178 INFO 25279 --- [ main] o.f.core.internal.command.DbMigrate : Migrating schema "PUBLIC" to version 1.0.0 - schma
2020-10-14 15:59:20.204 INFO 25279 --- [ main] o.f.core.internal.command.DbMigrate : Successfully applied 1 migration to schema "PUBLIC" (execution time 00:00.036s)
2020-10-14 15:59:20.689 INFO 25279 --- [ main] i.g.i.repository.CustomerRepositoryTest : Started CustomerRepositoryTest in 2.466 seconds (JVM running for 3.326)
2020-10-14 15:59:21.115 DEBUG 25279 --- [ main] o.s.d.r2dbc.core.DefaultDatabaseClient : Executing SQL statement [INSERT INTO customer (first_name, last_name) VALUES ($1, $2)]
org.springframework.data.r2dbc.BadSqlGrammarException: executeMany; bad SQL grammar [INSERT INTO customer (first_name, last_name) VALUES ($1, $2)]; nested exception is io.r2dbc.spi.R2dbcBadGrammarException: [42102] [42S02] Tabelle "CUSTOMER" nicht gefunden
Table "CUSTOMER" not found; SQL statement:
INSERT INTO customer (first_name, last_name) VALUES ($1, $2) [42102-200]
My src/test/resources/application.yaml is looking like this:
spring:
r2dbc:
url: r2dbc:h2:mem:///DBNAME?options=DB_CLOSE_DELAY=-1;DB_CLOSE_ON_EXIT=FALSE
username: sa
password:
flyway:
url: jdbc:h2:mem:///DBNAME
baseline-on-migrate: true
user: sa
password:
Any ideas whats missing missing or whats wrong with the setup? If further information is needed please let me know.
Addition/Solution:
The url pattern is different between jdbc and r2dbc. The working solution for me is as follows:
url: r2dbc:h2:file:///./tmp/test-database
url: jdbc:h2:file:./tmp/test-database
And In order to setup Flyway you have to Configure Flyway:
// Flyway is not compatible with r2dbc yet, therefore this config class is created
#Configuration
public class FlywayConfig {
private final Environment env;
public FlywayConfig(final Environment env) {
this.env = env;
}
#Bean(initMethod = "migrate")
public Flyway flyway() {
return new Flyway(Flyway.configure()
.baselineOnMigrate(true)
.dataSource(
env.getRequiredProperty("spring.flyway.url"),
env.getRequiredProperty("spring.flyway.user"),
env.getRequiredProperty("spring.flyway.password"))
);
}
}
I've faced the same issue to setup and access to h2 database in memory for tests:
Liquibase for database migration using JDBC driver
Tests Reactive Crud Repository using R2DBC driver
Error encoutred:
org.springframework.data.r2dbc.BadSqlGrammarException: executeMany; bad SQL grammar [INSERT INTO MY_TABLE... Table "MY_TABLE" not found ...
Inspired by Chris's solution, i configured my src/testresources/application.properties file as follow:
spring.r2dbc.url=r2dbc:h2:mem:///~/db/testdb
spring.r2dbc.username=sa
spring.r2dbc.password=
spring.liquibase.url=jdbc:h2:mem:~/db/testdb;DB_CLOSE_DELAY=-1
spring.liquibase.user=sa
spring.liquibase.password=
spring.liquibase.enabled=true
I am currently having the same problem using r2dbc with liquibase. I am suspecting that the JDBC url points to a different database due to a slightly different syntax between R2DB and JDBC. I can manage to get h2 running from the file system though...
url: r2dbc:h2:file:///~/db/testdb
...
url: jdbc:h2:file:~/db/testdb
EDIT:
In non-reactive Spring Data I'd usually populate the Schema into the H2 memory database using a schema.sql/data.sql pair. This is also possible with R2DBC, but you have to configure the populator yourself.
It's also in the Getting Started R2DBC Tutorial. Basically you have to register a ConnectionFactoryInitializer bean.
#Bean
public ConnectionFactoryInitializer initializer(#Qualifier("connectionFactory") ConnectionFactory connectionFactory) {
var initializer = new ConnectionFactoryInitializer();
initializer.setConnectionFactory(connectionFactory);
var populator = new CompositeDatabasePopulator();
populator.addPopulators(new ResourceDatabasePopulator(new ClassPathResource("schema.sql")));
populator.addPopulators(new ResourceDatabasePopulator(new ClassPathResource("data.sql")));
initializer.setDatabasePopulator(populator);
return initializer;
}
I was able to get it working.
First of all I created following test configuration class (because I want to execute tests only agains H2, on production mode I am using PostgreSQL):
#TestConfiguration
public class TestConfig {
#Bean
#Profile("test")
public ConnectionFactory connectionFactory() {
System.out.println(">>>>>>>>>> Using H2 in mem R2DBC connection factory");
return H2ConnectionFactory.inMemory("testdb");
}
#Bean(initMethod = "migrate")
#Profile("test")
public Flyway flyway() {
System.out.println("####### Using H2 in mem Flyway connection");
return new Flyway(Flyway.configure()
.baselineOnMigrate(true)
.dataSource(
"jdbc:h2:mem:testdb",
"sa",
"")
);
}
}
As you can see in the code above, both beans are scoped to the "test" profile only. As you can imagine I have pretty much the same beans in a regular ApplicationConfiguration class but annotated as a #Profile("default") and configured to use a PostgreSQL.
Second thing is that I created annotation which combines several other annotations to not repeat myself and to easily pickup beans declared in the TestConfig class:
#Target(ElementType.TYPE)
#Retention(RetentionPolicy.RUNTIME)
#Documented
#Inherited
#SpringBootTest
#ActiveProfiles("test")
#Import(TestConfig.class)
public #interface IntegrationTest {
}
Now the test itself:
#IntegrationTest
class CartsIntegrationTest {
// test methods here ....
}
I believe the main hint is to use H2ConnectionFactory.inMemory("testdb");
Flyway currently only supports the blocking JDBC APIs, and it is not compatible with the reactive r2dbc if possbile do not mix them in the same application.
Try to register a ConnectionFactoryInitializer to initiate the database schema and data as #Chris posted, my working example can be found here.
Try nkonev/r2dbc-migrate which is trying to migrate the flyway to the R2dbc world.
There were 2 issues I was experiencing in my project.
I needed to include the dependency:
<dependency>
<groupId>io.r2dbc</groupId>
<artifactId>r2dbc-h2</artifactId>
<scope>test</scope>
</dependency>
I needed to change the value for spring.r2dbc.url to r2dbc:h2:mem:///test_db
With these changes, rd2bc worked with an in memory h2 database for testing. See also:
https://github.com/r2dbc/r2dbc-h2
I have a application with two submudule project included in it.
rootProject.name = 'fete-bird-product'
include 'product-migration'
include 'Data'
Primary application which is fete-bird-product has the application.yml file
spring:
application:
name: PRODUCT-SERVICE
profiles:
active: dev
data:
mongodb:
uri: mongodb://127.0.1:27017/FeteBird-Product
app:
db:
migrations:
enabled: true
server:
port: 8083
From the Submodule project product-migration I want to access below property in the main class
data:
mongodb:
uri: mongodb://127.0.1:27017/FeteBird-Product
I tried something like this but didn't work
#PropertySource(value = "classpath:PRODUCT-SERVICE")
public class Configuration {
#Value("${spring.data.mongodb.uri}")
private static String MongoUri;
public static String getMongoUri() {
return MongoUri;
}
}
How can we access the primary application property from submudule.
I want to move our Quartz Scheduling configuration to our application.yml instead of maintaining a separate quartz.properties file.
Our Spring Boot application runs and picks up the configuration as expected when using quartz.properties file, but it doesn't pick up the config from application.yml.
Scheduler bean:
#SpringBootApplication
public class MyApp{
public static void main(String[] args) {
SpringApplication.run(MyApp.class, args);
}
...
#Bean
public Scheduler scheduler(SomeCustomConfig cfg, RestTemplate restTemplate) throws SchedulerException {
//StdSchedulerFactory schedulerFactory = new StdSchedulerFactory();
//schedulerFactory.initialize("quartz.properties");
Scheduler scheduler = StdSchedulerFactory.getDefaultScheduler();
scheduler.getContext().put("restTemplate", restTemplate);
scheduler.getContext().put("cfg", cfg);
return scheduler;
}
}
Pertinent application.yml:
spring:
application.name: myApp
quartz:
properties:
org:
quartz:
scheduler:
instanceId: AUTO
threadPool:
threadCount: 5
plugin:
shutdownhook:
class: org.quartz.plugins.management.ShutdownHookPlugin
cleanShutdown: TRUE
jobStore:
class: org.quartz.impl.jdbcjobstore.JobStoreTX
driverDelegateClass: org.quartz.impl.jdbcjobstore.PostgreSQLDelegate
tablePrefix: my_schema.
isClustered: true
dataSource: myDataSource
dataSource:
myDataSource:
driver: org.postgresql.Driver
URL: jdbc:postgresql://localhost/myDataSource
user: removed
password: removed
Our quartz.properties was:
org.quartz.scheduler.instanceId = AUTO
org.quartz.plugin.shutdownhook.class = org.quartz.plugins.management.ShutdownHookPlugin
org.quartz.plugin.shutdownhook.cleanShutdown = TRUE
org.quartz.threadPool.threadCount = 5
org.quartz.jobStore.class = org.quartz.impl.jdbcjobstore.JobStoreTX
org.quartz.jobStore.driverDelegateClass = org.quartz.impl.jdbcjobstore.PostgreSQLDelegate
org.quartz.jobStore.tablePrefix = my_schema.
org.quartz.jobStore.isClustered = true
org.quartz.jobStore.dataSource = myDataSource
org.quartz.dataSource.myDataSource.driver = org.postgresql.Driver
org.quartz.dataSource.myDataSource.URL = jdbc:postgresql://localhost/myDataSource
org.quartz.dataSource.myDataSource.user = removed
org.quartz.dataSource.myDataSource.password = removed
I feel like I'm missing something?
Instead of
spring:
quartz:
properties:
org:
quartz:
jobStore:
isClustered: true
Use this layout:
spring:
quartz:
properties:
org.quartz.jobStore:
isClustered: true
org.quartz.scheduler:
instanceId: AUTO
With the latter layout, I get:
2019-09-06 13:45:19.919 INFO PID --- [ main] o.q.c.QuartzScheduler : {} Scheduler meta-data: Quartz Scheduler (v2.3.0) 'quartzScheduler' with instanceId '0157799997'
Scheduler class: 'org.quartz.core.QuartzScheduler' - running locally.
NOT STARTED.
Currently in standby mode.
Number of jobs executed: 0
Using thread pool 'org.quartz.simpl.SimpleThreadPool' - with 10 threads.
Using job-store 'org.springframework.scheduling.quartz.LocalDataSourceJobStore' - which supports persistence. and is clustered.
Your application.yml configuration sets for spring-boot-starter-quartz and I think you are using org.quartz-scheduler independently. So you should config your application.yml something like this:
spring:
application.name: myApp
org:
quartz:
scheduler:
instanceId: AUTO
threadPool:
threadCount: 5
plugin:
shutdownhook:
class: org.quartz.plugins.management.ShutdownHookPlugin
cleanShutdown: TRUE
jobStore:
class: org.quartz.impl.jdbcjobstore.JobStoreTX
driverDelegateClass: org.quartz.impl.jdbcjobstore.PostgreSQLDelegate
tablePrefix: my_schema.
isClustered: true
dataSource: myDataSource
dataSource:
myDataSource:
driver: org.postgresql.Driver
URL: jdbc:postgresql://localhost/myDataSource
user: removed
password: removed
I have recently worked with Spring Boot Quartz Application and was facing a similar issue where the quartz.properties was not being detected by application where I was using application.yml to hold application environment variable
spring:
quartz:
properties:
org.quartz.scheduler:
instanceName: ${QUARTZ_SCHEDULER_INSTANCE_NAME:Scheduler}
instanceId: ${QUARTZ_SCHEDULER_INSTANCE_ID:AUTO}
makeSchedulerThreadDaemon: ${QUARTZ_SCHEDULER_MAKE_THREAD_DAEMON:true}
org.quartz.jobStore:
class: ${QUARTZ_JOBSTORE_CLASS:org.quartz.impl.jdbcjobstore.JobStoreTX}
driverDelegateClass: ${QUARTZ_JOBSTORE_DRIVER:org.quartz.impl.jdbcjobstore.PostgreSQLDelegate}
tablePrefix: ${QUARTZ_JOBSTORE_TABLE_PREFIX:qrtz_}
isClustered: ${QUARTZ_JOBSTORE_ISCLUSTER:false}
dataSource: ${QUARTZ_JOBSTORE_DATASOURCE:myDS}
misfireThreshold: ${QUARTZ_JOBSTORE_MISFIRE_THRESHOLD:25000}
org.quartz.threadPool:
class: ${QUARTZ_THREADPOOL_CLASS:org.quartz.simpl.SimpleThreadPool}
makeThreadsDaemons: ${QUARTZ_THREADPOOL_DAEMON:true}
threadCount: ${QUARTZ_THREADPOOL_COUNT:20}
threadPriority: ${QUARTZ_THREADPOOL_PRIORITY:5}
org.quartz.dataSource:
myDS:
driver: ${SPRING_DATASOURCE_DRIVER:org.postgresql.Driver}
URL: ${SPRING_DATASOURCE_URL:jdbc:postgresql://localhost:5432/postgres}
user: ${SPRING_DATASOURCE_USERNAME:postgres}
password: ${SPRING_DATASOURCE_PASSWORD:postgres}
maxConnections: ${SPRING_DATASOURCE_MAX_CONNECTION:20}
validationQuery: ${SPRING_DATASOURCE_VALIDATION_QUERY:select 1}
By using the above configuration in the above format, I was not only able to trigger Quartz jobs , i was also able to store in database
I am trying to wire datasource to get properties from application(yml) file but the Datasourcebuilder is not reading those properties. I referred Stackoverflow as well as Spring Boot docs but could not see anything missing in my code.
I am pasting the code below that uses Spring Boot 1.4.3.RELEASE
#SpringBootApplication
#EnableConfigurationProperties
#ComponentScan
public class MyApplication {
#Bean(name="dmDs")
#Primary
#ConfigurationProperties("spring.datasource")
public DataSource dmDataSource(){
return DataSourceBuilder.create().build();
}
#Bean
public String aBean(){
DataSource ds = dmDataSource(); // creates a datasource with URL, username and password empty.
return new String("");
}
The application config file is as shown below:
spring:
autoconfigure:
exclude:
- org.springframework.boot.autoconfigure.liquibase.LiquibaseAutoConfiguration
- org.springframework.boot.autoconfigure.orm.jpa.HibernateJpaAutoConfiguration
- org.springframework.boot.autoconfigure.jms.JndiConnectionFactoryAutoConfiguration
- org.springframework.boot.autoconfigure.jdbc.DataSourceAutoConfiguration
profiles:
active: test
---
spring:
profiles: test
datasource:
url: jdbc:oracle:thin:SOME_URL
driver-class-name: oracle.jdbc.OracleDriver
password: test
username: test
datacollector:
datasource:
driver-class-name: oracle.jdbc.OracleDriver
url: jdbc:oracle:thin:#SOME_URL
username: user
password: pass
I see in the logs that the properties are read from the application.yml file
[main] o.s.c.e.PropertySourcesPropertyResolver : Found key 'spring.datasource.url' in [applicationConfig: [classpath:/application.yml]] with type [String]
[main] o.s.c.e.PropertySourcesPropertyResolver : Found key 'spring.datasource.driver-class-name' in [applicationConfig: [classpath:/application.yml]] with type [String]
[main] o.s.c.e.PropertySourcesPropertyResolver : Found key 'spring.datasource.password' in [applicationConfig: [classpath:/application.yml]] with type [String]
[main] o.s.c.e.PropertySourcesPropertyResolver : Found key 'spring.datasource.username' in [applicationConfig: [classpath:/application.yml]] with type [String]
JdbcTemplateAutoConfiguration matched:
- #ConditionalOnClass found required classes 'javax.sql.DataSource', 'org.springframework.jdbc.core.JdbcTemplate' (OnClassCondition)
- #ConditionalOnSingleCandidate (types: javax.sql.DataSource; SearchStrategy: all) found a primary bean from beans 'cipDs', 'dmDs' (OnBeanCondition)
I am running the application as shown below:
public static void main(String[] args){
SpringApplication.run(new Object[]{DecisionManagementApplication.class,ApplicationConfig.class}, args);
}
If you want to use the bean Spring has created within its container you need to inject it, you can not use "new".
Try:
#Bean
#Autowired
public String aBean(final DataSource myDS)
{
return new String("Check myDS properties now");
}