Spring Cloud Data Flow datasources overrides spring batch app datasource - java

I'm setting up an instance of Spring Cloud Data Flow. I've run the following commands:
java -jar spring-cloud-dataflow-server-2.9.2.jar \
--spring.cloud.dataflow.features.streams-enabled=false \
--spring.cloud.dataflow.features.schedules-enabled=true \
--spring.datasource.url=jdbc:postgresql://localhost:5432/batch \
--spring.datasource.username=postgres \
--spring.datasource.password=postgres \
--spring.datasource.driver-class-name=org.postgresql.Driver \
--spring.datasource.initialization_mode=always
I've developed a batch job using spring batch to be deployed in this platform. The job uses two data sources: batch for Spring and task Metadata and app_db for my business logic. When I run the app locally, it persists metadata in batch and my business data in app_db, as expected. The problem is when I try to execute de job inside the Spring Cloud Dataflow. The platform overrides my configured business logic database and uses only the batch database, which is supposed to store metadata only.
application.yaml
spring:
batch:
datasource:
url: jdbc:postgresql://localhost:5432/batch
username: postgres
password: postgres
datasource:
url: jdbc:postgresql://localhost:5432/app_db
username: postgres
password: postgres
DatasourceConfiguration
public class DatasourceConfiguration {
#Bean
#ConfigurationProperties("spring.datasource")
#Primary
public DataSourceProperties dataSourceProperties() {
return new DataSourceProperties();
}
#Bean
#Primary
public DataSource dataSource(DataSourceProperties dataSourceProperties) {
return dataSourceProperties.initializeDataSourceBuilder().build();
}
#Bean(name = "batchDataSourceProperties")
#ConfigurationProperties("spring.batch.datasource")
public DataSourceProperties batchDataSourceProperties() {
return new BatchDataSourceProperties();
}
#Bean(name = "batchDataSource")
public DataSource batchDataSource() {
return batchDataSourceProperties.initializeDataSourceBuilder().build();
}
}
#SpringBootApplication
#EnableTask
#EnableBatchProcessing
public class BatchApplication {
#Bean
public TaskConfigurer taskConfigurer(#Qualifier("batchDataSource") DataSource dataSource) {
return new DefaultTaskConfigurer(dataSource);
}
#Bean
public BatchConfigurer batchConfigurer(#Qualifier("batchDataSource") DataSource dataSource) {
return new DefaultBatchConfigurer(dataSource);
}
public static void main(String[] args) {
SpringApplication.run(BatchApplication.class, args);
}
}
Job
#Bean
public Job startJob(JobBuilderFactory jobBuilderFactory, DataSource dataSource) {
try {
System.out.println(dataSource.getConnection().getMetaData().getURL().toString());
} catch (Exception e) {
//TODO: handle exception
}
}
When I look at the data source,jdbc:postgresql://localhost:5432/app_db will be printed when the batch is executed from local and jdbc:postgresql://localhost:5432/batch will be printed when the batch (task) is executed from SCDF.
I want to know how dataflow is overriding application the spring.datasource even though I am not passing any arguments while executing the task. Please suggest a solution to avoid the overriding of datasource.
One solution I am thinking of is creating AppDatasourceConfiguration(app.datasource) use it. But is there a possibility to use spring.datasource without getting overiddien by SCDF.

Related

Can i run liquibase database migrations after the app was initialized with spring boot?

Context
I am trying to start my spring app without a database(so when no database is available at initialization the app won't be stopped), i managed to do this with the following commands in app.prop:
#DB should not kill the app
spring.sql.init.continue-on-error=true //app should continue if a sql init error arrises
spring.liquibase.enabled=false // liquibase bean shouldn't be initialized at start up, without this command the app crashes anyway
spring.jpa.hibernate.ddl-auto=none
Now the only thing that i need to do is figure a way so when the app does make a successful connection with the db the liquibase migration files will get executed. For this task I understood I need to customize the liquibase bean, the following code shows my progress so far:
#Configuration
public class Config {
#Value("${postgres.host}")
private String host;
#Value("${postgres.port}")
private Integer port;
#Value("${postgres.database}")
private String database;
#Value("${postgres.user}")
private String user;
#Value("${postgres.password}")
private String password;
#Value("${spring.liquibase.change-log}")
private String changelog;
#Bean
public DataSource dataSource() {
DriverManagerDataSource dataSource = new DriverManagerDataSource();
dataSource.setDriverClassName("org.postgresql.Driver");
dataSource.setUrl(String.format("jdbc:postgresql://%s:%d/%s", host, port, database));
dataSource.setUsername(user);
dataSource.setPassword(password);
return dataSource;
}
#Bean
public SpringLiquibase liquibase() {
SpringLiquibase liquibase = new SpringLiquibase();
liquibase.setDataSource(dataSource());
liquibase.setChangeLog(changelog);
return liquibase;
}
}
Preferably if the database is down the bean should not be created and if the database is running/ the server established connection with the db at some point the bean will be brought in the context and execute the migration files, i don't know if that is possible as I am a newbie,but let me know if you have any suggestions.

datasource.getConnection() not working in Springboot application

My db properties are kept in application-test.properties (I am running Springboot application in test profile) and the Datasource is referred through #Autowired annotation. It throws NullPointerException when I try to use datasource.getConnection().
I have referred similar questions and mostly all of them include some solutions with bean xml configurations. In my case I am not explicitly using any bean configurations. Every datasource properties are kept in application-test.properties file and I am referring through it using Datasource. I am a newbie to Springboot and any help would be great.
My repository class
#Repository
public class ActualUserDetailsDAO {
#Autowired
DataSource dataSource;
public String getPriorityType(String idNo) throws Exception {
Connection con = null;
PreparedStatement ps = null;
ResultSet rs = null;
String cxPriorityType = null;
int count = 0;
try {
con = dataSource.getConnection();
String sql = ConfigurationHandler.getInstance().getConfigValue("sample.query");
......................
} catch (SQLException e) {
................
} catch (Exception e) {
..............
} finally {
.................
}
return cxPriorityType;
}
My application properties
spring.main.banner-mode=off
server.port=8180
# Datasource settings
spring.datasource.initialize=true
spring.datasource.type=org.apache.tomcat.jdbc.pool.DataSource
spring.datasource.driver-class-name=oracle.jdbc.driver.OracleDriver
spring.datasource.name=camst2
spring.datasource.url=jdbc:oracle:thin:#..................
spring.datasource.username=username
spring.datasource.password=password
# Tomcat JDBC settings
spring.datasource.tomcat.initial-size=10
spring.datasource.tomcat.max-active=100
spring.datasource.tomcat.min-idle=10
spring.datasource.tomcat.max-idle=100
#spring.datasource.tomcat.max-wait=6000
spring.datasource.tomcat.max-wait=30000
#spring.datasource.tomcat.test-on-connect=true
#spring.datasource.tomcat.test-on-borrow=true
#spring.datasource.tomcat.test-on-return=true
# Tomcat AccessLog
server.tomcat.accesslog.suffix=.log
server.tomcat.accesslog.prefix=access_log
server.tomcat.accesslog.enabled=true
server.tomcat.accesslog.directory=/tomcat/logs
server.tomcat.accesslog.pattern=%h %l %u %t %r %s %b %D
My application class
#SpringBootApplication
public class Application {
#Autowired
DataSource dataSource;
public static void main(String[] args) throws Exception {
SpringApplication.run(Application.class, args);
}
}
I found the solution. The problem was in my controller class. I was creating an instance of the my repository class by myself. I should have used #Autowired instead.
#RestController
public class ActualUserDetails implements ActualUserDetailsInt {
#RequestMapping(value = "/foo", method = RequestMethod.GET, produces = MediaType.APPLICATION_JSON_VALUE)
public ResponseEntity<Object> getActualUserDetails(#PathVariable("idNo") String idNo, #RequestParam("lob") String lob,
#RequestParam("offerSellingType") String offerSellingType) {
//do something
ActualUserDetailsDAO actualUserDetailsDAO = new ActualUserDetailsDAO();
actualUserDetailsDAO.getPriorityType(idNo);
//do something
I changed this into following.
#RestController
public class ActualUserDetails implements ActualUserDetailsInt {
#Autowired
ActualUserDetailsDAO actualUserDetailsDAO;
#RequestMapping(value = "/foo", method = RequestMethod.GET, produces =
MediaType.APPLICATION_JSON_VALUE)
public ResponseEntity<Object> getActualUserDetails(#PathVariable("idNo") String idNo,
#RequestParam("lob") String lob,
#RequestParam("offerSellingType") String offerSellingType) {
//do something
actualUserDetailsDAO.getPriorityType(idNo);
//do something
Manually creating object of my repository class did not detected dataSource defined inside it. Autowiring my repository class in my controller class seems to solve this problem.
If your data source is not been detected for any reason, I strongly recommend to have a deeper look on your code.
Following are some of the things to look for when this kind of error happens.
Look for the correct folder structure (application properties file
reside under resources folder)
If you are running Spring in a different profile (say test
profile), make sure relevant configurations are written in
application-test.properties
Check for proper annotation in relevant classes
Make sure your application properties are not overridden by any other
configurations

Why does my custom DataSource getConnection() throw "SQLException: The url cannot be null"?

I am writing a Spring Boot (Batch) application, that should exit with a specific exit code. A requirement is to return an exit code, when the database cannot be connected.
My approach is to handle this exception as early as possible by explicitly creating a DataSource bean, calling getConnection() and catch and throw a custom exception that implements ExitCodeGenerator. The configuration is as follows:
#Configuration
#EnableBatchProcessing
public class BatchConfiguration {
...
#Bean
#Primary
#ConfigurationProperties("spring.datasource")
public DataSourceProperties dataSourceProps() {
return new DataSourceProperties();
}
#Bean
#ConfigurationProperties("spring.datasource")
public DataSource customDataSource(DataSourceProperties props) {
DataSource ds = props.initializeDataSourceBuilder().create().build();
try {
ds.getConnection();
} catch (SQLException e) {
throw new DBConnectionException(e); // implements ExitCodeGenerator interface
}
return ds;
}
...
}
I want to reuse as much of the Spring Boot Autoconfiguration as possible, thats why I use the #ConfigurationProperties. I do not know if this is the way to go.
A call on DataSourceProperties.getUrl() returns the configured url (from my properties file):
spring.datasource.url=jdbc:oracle:....
But why does Spring Boot throw this exception when I call DataSource.getConnection():
java.sql.SQLException: The url cannot be null
at java.sql.DriverManager.getConnection(DriverManager.java:649) ~[?:1.8.0_141]
at java.sql.DriverManager.getConnection(DriverManager.java:208) ~[?:1.8.0_141]
at org.apache.tomcat.jdbc.pool.PooledConnection.connectUsingDriver(PooledConnection.java:308) ~[tomcat-jdbc-8.5.15.jar:?]
at org.apache.tomcat.jdbc.pool.PooledConnection.connect(PooledConnection.java:203) ~[tomcat-jdbc-8.5.15.jar:?]
at org.apache.tomcat.jdbc.pool.ConnectionPool.createConnection(ConnectionPool.java:735) ~[tomcat-jdbc-8.5.15.jar:?]
at org.apache.tomcat.jdbc.pool.ConnectionPool.borrowConnection(ConnectionPool.java:667) ~[tomcat-jdbc-8.5.15.jar:?]
at org.apache.tomcat.jdbc.pool.ConnectionPool.init(ConnectionPool.java:482) [tomcat-jdbc-8.5.15.jar:?]
at org.apache.tomcat.jdbc.pool.ConnectionPool.<init>(ConnectionPool.java:154) [tomcat-jdbc-8.5.15.jar:?]
at org.apache.tomcat.jdbc.pool.DataSourceProxy.pCreatePool(DataSourceProxy.java:118) [tomcat-jdbc-8.5.15.jar:?]
at org.apache.tomcat.jdbc.pool.DataSourceProxy.createPool(DataSourceProxy.java:107) [tomcat-jdbc-8.5.15.jar:?]
at org.apache.tomcat.jdbc.pool.DataSourceProxy.getConnection(DataSourceProxy.java:131) [tomcat-jdbc-8.5.15.jar:?]
at com.foo.bar.BatchConfiguration.customDataSource(BatchConfiguration.java:xxx) [main/:?]
...
Or do you know some cleaner way of handling this situation?
Thanks
Edit: Spring Boot version is 1.5.4
The error is subtle and lies in the line
DataSource ds = props.initializeDataSourceBuilder().create().build();
The create() creates a new DataSourceBuilder and erases the preconfigured properties.
props.initializeDataSourceBuilder() already returns a DataSourceBuilder with all the properties (url, username etc.) set. So you only have to add new properties or directly build() it. So the solution is removing create():
DataSource ds = props.initializeDataSourceBuilder().build();
In this context the dataSourceProps() method bean can be removed too.
It looks like you dont set any value to your Datasource.
props.initializeDataSourceBuilder().create().build(); does not set the values of your properties to your datasource. It just creates and builds one.
Try to set your values manually by using the static DataSourceBuilder. You will get the values from your dataSourceProps bean like that:
#Configuration
#EnableBatchProcessing
public class BatchConfiguration {
...
#Bean
#Primary
#ConfigurationProperties("spring.datasource")
public DataSourceProperties dataSourceProps() {
return new DataSourceProperties();
}
#Bean
#ConfigurationProperties("spring.datasource")
public DataSource customDataSource(DataSourceProperties props) {
DataSource ds = DataSourceBuilder.create()
.driverClassName(dataSourceProps().getDriverClassName())
.url(dataSourceProps().getUrl())
.username(dataSourceProps().getUsername())
.password(dataSourceProps().getPassword())
.build();
try {
ds.getConnection();
} catch (SQLException e) {
throw new DBConnectionException(e); // implements ExitCodeGenerator interface
}
return ds;
}
...
}

How can I start flyway migration before hibernate validation?

I use flyway + hibernate validate. I have flyway bean:
#Component
public class DbMigration {
private static final Logger LOG = LoggerFactory.getLogger(DbMigration.class);
private final Config config;
#Autowired
public DbMigration(Config config) {
this.config = config;
}
public void runMigration() {
try {
Flyway flyway = new Flyway();
flyway.configure(properties());
int migrationApplied = flyway.migrate();
LOG.info("[" + migrationApplied + "] migrations are applied");
} catch (FlywayException ex) {
throw new DatabaseException("Exception during database migrations: ", ex);
}
}
public Properties properties() {
//my prop
}
}
And in Apllication class I do it:
public static void main(String[] args) {
try {
AnnotationConfigApplicationContext context = new AnnotationConfigApplicationContext(ApplicationConfiguration.class);
context.getBean(DbMigration.class).runMigration();
But my hibernate start before runMigration(); And validate throw exeption. How can I start next?
run Migration
start hibernate validation
EDIT:
#Bean
#Autowired
public LocalContainerEntityManagerFactoryBean entityManagerFactory(DataSource datasource) {
log.info("entityManagerFactory start");
dbMigration.runMigration();
But I think it is bad
In your spring application configuration, if you have an entity manager factory bean configuration you can make it depend on the flyway bean so that it gets initialized after it. Something like:
#Bean
#DependsOn("flyway")
public LocalContainerEntityManagerFactoryBean entityManagerFactory() {
// Initialize EntityManagerFactory here
}
The flyway bean configuration can be something like:
#Bean(initMethod = "migrate")
public Flyway flyway() {
Flyway flyway = new Flyway();
// configure bean here
return flyway;
}

Spring-Data-Neo4J: How do log into the remote server?

I'm using Spring-Data-Neo4j 4.0.0.M1, and trying to connect to the server. I'm getting an exception:
Caused by: org.apache.http.client.HttpResponseException: Unauthorized
I have a password on the server interface, but I'm not sure how to tell Spring about it.
#Configuration
#EnableNeo4jRepositories(basePackages = "com.noxgroup.nitro.persistence")
#EnableTransactionManagement
public class MyConfiguration extends Neo4jConfiguration {
#Bean
public Neo4jServer neo4jServer () {
/*** I was quite surprised not to see an overloaded parameter here ***/
return new RemoteServer("http://localhost:7474");
}
#Bean
public SessionFactory getSessionFactory() {
return new SessionFactory("org.my.software.domain");
}
#Bean
ApplicationListener<BeforeSaveEvent> beforeSaveEventApplicationListener() {
return new ApplicationListener<BeforeSaveEvent>() {
#Override
public void onApplicationEvent(BeforeSaveEvent event) {
if (event.getEntity() instanceof User) {
User user = (User) event.getEntity();
user.encodePassword();
}
}
};
}
}
Side Note
4.0.0 Milestone 1 is absolutely fantastic. If anyone is using 3.x.x, I'd recommend checking it out!
The username and password are passed currently via system properties
e.g.
-Drun.jvmArguments="-Dusername=<usr> -Dpassword=<pwd>"
or
System.setProperty("username", "neo4j");
System.setProperty("password", "password");
https://jira.spring.io/browse/DATAGRAPH-627 is open (not targeted for 4.0 RC1 though), please feel free to add comments/vote it up

Categories

Resources