Currently I have a setup like below. On running the batch job locally the job will create the necessary metadata tables automatically using the data-source property values since initialize-schema is set to always. Liquibase will also run and create any tables listed in its changelog.
Here is my application.yml file
spring:
batch:
initialize-schema: always
job:
enabled: true
liquibase:
url: db_url
user: deploy_user
password: deploy_pass
change-log: classpath:db/changelog/db.changelog-master.yaml
enabled: true
data-source:
mysql:
user: r_user
password: r_pass
jdbc-url: db_url
Here is my db.changelog-master.yaml file.
databaseChangeLog:
- changeSet:
dbms: mysql
id: create-sample-table
author: me
sql: CREATE TABLE sample_table (
sample_id VARCHAR(255) NOT NULL,
sample_text TEXT,
PRIMARY KEY (samoke_id)
) ENGINE=InnoDB DEFAULT
CHARSET=utf8 COLLATE=utf8_bin;
Mysql datasource config:
#Configuration
public class DataSourceConfiguration {
#Primary
#Bean(name = "mySQLDataSource")
#ConfigurationProperties("data-source.mysql")
public DataSource mySQLDataSource() {
return DataSourceBuilder.create().type(HikariDataSource.class).build();
}
}
Liquibase Configuration (probably posting more than what's needed):
#Configuration
#EnableConfigurationProperties(LiquibaseProperties.class)
public class LiquibaseConfiguration {
private static final Logger LOG = LoggerFactory.getLogger(LiquibaseConfiguration.class);
#Autowired
private LiquibaseProperties liquibaseProperties;
public DataSource liquibaseDataSource() {
DataSourceBuilder factory = DataSourceBuilder
.create()
.url(liquibaseProperties.getUrl())
.username(liquibaseProperties.getUser())
.password(liquibaseProperties.getPassword());
return factory.build();
}
public void testLiquibaseConnection() throws SQLException {
LOG.info("Testing connection to Liquibase (in case PCF restarts and we have stale dynamic secrets)...");
liquibaseDataSource().getConnection();
LOG.info("Testing connection to Liquibase (in case PCF restarts and we have stale dynamic secrets)... Succeeded");
}
#Bean
public SpringLiquibase liquibase() {
try {
testLiquibaseConnection();
} catch (Exception ex) {
LOG.warn("WARNING: Could not connect to the database using " + liquibaseProperties.getUser() + ", so we will be skipping the Liquibase Migration for now. ", ex);
return null;
}
SpringLiquibase liquibase = new SpringLiquibase();
liquibase.setChangeLog(this.liquibaseProperties.getChangeLog());
liquibase.setContexts(this.liquibaseProperties.getContexts());
liquibase.setDataSource(liquibaseDataSource());
liquibase.setDefaultSchema(this.liquibaseProperties.getDefaultSchema());
liquibase.setDropFirst(this.liquibaseProperties.isDropFirst());
liquibase.setShouldRun(this.liquibaseProperties.isEnabled());
liquibase.setLabels(this.liquibaseProperties.getLabels());
liquibase.setChangeLogParameters(this.liquibaseProperties.getParameters());
return liquibase;
}
}
The issue is we have different credentials for creating/deploying tables and reading/writing to tables in our deployed environments. So the below setup will work to create tables via Liquibase, but fail creating the metadata tables due to having the incorrect credentials upon deployment. Our current work-around to get the metadata tables created is to deploy with the data-source properties having deploy credentials, run the job to initialize the tables and then redeploy with read/write credentials. (We can't just leave the deploy credentials for reads because they have very short TTL).
Is it possible to create the metadata tables for Spring Batch via Liquibase automatically? Specifically, without adding the creation SQL manually to the changelog files?
UPDATE:
Using veljkost's answer below having a changelog file that looks like this works:
databaseChangeLog:
- changeSet:
dbms: mysql
id: create-spring-batch-metadata
author: dev.me
changes:
- sqlFile:
encoding: UTF-8
path: classpath:/org/springframework/batch/core/schema-mysql.sql
relativeToChangelogFile: false
splitStatements: true
stripComments: true
Yes, you can reference the schema files that already exist in Spring Batch project. In org.springframework.batch.core package you can find schema-*.sql files where * is the name of the targeted db. Since you are running on mysql, your change set would look something like this:
- changeSet:
id: 1234
author: adam.sandler
changes:
- sqlFile:
encoding: utf8
path: classpath:/org/springframework/batch/core/schema-mysql.sql
relativeToChangelogFile: false
splitStatements: true
stripComments: true
To auto-migrate to your database without the use of liquabase add
spring.batch.initialize-schema=always
to your application.properties file, it will auto migrate to the embedded data-source
Related
I'm trying to connect to a postgresql instance on google cloud from a spring boot application, using beans.
I created a bean:
#Bean
#Primary
#ConfigurationProperties(prefix = "spring.datasource")
fun createConnectionPool(): DataSource {
val config = HikariConfig()
config.jdbcUrl = String.format("jdbc:postgresql:///%s", DB_NAME)
config.username = DB_USER
config.password = DB_PASS
config.addDataSourceProperty("cloudSqlInstance", INSTANCE_CONNECTION_NAME)
return HikariDataSource(config)
}
and the application.yml file looks like this:
spring:
jp:
features:
hibernate:
jdbc:
lobe:
non_contextual_creation: true
clouds:
gcp:
projectId: my-project-id
SQL:
instance-connection-name: "my-instance"
databaseName: "my-database-name"
but I am getting this error:
could not obtain connection to query metadata","stack_trace":"o.p.u.PSQLException: The server requested password-based authentication, but no password was provided by plugin null\n\t
but I gave it the password :(
I want to mention that I cannot give the password in application.yml
(Because the password comes from an external source and I cannot access it in the application yml file)
What am I doing wrong? Any idea is welcome. thanks
I am using Spring Boot and Spring JDBC with Derby. Below is the code snippet to initialize the embedded Database.
#Bean
public DataSource dataSource() {
// no need shutdown, EmbeddedDatabaseFactoryBean will take care of this
EmbeddedDatabaseBuilder builder = new EmbeddedDatabaseBuilder();
EmbeddedDatabase db = builder
.setType(EmbeddedDatabaseType.DERBY) //.H2 or .DERBY
.addScript("db/sql/create-db.sql")
.addScript("db/sql/insert-data.sql")
.build();
return db;
}
But when I am running the application the tables are getting dropped and created each time. So all the data inserted in last run was flushed. I don't want to drop the tables. How can I achieve this?
set the below property in your application.properties/application.yml
spring.jpa.hibernate.ddl-auto=update
As per spring documentation here
spring.jpa.hibernate.ddl-auto
DDL mode. This is actually a shortcut for the "hibernate.hbm2ddl.auto"
property. Defaults to "create-drop" when using an embedded database
and no schema manager was detected. Otherwise, defaults to "none".
As you are using embedded database, it is defaulting to create-drop
Application has a default spring data source specified in application.yml
spring:
datasource:
type: com.zaxxer.hikari.HikariDataSource
url: jdbc:oracle:thin:#localhost:1521:xe
username: system
password: oracle
hikari:
poolName: Hikari
auto-commit: false
I have added configuration options for a second data source, used for a completely difference (JDBCTemplate purpose).
faas20:
ds:
url: jdbc:oracle:thin:#tldb0147vm.group.net:1760:tdb
username: ...
password: ...
Then, I add two data sources, one named, and the other default. Without the default one, liquibase fails to start.
#Configuration
public class LegacyConfiguration {
#Bean(name = "faas20")
#ConfigurationProperties(prefix = "faas20.ds")
public DataSource legacyDataSource() {
return DataSourceBuilder
.create()
.build();
}
#Bean
public DataSource defaultDataSource() {
return DataSourceBuilder
.create()
.build();
}
}
Startup of the application fails though.
The application now cannot build the default EntityManagerFactory.
Why would that be affected?
Parameter 0 of constructor in impl.OrderServiceImpl required a bean named 'entityManagerFactory' that could not be found.
Consider defining a bean named 'entityManagerFactory' in your configuration.
Without the two data sources present the application and liquibase start up as they should.
edit
I am not clear on how to configure two separate data sources,
Default Data Source for JPA
Additional Data Source for use in JDBC (and potentially other JPA classes)
I want to connect multiple mysql db in my spring boot application. The thing is in my application, one of the db is used as an entity while from other db, I am fetching data in query form. So I want that whenever I write a custom query, it should take from one db while whenever I use repository methods, it should use another one.
Change your application.properties file as :
#first db
spring.datasource.url = [url]
spring.datasource.username = [username]
spring.datasource.password = [password]
spring.datasource.driverClassName = oracle.jdbc.OracleDriver
#second db ...
spring.secondDatasource.url = [url]
spring.secondDatasource.username = [username]
spring.secondDatasource.password = [password]
spring.secondDatasource.driverClassName = oracle.jdbc.OracleDriver
And Change your Configuration file i.e add following beans :
#Bean
#Primary
#ConfigurationProperties(prefix="spring.datasource")
public DataSource primaryDataSource() {
return DataSourceBuilder.create().build();
}
#Bean
#ConfigurationProperties(prefix="spring.secondDatasource")
public DataSource secondaryDataSource() {
return DataSourceBuilder.create().build();
}
In my test I need test with different databases (mysql, oracle, etc.) and I would like to know if it's possible with SpringRunner.
I'm using #SqlGroup and #Sql annotations, but I didn't discover how to indicate script files (sql) corresponding database.
Example:
#Sql(executionPhase = Sql.ExecutionPhase.BEFORE_TEST_METHOD, scripts = "classpath:tenantBeforeTestRun.sql")
This annotation configures my test to execute the script to all database types, but this file didn't work on Oracle.
#Sql annotation lets you define a SqlConfig which contains a datasource bean name.
Then you can define many datasource beans, with possibly different drivers and refer them from different #Sql. This might be helpful: Spring Boot Multiple Datasource
#Sql(..., config = #SqlConfig(datasource = "db1", ...)
application.properties:
#first db
spring.db1.url = [url]
spring.db1.username = [username]
spring.db1.password = [password]
spring.db1.driverClassName = oracle.jdbc.OracleDriver
#second db ...
spring.secondDatasource.url = [url]
spring.secondDatasource.username = [username]
spring.secondDatasource.password = [password]
spring.secondDatasource.driverClassName = oracle.jdbc.OracleDriver
Then, somewhere in #Configuration class:
#Bean(name = "db1")
#ConfigurationProperties(prefix="spring.db1")
public DataSource secondaryDataSource() {
return DataSourceBuilder.create().build();
}