How can I start flyway migration before hibernate validation? - java

I use flyway + hibernate validate. I have flyway bean:
#Component
public class DbMigration {
private static final Logger LOG = LoggerFactory.getLogger(DbMigration.class);
private final Config config;
#Autowired
public DbMigration(Config config) {
this.config = config;
}
public void runMigration() {
try {
Flyway flyway = new Flyway();
flyway.configure(properties());
int migrationApplied = flyway.migrate();
LOG.info("[" + migrationApplied + "] migrations are applied");
} catch (FlywayException ex) {
throw new DatabaseException("Exception during database migrations: ", ex);
}
}
public Properties properties() {
//my prop
}
}
And in Apllication class I do it:
public static void main(String[] args) {
try {
AnnotationConfigApplicationContext context = new AnnotationConfigApplicationContext(ApplicationConfiguration.class);
context.getBean(DbMigration.class).runMigration();
But my hibernate start before runMigration(); And validate throw exeption. How can I start next?
run Migration
start hibernate validation
EDIT:
#Bean
#Autowired
public LocalContainerEntityManagerFactoryBean entityManagerFactory(DataSource datasource) {
log.info("entityManagerFactory start");
dbMigration.runMigration();
But I think it is bad

In your spring application configuration, if you have an entity manager factory bean configuration you can make it depend on the flyway bean so that it gets initialized after it. Something like:
#Bean
#DependsOn("flyway")
public LocalContainerEntityManagerFactoryBean entityManagerFactory() {
// Initialize EntityManagerFactory here
}
The flyway bean configuration can be something like:
#Bean(initMethod = "migrate")
public Flyway flyway() {
Flyway flyway = new Flyway();
// configure bean here
return flyway;
}

Related

Spring Boot force reload DB connection on INI file read

I have an application with a UI which will manage some facets of a Spring Boot application while it is live.
First of all, there will be an INI file name passed in when the application starts which will have a username, password and host for the DB connection.
I have successfully implemented this dynamic database capability by starting the Spring Boot application after the initial INI file load.
However, I need to be able to change the #Primary data source on-the-fly during execution.
The user will click a button in the UI, the INI file will load from a specified source in the UI and I want the DB connection to drop, switch to the new properties read in from the INI file and restart the connection.
It seems to me that #RefreshScope + actuator method will not work for me since I am refreshing from a UI in the application and not an endpoint.
AbstractRoutingDatasource seems like it requires you to know the DB connection properties for the various sources at compile time and furthermore it's a lot more complex than I think is necessary to solve a simple problem such as this. I would think there should be some class which will allow a simple reload by telling it to call getDataSource again and reinitialize.
Configuration class:
#Configuration
#EnableTransactionManagement
#EnableJpaRepositories(
entityManagerFactoryRef = "smsEntityManagerFactory",
transactionManagerRef = "smsTransactionManager",
basePackages = { "com.conceptualsystems.sms.db.repository" })
public class JpaConfig {
#Bean(name="SMSX")
#Primary
public DataSource getDataSource() {
Logger logger = LoggerFactory.getLogger(this.getClass());
logger.error("DATABASE INITIALIZING: getDataSource() called!");
DataSourceBuilder builder = DataSourceBuilder.create();
builder.driverClassName("com.microsoft.sqlserver.jdbc.SQLServerDriver");
builder.username(IniSettings.getInstance().getIniFile().getDbUser());
builder.password(IniSettings.getInstance().getIniFile().getDbPassword());
String host = IniSettings.getInstance().getIniFile().getDbPath();
String db = IniSettings.getInstance().getIniFile().getDbName();
String connectionString = "jdbc:sqlserver://" + host + ";databaseName=" + db;
logger.info("Connecting [" + connectionString +"] as [" +
IniSettings.getInstance().getIniFile().getDbUser() + ":" +
IniSettings.getInstance().getIniFile().getDbPassword() + "]");
builder.url(connectionString);
return builder.build();
}
#Bean(name = "smsEntityManagerFactory")
#Primary
public LocalContainerEntityManagerFactoryBean smsEntityManagerFactory(
EntityManagerFactoryBuilder builder,
#Qualifier("SMSX") DataSource dataSource) {
return builder
.dataSource(dataSource)
.persistenceUnit("smsEntityManagerFactory")
.packages("com.conceptualsystems.sms.db.entity")
.build();
}
#Bean(name = "smsTransactionManager")
#Primary
public PlatformTransactionManager smsTransactionManager(
#Qualifier("smsEntityManagerFactory") EntityManagerFactory entityManagerFactory) {
return new JpaTransactionManager(entityManagerFactory);
}
}
main entry point:
public static void main(String[] args) {
try {
UIManager.setLookAndFeel( new FlatLightLaf() );
} catch( Exception ex ) {
System.err.println( "Failed to initialize LaF" );
}
javax.swing.SwingUtilities.invokeLater(() -> createAndShowSplash());
System.out.println("Scrap Management System v" + VERSION);
String iniFilename = null;
String user = null;
String pass = null;
for(String arg : args) {
if(arg.startsWith(ARG_HELP)) {
System.out.println(HELP_TEXT);
System.exit(0);
}
if(arg.startsWith(ARG_INI)) {
iniFilename = arg.substring(ARG_INI.length());
System.out.println("User entered INI file location on command line: " + iniFilename);
}
if(arg.startsWith(ARG_USER)) {
user = arg.substring(ARG_USER.length());
System.out.println("User entered DB username on command line: " + user);
}
if(arg.startsWith(ARG_PSWD)) {
pass = arg.substring(ARG_PSWD.length());
System.out.println("User entered DB password on command line: [****]");
}
}
mIniFile = new IniFile(iniFilename);
try {
mIniFile.load();
} catch (Exception e) {
System.out.println("Error loading INI file!");
e.printStackTrace();
}
IniSettings.getInstance().setIniFile(mIniFile);
System.out.println("INI file set completed, starting Spring Boot Application context...");
SplashFrame.getInstance().enableSiteSelection();
mApplicationContext = new SpringApplicationBuilder(Main.class)
.web(WebApplicationType.SERVLET)
.headless(false)
.bannerMode(Banner.Mode.LOG)
.run(args);
try {
IniSettings.getInstance().setIniSource(new IniJPA());
IniSettings.getInstance().load();
} catch(Exception e) {
System.out.println("Unable to setup INI from database!");
e.printStackTrace();
}
}

flapdoodle.embed.mongo gets always started with Spring Boot Main application in Eclipse, how to remove

I have got a problem. A simple spring boot application works fine with existing MongoDB configuration.
For integration test, I added required configuration for embeded mongodb with flapdoodle configuration. All the unit tests are getting executed properly. When I run the main Spring Boot application, it by default considers the flapdoodle embeded mongodb configuration. As a result, the embeded mongodb never exits and while running the junit test cases, it still runs. I provide below the code snippet.
Whenever I start Spring Boot main application, it still runs the embeded mongodb. I see always the following lines in the console.
Download PRODUCTION:Windows:B64 START
Download PRODUCTION:Windows:B64 DownloadSize: 231162327
Download PRODUCTION:Windows:B64 0% 1% 2% 3% 4% 5% 6% 7% 8%
I provide the code for Mongodb configuration which should be picked up when running the main Spring Boot application.
#Slf4j
#Configuration
public class NoSQLAutoConfiguration {
#Autowired
private NoSQLEnvConfigProperties configProperties;
/**
* Morphia.
*
* #return the morphia
*/
private Morphia morphia() {
final Morphia morphia = new Morphia();
morphia.mapPackage(DS_ENTITY_PKG_NAME);
return morphia;
}
#Bean
public Datastore datastore(#Autowired #Qualifier("dev") MongoClient mongoClient) {
String dbName = configProperties.getDatabase();
final Datastore datastore = morphia().createDatastore(mongoClient, dbName);
datastore.ensureIndexes();
return datastore;
}
/**
* Mongo client.
*
* #return the mongo client
*/
#Primary
#Bean(name = "dev")
public MongoClient mongoClient() {
MongoClient mongoClient = null;
String dbHost = configProperties.getHost();
int dbPort = configProperties.getPort();
String database = configProperties.getDatabase();
log.debug("MongDB Host: {} - MongoDB Port: {}", dbHost, dbPort);
List<ServerAddress> serverAddresses = new ArrayList<>();
serverAddresses.add(new ServerAddress(dbHost, dbPort));
MongoClientOptions options = getMongoOptions();
String dbUserName = configProperties.getMongodbUsername();
String encRawPwd = configProperties.getMongodbPassword();
char[] dbPwd = null;
try {
dbPwd = Util.decode(encRawPwd).toCharArray();
} catch (Exception ex) {
// Ignore exception
dbPwd = null;
}
Optional<String> userName = Optional.ofNullable(dbUserName);
Optional<char[]> password = Optional.ofNullable(dbPwd);
if (userName.isPresent() && password.isPresent()) {
MongoCredential credential = MongoCredential.createCredential(dbUserName, database, dbPwd);
List<MongoCredential> credentialList = new ArrayList<>();
credentialList.add(credential);
mongoClient = new MongoClient(serverAddresses, credentialList, options);
} else {
log.debug("Connecting to local Mongo DB");
mongoClient = new MongoClient(dbHost, dbPort);
}
return mongoClient;
}
private MongoClientOptions getMongoOptions() {
MongoClientOptions.Builder builder = MongoClientOptions.builder();
builder.maxConnectionIdleTime(configProperties.getMongodbIdleConnection());
builder.minConnectionsPerHost(configProperties.getMongodbMinConnection());
builder.connectTimeout(configProperties.getMongodbConnectionTimeout());
return builder.build();
}
}
For integration testing, I have the configuration for embeded mongodb which is part of src/test.
#TestConfiguration
public class MongoConfiguration implements InitializingBean, DisposableBean {
MongodExecutable executable;
private static final String DBNAME = "embeded";
private static final String DBHOST = "localhost";
private static final int DBPORT = 27019;
#Override
public void afterPropertiesSet() throws Exception {
IMongodConfig mongodConfig = new MongodConfigBuilder().version(Version.Main.PRODUCTION)
.net(new Net(DBHOST, DBPORT, Network.localhostIsIPv6())).build();
MongodStarter starter = MongodStarter.getDefaultInstance();
executable = starter.prepare(mongodConfig);
executable.start();
}
private Morphia morphia() {
final Morphia morphia = new Morphia();
morphia.mapPackage(DS_ENTITY_PKG_NAME);
return morphia;
}
#Bean
public Datastore datastore(#Autowired #Qualifier("test") MongoClient mongoClient) {
final Datastore datastore = morphia().createDatastore(mongoClient, DBNAME);
datastore.ensureIndexes();
return datastore;
}
#Bean(name = "test")
public MongoClient mongoClient() {
return new MongoClient(DBHOST, DBPORT);
}
#Override
public void destroy() throws Exception {
executable.stop();
}
}
Please help me how to remove this embeded mongo configuration while running Spring Boot main application in eclipse.
I also provide below my main application below.
#EnableAspectJAutoProxy
#EnableSwagger2
#SpringBootApplication(scanBasePackages = { "com.blr.app" })
public class ValidationApplication {
/**
* The main method. f
*
* #param args the arguments
*/
public static void main(String[] args) {
SpringApplication.run(ValidationApplication.class, args);
}
}
I see the code that you have not added any profile to MongoConfiguration class. During eclipse build, this class is also picked up by Spring framework. Add the below lines to this class so that while running Spring Boot test this class will be picked up and while running main Spring Boot app, the actual Mongo Configuration file will be picked up. That is why Spring comes up with concept Profile. Add the profile appropriately for different environment.
#Profile("test")
#ActiveProfiles("test")
So final code will look like this.
#Profile("test")
#ActiveProfiles("test")
#TestConfiguration
public class MongoConfiguration implements InitializingBean, DisposableBean {
...
...
}

Spring Boot Rest + ClearDB mySQL "exceeded 'max_user_connections'" error

I keep randomly getting this error every once in a while: "java.sql.SQLSyntaxErrorException: User '{key}' has exceeded the 'max_user_connections' resource (current value: 10)".
I have tried googling help for this, but all I can find is:
"increase the max connections limit" (which can't be done in free clearDB)
"adjust maxActive amount" or "release old connections" (both of which I can't find how to do it in Spring Boot)
Here's what my code looks like:
// application.properties
# Connect to heroku ClearDB MySql database
spring.datasource.url=jdbc:mysql://{heroku_url}?reconnect=true
spring.datasource.username={user}
spring.datasource.password={password}
# Hibernate ddl auto (create, create-drop, update)
spring.jpa.hibernate.ddl-auto=update
#MySQL DIALECT
spring.jpa.properties.hibernate.dialect=org.hibernate.dialect.MySQL5Dialect
spring.jpa.open-in-view=false
server.port=8080
#Configuration
public class DatabaseConfig {
#Value("${spring.datasource.url}")
private String dbUrl;
#Bean
public DataSource dataSource() {
HikariConfig config = new HikariConfig();
config.setJdbcUrl(dbUrl);
return new HikariDataSource(config);
}
}
EDIT 1: I was following PauMAVA's instructions as best as I could and I came up with this code, which for some reason fails:
#Configuration
public class DatabaseConfig {
#Value("${spring.datasource.url}")
private String dbUrl;
public static DataSource ds;
#Bean
public DataSource dataSource() {
HikariConfig config = new HikariConfig();
config.setJdbcUrl(dbUrl);
DataSource ds = new HikariDataSource(config);
DatabaseConfig.ds = ds;
return ds;
}
}
// Main class
public static void main(String[] args) {
SpringApplication.run(BloggerApplication.class, args);
Runtime.getRuntime().addShutdownHook(new Thread(new Runnable() {
public void run() {
DataSource ds = DatabaseConfig.ds;
if (ds != null) {
try {
ds.getConnection().close();
} catch (SQLException e) {
e.printStackTrace();
}
}
}
}, "Shutdown-thread"));
}
Whenever you create a connection object in you code, it is advisable to close the same in finally block. This way the number of connections do not get exhausted.
Hope this helps!
You should close the DataSource on application termination so that no unused connections remain open.
public void close(DataSource ds) {
if(ds != null) {
ds.close();
}
}
But do this only on program termination as stated here.
To use the data source later (on closing) you can register the DataSource as a Field in your class:
private DataSource ds;
#Bean
public DataSource dataSource() {
HikariConfig config = new HikariConfig();
config.setJdbcUrl(dbUrl);
DataSource ds = new HikariDataSource(config);
this.ds = ds;
return ds;
}
If you are going to have more than one data source you can make a List based approach:
private List<DataSource> activeDataSources = new ArrayList<>();
public DataSource dataSource() {
HikariConfig config = new HikariConfig();
config.setJdbcUrl(dbUrl);
DataSource ds = new HikariDataSource(config);
this.activeDataSources.add(ds);
return ds;
}
public void closeAllDataSources() {
for(DataSource ds: this.activeDataSources) {
if(ds != null) {
ds.close();
}
}
this.activeDataSources = new ArrayList<>();
}
To execute a function on program close refer to this.

Connecting quartz datasource to spring boot bean

I have set up a connection to our database, using a bean in spring boot. This all works correctly in our normal application.
#Bean(name="MoliDBConfig")
#Primary
public DataSource dataSource() throws SQLException {
I would like to connect to the same datasource from quartz, but am getting a JNDI error. (As an aside it is worth noting that I have managed to connect to a datasource from quartz by manually providing the config details. See the commented out code in quartz.properties below.)
2019-03-19T10:51:52.342+00:00 [APP/PROC/WEB/0] [OUT] ERROR 2019-03-19 10:51:52.333 - o.q.u.JNDIConnectionProvider 126 Error looking up datasource: Need to specify class name in environment or system property, or as an applet parameter, or in an application resource file: java.naming.factory.initial javax.naming.NoInitialContextException: Need to specify class name in environment or system property, or as an applet parameter, or in an application resource file: java.naming.factory.initial| at javax.naming.spi.NamingManager.getInitialContext(NamingManager.java:662) ~[?:1.8.0_202]| at javax.naming.InitialContext.getDefaultInitCtx(InitialContext.java:313) ~[?:1.8.0_202]| at javax.naming.InitialContext.getURLOrDefaultInitCtx(InitialContext.java:350) ~[?:1.8.0_202]| at javax.naming.InitialContext.lookup(InitialContext.java:417) ~[?:1.8.0_202]| at org.quartz.utils.JNDIConnectionProvider.init(JNDIConnectionProvider.java:124) [quartz-2.3.0.jar!/:?]| at org.quartz.utils.JNDIConnectionProvider.(JNDIConnectionProvider.java:102) [quartz-2.3.0.jar!/:?]| at org.quartz.impl.StdSchedulerFactory.instantiate(StdSchedulerFactory.java:995) [quartz-2.3.0.jar!/:?]| at org.quartz.impl.StdSchedulerFactory.getScheduler(StdSchedulerFactory.java:1559) [quartz-2.3.0.jar!/:?]| at com.xxx.d3.moli.schedule.QrtzScheduler.scheduler(QrtzScheduler.java:52) [classes/:?]| at com.xxx.d3.moli.schedule.QrtzScheduler$$EnhancerBySpringCGLIB$$aa50aa7b.CGLIB$scheduler$1() [classes/:?]| at com.xxx.d3.moli.schedule.QrtzScheduler$$EnhancerBySpringCGLIB$$aa50aa7b$$FastClassBySpringCGLIB$$374ea1c1.invoke() [classes/:?]| at org.springframework.cglib.proxy.MethodProxy.invokeSuper(MethodProxy.java:228) [spring-core-4.3.22.RELEASE.jar!/:4.3.22.RELEASE]| at org.springframework.context.annotation.ConfigurationClassEnhancer$BeanMethodInterceptor.intercept(ConfigurationClassEnhancer.java:358) [spring-context-4.3.22.RELEASE.jar!/:4.3.22.RELEASE]| at com.xxx.d3.moli.schedule.QrtzScheduler$$EnhancerBySpringCGLIB$$aa50aa7b.scheduler() [classes/:?]
quartz.properties
# Configure Main Scheduler Properties
org.quartz.scheduler.instanceName = MyClusteredScheduler
org.quartz.scheduler.instanceId = AUTO
# thread-pool
org.quartz.threadPool.class=org.quartz.simpl.SimpleThreadPool
org.quartz.threadPool.threadCount=2
org.quartz.threadPool.threadPriority = 5
org.quartz.threadPool.threadsInheritContextClassLoaderOfInitializingThread=true
# job-store
#org.quartz.jobStore.class = org.quartz.impl.jdbcjobstore.JobStoreTX
org.quartz.jobStore.class = org.springframework.scheduling.quartz.LocalDataSourceJobStore
#org.quartz.jobStore.dataSource = myDS
org.quartz.jobStore.dataSource = managedTXDS
org.quartz.jobStore.driverDelegateClass = org.quartz.impl.jdbcjobstore.oracle.OracleDelegate
org.quartz.jobStore.useProperties = false
org.quartz.jobStore.tablePrefix = QRTZ_
org.quartz.jobStore.misfireThreshold = 60000
org.quartz.jobStore.isClustered = true
org.quartz.jobStore.clusterCheckinInterval = 20000
# Configure Datasources
org.quartz.dataSource.managedTXDS.jndiURL=java:comp/env/jdbc/MoliDBConfig
org.quartz.dataSource.myDS.driver = oracle.jdbc.driver.OracleDriver
org.quartz.dataSource.myDS.URL = jdbc:oracle:thin:#ldap://oid.xxx.com:389/odsod012,cn=OracleContext,dc=xxx,dc=com
org.quartz.dataSource.myDS.user = MOLI_QRTZ_SCHED
org.quartz.dataSource.myDS.password = MOLI_QRTZ_SCHED
org.quartz.dataSource.myDS.maxConnections = 5
org.quartz.dataSource.myDS.validationQuery=select 0 from dual
# A different classloader is needed to work with Spring Boot dev mode,
# see https://docs.spring.io/spring-boot/docs/current/reference/html/using-boot-devtools.html#using-boot-devtools-known-restart-limitations
# and https://github.com/quartz-scheduler/quartz/issues/221
org.quartz.scheduler.classLoadHelper.class=org.quartz.simpl.ThreadContextClassLoadHelper
And my quartz config file
#Configuration
#Profile({"oracle-cloud","mysql-cloud"})
public class QrtzScheduler {
private static final Logger LOGGER = LogManager.getLogger(QrtzScheduler.class);
#Autowired
private ApplicationContext applicationContext;
#PostConstruct
public void init() {
LOGGER.info("Hello world from Quartz...");
}
#Bean
public SpringBeanJobFactory springBeanJobFactory() {
AutoWiringSpringBeanJobFactory jobFactory = new AutoWiringSpringBeanJobFactory();
LOGGER.debug("Configuring Job factory");
jobFactory.setApplicationContext(applicationContext);
return jobFactory;
}
#Bean
public Scheduler scheduler(Trigger trigger, JobDetail job) throws SchedulerException, IOException {
StdSchedulerFactory factory = new StdSchedulerFactory();
factory.initialize(new ClassPathResource("quartz/quartz.properties").getInputStream());
LOGGER.debug("Getting a handle to the Scheduler");
Scheduler scheduler = factory.getScheduler();
scheduler.setJobFactory(springBeanJobFactory());
if (scheduler.checkExists(job.getKey())){
scheduler.deleteJob(job.getKey());
}
scheduler.scheduleJob(job, trigger);
LOGGER.debug("Starting Scheduler threads");
scheduler.start();
return scheduler;
}
#Bean
public JobDetail jobDetail() {
return JobBuilder.newJob()
.ofType(ScheduledJob.class)
.storeDurably()
.withIdentity(JobKey.jobKey("Qrtz_Job_Detail"))
.withDescription("Invoke Sample Job service...")
.build();
}
#Bean
public Trigger trigger(JobDetail job) {
int frequencyInMin = 5;
LOGGER.info("Configuring trigger to fire every {} minutes", frequencyInMin);
return TriggerBuilder.newTrigger().forJob(job)
.withIdentity(TriggerKey.triggerKey("Qrtz_Trigger"))
.withDescription("Sample trigger")
.withSchedule(simpleSchedule().withIntervalInMinutes(frequencyInMin).repeatForever())
.build();
}
}
What is wrong with my approach? (The documentation at quartz-scheduler.org all appears to be down) :-(
So I changed over to this:
#Configuration
#Profile({"oracle-cloud","mysql-cloud"})
public class QrtzScheduler {
private static final Logger LOGGER = LogManager.getLogger(QrtzScheduler.class);
#Autowired
private ApplicationContext applicationContext;
#Autowired
#Qualifier("MoliDBConfig")
private DataSource dataSource;
#Value("${app.repeatInterval}")
private int repeatInterval;
#Value("${org.quartz.scheduler.instanceName}")
private String instanceName;
#Value("${org.quartz.scheduler.instanceId}")
private String instanceId;
#Value("${org.quartz.threadPool.threadCount}")
private String threadCount;
#Value("${org.quartz.threadPool.class}")
private String threadClass;
#Value("${org.quartz.threadPool.threadPriority}")
private String threadPriority;
#Value("${org.quartz.threadPool.threadsInheritContextClassLoaderOfInitializingThread}")
private String threadsInheritContextClassLoaderOfInitializingThread;
#Value("${org.quartz.jobStore.class}")
private String jobStoreClass;
#Value("${org.quartz.jobStore.driverDelegateClass}")
private String jobStoreDriverDelegateClass;
#Value("${org.quartz.jobStore.useProperties}")
private String jobStoreUseProperties;
#Value("${org.quartz.jobStore.tablePrefix}")
private String jobStoreTablePrefix;
#Value("${org.quartz.jobStore.misfireThreshold}")
private String jobStoreMisfireThreshold;
#Value("${org.quartz.jobStore.isClustered}")
private String jobStoreIsClustered;
#Value("${org.quartz.jobStore.clusterCheckinInterval}")
private String jobStoreClusterCheckinInterval;
#Value("${org.quartz.scheduler.classLoadHelper.class}")
private String schedulerClassLoadHelperClass;
#PostConstruct
public void init() {
LOGGER.info("Hello world from Quartz...");
}
#Bean
public SpringBeanJobFactory jobFactory() {
AutoWiringSpringBeanJobFactory jobFactory = new AutoWiringSpringBeanJobFactory();
LOGGER.debug("Configuring Job factory");
jobFactory.setApplicationContext(applicationContext);
return jobFactory;
}
#Bean
public SchedulerFactoryBean schedulerFactoryBean() {
LOGGER.debug("Configuring schedulerFactoryBean");
SchedulerFactoryBean factory = new SchedulerFactoryBean();
factory.setOverwriteExistingJobs(true);
factory.setJobFactory(jobFactory());
Properties quartzProperties = new Properties();
quartzProperties.setProperty("org.quartz.scheduler.instanceName",instanceName);
quartzProperties.setProperty("org.quartz.scheduler.instanceId",instanceId);
quartzProperties.setProperty("org.quartz.threadPool.threadCount",threadCount);
quartzProperties.setProperty("org.quartz.threadPool.class",threadClass);
quartzProperties.setProperty("org.quartz.threadPool.threadPriority",threadPriority);
quartzProperties.setProperty("org.quartz.threadPool.threadsInheritContextClassLoaderOfInitializingThread",threadsInheritContextClassLoaderOfInitializingThread);
quartzProperties.setProperty("org.quartz.jobStore.class",jobStoreClass);
quartzProperties.setProperty("org.quartz.jobStore.driverDelegateClass",jobStoreDriverDelegateClass);
quartzProperties.setProperty("org.quartz.jobStore.useProperties",jobStoreUseProperties);
quartzProperties.setProperty("org.quartz.jobStore.tablePrefix",jobStoreTablePrefix);
quartzProperties.setProperty("org.quartz.jobStore.misfireThreshold",jobStoreMisfireThreshold);
quartzProperties.setProperty("org.quartz.jobStore.isClustered",jobStoreIsClustered);
quartzProperties.setProperty("org.quartz.jobStore.clusterCheckinInterval",jobStoreClusterCheckinInterval);
quartzProperties.setProperty("org.quartz.scheduler.classLoadHelper.class",schedulerClassLoadHelperClass);
factory.setDataSource(dataSource);
factory.setQuartzProperties(quartzProperties);
factory.setTriggers(moliJobTrigger().getObject());
return factory;
}
#Bean(name = "moliJobTrigger")
public SimpleTriggerFactoryBean moliJobTrigger() {
long minute = 60000;
long repeatIntervalInMin = repeatInterval * minute;
LOGGER.debug("Configuring jobTrigger");
SimpleTriggerFactoryBean factoryBean = new SimpleTriggerFactoryBean();
factoryBean.setJobDetail(moliJobDetails().getObject());
factoryBean.setStartDelay(minute);
factoryBean.setRepeatInterval(repeatIntervalInMin);
factoryBean.setRepeatCount(SimpleTrigger.REPEAT_INDEFINITELY);
factoryBean.setMisfireInstruction(SimpleTrigger.MISFIRE_INSTRUCTION_RESCHEDULE_NEXT_WITH_REMAINING_COUNT);
return factoryBean;
}
#Bean(name = "moliJobDetails")
public JobDetailFactoryBean moliJobDetails() {
LOGGER.debug("Configuring jobDetails");
JobDetailFactoryBean jobDetailFactoryBean = new JobDetailFactoryBean();
jobDetailFactoryBean.setJobClass(ScheduledAutomaticMonitoringJob.class);
jobDetailFactoryBean.setDescription("Moli_Quartz_Description");
jobDetailFactoryBean.setDurability(true);
jobDetailFactoryBean.setName("Moli_Quartz_Name");
return jobDetailFactoryBean;
}
}
application.yml
org:
quartz:
scheduler:
instanceName: MyClusteredScheduler
instanceId: AUTO
classLoadHelper:
class: org.quartz.simpl.ThreadContextClassLoadHelper
threadPool:
class: org.quartz.simpl.SimpleThreadPool
threadCount: 10
threadPriority: 5
threadsInheritContextClassLoaderOfInitializingThread: true
jobStore:
class: org.springframework.scheduling.quartz.LocalDataSourceJobStore
dataSource: app.dataSource
driverDelegateClass: org.quartz.impl.jdbcjobstore.oracle.OracleDelegate
useProperties: false
tablePrefix: COT_QRTZ_
misfireThreshold: 60000
isClustered: true
clusterCheckinInterval: 20000
Also needed this class:
import org.quartz.spi.TriggerFiredBundle;
import org.springframework.beans.factory.config.AutowireCapableBeanFactory;
import org.springframework.context.ApplicationContext;
import org.springframework.context.ApplicationContextAware;
import org.springframework.scheduling.quartz.SpringBeanJobFactory;
/**
* Adds auto-wiring support to quartz jobs.
*/
public final class AutoWiringSpringBeanJobFactory extends SpringBeanJobFactory implements ApplicationContextAware {
private AutowireCapableBeanFactory beanFactory;
public void setApplicationContext(ApplicationContext applicationContext) {
beanFactory = applicationContext.getAutowireCapableBeanFactory();
}
#Override
protected Object createJobInstance(final TriggerFiredBundle bundle) throws Exception {
final Object job = super.createJobInstance(bundle);
beanFactory.autowireBean(job);
return job;
}
}

Spring batch NoClassDefFoundError: oracle/xdb/XMLType

I have a Spring batch project which connect to an Oracle SQL Database, and allow to export/import some data with xls files.
In my job, I do first a delete in the table, before import the data.
Sometimes, the job failed because there is problems in the xls to import.
For example : If I have duplicate lines, I'm gonna have a SQLException for duplicate when the job will insert the lines in database.
I want to simply no commit anything (especially the delete part).
If the job is successful -> commit
If the job failed -> rollback
So I find that I have to put "setAutocommit" to false.
I have my datasource loaded at the beginning of my job, so I do a :
dataSource.getConnection().setAutoCommit(false);
The instructions works, but when I launch the job, I have this error :
ERROR o.s.batch.core.step.AbstractStep -
Encountered an error executing step step_excel_sheet_1551274910254 in job importExcelJob
org.springframework.beans.factory.BeanCreationException:
Error creating bean with name 'scopedTarget.xlsListener'
defined in class path resource [com/adeo/config/ImportExcelConfig.class]:
Bean instantiation via factory method failed; nested exception is org.springframework.beans.BeanInstantiationException:
Failed to instantiate [org.springframework.batch.core.StepExecutionListener]:
Factory method 'xlsListener' threw exception; nested exception is
java.lang.NoClassDefFoundError: oracle/xdb/XMLType
at org.springframework.beans.factory.support.ConstructorResolver.instantiateUsingFactoryMethod(ConstructorResolver.java:599)
~[spring-beans-4.2.4.RELEASE.jar:4.2.4.RELEASE]
The Job config is :
#Configuration
public class ImportExcelConfig {
private static final Logger LOG = LoggerFactory.getLogger("ImportExcelConfig");
#Autowired
private JobBuilderFactory jobBuilderFactory;
#Resource(name = "dataSource")
private DataSource dataSource;
#Autowired
private StepBuilderFactory stepBuilderFactory;
#Bean(name = "importExcelJob")
public Job importExcel(#Qualifier("xlsPartitionerStep") Step xlsPartitionerStep) throws Exception {
return jobBuilderFactory.get("importExcelJob").start(xlsPartitionerStep).build();
}
#Bean(name = "xlsPartitionerStep")
public Step xlsPartitionerStep(#Qualifier("xlsParserSlaveStep") Step xlsParserSlaveStep, XlsPartitioner xlsPartitioner){
return stepBuilderFactory.get("xls_partitioner_step_builder")
.partitioner(xlsParserSlaveStep)
.partitioner("xls_partitioner_step_builder",XlsPartitioner)
.gridSize(3)
.build();
}
#Bean(name = "xlsParserSlaveStep")
#StepScope
public Step xlsParserSlaveStep(#Qualifier("step") Step step,XlsSheetPartitioner xlsPartitioner) throws Exception {
return stepBuilderFactory.get("sheet_partitioner_"+System.currentTimeMillis())
.partitioner(step)
.partitioner("sheet_partitioner_"+System.currentTimeMillis(),XlsPartitioner)
.gridSize(3)
.build();
}
#Bean(name = "step")
#StepScope
public Step step(#Qualifier("xlsReader") PoiItemReader xlsReader,
#Qualifier("jdbcWriter") ItemWriter jdbcWriter,
#Qualifier("xlsListener") StepExecutionListener xlsListener
) throws Exception {
return ((SimpleStepBuilder)stepBuilderFactory
.get("step_excel_sheet_"+System.currentTimeMillis())
.<Object, Map>chunk(1000)
.reader(xlsReader)
.writer(jdbcWriter)
.listener(xlsListener)
).build();
}
#Bean(name = "xlsListener")
#StepScope
#DependsOn
public StepExecutionListener xlsListener() {
XlsStepExecutionListener listener = new xlsStepExecutionListener();
listener.setDataSource(dataSource);
listener.afterPropertiesSet();
return listener;
}
#Bean(name = "jdbcWriter")
#StepScope
#DependsOn
public ItemWriter<Map> jdbcWriter(#Value("#{stepExecutionContext[sheetConfig]}") SheetConfig sheetConfig) throws IOException, ClassNotFoundException {
JdbcBatchItemWriter<Map> writer = new JdbcBatchItemWriter<>();
writer.setItemPreparedStatementSetter(preparedStatementSetter());
String sql = sheetConfig.getSqlInsert().replaceAll("#TABLE#", sheetConfig.getTable());
LOG.info(sql);
writer.setSql(sql);
writer.setDataSource(dataSource);
writer.afterPropertiesSet();
return writer;
}
#Bean
#StepScope
public ItemPreparedStatementSetter preparedStatementSetter(){
return new ItemPreparedStatementSetter();
}
#Bean
public ItemProcessor testProcessor() {
return new TestProcessor();
}
#Bean(name = "xlsReader")
#StepScope
#DependsOn
public PoiItemReader xlsReader(#Value("#{stepExecutionContext[sheetConfig]}") SheetConfig sheetConfig,
#Value("#{stepExecutionContext[xls]}") File xlsFile) throws IOException {
PoiItemReader reader = new PoiItemReader();
reader.setResource(new InputStreamResource(new PushbackInputStream(new FileInputStream(xlsFile))));
reader.setRowMapper(mapRowMapper());
reader.setSheet(sheetConfig.getSheetIndex());
reader.setLinesToSkip(sheetConfig.getLinesToSkip());
return reader;
}
#Bean
#StepScope
#DependsOn
public RowMapper mapRowMapper() throws IOException {
return new MapRowMapper();
}
}
The listener is :
public class XlsStepExecutionListener implements StepExecutionListener, InitializingBean {
private final static Logger LOGGER = LoggerFactory.getLogger(XlsStepExecutionListener.class);
#Value("#{stepExecutionContext[sheetConfig]}")
private SheetConfig config;
#Value("#{jobParameters['isFull']}")
private boolean isFull;
#Value("#{stepExecutionContext[supp]}")
private String supp;
private DataSource dataSource;
#Override
public void afterPropertiesSet() {
Assert.notNull(dataSource, "dataSource must be provided");
}
#Override
public void beforeStep(StepExecution stepExecution) {
LOGGER.info("Start - Import sheet {}", config.sheetName);
dataSource.getConnection().setAutoCommit(false);
JdbcTemplate jt = new JdbcTemplate(dataSource);
if(config.sqlDelete != null){
//DELETE DATA
LOGGER.info("beforeStep - PURGE DATA"+config.getSqlDelete().replaceAll("#TABLE#", config.getTable()));
jt.update(config.getSqlDelete().replaceAll("#TABLE#", config.getTable()),supp);
}
}
#Override
public ExitStatus afterStep(StepExecution stepExecution) {
LOGGER.info ("End - Import sheet {}",config.sheetName);
//TODO :
//If status failed -> rollback, if status success : commit
return ExitStatus.COMPLETED;
}
public DataSource getDataSource() {
return dataSource;
}
public void setDataSource(DataSource dataSource) {
this.dataSource = dataSource;
}
}
In the pom.xml, I have the oracle jar :
<dependency>
<groupId>com.oracle</groupId>
<artifactId>ojdbc6</artifactId>
<version>11.2.0.3</version>
</dependency>
I see that the class XMLType is in another jar of Oracle, but I don't know why I need to add this jar when I simply do a modification of auto commit mode ?
Also, I see that, for ALL the method I can call from getConnection().XXXX, the same exception happen. So it's not specific to the auto commit.
Thank you

Categories

Resources