After going through #KevinBowersox's Infinite Skills course on Spring Data for Java Developers, the only part that didn't seem to work as advertised were the Async methods. Mind you, at the beginning of the course he covered xml and Java configuration, but he proceeded to use the xml configuration throughout the rest of the course, whereas, I kept using Java configuration for each of the exercises and was able to get all the other parts to work. One minor difference is I am using IntelliJ IDEA rather than STS, as he uses throughout the course.
If anyone familiar with Spring Data Async Queries or the segment of his course (https://www.safaribooksonline.com/library/view/spring-data-for/9781771375924/video241705.html) has some insight into what might be missing, please let me know.
Here are the relevant bits:
/* Application.java */
#EnableAsync
public class Application {
public static void main(String[] args) throws ParseException {
try (AnnotationConfigApplicationContext context = new AnnotationConfigApplicationContext(
DataConfiguration.class)) {
BookRepository repository = context.getBean(BookRepository.class);
// TODO: Make method Async
for (long x = 0; x < 4; x++) {
repository.findByIds(x);
}
}
}
}
/* BaseRepository.java */
#NoRepositoryBean
public interface BaseRepository<T, ID extends Serializable> extends JpaRepository<T, ID> {
#Override
#Async("executor")
List<T> findByIds(ID... ids);
}
/* ExtendedRepositoryImpl.java */
public class ExtendedRepositoryImpl<T, ID extends Serializable>
extends SimpleJpaRepository<T, ID> implements BaseRepository<T, ID> {
private JpaEntityInformation<T, ?> entityInformation;
private final EntityManager entityManager;
public ExtendedRepositoryImpl(
JpaEntityInformation<T, ?> entityInformation,
EntityManager entityManager) {
super(entityInformation, entityManager);
this.entityInformation = entityInformation;
this.entityManager = entityManager;
}
#Override
public List<T> findByIds(ID... ids) {
Query query = this.entityManager.createQuery("select e from " + this.entityInformation.getEntityName()
+ " e where e." + this.entityInformation.getIdAttribute().getName() + " in :ids");
query.setParameter("ids", Arrays.asList(ids));
long wait = new Random().nextInt(10000-1) +1;
System.out.println(wait);
try {
Thread.sleep(wait);
}
catch (InterruptedException e) {
e.printStackTrace();
}
System.out.println("Executing query for ID: " + Arrays.toString(ids));
return (List<T>) query.getResultList();
}
}
/* DataConfiguration.java (aka AppConfig.java) */
#EnableJpaRepositories(
basePackages = {"com.infiniteskills.springdata.async"},
repositoryBaseClass = com.infiniteskills.springdata.async.data.repository.ExtendedRepositoryImpl.class,
repositoryImplementationPostfix = "CustomImpl")
#EnableJpaAuditing(auditorAwareRef = "customAuditorAware")
#EnableAsync
#EnableTransactionManagement
#ComponentScan("com.infiniteskills.springdata.async")
#Configuration
public class DataConfiguration implements AsyncConfigurer {
#Bean
public CustomAuditorAware customAuditorAware() {
return new CustomAuditorAware();
}
#Bean
public DataSource dataSource() {
EmbeddedDatabaseBuilder builder = new EmbeddedDatabaseBuilder();
return builder.setType(EmbeddedDatabaseType.H2).build();
}
#Bean
public EntityManagerFactory entityManagerFactory() {
HibernateJpaVendorAdapter vendorAdapter = new HibernateJpaVendorAdapter();
// Generate tables in database
vendorAdapter.setGenerateDdl(true);
Properties jpaProperties = new Properties();
jpaProperties.put("hibernate.hbm2ddl.auto", "create-drop");
//jpaProperties.put("hibernate.dialect", "org.hibernate.dialect.HSQLDialect");
//jpaProperties.put("hibernate.connection.driver_class", "org.h2.Driver");
// After DDL has been run, run init script to populate table with data.
jpaProperties.put("hibernate.hbm2ddl.import_files", "init.sql");
// Entity Manager Factory Bean
LocalContainerEntityManagerFactoryBean entityManagerFactoryBean = new LocalContainerEntityManagerFactoryBean();
entityManagerFactoryBean.setDataSource(dataSource());
entityManagerFactoryBean.setPackagesToScan("com.infiniteskills.springdata.async");
entityManagerFactoryBean.setJpaVendorAdapter(vendorAdapter);
entityManagerFactoryBean.setJpaProperties(jpaProperties);
entityManagerFactoryBean.afterPropertiesSet();
return entityManagerFactoryBean.getObject();
}
#Bean
public PlatformTransactionManager transactionManager() {
JpaTransactionManager transactionManager = new JpaTransactionManager();
transactionManager.setEntityManagerFactory(entityManagerFactory());
return transactionManager;
}
#Override
#Bean(name = "executor")
public Executor getAsyncExecutor() {
ThreadPoolTaskExecutor executor = new ThreadPoolTaskExecutor();
executor.setCorePoolSize(10);
executor.setMaxPoolSize(10);
executor.setQueueCapacity(100);
executor.setThreadNamePrefix("executor-");
executor.initialize();
return executor;
}
#Override
#Bean
public AsyncUncaughtExceptionHandler getAsyncUncaughtExceptionHandler() {
return new SimpleAsyncUncaughtExceptionHandler();
}
}
The method has to have return type void or Future in order to be called asynchronous.
You can read up on it in the documentation here: https://docs.spring.io/spring/docs/current/spring-framework-reference/html/scheduling.html#scheduling-annotation-support-async
#EnableAsync is used on any of your #Configuration classes.
From the docs
To be used together with #Configuration classes as follows, enabling annotation-driven async processing for an entire Spring application context:
So annotate your Application class with #Configuration.
Hope this helps.
Since I kept using Java configuration in the exercise, I initially wasn't sure if I was using all the correct annotations equivalent to the demonstrated xml configuration. And IntelliJ IDEA can render differing code suggestions than STS...
Anyway, the Spring Data Async Queries segment of the course (https://www.safaribooksonline.com/library/view/spring-data-for/9781771375924/video241705.html) made me question my sanity.
Basically, it turns out my Application.java class:
public class Application {
public static void main(String[] args) throws ParseException {
try (AnnotationConfigApplicationContext context = new AnnotationConfigApplicationContext(
DataConfiguration.class)) {
BookRepository repository = context.getBean(BookRepository.class);
// TODO: Make method Async
for (long x = 0; x < 4; x++) {
repository.findByIds(x);
}
}
}
}
should have done without the try-with-resources syntax and instead looked like this:
public class Application {
public static void main(String[] args) throws ParseException {
AnnotationConfigApplicationContext context = new AnnotationConfigApplicationContext(
DataConfiguration.class)
BookRepository repository = context.getBean(BookRepository.class;
for (long x = 0; x < 4; x++) {
repository.findByIds(x);
}
}
}
Related
I have created a project with spring boot. I have hikariConfig to create the data source for connection pooling with the autocommmit property set as false. Doing batch insert with jdbcTemplate running inside method annotated with #Transaction for DataSourceTransactionManager. I am unable to see the data getting inserted in Db after the program execution. If I make the autocommit true in hikariconfig it works fine.
#SpringBootApplication
public class Application {
public static void main(String[] args) {
SpringApplication.run(Application.class, args);
}
}
#Component
#EnableTransactionManagement
public class DataSourceConfig {
#Bean (name = "dateSourceForSqlServer")
public DataSource dataSourceForSqlServer () {
HikariConfig hikariConfig = new HikariConfig();
hikariConfig.setConnectionTimeout(10000L);
hikariConfig.setIdleTimeout(10000L);
hikariConfig.setMinimumIdle(1);
hikariConfig.setMaximumPoolSize(1);
hikariConfig.setMaxLifetime(600000L);
hikariConfig.setConnectionTestQuery("select 1");
hikariConfig.setValidationTimeout(4000L);
hikariConfig.setJdbcUrl("jdbc:sqlserver://localhost:1433;database=invt_mgmt");
hikariConfig.setUsername("sa");
hikariConfig.setPassword("sql_server_pass_123");
hikariConfig.setDriverClassName("com.microsoft.sqlserver.jdbc.SQLServerDriver");
hikariConfig.setAutoCommit(false);
return new HikariDataSource(hikariConfig);
}
#Bean (name = "jdbcTemplateForSqlServer")
public JdbcTemplate jdbcTemplateForSqlServer () {
JdbcTemplate jdbcTemplate = new JdbcTemplate();
jdbcTemplate.setDataSource(dataSourceForSqlServer());
return jdbcTemplate;
}
#Primary
#Bean(name = "invtMgmtTxMangerForSqlServer")
public DataSourceTransactionManager transactionManager() {
DataSourceTransactionManager manager = new DataSourceTransactionManager();
manager.setDataSource(dataSourceForSqlServer());
return manager;
}
}
#Component
public class startBean {
#Autowired
private Business Business;
#PostConstruct
public void startApp() throws SQLException {
Business.insertContainerHierarchy();
Business.insertContainerHierarchy();
}
}
#Component
class public Business {
#Autowired
#Qualifier("jdbcTemplateForSqlServer")
private JdbcTemplate jdbcTemplateForSqlServer;
String insertIntStudent = "INSERT INTO student (id, name) Values(?, ?)";
#Transactional(value = "invtMgmtTxMangerForSqlServer")
public void insertContainerHierarchy () throws SQLException {
System.out.println(TransactionSynchronizationManager.isActualTransactionActive());
System.out.println(TransactionSynchronizationManager.getCurrentTransactionName());
Date start = new Date();
for (int i = 0; i < 500; i++) {
System.out.println(i);
jdbcTemplateForSqlServer.batchUpdate(insertIntStudent, new BatchPreparedStatementSetter() {
#Override
public void setValues(PreparedStatement ps, int i) throws SQLException {
ps.setInt(1, i);
ps.setString(2, String.valueOf(i));
}
#Override
public int getBatchSize() {
return 1000;
}
});
}
System.out.println(new Date().getTime() - start.getTime());
}
}
I have used TransactionSynchronizationManager.isActualTransactionActive() which is returing true when the method is executed.
Q1. Why the data is not getting inserted, Transaction is supposed to autocommit once the method is executed?
Q2. In case spring transaction is getting used will database connection autocommit value make any difference?
Q3. How currently I am able to insert with autocommit set to true?
You are trying to invoke a transaction-wrapped proxy from within the #PostConstruct method. For that bean, all the initialization may be complete but not necessarily for the rest of the context. Not all proxies may be set at that point.
I would suggest implementing the ApplicationListener<ContextRefreshedEvent> interface in order to trigger any data creation inside that class. This will ensure it will be called after the entire context has been set-up:
#Component
public class OnContextInitialized implements
ApplicationListener<ContextRefreshedEvent> {
#Autowired
private Business Business;
#Override public void onApplicationEvent(ContextRefreshedEvent event) {
Business.insertContainerHierarchy();
Business.insertContainerHierarchy();
}
}
I have project spring-boot using datasource routes in three diferents datasources.
This is my configuration:
#Configuration
#EnableCaching
public class CachingConfiguration extends CachingConfigurerSupport {
#Override
public KeyGenerator keyGenerator() {
return new EnvironmentAwareCacheKeyGenerator();
}
}
--
public class DatabaseContextHolder {
private static final ThreadLocal<DatabaseEnvironment> CONTEXT =
new ThreadLocal<>();
public static void set(DatabaseEnvironment databaseEnvironment) {
CONTEXT.set(databaseEnvironment);
}
public static DatabaseEnvironment getEnvironment() {
return CONTEXT.get();
}
public static void clear() {
CONTEXT.remove();
}
}
--
#Configuration
#EnableJpaRepositories(basePackageClasses = UsuarioRepository.class,
entityManagerFactoryRef = "customerEntityManager",
transactionManagerRef = "customerTransactionManager")
#EnableTransactionManagement
public class DatasourceConfiguration {
#Bean
#ConfigurationProperties(prefix = "spring.ciclocairu.datasource")
public DataSource ciclocairuDataSource() {
return DataSourceBuilder.create().build();
}
#Bean
#ConfigurationProperties(prefix = "spring.palmas.datasource")
public DataSource palmasDataSource() {
return DataSourceBuilder.create().build();
}
#Bean
#ConfigurationProperties(prefix = "spring.megabike.datasource")
public DataSource megabikeDataSource() {
return DataSourceBuilder.create().build();
}
#Bean
#Primary
public DataSource customerDataSource() {
DataSourceRouter router = new DataSourceRouter();
final HashMap<Object, Object> map = new HashMap<>(3);
map.put(DatabaseEnvironment.CICLOCAIRU, ciclocairuDataSource());
map.put(DatabaseEnvironment.PALMAS, palmasDataSource());
map.put(DatabaseEnvironment.MEGABIKE, megabikeDataSource());
router.setTargetDataSources(map);
return router;
}
#Autowired(required = false)
private PersistenceUnitManager persistenceUnitManager;
#Bean
#Primary
#ConfigurationProperties("spring.jpa")
public JpaProperties customerJpaProperties() {
return new JpaProperties();
}
#Bean
#Primary
public LocalContainerEntityManagerFactoryBean customerEntityManager(
final JpaProperties customerJpaProperties) {
EntityManagerFactoryBuilder builder =
createEntityManagerFactoryBuilder(customerJpaProperties);
return builder.dataSource(customerDataSource()).packages(Users.class)
.persistenceUnit("customerEntityManager").build();
}
#Bean
#Primary
public JpaTransactionManager customerTransactionManager(
#Qualifier("customerEntityManager") final EntityManagerFactory factory) {
return new JpaTransactionManager(factory);
}
private JpaVendorAdapter createJpaVendorAdapter(
JpaProperties jpaProperties) {
AbstractJpaVendorAdapter adapter = new HibernateJpaVendorAdapter();
adapter.setShowSql(jpaProperties.isShowSql());
adapter.setDatabase(jpaProperties.getDatabase());
adapter.setDatabasePlatform(jpaProperties.getDatabasePlatform());
//adapter.setGenerateDdl(jpaProperties.isGenerateDdl());
return adapter;
}
private EntityManagerFactoryBuilder createEntityManagerFactoryBuilder(
JpaProperties customerJpaProperties) {
JpaVendorAdapter jpaVendorAdapter =
createJpaVendorAdapter(customerJpaProperties);
return new EntityManagerFactoryBuilder(jpaVendorAdapter,
customerJpaProperties.getProperties(), this.persistenceUnitManager);
}
}
--
public class DataSourceRouter extends AbstractRoutingDataSource {
#Override
protected Object determineCurrentLookupKey() {
if(DatabaseContextHolder.getEnvironment() == null)
DatabaseContextHolder.set(DatabaseEnvironment.CICLOCAIRU);
return DatabaseContextHolder.getEnvironment();
}
}
--
public class EnvironmentAwareCacheKeyGenerator implements KeyGenerator {
#Override
public Object generate(Object target, Method method, Object... params) {
String key = DatabaseContextHolder.getEnvironment().name() + "-" + (
method == null ? "" : method.getName() + "-") + StringUtils
.collectionToDelimitedString(Arrays.asList(params), "-");
return key;
}
}
I set datasource using
DatabaseContextHolder.set(DatabaseEnvironment.CICLOCAIRU);
Go to problem:
For example, two users in diferents datasources: 1 and 2
if one user using datasource 1, and send a request,
The other user that using datasource 2,
yours next request , instead of datasource 2, this get datasource 1. I think that this ThreadLocal<DatabaseEnvironment> CONTEXT =
new ThreadLocal<>(); was exclusive for request, But this does not seem to be so.
Iam sorry if this not be clear.
In realy, i need that DataSurceRouter were exclusive for each request, and an request not intefer in another.
I wrong about i think of DatasourceRouter or my code is bad ?
The issue probably occurs because of server thread pool: you have a given number of threads, and each request is served rolling among them.
When the server recycles a thread, the thread local variable has that value already set from the previous cycle, so you need to flush that value after each request, leaving the thread in a clean state.
My Spring Batch job is started every 5 minutes - basically it reads a string, uses the string as a parameter in a sql query, and prints out the resulting sql result list. Mostly it seems to be running ok, but I notice sporadic errors in my logs every 5-10 runs
2017-05-05 11:13:26.101 INFO 9572 --- [nio-8081-exec-8] c.u.r.s.AgentCollectorServiceImpl : Could not open JPA E
ntityManager for transaction; nested exception is java.lang.IllegalStateException: Transaction already active
My job is started like from my AgentCollectorServiceImpl class
#Override
public void addReportIds(List<Integer> reportIds) {
try {
.toJobParameters();
jobLauncher.run(job, jobParameters);
} catch (Exception e) {
log.info(e.getMessage());
}
}
My BatchConfig class looks like
#Configuration
#EnableBatchProcessing
#Import(AppConfig.class)
public class BatchConfig {
#Autowired
private JobBuilderFactory jobBuilderFactory;
#Autowired
private StepBuilderFactory stepBuilderFactory;
#Autowired
private AppConfig appConfig;
#Bean
public Reader reader() {
return new Reader();
}
#Bean
public Processor processor() {
return new Processor();
}
#Bean
public Writer writer() {
return new Writer();
}
#Bean
public Job job() {
return jobBuilderFactory.get("job")
.incrementer(new RunIdIncrementer())
.flow(step1())
.end()
.build();
}
#Bean
public Step step1() {
return stepBuilderFactory.get("step1")
.<String, String> chunk(1)
.reader(reader())
.processor(processor())
.writer(writer())
.build();
}
}
My AppConfig class looks like
#Configuration
#PropertySource("classpath:application.properties")
#ComponentScan
public class AppConfig {
#Bean
public LocalContainerEntityManagerFactoryBean entityManagerFactory() {
LocalContainerEntityManagerFactoryBean em = new LocalContainerEntityManagerFactoryBean();
em.setDataSource(organizationDataSource());
em.setPackagesToScan(new String[]{"com.organization.agentcollector.model"});
JpaVendorAdapter vendorAdapter = new HibernateJpaVendorAdapter();
em.setJpaVendorAdapter(vendorAdapter);
em.setJpaProperties(additionalProperties());
return em;
}
Properties additionalProperties() {
Properties properties = new Properties();
properties.setProperty("hibernate.dialect", "com.organization.agentcollector.config.SQLServerDialectOverrider");
return properties;
}
#Bean
JpaTransactionManager transactionManager(final EntityManagerFactory emf) {
JpaTransactionManager transactionManager = new JpaTransactionManager();
transactionManager.setEntityManagerFactory(entityManagerFactory().getObject());
return transactionManager;
}
My Processor class looks like
public class Processor implements ItemProcessor<String, String> {
private final Logger log = LoggerFactory.getLogger(Processor.class);
#Autowired
EventReportsDAOImpl eventReportsDAOImpl;
#Override
public String process(String reportIdsJson) throws Exception {
String eventReportsJson = eventReportsDAOImpl.listEventReportsInJsonRequest(reportIdsJson);
//System.out.println(returnContent+"PROCESSOR");
return eventReportsJson;
}
}
My DAOImpl class looks like
#Component
#Transactional
public class EventReportsDAOImpl implements EventReportsDAO {
#PersistenceContext
private EntityManager em;
#Override
public EventReports getEventReports(Integer reportId) {
return null;
}
#Override
public String listEventReportsInJsonRequest(String reportIds) {
System.out.println("Event Report reportIds processing");
ArrayList<EventReports> erArr = new ArrayList<EventReports>();
String reportIdsList = reportIds.substring(1, reportIds.length() - 1);
//System.out.println(reportIdsList);
try {
StoredProcedureQuery q = em.createStoredProcedureQuery("sp_get_event_reports", "eventReportsResult");
q.registerStoredProcedureParameter("reportIds", String.class, ParameterMode.IN);
q.setParameter("reportIds", reportIdsList);
boolean isResultSet = q.execute();
erArr = (ArrayList<EventReports>) q.getResultList();
} catch (Exception e) {
System.out.println("No event reports found for list " + reportIdsList);
}
return erArr.toString();
}
I thought Spring would manage transactions automatically. The error seems to suggest that a transaction is not being properly closed?
One thing I tried was removing all #Transactional annotations from my code as I read that #EnableBatchProcessing already injects a Transaction Manager into each step - but when I did this, I saw the 'transaction already active' error much more frequently.
Any advice appreciated on how to fix this, thank you!
The #Transactional notation establishes a transactional scope that dictates when a transaction starts and ends, also called its boundary. If you operate outside of this boundary you'll receive errors.
First off, I found this bit of documentation the most helpful on Spring transactions: http://docs.spring.io/spring-framework/docs/4.2.x/spring-framework-reference/html/transaction.html specifically this section
Secondly, you may wish to enable trace level logs and potentially the SQL statements to help debug this. In order to do so I added the following to my application.properties:
spring.jpa.properties.hibernate.show_sql=false
spring.jpa.properties.hibernate.use_sql_comments=true
spring.jpa.properties.hibernate.format_sql=true
spring.jpa.properties.hibernate.type=trace
spring.jpa.show-sql=true
logging.level.org.hibernate=TRACE
There will be ALOT of output here, but you'll get a good idea of whats happening behind the scenes.
Third, and the most important part for me in learning how to use #Transactional is that every call to the DAO creates a new session -or- reuses the existing session if within the same transactional scope. Refer to the documentation above for examples of this.
I have a QuartzJobConfig class where I register my Spring-Quartz-Beans.
I followed the instruction of the SchedulerFactoryBean, JobDetailFactoryBean and CronTriggerFactoryBean.
My Jobs are configured in a yaml file outside the application. Means I have to create the Beans dynamically when the application starts.
My Config:
channelPartnerConfiguration:
channelPartners:
- code: Job1
jobConfigs:
- schedule: 0 * * ? * MON-FRI
name: Job1 daily
hotel: false
allotment: true
enabled: true
- schedule: 30 * * ? * MON-FRI
name: Job2 weekly
hotel: true
allotment: false
enabled: true
...
My Config Class:
#Configuration
public class QuartzJobConfig implements IJobClass{
#Autowired
ChannelPartnerProperties channelPartnerProperties;
#Autowired
private ApplicationContext applicationContext;
#Bean
public SchedulerFactoryBean quartzScheduler() {
SchedulerFactoryBean quartzScheduler = new SchedulerFactoryBean();
quartzScheduler.setOverwriteExistingJobs(true);
quartzScheduler.setSchedulerName("-scheduler");
AutowiringSpringBeanJobFactory jobFactory = new AutowiringSpringBeanJobFactory();
jobFactory.setApplicationContext(applicationContext);
quartzScheduler.setJobFactory(jobFactory);
// point 1
List<Trigger> triggers = new ArrayList<>();
for(ChannelPartner ch : channelPartnerProperties.getChannelPartners()){
for(JobConfig jobConfig : ch.getJobConfigs()){
triggers.add(jobTrigger(ch, jobConfig).getObject());
}
}
quartzScheduler.setTriggers(triggers.stream().toArray(Trigger[]::new));
return quartzScheduler;
}
#Bean
public JobDetailFactoryBean jobBean(ChannelPartner ch, JobConfig jobConfig) {
JobDetailFactoryBean jobDetailFactoryBean = new JobDetailFactoryBean();
jobDetailFactoryBean.setJobClass(findJobByConfig(jobConfig));
jobDetailFactoryBean.setGroup("mainGroup");
jobDetailFactoryBean.setName(jobConfig.getName());
jobDetailFactoryBean.setBeanName(jobConfig.getName());
jobDetailFactoryBean.getJobDataMap().put("channelPartner", ch);
return jobDetailFactoryBean;
}
#Bean
public CronTriggerFactoryBean jobTrigger(ChannelPartner ch, JobConfig jobConfig) {
CronTriggerFactoryBean cronTriggerFactoryBean = new CronTriggerFactoryBean();
cronTriggerFactoryBean.setJobDetail(jobBean(ch, jobConfig).getObject());
cronTriggerFactoryBean.setCronExpression(jobConfig.getSchedule());
cronTriggerFactoryBean.setGroup("mainGroup");
return cronTriggerFactoryBean;
}
#Override
public Class<? extends Job> findJobByConfig(JobConfig jobConfig) {
if(isAllotmentJob(jobConfig) && isHotelJob(jobConfig)){
return HotelAndAllotmentJob.class;
}
if(isAllotmentJob(jobConfig)){
return AllotmentJob.class;
}
if(isHotelJob(jobConfig)){
return HotelJob.class;
}
return HotelAndAllotmentJob.class;
}
private boolean isAllotmentJob(JobConfig jobConfig){
return jobConfig.isAllotment();
}
private boolean isHotelJob(JobConfig jobConfig) {
return jobConfig.isHotel();
}
}
My problem is that the creation of the Beans inside the iteration (Point 1) is just done one time. After the first iteration its not going inside the jobTrigger(ch, jobConfig) method anymore. (More or less clear because of the bean name if I am right)
What I was thinking, because I use the Quartz factories of Spring the jobDetailFactoryBean.setBeanName() method is used to create more beans with different names.
Not sure how I can solve this problem. The Code is working and the first created job is executing right. But I need more jobs.
How can I create the different jobs in a dynamically way?
Edit:
My full configuration classes:
#Configuration
#ConfigurationProperties(prefix = "channelPartnerConfiguration", locations = "classpath:customer/channelPartnerConfiguration.yml")
public class ChannelPartnerProperties {
#Autowired
private List<ChannelPartner> channelPartners;
public List<ChannelPartner> getChannelPartners() {
return channelPartners;
}
public void setChannelPartners(List<ChannelPartner> channelPartners) {
this.channelPartners = channelPartners;
}
}
#Configuration
public class ChannelPartner {
private String code;
private String contracts;
private Boolean includeSpecialContracts;
private String touroperatorCode = "EUTO";
#Autowired
private PublishConfig publishConfig;
#Autowired
private BackupConfig backupConfig;
#Autowired
private List<JobConfig> jobConfigs;
//getter/setter
#Configuration
public class JobConfig {
private String schedule;
private boolean hotelEDF;
private boolean allotmentEDF;
private boolean enabled;
private String name;
//getter/setter
Added project to github for better understanding of the problem
The reason why your list will contain null values is because the getObject method you are calling, should return the CronTrigger which is only initiated in afterPropertiesSet method called by spring when done initiating the spring context. You can call this method yourself manually on your CronTriggerFactoryBean, this will allow you to have it as a private method.
// Just to clarify, no annotations here
private CronTriggerFactoryBean jobTrigger(ChannelPartner ch, JobConfig jobConfig) throws ParseException {
CronTriggerFactoryBean cronTriggerFactoryBean = new CronTriggerFactoryBean();
cronTriggerFactoryBean.setJobDetail(jobBean(ch, jobConfig).getObject());
cronTriggerFactoryBean.setCronExpression(jobConfig.getSchedule());
cronTriggerFactoryBean.setGroup("mainGroup");
cronTriggerFactoryBean.setBeanName(jobConfig.getName() + "Trigger");
cronTriggerFactoryBean.afterPropertiesSet();
return cronTriggerFactoryBean;
}
I'm sure there are many other ways of doing this as well, as you mentioned yourself you did a work-around for it, if this however is not what you want or need I can check some more if I can find a better way.
Your jobTrigger() and jobBean() methods are not actual beans but factory methods you are using given some inputs to construct CronTriggers and JobDetails to register in your loop found in your quartzScheduler bean by invoking triggers.add(..).
Remove the #Bean and #Scope annotations from the jobTrigger() and jobBean() methods (ideally reduce their visibility too (package private if not private) and you should be good to go.
After many different tries to get this code working, I found a working solution. Its just a workaround but gives maybe some hints to find the right - not workaround - solution.
What I did:
I changed all my #Configuration
classes to #Component except ChannelPartnerProperties and QuartzJobConfig.
I put #Scope(scopeName = ConfigurableBeanFactory.SCOPE_PROTOTYPE) to my jobBean() and jobTrigger() method.
I deleted the method parameter of both.
I dont have any other #Scope(scopeName = ConfigurableBeanFactory.SCOPE_PROTOTYPE) anywhere else in my code.
I created three counter for counting through my channelPartners, jobConfigs and one for the TriggerGroups name.
I dont use the local Objects in my loops anymore. But use the counters to get the right Objects from my #Autowired channelPartnerProperties which holds all the entries of my yaml file.
After that my QuartzJobConfig class looks like that:
#Configuration
public class QuartzJobConfig implements IJobClass {
private static int channelPartnerCount = 0;
private static int jobCount = 0;
private static int groupCounter = 0;
#Autowired
ChannelPartnerProperties channelPartnerProperties;
#Autowired
private ApplicationContext applicationContext;
#Bean
public SchedulerFactoryBean quartzScheduler() {
SchedulerFactoryBean quartzScheduler = new SchedulerFactoryBean();
quartzScheduler.setOverwriteExistingJobs(true);
quartzScheduler.setSchedulerName("-scheduler");
AutowiringSpringBeanJobFactory jobFactory = new AutowiringSpringBeanJobFactory();
jobFactory.setApplicationContext(applicationContext);
quartzScheduler.setJobFactory(jobFactory);
List<CronTrigger> triggers = new ArrayList<>();
for (ChannelPartner ch : channelPartnerProperties.getChannelPartners()) {
for (JobConfig jobConfig : ch.getJobConfigs()) {
triggers.add(jobTrigger().getObject());
jobCount++;
groupCounter++;
}
channelPartnerCount++;
jobCount = 0;
}
quartzScheduler.setTriggers(triggers.stream().toArray(Trigger[]::new));
return quartzScheduler;
}
#Bean
#Scope(scopeName = ConfigurableBeanFactory.SCOPE_PROTOTYPE)
public JobDetailFactoryBean jobBean() {
JobDetailFactoryBean jobDetailFactoryBean = new JobDetailFactoryBean();
jobDetailFactoryBean.setJobClass(findJobByConfig(
channelPartnerProperties.getChannelPartners().get(channelPartnerCount).getJobConfigs().get(jobCount)));
jobDetailFactoryBean.setGroup("mainGroup" + groupCounter);
jobDetailFactoryBean.setName(channelPartnerProperties.getChannelPartners().get(channelPartnerCount)
.getJobConfigs().get(jobCount).getName());
jobDetailFactoryBean.setBeanName(channelPartnerProperties.getChannelPartners().get(channelPartnerCount)
.getJobConfigs().get(jobCount).getName());
jobDetailFactoryBean.getJobDataMap().put("channelPartner",
channelPartnerProperties.getChannelPartners().get(channelPartnerCount));
return jobDetailFactoryBean;
}
#Bean
#Scope(scopeName = ConfigurableBeanFactory.SCOPE_PROTOTYPE)
public CronTriggerFactoryBean jobTrigger() {
CronTriggerFactoryBean cronTriggerFactoryBean = new CronTriggerFactoryBean();
cronTriggerFactoryBean.setJobDetail(jobBean().getObject());
cronTriggerFactoryBean.setCronExpression(channelPartnerProperties.getChannelPartners().get(channelPartnerCount)
.getJobConfigs().get(jobCount).getSchedule());
cronTriggerFactoryBean.setGroup("mainGroup" + groupCounter);
cronTriggerFactoryBean.setBeanName(channelPartnerProperties.getChannelPartners().get(channelPartnerCount)
.getJobConfigs().get(jobCount).getName() + "Trigger" + groupCounter);
return cronTriggerFactoryBean;
}
#Override
public Class<? extends Job> findJobByConfig(JobConfig jobConfig) {
if (isAllotmentJob(jobConfig) && isHotelJob(jobConfig)) {
return HotelAndAllotmentEdfJob.class;
}
if (isAllotmentJob(jobConfig)) {
return AllotmentEdfJob.class;
}
if (isHotelJob(jobConfig)) {
return HotelEdfJob.class;
}
return HotelAndAllotmentEdfJob.class;
}
private boolean isAllotmentJob(JobConfig jobConfig) {
return jobConfig.isAllotmentEDF();
}
private boolean isHotelJob(JobConfig jobConfig) {
return jobConfig.isHotelEDF();
}
All the defined jobs in my yaml configuration gets initialized and executed like they defined.
Its a working solution but a workaround. Maybe we find a better one.
I am trying to write a test for custom spring data repository. I'm also using QueryDSL.
I am new to spring-data. I use spring support for HSQL DB in testing. MySQL for dev.
Problem: I do not see updated data in tests if I use custom repository.
public interface AuctionRepository extends AuctionRepositoryCustom, CrudRepository<Auction, Long>, QueryDslPredicateExecutor<Auction> {
// needed for spring data crud
}
.
public interface AuctionRepositoryCustom {
long renameToBestName();
}
.
public class AuctionRepositoryImpl extends QueryDslRepositorySupport implements AuctionRepositoryCustom {
private static final QAuction auction = QAuction.auction;
public AuctionRepositoryImpl() {
super(Auction.class);
}
#Override
public long renameToBestName() {
return update(auction)
.set(auction.name, "BestName")
.execute();
}
}
My test
Somehow fails at last line
public class CustomAuctionRepositoryImplTest extends AbstractIntegrationTest {
#Inject
AuctionRepository auctionRepository;
#Test
public void testDoSomething() {
Auction auction = auctionRepository.findOne(26L);
assertEquals("EmptyName", auction.getName());
// test save
auction.setName("TestingSave");
auctionRepository.save(auction);
Auction saveResult = auctionRepository.findOne(26L);
assertEquals("TestingSave", saveResult.getName());
// test custom repository
long updatedRows = auctionRepository.renameToBestName();
assertTrue(updatedRows > 0);
Auction resultAuction = auctionRepository.findOne(26L);
assertEquals("BestName", resultAuction.getName()); // FAILS expected:<[BestNam]e> but was:<[TestingSav]e>
}
}
I can't figure out why data doesn't update when using custom repository. If I start application in dev mode, and call renameToBestName() through controller, everything works as expected, name changes.
Below is Test Configuration if needed
#RunWith(SpringJUnit4ClassRunner.class)
#Transactional
#ActiveProfiles("test")
#ContextConfiguration(classes = {TestBeans.class, JpaConfig.class, EmbeddedDataSourceConfig.class})
#ComponentScan(basePackageClasses = IntegrationTest.class, excludeFilters = #Filter({Configuration.class}))
public abstract class AbstractIntegrationTest {
}
.
#Configuration
#EnableTransactionManagement
#EnableJpaRepositories(basePackageClasses = Application.class)
class JpaConfig {
#Value("${hibernate.dialect}")
private String dialect;
#Value("${hibernate.hbm2ddl.auto}")
private String hbm2ddlAuto;
#Value("${hibernate.isShowSQLOn}")
private String isShowSQLOn;
#Autowired
private DataSource dataSource;
#Bean
public LocalContainerEntityManagerFactoryBean entityManagerFactory() {
LocalContainerEntityManagerFactoryBean entityManagerFactory = new LocalContainerEntityManagerFactoryBean();
entityManagerFactory.setDataSource(dataSource);
entityManagerFactory.setPackagesToScan("auction");
entityManagerFactory.setJpaVendorAdapter(new HibernateJpaVendorAdapter());
Properties jpaProperties = new Properties();
jpaProperties.put(org.hibernate.cfg.Environment.DIALECT, dialect);
if ( !hbm2ddlAuto.isEmpty()) {
jpaProperties.put(org.hibernate.cfg.Environment.HBM2DDL_AUTO, hbm2ddlAuto);
}
jpaProperties.put(org.hibernate.cfg.Environment.SHOW_SQL, isShowSQLOn);
jpaProperties.put(org.hibernate.cfg.Environment.HBM2DDL_IMPORT_FILES_SQL_EXTRACTOR, "org.hibernate.tool.hbm2ddl.MultipleLinesSqlCommandExtractor");
entityManagerFactory.setJpaProperties(jpaProperties);
return entityManagerFactory;
}
#Bean
public PlatformTransactionManager transactionManager() {
return new JpaTransactionManager();
}
}
This is due to the update query issued through your code is defined to not evict the object potentially touched by the query from the EntityManager. Read more on that in this answer.