I am following the example from here to run tasks in parallel using #EnableScheduling - from this tutorial
Here is my code -
#EnableScheduling
#Configuration
public class MyJobScheduler implements SchedulingConfigurer {
#Bean(destroyMethod="shutdown")
public Executor taskExecutor() {
return Executors.newScheduledThreadPool(5);
}
#Override
public void configureTasks(ScheduledTaskRegistrar taskRegistrar) {
taskRegistrar.setScheduler(taskExecutor());
}
#Bean
#Scope("prototype")
public MyJobExecutor createMyJobExecutor() {
return new MyJobExecutor();
}
#PostConstruct
public void registerServices() {
for (int i = 0; i < 5; i++) {
createMyJobExecutor();
}
}
}
public class MyJobExecutor {
private static final Logger logger =
LoggerFactory.getLogger(MyJobExecutor.class);
#Autowired
MyService myService;
#Scheduled(fixedDelayString = "10000", initialDelayString = "30000")
public void runJob() {
try {
logger.info("MyJobExecutor executing...");
myService.myJobTask();
} catch (Exception e) {
logger.error(MyJobExecutor failed.", e);
}
}
}
#EnableScheduling
#Configuration
public class MyJobScheduler2 implements SchedulingConfigurer {
#Bean(destroyMethod="shutdown")
public Executor taskExecutor() {
return Executors.newScheduledThreadPool(2);
}
#Override
public void configureTasks(ScheduledTaskRegistrar taskRegistrar) {
taskRegistrar.setScheduler(taskExecutor());
}
#Bean
#Scope("prototype")
public MyJobExecutor2 createMyJobExecutor() {
return new MyJobExecutor2();
}
#PostConstruct
public void registerServices() {
for (int i = 0; i < 2; i++) {
createMyJobExecutor();
}
}
}
public class MyJobExecutor2 {
private static final Logger logger =
LoggerFactory.getLogger(MyJobExecutor2.class);
#Autowired
MyService2 myService;
#Scheduled(fixedDelayString = "10000", initialDelayString = "30000")
public void runJob() {
try {
logger.info("MyJobExecutor2 executing...");
myService.myJobTask();
} catch (Exception e) {
logger.error(MyJobExecutor2 failed.", e);
}
}
}
When I have just one scheduler configured, MyJobScheduler, I get the expected number of jobs running in parallel. However, I need 2 scheduled jobs. When I add MyJobScheduler2, here are the logs...
21:03:25.291 [pool-2-thread-1] INFO c.t.m.e.s.e.p.p.MyJobExecutor2 - MyJobExecutor2 executing...
21:03:25.459 [pool-2-thread-2] INFO c.t.m.e.s.e.p.p.MyJobExecutor2 - MyJobExecutor2 executing...
21:03:25.537 [pool-2-thread-1] INFO c.t.m.e.s.e.p.p.MyJobExecutor - MyJobExecutor executing...
21:03:25.680 [pool-2-thread-2] INFO c.t.m.e.s.e.p.p.MyJobExecutor - MyJobExecutor executing...
I was expecting MyJobScheduler to use 5 threads and MyJobScheduler2 to use 2 threads. It seems like both these Schedulers are using the same thread pool and getting limited to 2 threads rather than getting their own separate thread pools. What can be going on here?
Related
I am completely new to Spring. Trying to call a method on a separate thread using Spring Async annotation. This is the code I have tried after looking around a bit:
public class MyClass {
private static final Logger LOGGER = LoggerFactory.getLogger(MyClass.class);
#Async("threadPoolTaskExecutor")
public void asyncMethodWithVoidReturnType() {
System.out.println("Hello from sout! Execute method asynchronously. " + Thread.currentThread().getName());
LOGGER.info("Hello from logger! Execute method asynchronously. " + Thread.currentThread().getName());
}
}
and the runner:
#SpringBootApplication
#EnableAsync
public class SpringTestRunner {
#Bean("threadPoolTaskExecutor")
public Executor getAsyncExecutor() {
return new ThreadPoolTaskExecutor();
}
public static void main(String[] args) {
SpringApplication.run(SpringTestRunner.class, args);
}
}
When running the main class, I do not see any output from the thread. Why is that?
As you can tell, this is a very basic question, so please explain as much as you can.
You don't see output because you don't call your asyncMethodWithVoidReturnType function. Example with scheduler:
Writer:
#Slf4j
#Service
public class AsyncWriter {
#Async("threadPoolTaskExecutor")
#SneakyThrows
public void asyncWrite() {
log.info("Hello! " + Thread.currentThread().getName());
Thread.sleep(1000);
}
}
Scheduler:
#Slf4j
#Service
#RequiredArgsConstructor
public class Scheduler {
private final AsyncWriter asyncWriter;
#Scheduled(fixedDelay = 1000)
public void scheduledWrite() {
log.info("Scheduler");
for(int i = 0; i < 10; i++) {
asyncWriter.asyncWrite();
}
}
}
Runner:
#SpringBootApplication
#EnableAsync
#EnableScheduling
public class SpringTestRunner {
#Bean("threadPoolTaskExecutor")
public Executor getAsyncExecutor() {
ThreadPoolTaskExecutor executor = new ThreadPoolTaskExecutor();
executor.setCorePoolSize(10);
return executor;
}
public static void main(String[] args) {
SpringApplication.run(SpringTestRunner.class, args);
}
}
i am not really familiar with Spring's Live Reload feature. However, i noticed that every time i have changes saved on my Java code. Java profiles (JMC) or Eclipse's (Debug Mode) shows that the thread i already spawned was spawned again resulting on 2 threads (1 thread before reload + 1 thread after reload).
I am currently using ThreadPoolExecutor to spawned my threads. basically i am letting spring manage my threads. In this case, How to i force shutdown/interrupt the threads i spawned when there is live reload occuring?
Below is my source.
ApplicationThreadingConfiguration.java
#Configuration
public class ApplicationThreadingConfiguration {
private static final Logger logger = LoggerFactory.getLogger(ApplicationThreadingConfiguration.class);
#Autowired
MyProperties prop;
#Bean(name = "myThread")
public TaskExecutor taskExecutor() {
logger.info(prop.toString());
final ThreadPoolTaskExecutor executor = new ThreadPoolTaskExecutor();
executor.setCorePoolSize(prop.getCorePoolSize());
executor.setMaxPoolSize(prop.getMaxPoolSize());
executor.setQueueCapacity(prop.getQueueCapacity());
executor.setThreadNamePrefix("MyThread-");
executor.setWaitForTasksToCompleteOnShutdown(true);
executor.initialize();
return executor;
}
}
AppEventConfiguration.java (this spawns my thread after Spring context is loaded)
#Configuration
public class AppEventConfiguration {
private static final Logger logger = LoggerFactory.getLogger(AppEventConfiguration.class);
#Autowired
private MyService service;
#Autowired
private ApplicationContext context;
#Autowired
#Qualifier("myThread")
private TaskExecutor executor;
#EventListener(ApplicationReadyEvent.class)
public void onApplicationReadyEvent() {
service.getSomethingFromDB().stream().forEach(dto -> {
logger.info("Id: {}", dto.getId());
MyRunnableThread t = this.context.getBean(MyRunnableThread.class);
t.setMyId(dto.getId());
executor.execute(t);
});
}
}
MyRunnableThread.java
#Component
#Scope("prototype")
public class MyRunnableThread implements Runnable {
private static final Logger logger = LoggerFactory.getLogger(MyRunnableThread.class);
private long myId;
#Override
public void run() {
while(true) {
try {
logger.info("Do something here on ID: {}",this.myId);
Thread.sleep(5000);
} catch (InterruptedException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
public void setMyId(long id) {
this.myId = myId;
}
}
myproperties.properties
myprop.core-pool-size=80
myprop.max-pool-size=100
myprop.queue-capacity=100
I have created a simple service like ... where I have to take some data from database at later
package com.spring.scheduler.example.springscheduler;
import org.springframework.stereotype.Service;
#Service
public class ExampleService {
private String serviceName;
private String repository;
public String getServiceName() {
return serviceName;
}
public void setServiceName(String serviceName) {
this.serviceName = serviceName;
}
public String getRepository() {
return repository;
}
public void setRepository(String repository) {
this.repository = repository;
}
}
Here is my task scheduler created to run different threads.
#Component
public class TaskSchedulerService {
#Autowired
ThreadPoolTaskScheduler threadPoolTaskScheduler;
public ScheduledFuture<?> job1;
public ScheduledFuture<?> job2;
#Autowired
ApplicationContext applicationContext;
#PostConstruct
public void job1() {
//NewDataCollectionThread thread1 = new NewDataCollectionThread();
NewDataCollectionThread thread1 = applicationContext.getBean(NewDataCollectionThread.class);
AutowireCapableBeanFactory factory = applicationContext.getAutowireCapableBeanFactory();
factory.autowireBean(thread1);
factory.initializeBean(thread1, null);
job1 = threadPoolTaskScheduler.scheduleAtFixedRate(thread1, 1000);
}
}
This is a thread trying to call from scheduler. I tried to create service instance forcibly by using application context but it's not created.
#Configurable
#Scope("prototype")
public class NewDataCollectionThread implements Runnable {
private static final Logger LOGGER =
LoggerFactory.getLogger(NewDataCollectionThread.class);
#Autowired
private ExampleService exampleService;
#Override
public void run() {
LOGGER.info("Called from thread : NewDataCollectionThread");
System.out.println(Thread.currentThread().getName() + " The Task1
executed at " + new Date());
try {
exampleService.setRepository("sdasdasd");
System.out.println("Service Name :: " +
exampleService.getServiceName());
Thread.sleep(10000);
} catch (InterruptedException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
Kindly suggest what are the possible ways to achieve it.
Try something like this:
#Autowired
private AutowireCapableBeanFactory beanFactory;
#PostConstruct
public void job1() {
NewDataCollectionThread thread1 = new NewDataCollectionThread();
beanFactory.autowireBean(thread1);
job1 = threadPoolTaskScheduler.scheduleAtFixedRate(thread1, 1000);
}
In NewDataCollectionThread I injected the Example service successfully.
I'm trying to figure out why my scheduled jobs are not executed parallelly. Maybe there is something wrong with my transaction management? Method JobScheduledExecutionService.execute() is #Scheduled with fixedRate=250, so it should be fired every 250ms no matter if previous job is finished. Due to logs it is not working as expected.
Logs: https://pastebin.com/M6FaXpeE
My code is below.
#Service
#Slf4j
public class JobExecutionService {
private final TransactionalJobExecutionService transactionalJobExecutionService;
#Autowired
public JobExecutionService(TransactionalJobExecutionService transactionalJobExecutionService) {
this.transactionalJobExecutionService = transactionalJobExecutionService;
}
public void execute() {
TestJob job = transactionalJobExecutionService.getJob();
executeJob(job);
transactionalJobExecutionService.finishJob(job);
}
private void executeJob(TestJob testJob) {
log.debug("Execution-0: {}", testJob.toString());
Random random = new Random();
try {
Thread.sleep(random.nextInt(3000) + 200);
} catch (InterruptedException e) {
log.error("Error", e);
}
log.debug("Execution-1: {}", testJob.toString());
}
}
#Service
#Slf4j
public class JobScheduledExecutionService {
private final JobExecutionService jobExecutionService;
#Autowired
public JobScheduledExecutionService(JobExecutionService jobExecutionService) {
this.jobExecutionService = jobExecutionService;
}
#Scheduled(fixedRate = 250)
public void execute() {
log.trace("Job fired");
jobExecutionService.execute();
}
}
#Service
#Slf4j
#Transactional
public class TransactionalJobExecutionService {
private final Environment environment;
private final TestJobRepository testJobRepository;
private final TestJobResultRepository testJobResultRepository;
#Autowired
public TransactionalJobExecutionService(Environment environment, TestJobRepository testJobRepository, TestJobResultRepository testJobResultRepository) {
this.environment = environment;
this.testJobRepository = testJobRepository;
this.testJobResultRepository = testJobResultRepository;
}
public TestJob getJob() {
TestJob testJob = testJobRepository.findFirstByStatusOrderByIdAsc(
0
);
testJob.setStatus(1);
testJobRepository.save(testJob);
return testJob;
}
public void finishJob(TestJob testJob) {
testJobResultRepository.save(
new TestJobResult(
null,
testJob.getId(),
environment.getProperty("local.server.port")
)
);
}
}
#Configuration
public class SchedulingConfigurerConfiguration implements SchedulingConfigurer {
#Override
public void configureTasks(ScheduledTaskRegistrar taskRegistrar) {
ThreadPoolTaskScheduler taskScheduler = new ThreadPoolTaskScheduler();
taskScheduler.setPoolSize(32);
taskScheduler.initialize();
taskRegistrar.setTaskScheduler(taskScheduler);
}
}
The reason is scheduler will fire only one event, which will be executed by one thread and then I don't see you are spawning multiple threads in your logic for parallel execution. That call of jobExecutionService.execute(); in execute() of JobScheduledExecutionService is in that one thread. So overall it ends up being sequential execution.
Seems you need to put multi-threaded [Callable-Future based] logic in JobExecutionService : execute() to pick job [transactionalJobExecutionService.getJob()] and call executeJob() inside it. hope this helps..
I am using Spring 4. I use this for execute a task periodically for web sockets:
private TaskScheduler scheduler = new ConcurrentTaskScheduler();
In my class:
#Configuration
#EnableWebSocketMessageBroker
#EnableScheduling
#Component
public class WebSocketConfig implements WebSocketMessageBrokerConfigurer {
#Autowired
private SimpMessagingTemplate template;
private TaskScheduler scheduler = new ConcurrentTaskScheduler();
public void registerStompEndpoints(StompEndpointRegistry registry) {
registry.addEndpoint("/simplemessages").withSockJS();
}
public void configureMessageBroker(MessageBrokerRegistry config) {
config.enableSimpleBroker("/topic/", "/queue/");
config.setApplicationDestinationPrefixes("/app");
}
#PostConstruct
private void broadcastTimePeriodically() {
scheduler.scheduleAtFixedRate(new Runnable() {
public void run() {
String statStr = "Server Response" + new Date();
System.out.println("thread schedular run time :" + Hello.printTime());
try {
template.convertAndSend("/topic/simplemessagesresponse", statStr);
} catch (MessagingException e) {
System.err.println("!!!!!! websocket timer error :>" + e.toString());
}
}
}, 4000));
}
#PreDestroy
private void destroyServices() {
scheduler = null; // how to destroy ?
}
public void configureClientInboundChannel(ChannelRegistration registration) {
}
public void configureClientOutboundChannel(ChannelRegistration registration) {
registration.taskExecutor().corePoolSize(4).maxPoolSize(10);
}
public boolean configureMessageConverters(List < MessageConverter > arg0) {
// TODO Auto-generated method stub
return true;
}
#Override
public void configureWebSocketTransport(WebSocketTransportRegistration arg0) {
}
}
I want to know to things:
I found that the scheduler is running twice within 4000 milliseconds. How is it happening and how can I stop it?
I run this application in tomcat. As you can see, the method destroyServices() needs to destroy the schedular. Here the problem is, even the tomcat is restarted again, previously running thread is still running. So when the tomcat is going to down, that thread also should be terminated. I need to know How I can destroy it on tomcat is going to down or any system crash?
The following code snippet is from documentation of #EnableScheduling:
#Configuration
#EnableScheduling
public class AppConfig implements SchedulingConfigurer {
#Override
public void configureTasks(ScheduledTaskRegistrar taskRegistrar) {
taskRegistrar.setScheduler(taskExecutor());
}
#Bean(destroyMethod="shutdown")
public Executor taskExecutor() {
return Executors.newScheduledThreadPool(100);
}
}
I think you should get the bean named taskExecutor (in this case) and call shutdown (in fact depending on your configuration) method of it.