When the Task includes 2 lines opening a stream and closing it, the Task returns correctly. Its succeed method runs. But the Service' setOnSucceeded method does not run. Why?
When the 2 lines about the streaming opening and closing are commented out in the Task, we do get the expected behavior: System.out.println("in the succeed method of the Service") does print.
The Main method:
package Test;
import javafx.application.Application;
import javafx.concurrent.Worker;
import javafx.stage.Stage;
public class Main extends Application {
public static void main(String[] args) {
launch(args);
}
#Override
public void start(Stage primaryStage) {
MyService pluginUpdateService = new MyService();
pluginUpdateService.setOnFailed(e -> {
System.out.println("task failed");
});
pluginUpdateService.setOnSucceeded(e -> {
System.out.println("in the succeed method of the Service"); // this does not print, when MyTask opens the stream (see code of MyTask)
});
// launching the service
if (pluginUpdateService.getState() == Worker.State.READY) {
pluginUpdateService.start();
}
}
}
The service MyService
package Test;
import javafx.concurrent.Service;
import javafx.concurrent.Task;
public class MyService extends Service {
public MyService() {
}
#Override
protected Task createTask() {
Task updater = new MyTask();
return updater;
}
}
The task MyTask:
package Test;
import java.io.IOException;
import java.io.InputStream;
import java.net.MalformedURLException;
import java.net.URL;
import javafx.concurrent.Task;
public class MyTask extends Task {
public MyTask() {
}
#Override
protected Void call() throws MalformedURLException, IOException {
InputStream in;
URL url = new URL("https://clementlevallois.net");
in = url.openStream(); // when this line is commented out, the Service does trigger its setOnSucceeded method as expected
in.close(); // when this line is commented out, the Service does trigger its setOnSucceeded method as expected
System.out.println("about to return from the call method of the Task");
return null;
}
#Override
protected void succeeded() {
super.succeeded();
System.out.println("succeeded - sent from the succeeded method of the Task");
}
#Override
protected void cancelled() {
super.cancelled();
System.out.println("cancelled");
updateMessage("Cancelled!");
}
#Override
protected void failed() {
super.failed();
System.out.println("failed");
updateMessage("Failed!");
}
}
I am using the below code sample where I am calling the cancelWF method to cancel the execution of workflow. The onCatch method is successfully invoked with the RuntimeException("Simply cancel"), but on the Amazon SWF console the WF does not end immediately, it waits will timeout and ends with a WorkflowExecutionTerminated event.
The whole project is available here if you want more info.
package aws.swf;
import aws.swf.common.Constants;
import aws.swf.common.DelayRequest;
import aws.swf.common.MyActivityClient;
import aws.swf.common.MyActivityClientImpl;
import aws.swf.common.MyWorkflow;
import aws.swf.common.SWFClient;
import com.amazonaws.services.simpleworkflow.AmazonSimpleWorkflow;
import com.amazonaws.services.simpleworkflow.flow.WorkflowWorker;
import com.amazonaws.services.simpleworkflow.flow.annotations.Asynchronous;
import com.amazonaws.services.simpleworkflow.flow.core.Promise;
import com.amazonaws.services.simpleworkflow.flow.core.TryCatch;
import java.util.concurrent.CancellationException;
public class D_CancelWorkflow implements MyWorkflow {
private TryCatch tryCatch;
private final MyActivityClient activityClient = new MyActivityClientImpl();
#Override
public void sum() {
tryCatch = new TryCatch() {
#Override
protected void doTry() throws Throwable {
System.out.printf("[WF %s] Started exec\n", D_CancelWorkflow.this);
Promise<Integer> result = activityClient.getNumWithDelay(new DelayRequest("req1", 1));
cancelWF(result);
newDelayRequest(result);
}
#Override
protected void doCatch(Throwable e) throws Throwable {
if (e instanceof CancellationException) {
System.out.printf("[WF %s] Cancelled With message [%s]\n",
D_CancelWorkflow.this, e.getCause().getMessage());
} else {
e.printStackTrace();
}
rethrow(e);
}
};
}
#Asynchronous
private void newDelayRequest(Promise<Integer> num) {
activityClient.getNumWithDelay(new DelayRequest("req2", 1));
}
#Asynchronous
private void cancelWF(Promise<Integer> ignore) {
System.out.printf("[WF %s] Cancelling WF\n", D_CancelWorkflow.this);
this.tryCatch.cancel(new RuntimeException("Simply cancel"));
}
public static void main(String[] args) throws Exception {
AmazonSimpleWorkflow awsSwfClient = new SWFClient().getClient();
WorkflowWorker workflowWorker =
new WorkflowWorker(awsSwfClient, Constants.DOMAIN, Constants.TASK_LIST);
workflowWorker.addWorkflowImplementationType(D_CancelWorkflow.class);
workflowWorker.start();
}
}
This is the event history for one of my execution,
I am a novice of spring batch.
My question is how to catch exceptions with the skip method in spring-batch?
As I know, we can use a skip method to skip them when some exceptions happened in spring batch.
But how can I get the exception message with skip method?
somebody suggested me use SkipListener ,this class has three call back method like onSkipInProcess(),but it is no use for me.
And ItemProcessListener did not work either.
The code like below:(I use skip method to ignore the exception and two listeners to receive the exception info)
Step mainStep = stepBuilder.get("run")
.<ItemProcessing, ItemProcessing>chunk(5)
.faultTolerant()
.skip(IOException.class).skip(SocketTimeoutException.class)//skip IOException here
.skipLimit(2000)
.reader(reader)
.processor(processor)
.writer(writer)
.listener(stepExecListener)
.listener(new ItemProcessorListener()) //add process listener
.listener(new SkipExceptionListener()) //add skip exception listner
.build();
ItemProcessorListener like below:
//(this class implements ItemProcessListener )
{
#Override
public void beforeProcess(Object item) {
// TODO Auto-generated method stub
}
#Override
public void afterProcess(Object item, Object result) {
logger.info("invoke remote finished, item={},result={}",item,result);
}
#Override
public void onProcessError(Object item, Exception e) {
logger.error("invoke remote error, item={},exception={},{}",item,e.getMessage(),e);
}
}
SkipExceptionListener like below:
//(implements SkipListener<Object, Object>)
{
#Override
public void onSkipInRead(Throwable t) {
// TODO Auto-generated method stub
}
#Override
public void onSkipInWrite(Object item, Throwable t) {
// TODO Auto-generated method stub
}
#Override
public void onSkipInProcess(Object item, Throwable t) {
logger.info("invoke remote finished,item={},itemJsonStr={},errMsg={},e={}",
item,
JSONObject.toJSONString(item),
t.getMessage(),
t);
}
}
The issue is that all logger did not work. Actually the skip method does work well, I can get the skip count in table batch_step_execution. I am not sure these two listeners whether be callback. Who can tell me how can I do? Or is there anything else? Thanks a lot.
How to catch exception message with skip method in spring batch?
You can do that by implementing the SkipListener interface or extending the SkipListenerSupport class. All methods in the SkipListener interface have a Throwable parameter which is the exception thrown and that caused the item to be skipped. This is where you can get the exception message. Here is an example:
import java.util.Arrays;
import org.springframework.batch.core.Job;
import org.springframework.batch.core.JobParameters;
import org.springframework.batch.core.SkipListener;
import org.springframework.batch.core.Step;
import org.springframework.batch.core.configuration.annotation.EnableBatchProcessing;
import org.springframework.batch.core.configuration.annotation.JobBuilderFactory;
import org.springframework.batch.core.configuration.annotation.StepBuilderFactory;
import org.springframework.batch.core.launch.JobLauncher;
import org.springframework.batch.item.ItemProcessor;
import org.springframework.batch.item.ItemReader;
import org.springframework.batch.item.ItemWriter;
import org.springframework.batch.item.support.ListItemReader;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.ApplicationContext;
import org.springframework.context.annotation.AnnotationConfigApplicationContext;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
#Configuration
#EnableBatchProcessing
public class MyJob {
#Autowired
private JobBuilderFactory jobs;
#Autowired
private StepBuilderFactory steps;
#Bean
public ItemReader<Integer> itemReader() {
return new ListItemReader<>(Arrays.asList(1, 2, 3, 4, 5, 6, 7, 8, 9, 10));
}
#Bean
public ItemWriter<Integer> itemWriter() {
return items -> {
for (Integer item : items) {
System.out.println("item = " + item);
}
};
}
#Bean
public ItemProcessor<Integer, Integer> itemProcessor() {
return item -> {
if (item.equals(7)) {
throw new IllegalArgumentException("Sevens are not accepted!!");
}
return item;
};
}
#Bean
public Step step() {
return steps.get("step")
.<Integer, Integer>chunk(5)
.reader(itemReader())
.processor(itemProcessor())
.writer(itemWriter())
.faultTolerant()
.skip(IllegalArgumentException.class)
.skipLimit(3)
.listener(new MySkipListener())
.build();
}
#Bean
public Job job() {
return jobs.get("job")
.start(step())
.build();
}
public static class MySkipListener implements SkipListener<Integer, Integer> {
#Override
public void onSkipInRead(Throwable t) {
}
#Override
public void onSkipInWrite(Integer item, Throwable t) {
}
#Override
public void onSkipInProcess(Integer item, Throwable t) {
System.out.println("Item " + item + " was skipped due to: " + t.getMessage());
}
}
public static void main(String[] args) throws Exception {
ApplicationContext context = new AnnotationConfigApplicationContext(MyJob.class);
JobLauncher jobLauncher = context.getBean(JobLauncher.class);
Job job = context.getBean(Job.class);
jobLauncher.run(job, new JobParameters());
}
}
In this example, MySkipListener implements SkipListener and gets the message from the exception as you are trying to do. The example reads numbers from 1 to 10 and skips number 7. You can run the main method and should see the following output:
item = 1
item = 2
item = 3
item = 4
item = 5
item = 6
item = 8
item = 9
item = 10
Item 7 was skipped due to: Sevens are not accepted!!
I hope this helps.
I couldn't get it to work either with implementing SkipListener (would be nice to know why) but in the end I went with the annotation way which is only briefly mentioned in the docs. Also somebody had a similar issue here using the implementation method (question) and the guy in the answer uses this annotation method instead of implementing the interface.
Example bean:
#Component
public class CustomSkippedListener {
#OnSkipInRead
public void onSkipInRead(Throwable throwable) {
}
#OnSkipInWrite
public void onSkipInWrite(FooWritingDTO fooWritingDTO, Throwable throwable) {
LOGGER.info("balabla" + throwable.getMessage());
}
#OnSkipInProcess
public void onSkipInProcess(FooLoaderDTO fooLoaderDTO, Throwable throwable) {
LOGGER.info("blabla" + throwable.getMessage());
}
private static final Logger LOGGER = LoggerFactory.getLogger(CustomSkippedListener.class);
}
then autowire and include in the step chain as you did.
As the question says I'm trying to write a bunch of files (3 in this case) to the filesystem after aggregating them and then splitting them.
How can I do this in camel? I'm using 2.16.5 as this is the camel version of the project I'm woking on.
This is the code that to my mind is supposed to do this:
// reads
public class StringRoute extends RouteBuilder {
public void configure() throws Exception {
from("direct:readStrings")
.split(body())
.to("file:input?fileExist=Append&fileName=${body}.txt");
// appending each message to diff file
}
import org.apache.camel.builder.RouteBuilder;
import org.apache.camel.processor.aggregate.GroupedExchangeAggregationStrategy;
public class AggregatorRoute extends RouteBuilder {
#Override
public void configure() throws Exception {
from("file:input")
.aggregate(new GroupedExchangeAggregationStrategy())
.constant(true)
.completionSize(3)
.to("direct:splitFiles");
from("direct:splitFiles")
.split(body())
.to("direct:writeFiles");
}
}
public class FileWriter extends RouteBuilder {
#Override
public void configure() throws Exception {
from("direct:writeFiles")
.process(new Processor() {
#Override
public void process(Exchange exchange) throws Exception {
System.out.println("In FileWriter: " + exchange.getIn().getBody());
}
})
.to("file:output?fileName=${body}.txt");
}
}
// Main
import routes.AggregatorRoute;
import routes.FileWriter;
import routes.StringRoute;
import org.apache.camel.CamelContext;
import org.apache.camel.ProducerTemplate;
import org.apache.camel.impl.DefaultCamelContext;
import java.util.ArrayList;
import java.util.List;
public class Main {
public static void main(String[] args) {
CamelContext camelContext = new DefaultCamelContext();
try {
camelContext.addRoutes(new StringRoute());
camelContext.start();
ProducerTemplate producerTemplate = camelContext.createProducerTemplate();
List<String> list = new ArrayList<String>();
String a = "abc";
String b = "cdb";
String c = "efg";
list.add(a); list.add(b); list.add(c);
producerTemplate.sendBody("direct:readStrings", list);
camelContext.addRoutes(new AggregatorRoute());
camelContext.addRoutes(new FileWriter());
Thread.sleep(10000);
camelContext.stop();
} catch (Exception e) {
e.printStackTrace();
}
}
As you can see there are no files in output directory.
result.png
You must be got the exception No consumers available on endpoint: Endpoint[direct://splitFiles] because definition of direct:splitFiles was actually below it was accessed in the file:input.
So I moved the definition as a first step.
import java.util.ArrayList;
import java.util.List;
import org.apache.camel.Exchange;
import org.apache.camel.builder.RouteBuilder;
import org.apache.camel.processor.aggregate.AggregationStrategy;
public class AggregatorRoute extends RouteBuilder {
#Override
public void configure() throws Exception {
from("direct:splitFiles")
.split(body())
.convertBodyTo(String.class)
.to("direct:writeFiles")
;
from("file:input")
.convertBodyTo(String.class)
.log("current message:: ${body}")
.aggregate(new AggregationStrategy() {
#Override
public Exchange aggregate(Exchange oldExchange, Exchange newExchange) {
String newMessage=newExchange.getIn().getBody(String.class);
List<String> messageList=null;
if(oldExchange == null) {
messageList=new ArrayList<String>();
}
if(oldExchange != null && newExchange!=null) {
messageList=oldExchange.getIn().getBody(List.class);
}
messageList.add(newMessage);
newExchange.getOut().setBody(messageList);
return newExchange;
}
})
.constant(true)
.completionSize(3)
.log("current message:: ${body}")
.to("direct:splitFiles")
;
}
}
you should use GroupedMessageAggregationStrategy instead of GroupedExchangeAggregationStrategy if you just want to aggregate the message.
Finally in Main.java also flipped the order to get rid of the No consumers Exception for direct:writeFiles
camelContext.addRoutes(new FileWriter());
camelContext.addRoutes(new AggregatorRoute());
import io.vertx.core.AbstractVerticle;
import io.vertx.core.Vertx;
import io.vertx.ext.web.Router;
public class app extends AbstractVerticle{
#Override
public void start() throws Exception{
Router router = Router.router(vertx);
router.route().handler(routingContext -> {
routingContext.response()
.putHeader("content-type", "text/html")
.end("hello vert.x");
});
vertx.createHttpServer().requestHandler(router::accept).listen(8888);
}
public static void main(String[] args) {
Vertx vertx = Vertx.vertx();
vertx.deployVerticle(new app());
}
}
then error occurs as java.lang.NoSuchMethodError: io.netty.resolver.HostsFileParser.parseSilently()Ljava/util/Map;
Try this way
import io.vertx.core.AbstractVerticle;
import io.vertx.ext.web.Router;
public class Server extends AbstractVerticle {
#Override
public void start() throws Exception {
Router router = Router.router(vertx);
router.route().handler(routingContext -> {
routingContext.response().putHeader("content-type", "text/html").end("hello vert.x");
});
vertx.createHttpServer().requestHandler(router::accept).listen(8080);
}
}
run mvn clean package then java -jar /path/to/jar
Also in java app is not a valid class name.
Class names start with a capital e.g. App