Retrieving Jobs and/or Process - java

Is there an easy way to retrieve a job and check e.g. the status with Play?
I have a few encoding jobs/downloading jobs which run for a long time. In some cases I want to cancel them.
Is there a way to retrieve a list of Jobs or something?
E.g. one Job calls the FFMPEG encoder using the ProcessBuilder. I would like to be able to get this job and kill the process if it is not required (e.g. wrong file uploaded and don't want to wait for an hour before it is finished). If I can get a handle to that Job then I can get to the process as well.
I am using Play 1.2.4

See JobsPlugin.java to see how to list all the scheduledJobs.
Getting the task currently executed is more tricky but you can find your jobs in JobsPlugin.scheduledJobs list by checking Job class and call a method in your custom Job to tell him to cancel
Something like
for (Job<?> job : JobsPlugin.scheduledJobs) {
if (job instanceof MyJob) {
((MyJob) job).cancelWork();
}
}
where cancelWork is your custom method

Related

Is this the correct waiting strategy when sending commands with testcontainers?

I am using Testcontainers DockerComposeContainer and sending shell commands using the execInContainer method once my containers are up and running:
#ClassRule
public static DockerComposeContainer<?> environment =
new DockerComposeContainer<>(new File("docker-compose.yml"))
.withExposedService(DB_1, DB_PORT)
.waitingFor(SERVICE_1, Wait.defaultWaitStrategy())
.withLocalCompose(true);
One of the commands is to simply move a file which will then be processed and I want to wait until the process inside the container processes it before checking the results in my test.
service.execInContainer("cp", "my-file.zip", "/home/user/dir");
The way I'm checking to see if the process has consumed the my-file.zip, once it has been moved, is to inspect the logs:
String log = "";
while (!log.contains("Moving my-file.zip file to /home/user/dir")) {
ExecResult cat = soar.execInContainer("cat", "/my-service/logs/service.log");
log = cat.getStdout();
}
This works, but I don't like the constant polling very much inside the while loop and was wondering if there is a better way to achieve this.
I've been looking internally into testcontainers and it makes use of the java dockerapi so I wondered if there is a better way to do this via that API or if I could do the waiting using a library like Awaitility.
Thanks for any suggestions

AS400 JOB Queue via Java jt400

I am just writing an Interface between a java application and an AS400.
For this purpose I use jt400. I managed to get information about the systemstatus like CPU usage, as well I managed to receive the current status about subsystems and jobs.
Now I am searching for an option to have a look at the different job queues inside the AS400.
For example: I would like to know, how many jobs are in which queue.
Is there a solution via jt400 or a different approach to access those information via java?
The corresponding command inside AS400 is WRKJOBQ
Best
LStrike
[Edit]
The following code is my filter for JobList. But how do I configure QSYSObjectPathName that it is matching WRKJOBQ?
QSYSObjectPathName path = new QSYSObjectPathName(.....);
JobList jList = new JobList(as400);
jList.addJobSelectionCriteria(JobList.SELECTION_PRIMARY_JOB_STATUS_JOBQ, true);
jList.addJobSelectionCriteria(JobList.SELECTION_JOB_QUEUE, path.getPath());
Job[] jobs = jList.getJobs(-1, 1);
System.out.println("Jobs Size: " + jobs.length);
You can use a JobList object for that, using SELECTION_JOB_QUEUE to filter jobs.
Once your selection suits your need, JobList#getLength() will give you the number of jobs.
See also this question

Jbpm6 . Asynchronous workitem and retry

Let me come directly to use case.
I am having a number of work-items in my process say A,B,C. It starts in A--->B--->C order.
In my case, B is a call to a 3rd party web service. C should process only if B is success. if the call to the web-service fails, system should retry after 5 min. The number of retries are limited to 3.
How can I achieve this using Jbpm6.?
Some options that I understand from doc are,
1) I can use a work item handler. Inside work item, I will start another thread which will do the retries and finally it calls the completeWrokItem() method. But in this case my process engine thread will wait unnecessarily for the completeWrokItem() call.
2)I can use command for retry . But if I call command it will execute in another thread and the process thread will execute C. Which is not a desirable way
How can I create a process so that, B will execute in back-end and will notify the engine that it can continue executing C?
Please advice.
Thanks in advance.
Please comment if my question is not clear enough to answer.
Your question is not completely clear; however, I provide an answer to hopefully provide some clarity:
For asynchronous execution, you should follow guidelines in documentation: JBMP 6.0 Async Documentation
Given your processes flow, if you use a Command and a process defined as: A->B->C; C will not start until the command completes.
To have commands run in parallel, you use parallel branches. In below pic, if Script1 and Script2 were commands they would execute in parallel, and Email would only execute once both Scripts complete:
A command signals complete by simply returning from execute method:
public ExecutionResults execute(CommandContext ctx) throws Exception {
// Set results if exist. Otherwise, return empty ExecutionResults.
ExecutionResults results = new ExecutionResults();
// This would match the name of an output parameter for the work item.
// results.setData("result", "result data");
logger.info("Command finished execution: " + this.getClass());
logger.debug("Results of executing command: ", results);
return results;
}
`
Add a XOR gateway after the node B, Add script to of the node B and set the status and retry_count of web-service(if success, status_b = true; if failed, status_b = false and retry_count ++),
XOR go to C if the retry_count>=3 or status_b == true
else go to B again

Spring Batch (java-config) executing step after a jobExeuctionDecider

I'm trying to configure a Flow in spring batch using java-config,
this flow basically has to do this:
Execute a init step(which adds a record in the database),
then execute a decider to check file existence,
2.1. IF the files exists it will execute the load job (which is another flow with a bunch of steps in parallel)
Execute a finish step (which adds a record in the database), this should always run, even if 2.1 was not executed.
I tried to do this configuration, but the finish step never runs:
Flow flow = new FlowBuilder<SimpleFlow>("commonFlow")
.start(stepBuilderFactory.get("initStep").tasklet(initTasklet).build())
.next(decider)
.on(FlowExecutionStatus.COMPLETED.getName())
.to(splitFlow)
.from(decider).on("*")
.end()
.next(stepBuilderFactory.get("finishStep").tasklet(finishTasklet).build())
.end();
I'm able to make it work doing as below, but it is not elegant at all:
Step finishStep = stepBuilderFactory.get("finishStep").tasklet(finishTasklet).build();
Flow flow = new FlowBuilder<SimpleFlow>("commonFlow")
.start(stepBuilderFactory.get("initStep").tasklet(initTasklet).build())
.next(decider)
.on(FlowExecutionStatus.COMPLETED.getName())
.to(splitFlow)
.next(finishStep)
.from(decider).on("*")
.to(finishStep)
.end();
Does anybody know how is the right way to execute a step after a decision using java-config?
It sounds like this is being made MUCH more complicated than it needs to be. You do not need to configure a flow or decider. This is a VERY simple in and out job.
The simplest option is to use Spring Integration to detect the presents of a file and trigger the job.
The next simplest option is just to have a quartz or CRON job check for the file and start the batch job.
Last but not least you can have the job run and if the ItemReader can not find the file(s) just swallow the exception. Or set a FileItemReader Listener to check for files on it's before method.
You can use two separate Flows to achieve this.
Flow flowWithDecider = new FlowBuilder<SimpleFlow>("flow-with-decider")
.start(decider)
.on("ok")
.to(okStep1)
.next(okStep2)
.from(decider)
.on("failed")
.to(failedStep)
.build()
Flow commonFlow = new FlowBuilder<SimpleFlow>("common-flow")
.start(commonStep)
.build()
Job job = jobs
.get("my-job")
.start(flowWithDecider())
.next(commonFlow()) // this will execute after the previous flow, regardless of the decision
.end()
.build();

Extract JobID etc from Hadoop Job

I am running a Hadoop jar file inside a cluster. From the documentation, I know that Hadoop manages JobID, Start time etc. Is it possible to get the parameters so that we can show them on our web interface just to let user know how much time the job will consume (e.g. estimated duration)?
All the details shown in the Jobtracker UI can be obtained easily by using the APIs provided.
Use jobclient API refer : https://hadoop.apache.org/docs/current/api/org/apache/hadoop/mapred/JobClient.html
and Jobstatus api refer : https://hadoop.apache.org/docs/current/api/org/apache/hadoop/mapred/JobStatus.html
Using the combination of jobclient and jobstatus(jobsToComplete(), getAllJobs() ) you can retrieve the JobId . Once you get the jobId you can easily get all the other details by just calling the functions in the API.

Categories

Resources