Play Framework await() makes the application act wierd - java

I am having some strange trouble with the method await(Future future) of the Controller.
Whenever I add an await line anywhere in my code, some GenericModels which have nothing to do with where I placed await, start loading incorrectly and I can not access to any of their attributes.
The wierdest thing is that if I change something in another completely different java file anywhere in the project, play will try to recompile I guess and in that moment it starts working perfectly, until I clean tmp again.

When you use await in a controller it does bytecode enhancement to break a single method into two threads. This is pretty cool, but definitely one of the 'black magic' tricks of Play1. But, this is one place where Play often acts weird and requires a restart (or as you found, some code changing) - the other place it can act strange is when you change a Model class.
http://www.playframework.com/documentation/1.2.5/asynchronous#SuspendingHTTPrequests
To make it easier to deal with asynchronous code we have introduced
continuations. Continuations allow your code to be suspended and
resumed transparently. So you write your code in a very imperative
way, as:
public static void computeSomething() {
Promise delayedResult = veryLongComputation(…);
String result = await(delayedResult);
render(result); }
In fact here, your code will be executed in 2 steps, in 2 different hreads. But as you see it, it’s very
transparent for your application code.
Using await(…) and continuations, you could write a loop:
public static void loopWithoutBlocking() {
for(int i=0; i<=10; i++) {
Logger.info(i);
await("1s");
}
renderText("Loop finished"); }
And using only 1 thread (which is the default in development mode) to process requests, Play is able to
run concurrently these loops for several requests at the same time.
To respond to your comment:
public static void generatePDF(Long reportId) {
Promise<InputStream> pdf = new ReportAsPDFJob(report).now();
InputStream pdfStream = await(pdf);
renderBinary(pdfStream);
and ReportAsPDFJob is simply a play Job class with doJobWithResult overridden - so it returns the object. See http://www.playframework.com/documentation/1.2.5/jobs for more on jobs.
Calling job.now() returns a future/promise, which you can use like this: await(job.now())

Related

Fast file handling in multithreading mode

I am developing a web application in java. It has a file that is read on every page request, but I have to think about changing it (which is rare). You need to come up with something to make everything work as quickly as possible. Any advice?
You can use a sort of copy_on_change.
If your file is /path/myFile_v1.txt
Write a code similar to
private AtomicInteger version=1;
public String getFilePath() {
return "/path/myFile_v"+version.get()+".txt"
}
public void synchronized makeAChange() {
// create a new copy of the file with some changes
version.incrementAndGet();
}
you can remove old copies after some time or remove version-2 each time.
Reading threads are not blocked while you make changes.

How to cancel a nexus task properly (as executable script)?

I have written a cleanup script for a nexus repository, that verifies if a component is used by calling a rest interface of a third party system. Long story short: This script has a significant runtime due to the amount of components and network traffic. So it needs to be cancelable, and with this requirement my problems are rising.
this script was written in groovy and executed manually using the task interface of nexus. Created a new Task using the "Execute script" skeleton. When trying to stop the task it only states in the log:
admin org.sonatype.nexus.quartz.internal.task.QuartzTaskJob - Task not cancelable: 'TEST' [script]
Unfortunately the example scripts of nexus are all short n handy and no one is using the cancel feature itself.
So in looked up the source of the suggested QuartzTaskJob and i saw that this is a Wrapper of a Nexus Task and it should be automaticaly used by the QuartzSchedulerSPI.
Therefore i've implemented a dummy version of it to test it with the implemented Tasks (e.g. RestoreMetadataTask) as examples. First by having sleep() in it and the second try was the for loop you see in the code quote below. Both tries had the same sad end.
import org.sonatype.nexus.scheduling.Cancelable;
import org.sonatype.nexus.scheduling.TaskSupport;
public class CancelableTask
extends TaskSupport
implements Cancelable
{
#Override
public String getMessage(){
return null
}
#Override
protected Void execute() throws Exception{
log.info("Start TimeTestCancel")
for(int i = 0; i < 1000000000000; i++){
for(int j = 0; j < 1000000000000; j++) {
if (isCanceled()) {
log.info(i)
break;
}
}
}
log.info("Finished TimeTestCancel")
}
}
Sleep and the loops: When testing on Nexus 3.7.1-02 it just changed the status to "Blocked", showing the "Stop" button but stating "Task not cancelable" again when pressing it. Not able to delete them or change them.
Just the loops: When testing on Nexus 3.21.1-01 it just executes so fast i cannot try to cancel it.
So i basically ask myself what am i missing? Is there a way to have cancelable Jobs as groovy script at all? Or do i have to implement a nexus plugin to archive my goal?
Not all tasks are cancellable. You could file an issue in issues.sonatype.com to ask for this, however, the scripting interface has a security issue and is no longer enabled in newer versions of NXRM3. I suspect ultimately it will be replaced by the REST API.

Only start function when previous function has completed

I have 3 methods and they're being called from the front end. You can only call a function once you have called the previous function, but don't necessarily need to call all of them. So you may call either just f1, or f1->f2 or f1->f2->f3.
My problem is that on the front end you can click on a function, before the previous one has even stopped running. I need each function to finish before the next function starts running.
What I'm doing at the moment, which works, is pausing the execution until the end of the previous function, but I'd like for a nicer answer:
f1 {
ready1=false
...
ready1=true }
f2 {
ready2=false
while (!ready1) {Thread.sleep(250);}
...
ready2=true }
f3 {
while (!ready2) {Thread.sleep(250);}
...
}
Is there an easy way to do this?
It sounds like you're using a web framework, so maybe include the framework, and it will have some built in tools. One example is to use the built in java tools.
class NoLookingBack{
CountDownLatch latchB = new CountDownLatch(1);
public void methodA(){
//do work.
latchB.countDown();
}
public void methodB(){
try{
latchB.await();
} catch(InterruptException e){
//do something or declare this method throws.
return;
}
}
}
I think this shows how more methods could be included by adding more latches when necessary.
Your example is flawed, and so is this solution. What if methodA fails, then methodB will block forever. The latch gives you the power to use a timeout value, then you can use a response that indicates a failure.

Simultaneously downloading of webpages/files in EJB(java)

I have a small problem with creating threads in EJB.OK I understand why i can not use them in EJB, but dont know how to replace them with the same functionality.I am trying to download 30-40 webpages/files and i need to start downloading of all files at the same time(approximately).This is need ,because if i run them in one thread in queue.It will excecute more than 3 minutes.
I try with #Asyncronious anotation, but nothing happened.
public void execute(String lang2, String lang1,int number) {
Stopwatch timer = new Stopwatch().start();
htmlCodes.add(URL2String(URLs.get(number)));
timer.stop();
System.out.println( number +":"+ Thread.currentThread().getName() + timer.elapsedMillis()+"miseconds");
}
private void findMatches(String searchedWord, String lang1, String lang2) {
articles = search(searchedWord);
for (int i = 0; i < articles.size(); i++) {
execute(lang1,lang2,i);
}
Here are two really good SO answers that can help. This one gives you your options, and this one explains why you shouldn't spawn threads in an ejb. The problem with the first answer is it doesn't contain a lot of knowledge about EJB 3.0 options. So, here's a tutorial on using #Asynchronous.
No offense, but I don't see any evidence in your code that you've read this tutorial yet. Your asynchronous method should return a Future. As the tutorial says:
The client may retrieve the result using one of the Future.get methods. If processing hasn’t been completed by the session bean handling the invocation, calling one of the get methods will result in the client halting execution until the invocation completes. Use the Future.isDone method to determine whether processing has completed before calling one of the get methods.

Apache JCI FilesystemAlterationMonitor processes changes for existing folder contents on startup

I am using Apache JCI's FAM (FileAlterationMonitor) in a Java OSGi Service to monitor and handle changes in the FileSystem. Everything seems to be working fairly well except whenever I start the Service (which starts the FAM using the code below), FAM picks up on ALL the changes that exist in the directory.
Currently I am watching /tmp
/tmp includes a subtree: /tmp/foo/bar/cat/dog
Everytime I start the service and which starts FAM, it reports DirectoryCreate events for:
/tmp/foo
/tmp/foo/bar
/tmp/foo/bar/cat
/tmp/foo/bar/cat/dog
Even if no changes have been made to any part of that subtree.
Code run on service activation:
File watchFolder = new File("/tmp");
watchFolder.mkdirs();
fam = new FilesystemAlterationMonitor();
fam.setInterval(1000);
fam.addListener(watchFolder, listener);
fam.start();
// I've already tried adding:
listener.waitForFirstCheck();
Listener example:
private FileChangeListener listener = new FileChangeListener() {
public void onDirectoryChange(File pDir) { System.out.println(pDir.getAbsolutePath()); }
public void onDirectoryCreate(File pDir) { System.out.println(pDir.getAbsolutePath()); }
...
}
Yes, that's one very annoying feature of JCI. When monitoring is started, it will notify you of all the files and directories it finds with calls to onXxxCreate(). I think you have the following options
After starting the monitoring, wait for some time (couple of seconds) in your FileChangeListener callback implementation before you actually process the events coming from JCI. That's what I did in a project and it works fairly well, although there is the possibility that you miss an actual file creation that just happens within the "grace period"
Take the sources of JCI and modify them to use two new event methods onDirectoryFound(File)and onFileFound(File) that will only be fired when files and directories are found on startup of the monitoring
Take a look at java.nio.file.WatchService that comes with Java 7. IMO the best option, as it uses native methods internally in order to be notified of changes by the OS, instead of starting a thread and checking periodically. With JCI, you may get delays in the range of several seconds until changes are propagated to your callbacks
Forget about WatchService. It is not intuitive and there are issues with it when trying to see if it can detect that the folder it is monitoring is deleted or changed. I would stay far away from it. I have worked with Watcher but prefer Apache IO much more. I believe Camel uses it as well.

Categories

Resources