Wicket: how to handle long running tasks - java

I've setup a Wicket + Hibernate + Spring Web application that involves gathering some data (having some files generated and returned), storing this in a database, creating some images and displaying all this on a webpage.
This all works fine for short runs, but sometimes gathering the data (which involves some remote number crunching) takes too long (20+ minutes) and times out. I've tried to resolve this using two approaches, but both of them show some problems.
The first approach was using AjaxLazyLoadPanels and just doing everything within the getLazyLoadComponent. This worked fine for the short runs, but for the 20+ minute runs the LazyLoadComponents would not load (nice oxymoron there) due to timeouts.
The second approach involved creating an intermediate Fragment with an added AjaxSelfUpdatingTimerBehavior with a duration set to 10 seconds, that polled for the files that are created in the number crunching. This seems to make the tasks run in the background without problems, but fails when the returned data needs to be stored in the database. I'm using the Open Session in View pattern, but maybe this fails when attempting to store data after 20 minutes?? (Solution could lie in resolving this..).
Due to the above problems I'm now reading up on alternate approaches to handle these long running tasks and came across:
org.apache.wicket.util.time.Task
org.apache.wicket.util.watch.ModificationWatcher
I'm now wondering if either of these might be better suited to solve the time-out problems I'm having in both running the tasks and storing the data in the database afterwards, or if anyone has any other solutions that might help in this situation.
I'd really like to know if a new approach is viable before I spend another day implementing something that might turn out not to work after all.
Best regards,
Tim

I know we have had success in using a Panel with an attached AjaxSelfUpdatingTimerBehavior. The task and the results piece are separated from the view logic, but are made accessible for the view via a service you create. The service implementation we have used is then responsible for starting a TheadPool or ExectutorService for running the individual tasks. The service can provide a way to monitor the progress/status of the particular job/call that is taking place. Once it is complete it should also make the data available for the view. Injection of a SessionFactory into the service implementation (or injected DAO) should be sufficient to create the HibernateSession outside of a WebSession.

Related

Vaadin with Spring and Redis loads previous UI state on slow environment

I successfully enabled redis in a spring boot + vaadin application and it runs fine on my computer. The application is on a test run in a slower environment and an error occours multiple times.
WARN c.v.s.communication.ServerRpcHandler [ServerRpcHandler.java : 266] - Unexpected message id from the client. Expected: 248, got: 249
It seems like it happens when the serialization/deserialization of the VaadinSession takes too long. For example I have a page that has multiple checkboxes. I click on the first, then the second and third. After this the upper warn is thrown and a previous state of the page appears. In this case it might be without any cheched checkboxes or with one or two checked checkboxes. In rare cases it works properly.
I can't think of a solution for the problem. One thing I tried is showing a loading indicator immediately (100ms) (the default is after 300ms of loading) but it doesn't solve the problem.
Can I somehow configure when the serialization/deserialization occurs instead of every UI change or make it faster by leaving parts of the VaadinSession out of it? (I need the data on the current page so I can't make the ui components transient.)
We had a discussion about the problem in my workplace and we think that the components are working properly. The problem occurs when a serialization is slower then the next request's deserialization. (Every UI change begins with a deserialization to get the latest state then serializes the modified state.) My solution was creating an aspect that stores the latest VaadinSession that was sent for serialization and compares every deserialized VaadinSession to the stored one. I keep the one with the higher lastProcessedClientToServerId. This solves the issue almost every time.

Search optimization when data owner is someone else

In my project, we have 2 REST calls which take too much time, so we are planning to optimize that. Here is how it works currently - we make 1st call to system A and then pass the response to system B for further processing. Once we get the response from system B, we have to manipulate it further before passing it to UI layer and this entire process takes lot of time. We planned on using Solr/Lucene but since we are not the data owners, we can't implement that. Can someone please shed some light on how best this can be handled? We are using Spring MVC and Spring webflow. Thanks in advance!!
[EDIT:] This is not the actual scenario and I am writing this as an example for better understanding. Think of this as making a store locator call for a particular zip to get a list of 100 stores and then sending those 100 stores to another call to get a list of inventory etc. So, this list of stores would change for every zip code and also the inventory there.
If your queries parameters to System A / System B are frequently the same you can add a cache framework to your code. If you use Spring3, you can use the cache easily with an #Cacheable annotation on your code calling SystemA. See :
http://static.springsource.org/spring/docs/3.1.0.M1/spring-framework-reference/html/cache.html
The cache subsystem will cache the result including processing code.

How to download images asynchronously from web server

My application screen looks similar to the image attached. I have multiple rows and each row has a Bitmap image, title and description field. All the information's are getting fetched from our supporting web-server in the form of XML. Now, I have used observer design pattern, which creates a separate thread for connecting to my remote server over HTTP, downloads and parse the XML. The XML includes the URL for image, title and description for each row.
I have tried few approaches so far,
Approach 1:
Created a separate method (drawRow()), which takes care of putting the contents together by specifying the layout.
And, then using the method downloadImage(), I am trying to download the remote URL from the drawRow() method. But it sucks, as it downloads using the same thread and UI gets blocked.
Approach 2:
While searching for the above issue, I came across WebBitmapField in blackberry from coderholic.com.
And, then I am using the below code from my drawRow() method. As I understand the WebBitmapField, here is using observer design pattern and the image is downloading over thread other than UI thread. It works fine when I have limited number of rows like 5 or 10. But when I have more number of rows to be drawn it throws TooManyThreads exception, as it creates a new thread for each row.
I have got this link taskworker-thread-blackberry, but not much clear with how to achieve my requirement.
As I understand, in blackberry an application can create maximum of 16 number of threads. So, now I believe. I may need to create a thread pool specifying max size to 10.
Can anyone please help me to understand and implement the thread pooling for blackberry for my current problem?
Also, I appreciate anyone giving me any other best approach which will fit for my requirement.
Thanks in advance.
You have everything what you need. So:
Create one TaskWorker for your application (use singelton)
Implement Task class from TaskWorker - DownloadImageTask (simply put everything from Runnable.run() to Task.doTask() method)
Instead of new thread creation in Util.getWebData() call TaskWorker.addTask()
There are probably more minor details but you could figure out how to finish it.
And I think it's better to have two method in Callback - success(byte[] data) and error(Throwable error) - to determine end result and escape converting images to String and back.

Database management in Java

I'm making a server with Java that will provide chat services for flash clients. The server will store data about each user on a .txt file somewhere on the server. For example when a user logs in, information about this user is requested to the DatabaseManger class. It will then search through the database and return the information. The point is that when allot of people log in a short amount of time the server is doing allot of checks again and again.
The idea that I want to implement is that a connection class does something like this:
String userData = DatabaseManager.getUserData(this.username);
The DatabaseManager then doesn't search immediately, it stores this request in an array of requests, then in a fixed interval it goes through the database 1 time and returns data to the clients that requested this. This way when 15 people log in in a second it wont go through all the information 15 times. How to implement this?
You use a real DBMS like everyone else on the planet. I'm eager to hear a reason why someone wouldn't choose a DB for this application. I can't think of anything that would prevent it. Back in the day, RDBMS were ungainly, expensive, complicated beasts. Today, they're as readily available as tabloids at the checkout counter.
There are few excuses to not to a DB nowadays, and arguably there are more excuses to use the DB than the file system for most any application.
As above I'd recommend using an existing database solution like HSQLDB, you'd be far better off in the long run doing things this way rather than hacking your own solution together.
If you really want to do this anyway, have a look at the ScheduledExecutorService. You can then fire off a request to the executor service with a delay, and in that delay listen for more data and add it to the query.

Using A BlockingQueue With A Servlet To Persist Objects

First, this may be a stupid question, but I'm hoping someone will tell me so, and why. I also apologize if my explanation of what/why is lacking.
I am using a servlet to upload a HUGE (247MB) file, which is pipe (|) delineated. I grab about 5 of 20 fields, create an object, then add it to a list. Once this is done, I pass the the list to an OpenJPA transactional method called persistList().
This would be okay, except for the size of the file. It's taking forever, so I'm looking for a way to improve it. An idea I had was to use a BlockingQueue in conjunction with the persist/persistList method in a new thread. Unfortunately, my skills in java concurrency are a bit weak.
Does what I want to do make sense? If so, has anyone done anything like it before?
Servlets should respond to requests within a short amount of time. In this case, the persist of the file contents needs to be an asynchronous job, so:
The servlet should respond with some text about the upload job, expected time to complete or something like that.
The uploaded content should be written to some temp space in binary form, rather than keeping it all in memory. This is the usual way the multi-part post libraries to their work.
You should have a separate service that blocks on a queue of pending jobs. Once it gets a job, it processes it.
The 'job' is simply some handle to the temporary file that was written when the upload happened... and any metadata like who uploaded it, job id, etc.
The persisting service needs to upload a large number of rows, but make it appear 'atomic', either model the intermediate state as part of the table model(s), or write to temp spaces.
If you are writing to temp tables, and then copying all the content to the live table, remember to have enough log space and temp space at the database level.
If you have a full J2EE stack, consider modelling the job queue as a JMS queue, so recovery makes sense. Once again, remember to have proper XA boundaries, so all the row persists fall within an outer transaction.
Finally, consider also having a status check API and/or UI, where you can determine the state of any particular upload job: Pending/Processing/Completed.

Categories

Resources