My application screen looks similar to the image attached. I have multiple rows and each row has a Bitmap image, title and description field. All the information's are getting fetched from our supporting web-server in the form of XML. Now, I have used observer design pattern, which creates a separate thread for connecting to my remote server over HTTP, downloads and parse the XML. The XML includes the URL for image, title and description for each row.
I have tried few approaches so far,
Approach 1:
Created a separate method (drawRow()), which takes care of putting the contents together by specifying the layout.
And, then using the method downloadImage(), I am trying to download the remote URL from the drawRow() method. But it sucks, as it downloads using the same thread and UI gets blocked.
Approach 2:
While searching for the above issue, I came across WebBitmapField in blackberry from coderholic.com.
And, then I am using the below code from my drawRow() method. As I understand the WebBitmapField, here is using observer design pattern and the image is downloading over thread other than UI thread. It works fine when I have limited number of rows like 5 or 10. But when I have more number of rows to be drawn it throws TooManyThreads exception, as it creates a new thread for each row.
I have got this link taskworker-thread-blackberry, but not much clear with how to achieve my requirement.
As I understand, in blackberry an application can create maximum of 16 number of threads. So, now I believe. I may need to create a thread pool specifying max size to 10.
Can anyone please help me to understand and implement the thread pooling for blackberry for my current problem?
Also, I appreciate anyone giving me any other best approach which will fit for my requirement.
Thanks in advance.
You have everything what you need. So:
Create one TaskWorker for your application (use singelton)
Implement Task class from TaskWorker - DownloadImageTask (simply put everything from Runnable.run() to Task.doTask() method)
Instead of new thread creation in Util.getWebData() call TaskWorker.addTask()
There are probably more minor details but you could figure out how to finish it.
And I think it's better to have two method in Callback - success(byte[] data) and error(Throwable error) - to determine end result and escape converting images to String and back.
Related
I want that if one image is selected it detects Label, text and faces in single image at one time only.
Just call all of the functions on your image file at once, then combine the results using something like zip in RXJava.
Alternatively, you could nest the results (e.g. call FirebaseVision.getInstance().onDeviceTextRecognizer.processImage(image) inside the onSuccessListener of another function), although this will take much longer to complete all.
If you provide code of your existing attempts, StackOverflow can help further.
I have written a Java program, which reads numbers from different files. The numbers are added while being read from the files and the sum is displayed in a browser. The browser keeps on displaying the new sum getting created at every step.
I know how to display static values in a browser. I can use Javascripts. But I don't know what mechanism to use to display continuously a changing value.
Any help is appreciated!
You'll have to request the data to display from the server. You can use a data-binding library like Knockout to automatically update the page as the underlying model changes, or you can just use a library like jquery to modify the DOM on your own.
Alternatively, you could keep a pipe open to the server using the Comet model: http://en.wikipedia.org/wiki/Comet_%28programming%29. However, it can be expensive to eat up a thread for long periods of time on your web server.
Good luck.
Check out knockout.js http://www.knockoutjs.com/ it is a framework for updating UI automatically when data changes
Right now I have a program with a Gui that indexes a url that I specify. I need to index 15 things at a time and I have been just opening 15 windows of the program and individually inputting the urls that i want to index. However these 15 url's change every hour or so... I have a separate program that stores these 15 constantly changing urls in a table in my mysql database. I am able to fetch these urls from my database(I store them in an arraylist) but im not sure how to go about multi-threading my application so that i dont have to do the manual work of inputing the urls into my application.
My question: Can someone give me an example/link me to a tutorial of how I would go about creating a new thread for each url in my arraylist(also this arraylist will change so will i need to make a new thread for changing this arraylist aswell?)
Ive looked at the java site on concurrency and high level concurrency but didn't really understand the examples they gave(I am still a beginning programmer so please bear with me)
Hopefully I explained what im trying to do with enough detail
Thanks in advance
EDIT: The urls i index change every couple of seconds which is why i dont think i can go through my method with each url one after another, hence why i believe it needs to be multithreaded?
2nd EDIT(I believe these guys understand what im asking):
#Jon Storm There are two issues going on 1) Getting URL list 2) Accessing said URLs. I would make the URL fetcher single-threaded and then dispatch out to a Thread Pool of fetchers. This dispatcher can also queue pending fetches, etc. – pst
#Jon Storm: could you please update your question to describe what you want to do more explicitely? If In understood correctly, you want to index a list of 15 URLs again and again, because the contents of the pages at these URLs changes every 3 seconds. And you want to update the list of URLs to index every hour, by getting them from a database. Is that right? – JB Nizet
It seems to me that your problem is not with multi-threading, but with inputting something in a GUI from the application fetching the URLs from the database.
Why don't you simply reuse the class (or some of the code, if it's impossible to reuse the classes as is) of your GUI application (i.e. the URL indexing method) inside the application which fetches the URLs from the database.
My guess is that you could very well index these 15 URLs one after the other, in a single thread. I would try doing that before trying to use threads.
The program would look like this:
Fetch the 15 URLs from the database and put them into a List
Iterate through the list and index each URL
Sleep for some time,
Go to 1
EDIT:
Since it seems the URLs must be indexed again and again until the list of URLs changes, I would use this algorithm:
Create a thread pool using Executors.newCachedThreadPool()
Get the URLs from the database
For each URL, create a task which will index the URL again and again, until interrupted (check that Thread.interrupted() returns false at each iteration)
submit each task to the executorService created at step 1, and keep the returned Future inside a list
Sleep/wait until the list of URLs to index changes
Cancel each Future (cancel(true)) of the list of Future instances
Go to step 2.
SwingWorker, shown here, is a good choice.
First, this may be a stupid question, but I'm hoping someone will tell me so, and why. I also apologize if my explanation of what/why is lacking.
I am using a servlet to upload a HUGE (247MB) file, which is pipe (|) delineated. I grab about 5 of 20 fields, create an object, then add it to a list. Once this is done, I pass the the list to an OpenJPA transactional method called persistList().
This would be okay, except for the size of the file. It's taking forever, so I'm looking for a way to improve it. An idea I had was to use a BlockingQueue in conjunction with the persist/persistList method in a new thread. Unfortunately, my skills in java concurrency are a bit weak.
Does what I want to do make sense? If so, has anyone done anything like it before?
Servlets should respond to requests within a short amount of time. In this case, the persist of the file contents needs to be an asynchronous job, so:
The servlet should respond with some text about the upload job, expected time to complete or something like that.
The uploaded content should be written to some temp space in binary form, rather than keeping it all in memory. This is the usual way the multi-part post libraries to their work.
You should have a separate service that blocks on a queue of pending jobs. Once it gets a job, it processes it.
The 'job' is simply some handle to the temporary file that was written when the upload happened... and any metadata like who uploaded it, job id, etc.
The persisting service needs to upload a large number of rows, but make it appear 'atomic', either model the intermediate state as part of the table model(s), or write to temp spaces.
If you are writing to temp tables, and then copying all the content to the live table, remember to have enough log space and temp space at the database level.
If you have a full J2EE stack, consider modelling the job queue as a JMS queue, so recovery makes sense. Once again, remember to have proper XA boundaries, so all the row persists fall within an outer transaction.
Finally, consider also having a status check API and/or UI, where you can determine the state of any particular upload job: Pending/Processing/Completed.
I've setup a Wicket + Hibernate + Spring Web application that involves gathering some data (having some files generated and returned), storing this in a database, creating some images and displaying all this on a webpage.
This all works fine for short runs, but sometimes gathering the data (which involves some remote number crunching) takes too long (20+ minutes) and times out. I've tried to resolve this using two approaches, but both of them show some problems.
The first approach was using AjaxLazyLoadPanels and just doing everything within the getLazyLoadComponent. This worked fine for the short runs, but for the 20+ minute runs the LazyLoadComponents would not load (nice oxymoron there) due to timeouts.
The second approach involved creating an intermediate Fragment with an added AjaxSelfUpdatingTimerBehavior with a duration set to 10 seconds, that polled for the files that are created in the number crunching. This seems to make the tasks run in the background without problems, but fails when the returned data needs to be stored in the database. I'm using the Open Session in View pattern, but maybe this fails when attempting to store data after 20 minutes?? (Solution could lie in resolving this..).
Due to the above problems I'm now reading up on alternate approaches to handle these long running tasks and came across:
org.apache.wicket.util.time.Task
org.apache.wicket.util.watch.ModificationWatcher
I'm now wondering if either of these might be better suited to solve the time-out problems I'm having in both running the tasks and storing the data in the database afterwards, or if anyone has any other solutions that might help in this situation.
I'd really like to know if a new approach is viable before I spend another day implementing something that might turn out not to work after all.
Best regards,
Tim
I know we have had success in using a Panel with an attached AjaxSelfUpdatingTimerBehavior. The task and the results piece are separated from the view logic, but are made accessible for the view via a service you create. The service implementation we have used is then responsible for starting a TheadPool or ExectutorService for running the individual tasks. The service can provide a way to monitor the progress/status of the particular job/call that is taking place. Once it is complete it should also make the data available for the view. Injection of a SessionFactory into the service implementation (or injected DAO) should be sufficient to create the HibernateSession outside of a WebSession.