I want to create a blocking queue which blocks producer on the basis of customized rules instead of number of items in the queue.
For example:
Producer produces some files and puts into a queue. Consumer transfers them to a specific location after some analysis.
For above scenario, I want producer waiting to produce new files if the size of total files in the queue reaches some threshold value. Queue can accept any number of files if the total size don't cross threshold value.
I would probably subclass a BlockingQueue such as the ArrayBlockingQueue and add a simple CountDownLatch which is initialized to the threshold value and enables the various take/remove methods when reaching 0.
I think you will have to implement this locking mechanism yourself. You could use wait/notify or ReentrantLock/Condition, a long variable holding the combined length and a LinkedList holding the files.
Related
I'm writing app for Android that process real-time data.
My app reads binary data from data bus (CAN), parse and display it on the screen.
App reads data in background thread. A need rapidly transfer data from one thread to another. Displaying data should be most actual.
I've found the nice java queue that almost implements required behavior: LinkedBlockingQueue. I plan to set the strong limit for this queue (about 100 messages).
Consumer thread should read data from queue with the take() method. But producer thread can't wait for consumer. By this reason it can't use standard method put() (because it's blocking).
So, I plan to put messages to my queue using the following construction:
while (!messageQueue.offer(message)) {
messageQueue.poll();
}
That is, the oldest message should be removed from queue to provide a place for the new actual data.
Is this a good practice? Or I've lost some important details?
Can't see anything wrong with it. You know what you are doing (loosing the head record). This can't relate to any practice; it's your call to use the api like you want. I personally prefer ArrayBlockingQueue though (less temp objects).
This should be what you're looking for: Size-limited queue that holds last N elements in Java
Top answer refers to an apache lib queue which will drop elements.
My program requires me to take data from two queues and sort them into a priority queue. The first queue is for planes landing and it takes priority over the second queue of planes trying to take off. I am having trouble understanding how to set up the priority queue and take the two separate queues and sort them correctly into the priority queue.
A priority queue should be automatically orderered.
That is to say, when you poll it takes the least element according to the specified ordering (or natural ordering if none is specified).
So if you want to use a priority queue, write a comparator so that the elements from the first queue get selected first (perhaps using a wrapper class).
A better solution might be to write your own queue. When polling from this queue, just check the first queue for available items, if none are available, check the second.
I have a msgs server with 2 different type of threads, one that reads from the client, and the other writes in another client (depending on the receiver)... (and yes, it has to be that way, I cant have the read/write in the same thread...)
I basically need to store somewhere in an ArrayList(Server ?) with all msgs, holding them up until the other client connects to the server.
My problem is:
I can easily read the object from the Thread, however i can't see any way to extract the object to a shared ArrayList in order to get acess to him in the other thread.
--->Input Thread ---> ArrayList ---> OutputThread
It sounds like what you really need is a thread-safe queue, not necessarily an ArrayList. The BlockingQueue interface is meant specifically for this sort of thing. Your input thread can put messages into the queue, and the output thread can remove them. If the queue is empty when the output thread tries to take a message from it, it'll automatically wait for the input thread to add a message.
There are a number of classes that implement the BlockingQueue interface, but you'll probably want to use one of these two:
ArrayBlockingQueue is based on a fixed-size array, so you have to choose a size when you construct one, and that's the limit for how many items can be held in the queue. If the queue is full when the input thread tries to put a message into it, the input thread will wait for the output thread to remove one of the messages already in the queue.
LinkedBlockingQueue doesn't require a size limit; you can have a queue that never gets "full", so the input thread can keep putting more and more messages into it even if the output thread isn't removing them fast enough to keep up. (Queueing up too many messages can eventually lead to an OutOfMemoryError.)
I need a blocking queue that has a size of 1, and every time put is applied it removes the last value and adds the next one. The consumers would be a thread pool in which each thread needs to read the message as it gets put on the queue and decide what to do with it, but they shouldn't be able to take from the queue since all of them need to read from it.
I was considering just taking and putting every time the producer sends out a new message, but having only peek in the run method of the consumers will result in them constantly peeking, won't it? Ideally the message will disappear as soon as the peeking stops, but I don't want to use a timed poll as it's not guaranteed that every consumer will peek the message in time.
My other option at the moment is to iterate over the collection of consumers and call a public method on them with the message, but I really don't want to do that since the system relies on real time updates, and a large collection will take a while to iterate through completely if I'm going through each method call on the stack.
After some consideration, I think you're best off, with each consumer having its own queue and the producer putting its messages on all queues.
If there are few consumers, then putting the messages on those few queues will not take too long (except when the producer blocks because a consumer can't keep up).
If there are many consumers this situation will be highly preferable over a situation where many consumers are in contention with each other.
At the very least this would be a good measure to compare alternate solutions against.
I am trying to add asynchronous output to a my program.
Currently, I have an eventManager class that gets notified each frame of the position of any of the moveable objects currently present in the main loop (It's rendering a scene; some objects change from frame to frame, others are static and present in every frame). I am looking to record the state of each frame so I can add in the functionality to replay the scene.
This means that I need to store the changing information from frame to frame, and either hold it in memory or write it to disk for later retrieval and parsing.
I've done some timing experiments, and recording the state of each object to memory increased the time per frame by about 25% (not to mention the possibility of eventually hitting a memory limit). Directly writing each frame to disk takes (predictably) even longer, close to twice as long as not recording the frames at all.
Needless to say, I'd like to implement multithreading so that I won't lose frames per second in my main rendering loop because the process is constantly writing to disk.
I was wondering whether it was okay to use a regular queue for this task, or if I needed something more dedicated like the queues discussed in this question.
In my situation, there is only one producer (the main thread), and one consumer (the thread I want to asynchronously write to disk). The producer will never remove from the queue, and the consumer will never add to it - so do I need a specialized queue at all?
Is there an advantage to using a more specialized queue anyway?
Yes, a regular Queue is inappropriate. Since you have two threads you need to worry about boundary conditions like an empty queue, full queue (assuming you need to bound it for memory considerations), or anomalies like visibility.
A LinkedBlockingQueue is best suited for your application. The put and take methods use different locks so you will not have lock contention. The take method will automatically block the consumer writing to disk if it somehow magically caught up with the producer rendering frames.
It sounds like you don't need a special queue, but if you want the thread removing from the queue to wait until there's something to get, try the BlockingQueue. It's in the java.util.concurrent package, so it's threadsafe for sure. Here are some relevant quotes from that page:
A Queue that additionally supports operations that wait for the queue
to become non-empty when retrieving an element, and wait for space to
become available in the queue when storing an element.
...
BlockingQueue implementations are designed to be used primarily for
producer-consumer queues, but additionally support the Collection
interface.
...
BlockingQueue implementations are thread-safe.
As long as you're already profiling your code, try dropping a BlockingQueue in there and see what happens!
Good luck!
I don't think it will matter much.
If you have 25% overhead serializing a state in memory, that will still be there with a queue.
Disk will be even more expensive.
The queue blocking mechanism will be cheap in comparison.
One thing to watch for is your queue growing out of control: disk is slow no matter what, if it can't consume queue events fast enough you're in trouble.