Re ordering of a priorityblockingqueue after an element is added - java

I use a PriorityBlockingQueue to maintain a list of objects whose order is dictated using a comparator. My requirement is as follows: First, I add N objects to the queue, and the queue maintains the ordered list with it. Later, I change the value in the objects that had been added to the queue. The issue is that the queue is not refreshed based on the updated values in the objects. In contrast, I observed that the queue is refreshed when a single object is removed.
Is there anyway I can manually refresh the values in the queue before obtaining values from it in a very efficient manner?

Not with PriorityBlockingQueue. It sounds like the option you're looking for is decrease-key, which isn't supported by the Java priority queue abstractions.

Related

how to sort priority queues in java

My program requires me to take data from two queues and sort them into a priority queue. The first queue is for planes landing and it takes priority over the second queue of planes trying to take off. I am having trouble understanding how to set up the priority queue and take the two separate queues and sort them correctly into the priority queue.
A priority queue should be automatically orderered.
That is to say, when you poll it takes the least element according to the specified ordering (or natural ordering if none is specified).
So if you want to use a priority queue, write a comparator so that the elements from the first queue get selected first (perhaps using a wrapper class).
A better solution might be to write your own queue. When polling from this queue, just check the first queue for available items, if none are available, check the second.

Concurrent LinkedList vs ConcurrentLinkedQueue

I need a concurrent list that is thread safe and at the same time is best for iteration and should return exact size.
I want to to store auction bids for an item. So I want to be able to
retrieve the exact number of bids for an item
add a bid to a item
retrieve all the bids for a given item.
Remove a bid for a item
I am planning to have it in a
ConcurrentHashMap<Item, LinkedList<ItemBid>> -- LinkedList is not thread safe but returns exact size
ConcurrentHashMap<Item, ConcurrentLinkedQueue<ItemBid>> - concurrentlinked queue is thread safe but does not guarantee to return exact size
Is there any other better collection that will address the above 4 points and is thread safe.
Well arguably in a thread-safe collection or map you cannot guarantee the "consistency" of the size, meaning that the "happen-before" relationship between read and write operations will not benefit your desired use case, where a read operation on the size should return a value reflecting the exact state from the last write operation (N.B.: improved based on comments - see below).
What you can do if performance is not an issue is to use the following idiom - either:
Collections.synchronizedMap(new HashMap<YourKeyType, YourValueType>());
Collections.synchronizedList(new ArrayList<YourType>());
You'll then also need to explicitly synchronize over those objects.
This will ensure the order of operations is consistent at the cost of blocking, and you should get the last "right" size at all times.
You can use LinkedBlockingQueue. It is blocking (as apposed to the CLQ) but size is maintained and not scanned like the CLQ.

How to loop over a updatable list?

I'm building a Java Running class that will treat a set of items one by one. When working (running), that set may be updated (items added only).
How can I loop over that list by being sure that it will take into consideration newly added elements?
Update
Following the answers, I implemented the code that I suggested on Code Review.
Answer
You shoud use a Queue, i.e. java.util.concurrent.ConcurrentLinkedQueue, java.util.concurrent.LinkedBlockingQueue, java.util.concurrent.ArrayBlockingQueue or one of the other implementations that suits your needs.
They have different methods which allow you to implement different scenarios. You can check javadocs for differences between Queue methods (there are methods throwing exceptions, or returning nulls; methods for viewing elements or for retrieving them with removal).
                   Throws exception       Returns special value
Insert          add(e)                          offer(e)
Remove     remove()                       poll()
Examine    element()                      peek()
In case of BlockingQueue implementations there are additionally 2 options: blocking methods and methods that time out. The extended table for possible methods is in it's javadoc.
Choose carefully the required implementation. Do you want a fixed capacity? Do you want to block on retrieval from empty queue? Do you want your Queue bounded or unbounded. If still in doubt, look up Stack Overflow answers that explain differences between different queue types, Google them or check javadocs.
RANT
There is one problem you may or may not run into, depending on your design - how to tell the queue is empty because you are done producing elements. Is your producer (whatever is inserting elements into your queue) not fast enough to add items to the queue before they are consumed? Or is the queue empty because all the tasks where completed? In the latter case, if you use a blocking queue, you can block on retrieval of element when there will be none available. You can consider a non-blocking queue in such case, using a "poison pill" marker element that means the producers are done producing, or even better decouple producer from consumer by using an intermediary mediator class which holds the queue and producer / consumer interact only with the mediator.
Use a queue (for example, java.util.concurrent.LinkedBlockingQueue) instead of a list. Queues are specifically designed for this kind of scenario.
Don't go with List.
If I you use a Queue instance then you can call remove() and you would retrieve the elements in FIFO order. If you use an List instance then you can make no such guarantee.
Take the following code as an example:
ArrayList<Integer> list = new ArrayList<Integer>();
list.add(5);
list.add(4);
list.add(3);
list.add(2);
list.add(1);
list.set(4,5);
list.set(3,4);
list.set(2,3);
list.set(1,2);
list.set(0,1);
System.out.println(list);
Also, another difference is abstraction. With a Queue instance you don't have to worry about indexes and this makes things easier to think about if you don't need everything a List has to offer.

When to use queue over arraylist

One basic argument to use a Queue over an ArrayList is that Queue guarantees FIFO behavior.
But if I add 10 elements to an ArrayList and then iterate over the elements starting from the 0th element, then I will retrieve the elements in the same order as they were added. So essentially, that guarantees a FIFO behavior.
What is so special about Queue as compared to traditional ArrayList?
You can look at the javadoc here. The main difference is a List lets you look at any element whenever you want. A queue only lets you look at the "next" one.
Think about it as a real queue or as a line for the cash register at a grocery store. You don't ask the guy in the middle or the end to pay next, you always ask the guy who's in the front/been waiting the longest.
It's worth noting that some lists are queues. Look at LinkedList, for example.
If I gave you a Queue instance then you would know that by iteratively calling remove() you would retrieve the elements in FIFO order. If i gave you an ArrayList instance then you can make no such guarantee.
Take the following code as an example:
ArrayList<Integer> list = new ArrayList<Integer>();
list.add(5);
list.add(4);
list.add(3);
list.add(2);
list.add(1);
list.set(4,5);
list.set(3,4);
list.set(2,3);
list.set(1,2);
list.set(0,1);
System.out.println(list);
If I were now to give you this list, then my iterating from 0 to 4 you would not get the elements in FIFO order.
Also, I would say another difference is abstraction. With a Queue instance you don't have to worry about indexes and this makes things easier to think about if you don't need everything ArrayList has to offer.
The limitations imposed on a queue (FIFO, no random access), as compared to an ArrayList, allow for the data structure to be better optimized, have better concurrency, and be a more appropriate and cleaner design when called for.
In regards to optimization and concurrency, imagine the common scenario where a producer is filling a queue while a consumers consumes it. If we used an ArrayList for this, then in the naive implementation each removal of the first element would cause a shift operation on the ArrayList in order to move down every other element. This is very inefficient, especially in a concurrent implementation since the list would be locked for duration of the entire shift operation.
In regards to design, if items are to be accessed in a FIFO fashion then using a queue automatically communicates that intention, whereas a list does not. This clarity of communication allows for easier understanding of the code, and may possibly make the code more robust and bug free.
The difference is that for a Queue, you are guaranteed to pull elements out in FIFO order. For an ArrayList, you have no idea what order the elements were added. Depending on how you use it, you could enforce FIFO ordering on an ArrayList. I could also design a wrapper for a Queue that allowed me to pull out which-ever element I wanted.
The point I'm trying to make is that these classes are designed to be good at something. You don't have to use them for that, but that's what they are designed and optimized for. Queues are very good at adding and removing elements, but bad if you need to search through them. ArrayLists, on the other hand, are a bit slower to add elements, but allow easy random access. You won't see it in most applications you write, but there is often a performance penalty for choosing one over the other.
Yes!
I would have used poll() and peek() methods in queue which returns the value as well as remove , examine head element respectively .Also These methods provides you with a special value null if the operation fails and doesn't throws an exception as with remove() method will throw nosuchelement exception.
Ref: docs.oracle.com
For example, Queue methods poll() and remove() retrieves the element and removes it from the Queue.
Some implementation of Queue interface (PriorityQueue) allow to set a priority to the elements and retrieves them thanks to this priority. It is much more than a FIFO behaviour in that last case.
Consider a situation in which random processes update an arraylist randomly and we are supposed to process them in fifo?
There is absolutely no way to do that but to change the data structure from arraylist to queue

Java Concurrency: lock effiency

My program has 100 threads.
Every single thread does this:
1) if arrayList is empty, add element with certain properties to it
2) if arrayList is not empty, iterate through elements found in arrayList, if found suitable element (matching certain properties), get it and remove the arrayList
The problem here is that while one thread is iterating through the arrayList, other 99 threads are waiting for the lock on arrayList.
What would you suggest to me if I want all 100 threads to work in lock-less condition? So they all have work to do?
Thanks
Have you looked at shared vs exclusive locking? You could use a shared lock on the list, and then have a 'deleted' property on the list elements. The predicate you use to check the list elements would need to make sure the element is not marked 'deleted' in addition to whatever other queries you have - also due to potential read-write conflicts, you would need to lock on each element as you traverse. Then periodically get an exclusive lock on the list to perform the deletes for real.
The read lock allows for a lot of concurrency on the list. The exclusive locks on each element of the list are not as nice, but you need to force the memory model to update your 'deleted' flag to each thread, so there's no way around that.
First if you're not running on a machine that has 64 cores or more your 100 threads are probably a performance hog in themselves.
Then an ArrayList for what you're describing is certainly not a good choice because removing an element does not run in amortized constant time but in linear time O(n). So that's a second performance hog. You probably want to use a LinkedList instead of your ArrayList (if you insist on using a List).
Now of course I doubt very much that you need to iterate over your complete list each time you need to find one element: wouldn't another data structure be more appropriate? Maybe that the elements that you put in your list have such a concept as "equality" and hence a Map with an O(1) lookup time could be used instead?
That's just for a start: as I showed you, there are at least two serious performances issues in what you described.... Maybe you should clarify your question if you want more help.
If your notion of "suitable element (matching certain properties)" can be encoded using a Comparator then a PriorityBlockingQueue would allow each thread to poll the queue, taking the next element without having to search the list or enqueuing a new element if the queue is empty.
Addendum: Thilo raise an essential point: As your approach evolves, you may want to determine empirically how many threads are optimal.
The key is to only use the object lock on arraylist when you actually need to.
A good idea would be to subclass arraylist and provide synchro on single read + write + delete processes.
This will ensure fine granularity with the locking while allowing the threads to run through the array list while protecting the semantics of the arraylist.
Have a single thread own the array and be responsible for adding to it and iterating over it to find work to do. Once a unit of work is found, put the work on a BlockingQueue. Have all your worker threads use take() to remove work from the queue.
This allows multiple units of work to be discovered per pass through the array and they can be handed off to waiting worker threads fairly efficiently.

Categories

Resources