I'm using the enhance for loop in my android application to iterate over a list of Status Objects.
for (Status stat : status)
timelineItems.add(new TimelineItem(stat));
This is generating the following IndexOutOfBoundsException:
java.lang.IndexOutOfBoundsException: Invalid index 0, size is 0
I realized that the this is happening is because the size of the ArrayList is 0 which means its empty. My question is why did it even enter the loop in the first place isn't the enhanced for created to guarantee that these problems don't occur and how to I stop it.
First of all: Don't use enhanced for loop on Android if you're using an ArrayList. It's slower than the regular one and uses more memory. For all other collections, it's OK.
What you're experiencing probably relates to non-synchronized modification of the collection. If the size suddenly changes while the loop is running, it'll cause errors like that one.
The only way this can happen is if somebody else (another thread) clears the collection while the enhanced for loop was executing. It identified that there was a zeroth element, but when it came to access it it had been deleted.
Related
I've been scratching my head at this for hours now, but I can't seem to figure this one out.
I'm writing a program that has a constant loop going in a Thread, which occasionally sends back data to another part of the program using an Arraylist.
I only want to send this data back after I have gathered 11 items in my arraylist. So I used this code:
//The rest of the loop in which i gather the values for key and velocity.
notenbuffer.add(new Noot(key,velocity));
if (notenbuffer.size() > 10){
System.out.println("Notenbuffer > 10, verstuur data");
if (notenbuffer.isEmpty()){
System.out.println("buffer is empty!");
}
else {
usbRead.writeNotes(notenbuffer);
System.out.println("emptied the buffer!");
notenbuffer.clear();
}
}
Now for some weird reason the program never empties the Arraylist, and just keeps on adding items to it. I've checked and it does in fact reach the usbRead.writeNotes(notenbuffer) part because this function gets called correctly. When I use the debugger it simply skips to the beginning of the loop after this function was called.
Is there a way in which I can empty an Arraylist once it reaches a certain size?
Edit: I made a logic error by writing if (notenbuffer.isEmpty()) this will always be false because I am already in an if statement which requires this to be false.
Did you put your second if inside your first by accident? (The missing indentation seems to suggest so). Having if
(notenbuffer.isEmpty()) inside the if (notenbuffer.size() > 10) block
makes no sense at all logically. After all if your List size is > 10
then the list is obviously not empty. So if (notenbuffer.isEmpty())
can never be true at all. – OH GOD SPIDERS 15 mins ago
This was indeed the problem. Removing this exposed an NoClassDefFoundError that needed resolving.
This question already has answers here:
How do I make my ArrayList Thread-Safe? Another approach to problem in Java?
(8 answers)
Closed 7 years ago.
This question is more about asking if my way of doing something is the "correct" way or not. I have some program that involves constantly updating graphical components. To that effect, I have the method below.
public void update(){
for (BaseGameEntity movingEntity : movingEntityList) {
((MovingEntity)movingEntity).update();
}
}
Essentially, the class containing this method has a list of all graphical objects that need updating and it loops through, calling their respective update methods.
The issue comes when I have to add new entities or remove current entities from this list. The addition and removal of entities is handled by a different thread, and as you can guess, this results in a Concurrent Modification Exception if I try to add/remove entities while also looping through and updating their graphical components.
My ad hoc solution was to simply throw a try-catch block around this and just ignore any concurrent modification exceptions that crop up - in effect, not updating at that specific time. This does exactly what I want and no problems occur.
public void update(){
try{
for (BaseGameEntity movingEntity : movingEntityList) {
((MovingEntity)movingEntity).update();
}
}catch(ConcurrentModificationException e){
//Do Nothing
}
}
However, my question is, is this a "proper" way of handling this issue? Should I perhaps be doing something akin to what is outlined in this answer? What is the "correct" way to handle this issue, if mine is wrong? I'm not looking specifically for ways to make my arraylist thread safe such as through synchronized lists, I'm specifically asking if my method is a valid method or if there is some reason I should avoid it and actually use a synchronized list.
The proper way would be to synchronize the list with Collections.synchronizedList():
List list = Collections.synchronizedList(new ArrayList());
...
synchronized (list) {
Iterator i = list.iterator(); // Must be in synchronized block
while (i.hasNext())
foo(i.next());
}
If you are traversing way more than the number of times you update your list, you can also use CopyOnWriteArrayList.
If you don't mind occasional missing updates (or if they happen way too infrequently for the price of synchronization), your way is fine.
Is this a "proper" way of handling this issue?
If you do not mind getting an increase of concurrency at the expense of dropping the updates on error, then the answer is "yes". You do run the risk of not completing an update multiple times in a row, when significant additions and removals to the list happen often.
On the other hand, when the frequency of updates is significantly higher than the frequency of adding/removing an object, this solution sounds reasonable.
Should I perhaps be [using synchronized]?
This is also a viable solution. The difference is that an update would no longer be able to proceed when an update is in progress. This may not be desirable when the timing of calls to update is critical (yet it is not critical to update everything on every single call).
Some people consider it as a duplicate of all the generic synchronization questions. I think this is not the case. You are asking for a very specific constellation, and whether your solution is "OK", in this sense.
Based on what you described, the actual goal seems to be clear: You want to quickly and concurrently iterate over the entities to call the update method, and avoid any synchronization overhead that may be implied by using Collections#synchronizedList or similar approaches.
Additionally, I assume that the main idea behind the solution that you proposed was that the update calls have to be done very often and as fast as possible, whereas adding or removing entities happens "rarely".
So, adding and removing elements is an exception, compared to the regular operations ;-)
And (as dasblinkenlight already pointed out in his answer) for such a setup, the solution of catching and ignoring the ConcurrentModificationException is reasonable, but you should be aware of the consequences.
It might happen that the update method of some entities is called, and then the loop bails out due to the ConcurrentModificationException. You should be absolutely sure that this does not have undesirable side-effects. Depending on what update actually does, this might, for example, cause some entities to move faster over the screen, and others to not move at all, because their update calls had been missed due to several ConcurrentModificationExceptions. This may be particularly problematic if adding and removing entities is not an operation that happens rarely: If one thread constantly adds or removes elements, then the last elements of the list may never receive an update call at all.
If you want some "justification by example": I first encountered this pattern in the JUNG Graph Library, for example, in the SpringLayout class and others. When I first saw it, I cringed a little, because at the first glance it looks horribly hacky and dangerous. But here, the justification is the same: The process has to be as fast as possible, and modifications to the graph structure (which would cause the exception) are rare. Note that the JUNG guys actually do recursive calls to the respective method when the ConcurrentModificationException happens - simply because they can't always assume the method to be called constantly by another thread. This, in turn, can have nasty side-effects: If another thread does constant modifications, and the ConcurrentModificationException is thrown each time when the method is called, then this will end with a StackOverflowError... But this is not the case for you, fortunately.
What happens when a map has to be resized to accommodate more items? What will happen if another thread calls get() when resizing is underway?
Your question on what will happen if another thread calls get() while the map is being resized reveals a general lack of intuition about the Java Memory Model. Specifically, if the map implementation you use is not thread-safe, it will be irrelevant when get() is called—in the middle of resizing or while the map is sitting completely idle: the call from another thread will always be broken due to write visibility issues.
Simply put, there are only two conditions in Java: thread-safe and not thread-safe. No further details are necessary.
The answer is one word it will got crashed and will throw Exception might be UnsupportedOperationException.the reason behind this is ..
As HashMap is not thread safe so when another thread will try to read the data, it will not allow and stuff will occurs which completely depend upon implementation.
I will try to explain what will happen there. Just in case - one should not do this.
As far as I understand HashMap implementation, if you are writing to it with a single thread, it would not crash, but might not find your item. For two concurrent threads it would crash very early because of ConcurrentModificationException.
Why it won't crash
Resize operation in HashMap is sort of atomic - it first allocates your table and re-indexes items into it and then replaces an existing table with a new one
Table size grows, but does not shrink, so even with broken indexing function you are still within bounds.
get will never throw a null pointer exception - one can check that getEntry(Object key) would never throw a NullPointerException.
Why it might not find your item
Because table[hash(key)] operation is non-atomic. You might run into a situation when you have a hash value for an old map, but after resize it's already a new one.
We have JSF2.1 app deployed in weblogic10.3.4 ,in one of our backing bean ,when we try to assign the reference ArrayList to a List instance ,weblogic ends up in Struck thread ,during peak traffic to our application.
java.util.ArrayList.indexOf(ArrayList.java:210)
java.util.ArrayList.contains(ArrayList.java:199)
Any one has faced this issue before.
It is not entirely clear what you mean, so I'm going to assume that you mean a "stuck thread", and that the thread is stuck in the sense that it is continually executing at that point.
I can think of three plausible causes.
The object that is being searched for has a buggy equals(Object) method that in some circumstances goes into an infinite loop.
There are two (or more) threads accessing and/or updating the list roughly simultaneously, and you are not synchronizing properly. If you don't synchronize properly, there is a risk that the threads will see inconsistent views of the data structure, and that that will cause it behave in a way that seems impossible.
You've somehow set up a pathological situation that is causing one thread to be both reading and updating the list in the (incorrect) belief that it has two distinct lists.
My bet is that it is the second problem, since "heisenbugs" like that are more likely to occur when your server is under heavy load.
Finally, it is possible that the thread is not in an infinite loop, but is just taking a long time to do something. And it is possible that the loop involves other code, but each time you look at it is at that point.
I'm working on a Java class in an Android project that summarizes array entries saved in previous classes, with each entry itself being an array with multiple elements.
I've have created methods to move forwards and backwards through the entries, but given there can be over 100 entries I would like to create another method that cycles through them instead of pressing the "Next" button over and over again.
Is there a way to do this?
I've found that loops will only show the last entry, but below is the best example I can think of, of what I need.
for (int i = Selection; i<=Qty; i++){
Num.setText(Integer.toString(i));
loadNext();
try{
Thread.sleep(1500);
}catch(InterruptedException e){}
if (Brk=true){
break;
}
}
The solution that would be closest to your original answer would be to create a background thread that does the loop, loading each item inside an Activity.runOnUiThread(). You can also do a similar thing with AsyncTask and progress updates. See this article for more information on both of these:
http://developer.android.com/resources/articles/painless-threading.html
However, a better solution is to not have a loop at all - just have a timer, and increment your loop variable each time the timer runs.
It may work. However, it will cause your UI to freeze during each time you call the sleep method. In general, when you are dealing with UI stuff, never use Thread class. Instead, use the Handler class. There are a lot of documentation but if, after you have search exhaustively, you can't find a good example just let me know.
Your break condition seems wrong, and causes the loop breaks at the first iteration:
if (Brk=true){
break;
}
Brk=true is an assigment exception, not a comparation exception. It will return always true. The expresion should be Brk==trueto check if the variable value is true. But again, it is a boolean variable, so you don't need to compare, but just reference it at the if statement:
if (Brk){
break;
}