when declare iterator on global variable, not execute in my method - java

In the past, my app is record app
I use ArrayList ,but ArrayList occur memory leak
`private ArrayList<OutputInputPair>> pairs = new ArrayList<OutPutInputPair>();`
so, when I click record stop button execute pairs.clear(); pairs = null;
but if user never click record stop button. always occur memory leak.
so I use WeakHashMap in reference this site
ArrayList<WeakReference<Runnable>> - How to tidy up best?
in the past, I declare on global variable
private ArraytList<OutputInputPair> pairs = new ArrayList<OutputInputPair>();
I changed
private WeakHaspMap<OutputInputPair, Void> pairs = new WeahHashMap<OutputInputPair, Void>();
and Iterator<OutputInputPair> iterator = pairs.keySet().iterator(); declare on global variable
According to my plan, execute my method.
but if I declare WeakHashMap on global variable not execute my method.
my method source
public void process() {
while(iterator.hasNext()) {
OutputInputPair pair = iterator.next();
//data insert on queue
}
while (!stopped) { //when I click record stop button, stopped is true
while(iterator.hasNext()) {
OutputInputPair pair = iterator.next();
Log.d(TAG, "<<<<<process>>>>"); //not show this log
recordstart(pair);
}
}
}
but if write Iterator<OutputInputPair> iterator = pairs.keySet().iterator();on my method, execute my method.
#Override
public void process() {
Iterator<OutputInputPair> iterator = pairs.keySet().iterator(); //
while(iterator.hasNext()) {
OutputInputPair pair = iterator.next();
//data insert on queue
}
while (!stopped) { //when I click record stop button, stopped is true
while(iterator.hasNext()) {
OutputInputPair pair = iterator.next();
Log.d(TAG, "<<<<<process>>>>"); //not show this log
recordstart(pair);
}
}
}
this source execute my method.
in other words,why add 'Iterator iterator = pairs.keySet().iterator();` on global variable, not execute my method?
Why I want to be a global variable, if I add in my method, It runs indefinitely. because while(!stopped) .
please help me
thanks.

It's hard to say for sure without knowing more context to your code, but if you look at the docs for HashMap.keySet it reads
If the map is modified while an iteration over the set is in progress (except through the iterator's own remove operation), the results of the iteration are undefined.
So if you initialize an iterator over the keySet in a static variable (what you're calling "global" I guess), and then some other part of your code modifies the entries in the hash map, it could have unpredictable results.

Related

Refreshing cache without impacting latency to access the cache

I have a cache refresh logic and want to make sure that it's thread-safe and correct way to do it.
public class Test {
Set<Integer> cache = Sets.newConcurrentHashSet();
public boolean contain(int num) {
return cache.contains(num);
}
public void refresh() {
cache.clear();
cache.addAll(getNums());
}
}
So I have a background thread refreshing cache - periodically call refresh. And multiple threads are calling contain at the same time. I was trying to avoid having synchronized in the methods signature because refresh could take some time (imagine that getNum makes network calls and parsing huge data) then contain would be blocked.
I think this code is not good enough because if contain called in between clear and addAll then contain always returns false.
What is the best way to achieve cache refreshing without impacting significant latency to contain call?
Best way would be to use functional programming paradigm whereby you have immutable state (in this case a Set), instead of adding and removing elements to that set you create an entirely new Set every time you want to add or remove elements. This is in Java9.
It can be a bit awkward or infeasible however to achieve this method for legacy code. So instead what you could do is have 2 Sets 1 which has the get method on it which is volatile, and then this is assigned a new instance in the refresh method.
public class Test {
volatile Set<Integer> cache = new HashSet<>();
public boolean contain(int num) {
return cache.contains(num);
}
public void refresh() {
Set<Integer> privateCache = new HashSet<>();
privateCache.addAll(getNums());
cache = privateCache;
}
}
Edit We don't want or need a ConcurrentHashSet, that is if you want to add and remove elements to a collection at the same time, which in my opinion is a pretty useless thing to do. But you want to switch the old Set with a new one, which is why you just need a volatile variable to make sure you can't read and edit the cache at the same time.
But as I mentioned in my answer at the start is that if you never modify collections, but instead make new ones each time you want to update a collection (note that this is a very cheap operation as internally the old set is reused in the operation). This way you never need to worry about concurrency, as there is no shared state between threads.
How would you make sure your cache doesn't contain invalid entries when calling contains?? Furthermore, you'd need to call refresh every time getNums() changes, which is pretty inefficient. It would be best if you make sure you control your changes to getNums() and then update cache accordingly. The cache might look like:
public class MyCache {
final ConcurrentHashMap<Integer, Boolean> cache = new ConcurrentHashMap<>(); //it's a ConcurrentHashMap to be able to use putIfAbsent
public boolean contains(Integer num) {
return cache.contains(num);
}
public void add(Integer nums) {
cache.putIfAbsent(num, true);
}
public clear(){
cache.clear();
}
public remove(Integer num) {
cache.remove(num);
}
}
Update
As #schmosel made me realize, mine was a wasted effort: it is in fact enough to initialize a complete new HashSet<> with your values in the refresh method. Assuming of course that the cache is marked with volatile. In short #Snickers3192's answer, points out what you seek.
Old answer
You can also use a slightly different system.
Keep two Set<Integer>, one of which will always be empty. When you refresh the cache, you can asynchronously re-initialize the second one and then just switch the pointers. Other threads accessing the cache won't see any particular overhead in this.
From an external point of view, they will always be accessing the same cache.
private volatile int currentCache; // 0 or 1
private final Set<Integer> caches[] = new HashSet[2]; // use two caches; either one will always be empty, so not much memory consumed
private volatile Set<Integer> cachePointer = null; // just a pointer to the current cache, must be volatile
// initialize
{
this.caches[0] = new HashSet<>(0);
this.caches[1] = new HashSet<>(0);
this.currentCache = 0;
this.cachePointer = caches[this.currentCache]; // point to cache one from the beginning
}
Your refresh method may look like this:
public void refresh() {
// store current cache pointer
final int previousCache = this.currentCache;
final int nextCache = getNextPointer();
// you can easily compute it asynchronously
// in the meantime, external threads will still access the normal cache
CompletableFuture.runAsync( () -> {
// fill the unused cache
caches[nextCache].addAll(getNums());
// then switch the pointer to the just-filled cache
// from this point on, threads are accessing the new cache
switchCachePointer();
// empty the other cache still on the async thread
caches[previousCache].clear();
});
}
where the utility methods are:
public boolean contains(final int num) {
return this.cachePointer.contains(num);
}
private int getNextPointer() {
return ( this.currentCache + 1 ) % this.caches.length;
}
private void switchCachePointer() {
// make cachePointer point to a new cache
this.currentCache = this.getNextPointer();
this.cachePointer = caches[this.currentCache];
}

How does this asynchronous call work in my example

I learn Java and wonder if the item in this code line:
useResult(result, item);
Will be overrwritten by the next call coming from the
doItem(item);
Here´s the eaxmple:
public void doSomeStuff() {
// List with 100 items
for (Item item : list) {
doItem(item);
}
}
private void doItem(final Item item) {
someAsyncCall(item, new SomeCallback() {
#Override
public void onSuccess(final Result result) {
useResult(result, item);
}
});
}
the SomeCallback() happens some time in the future and it´s another thread
I mean will the useResult(result, item); item be the same when callback return?
Please advice what happens here?
I mean will the useResult(result, item); item be the same when callback return?
Of course it will, what would the utility of that be otherwise?
What you are doing is creating 100 different SomeCallback classes, that will process a different Item object.
A skeleton for your someAsyncCall may look like this:
public static void someAsyncCall(Item i, Callback callback) {
CompletableFuture.runAsync( () -> { // new thread
Result result = computeResult(i);
callback.onSuccess(result, i);
});
}
The point is: Callback, at the moment of instantiation, doesn't know anything about the Item he will get as parameter. He will only know it, when Callback::onSuccess is executed in the future.
So, will Item i change (be assigned a new object) ?
No, because it is effectively final within someAsyncCall (the object value is not explicitly changed).
You can't even assign i = new Item(), as the compiler will complain about the anonymous function accessing a non-final variable.
You could of course create a new Item and pass it to the callback
Item i2 = new Item();
callback.onSuccess(result, i2);
but then it would become one hell of a nasty library...
Nobody forbids you to do i.setText("bla") though, unless your Result class is immutable (the member fields are final themselves).
EDIT
If your questions is how java handles object in method parameters, then the answer is: yes, they are a just copy of the original instances.
You could try with a simple swap method void swap(Item i1, Item 12); and you'll notice the references are effectively swapped only within function, but as soon as you return the objects will point respectively to their original instances.
But it's a copy that reflects the original instance.
Coming back to your example. Imagine your someAsyncCall waits 10000 ms before executing the callback.
in your for loop, after you call doItem, you also do: item.setText("bla");.
When you print item.getName() within useResult you will get bla. Even though the text was changed after the async function was called.

java.lang.IllegalStateException while trying to use MongoDB BulkWriteOperation

I have this code that dumps documents into MongoDB once an ArrayBlockingQueue fills it's quota. When I run the code, it seems to only run once and then gives me a stack trace. My guess is that the BulkWriteOperation someone has to 'reset' or start over again.
Also, I create the BulkWriteOperations in the constructor...
bulkEvent = eventsCollection.initializeOrderedBulkOperation();
bulkSession = sessionsCollection.initializeOrderedBulkOperation();
Here's the stacktrace.
10 records inserted
java.lang.IllegalStateException: already executed
at org.bson.util.Assertions.isTrue(Assertions.java:36)
at com.mongodb.BulkWriteOperation.insert(BulkWriteOperation.java:62)
at willkara.monkai.impl.managers.DataManagers.MongoDBManager.dumpQueue(MongoDBManager.java:104)
at willkara.monkai.impl.managers.DataManagers.MongoDBManager.addToQueue(MongoDBManager.java:85)
Here's the code for the Queues:
public void addToQueue(Object item) {
if (item instanceof SakaiEvent) {
if (eventQueue.offer((SakaiEvent) item)) {
} else {
dumpQueue(eventQueue);
}
}
if (item instanceof SakaiSession) {
if (sessionQueue.offer((SakaiSession) item)) {
} else {
dumpQueue(sessionQueue);
}
}
}
And here is the code that reads from the queues and adds them to an BulkWriteOperation (initializeOrderedBulkOperation) to execute it and then dump it to the database. Only 10 documents get written and then it fails.
private void dumpQueue(BlockingQueue q) {
Object item = q.peek();
Iterator itty = q.iterator();
BulkWriteResult result = null;
if (item instanceof SakaiEvent) {
while (itty.hasNext()) {
bulkEvent.insert(((SakaiEvent) itty.next()).convertToDBObject());
//It's failing at that line^^
}
result = bulkEvent.execute();
}
if (item instanceof SakaiSession) {
while (itty.hasNext()) {
bulkSession.insert(((SakaiSession) itty.next()).convertToDBObject());
}
result = bulkSession.execute();
}
System.out.println(result.getInsertedCount() + " records inserted");
}
The general documentation applies to all driver implementations in this case:
"After execution, you cannot re-execute the Bulk() object without reinitializing."
So the .execute() method effectively "drains" the current list of operations that have been sent to it and now contains state information about how the commands were actually sent. So you cannot add more entries or call .execute() again on the same instance without reinitializing .
So after you call execute on each "Bulk" object, you need to call the intialize again:
bulkEvent = eventsCollection.initializeOrderedBulkOperation();
bulkSession = sessionsCollection.initializeOrderedBulkOperation();
Each of those lines placed again repectively after each .execute() call in your function. Then further calls to those instances can add operations and call execute again continuing the cycle.
Note that "Bulk" operations objects will store as many items as you want to put into them but will break up requests to the server into maximum amounts of 1000 items. After execution the state of the operations list will reflect exactly how this is done should you want to inspect that.

Remove element from ArrayList internally

I have the following code:
class Action {
public void step(Game game) {
//if some condition met,
// then remove self from action stack
game.actionStack.remove(this);
}
class Game (
public ArrayList<Action> actionStack;
public Game() {
actionStack = new Arraylist<Action>();
actionStack.add(new Action());
while (true) {
for (Action action : this.actionStack) {
action.step(this);
}
}
}
}
An exception gets thrown when game.actionStack.remove(this); occurs. Is there a way to remove the element safely from inside the Action class like I want?
I'm guessing you're getting a ConcurrentModificationException because you're calling the list remove method while iterating it. You can't do that.
An easy fix is to work on a copy of the array when iterating:
for (Action action : new ArrayList<>(this.actionStack)) {
action.step(this);
}
A slightly more efficient fix is to use an explicit Iterator and call its remove method. Perhaps have step() return a boolean indicating whether it wants to remain in the list for the next step or not:
for (Iterator<Action> it = this.actionStack.iterator(); it.hasNext();) {
Action action = it.next();
if (!action.step(this)) {
it.remove();
}
}
From : the java tutorial we get the following:
Iterators
...
Note that Iterator.remove is the only safe way to modify a collection during iteration; the behavior is unspecified if the underlying collection is modified in any other way while the iteration is in progress.
Use Iterator instead of the for-each construct when you need to:
Remove the current element. The for-each construct hides the iterator, so you cannot call remove. Therefore, the for-each construct is not usable for filtering.
Iterate over multiple collections in parallel.
The following method shows you how to use an Iterator to filter an arbitrary Collection — that is, traverse the collection removing specific elements.
static void filter(Collection<?> c) {
for (Iterator<?> it = c.iterator(); it.hasNext(); )
if (!cond(it.next()))
it.remove();
}
This simple piece of code is polymorphic, which means that it works for any Collection regardless of implementation. This example demonstrates how easy it is to write a polymorphic algorithm using the Java Collections Framework.
Note: I assume, you implemented equals and hashCode methods for your class
You need to use iterator to remove like below;
class Game (
public ArrayList<Action> actionStack;
public Game() {
actionStack = new Arraylist<Action>();
actionStack.add(new Action());
while (true) {
for (Iterator<Action> it = this.actionStack.iterator(); it.hasNext(); ) {
it.remove();
}
}
}
}
Edit: step function is doing simple remove job. I move it to Game constructor
I suspect that you are getting a Concurrent Modification Exception. I would suggest you do it like this
class Action {
public void step(Game game) {
//if some condition met,
// then remove self from action stack
List<Action> tmpActionList = new List<Action>();
tmpActionList = game.actionStack
tmpActionList.remove(this);
game.actionStack = tmpActionList;
}
}
Let me know if it works.

arraylist remove eliminates following objects

I'm getting a very strange action in my code. I have an ArrayList of the following class.
class mySocket
{
public String name;
public Socket sck;
public mySocket(String n,Socket s)
{
this.name=n;
this.sck=s;
}
}
I declare the object like this
ArrayList<mySocket> handler = new ArrayList<>();
Now the problem is that when I try to remove an item using this method:
public void removeByName(String name)
{
synchronized(this)
{
mySocket t;
int i;
for(i=0;i<handler.size();i++)
{
t=handler.get(i);
if((t.name.equals(name)))
{
handler.remove(i);
break;
}
}
}
}
The remove function clears everything that follows the index.
For Example:
if this ArrayList has 3 elements and I call handler.remove(1) it removes not only 1 but also the object on line 2.
I think your issue is that you are using an indexed for loop and removing by index. In your example, if your list has 3 elements and you remove index 1, the object that was at index 2 is still there. It's just now at index 1.
A better way to do what you're attempting is to use an iterator or for-each loop.
//code outside for loop the same
for( mySocket socket : handler ) {
if((socket.name.equals(name)))
{
handler.remove(socket);
break;
}
}
Is the ordering of your mySocket objects important? If not, storing them in a Map keyed by name would save you some trouble. Then you would just call handler.remove(name). This operation is safe, even if name doesn't exist in the map. Also, for current uses of the collection handler that don't care aobut the name, you can retrieve the unordered Set of mySockets by calling map.values(). You would then be able to iterate over that Set using an iterator or for-each as above.
You CAN NOT remove items in a Collection while looping through them, the result, as you have seen, is undefined.
You either have to build a list of items to be removed and use
originalList.removeAll(itemsToBeRemoved);
Or you build your loop using an iterator.
Iterator<mySocket> handlerIterator = handler.iterator();
while (handlerIterator.hasNext()) {
mySocket t = handlerIterator.next();
if (t.name.equals(name)) {
handlerIterator.remove();
}
}

Categories

Resources