how can more than one thread execute my method? - java

hey i'm sorry to be asking this here, but my lecturer won't help me with past exam paper answers.
How can more than one thread execute my oneTimeOnly() method? and what steps would i need to take to make it thread safe/only executed once by one thread?
public class ExampleClass {
private volatile boolean flag = false;
public void someOperation() {
if(flag != true) {
oneTimeOnly();
}
flag = true;
}
}

In a literal sense, nothing will prevent your method from being executed as many times as it is called. Only if understood on a higher level, the main logic of your method can be ensured to execute only once. I am saying this because some teachers/interviewers just love to nitpick and play meaningless mind games with their students/interviewees.
The problem with your approach is that two threads may be simultaneously executing your method, and setting flag to true only after both complete.
You can synchronize the method, or you can use an AtomicBoolean which can ensure that only one thread sets it to true. I believe this is too advanced for your class assignment, though, so stick with a syncronized method.

you should declare your method as synchronized:
public synchronized void someOperation() {
...
}
In this way you can be sure that only one thread at a time is executing your method's code, thus ensuring that the flag is set only once by the first thread calling it, and all other threads accessing it will find it already set.

Make the method synchronized or use synchronized block like
public void synchronized oneTimeOnly(){}

Related

synchronization and synchronized block in java

I've got a few questions about synchronization in java.
public class A {
private int i;
public A (int t) { i = t; }
public synchronized void increment() {
i++;
}
public synchronized void decrement() {
i--;
}
}
Suppose I have a class implemented as above and I create an object (p) of type A.
I know there can only be one thread executing p.increment(), but is possible for another thread to execute p.decrement() at the same time ?
Thanks~
synchronized does not protect methods. synchronized does not protect objects. synchronized does one thing, and one thing only.
The way you wrote your increment method is actually just a shorthand way of writing this:
public void increment() {
synchronized(this) {
i++;
}
}
This longer way of expressing it make it clear that synchronized is operating on this.
So, the one thing that synchronized does is: The JVM will not allow two threads to be synchronized on the same object at the same time.
If you have an object, p, of type A, then the answer is "no". The JVM will not allow one thread to do the increment at the same time the other thread does the decrement because both threads would be trying to synchronize on the same object, p.
On the other hand, if you have one object p and another object q, both of type A, then one thread could be in a p.increment() call while another thread is in the q.decrement() call. That's because each thread would be synchronized on a different object, and that is allowed.
P.S.: synchronized actually does one other thing that pertains to the Java concept called "Happens Before". That's something you should learn about (Google is your friend) before you get much deeper into multi-threaded programming.
No. Using synchronized as a method modifier is equivalent to wrapping the method with synchronized(this).
No. the synchronized will not allow multiple threads to enter the methods for an instance of the object. Note if you have a p1 AND a p2 of class A you could have p1.increment() running at same time p2.decrement() (etc.) is running.
Actually since the value is wrapped into synchronized methods, it content is also synchronized so no, it won't be executed by another thread.
Note that it can be called by another thread.
Personally, if it is only a matter of decrements/increments, I wouldn't bother synchronizing and would simply use Atomic values.
See documentation

Java: Concurrency inside ActionListener.actionPerformed

today I started deal with concurrency in Java (probably that was bad idea...)
I read some articles about it. At the beginning I understood it but now I am confused...
I'm going straight to the point. Suppose we have this class
public class Test implements ActionListener
{
private boolean bool;
public void process()
{
if (bool)
// ...
return bool;
}
#Override
public void actionPerformed(ActionEvent e)
{
// We can suppose that this method belongs to JButton or so...
// And if I am not wrong then this method is fired in EDT, so it means it runs in
// other thread (Event Dispatch Thread) in contrast with this class... So there are
// two threads...
bool = ... ? true : false;
}
// ...
}
My confusion:
In actionPerformed() we modify the value of property bool and in the same time we can access the property inside process() (again if I am not wrong)... For me it seems that there can occur to some error or so... My solution is that property bool should be volatile or method process() should be tagged as synchronized and statement
bool = ... ? true : false;
should be as:
synchronized (this)
{
bool = ... ? true : false;
}
This is little primitive example, but we can consider bool property as Object, String, etc... and in actionPerformed() we will set it to null and in process() will be thrown a NullPointerException for example...
What you think? I want to know why anyone use synchronizing inside listeners... I have not found anyone...
Thanks
Regards
brolinko
EDIT #1: process() is never called within EDT... When I create an instance of Test for example in main() method then the instance runs inside a thread as main() and in main() I can call the process() ... So there are two threads in this case if I am not wrong...
You're not wrong, if a variable is used by two or more threads, writes should be made visible by using the appropriate memory barriers (synchronized/volatile/java.util.concurrent).
In this case though, the question is: is process() not also called from the EDT? If it is, there are no multiple threads to talk about and hence the code works, albeit dangerously unsafe.
Update: You're saying process() isn't called from the EDT. Your code will still happen to work because although you don't put a memory barrier on your write, the implementation of the EDT does. In theory, relying on this is a dangerous idea because there's no guarantee this will stay this way. In practice, there are so many brittle AWT applications out there that it's unlikely to change. (As pointed out by #jtahlborn, this barrier still only applies to calls from within the EDT)
The Java memory model and the Java multithreading rules apply to all classes and threads. The fact that your class implements ActionListener is irrelevant.
If two threads access a shared variable and you expect writes by one thread to be visible by reads from the other threads, then some kind of synchronization must be used to access the shared variable (synchronized access, make the variable volatile, or use a thread-safe object like an AtomicBoolean).

Multi-threading -- a faster way?

I have a class with a getter getInt() and a setter setInt() on a certain field, say field
Integer Int;
of an object of a class, say SomeClass.
The setInt() here is synchronized-- getInt() isn't.
I am updating the value of Int from within multiple threads.
Each thread is getting the value Int, and setting it appropriately.
The threads aren't sharing any other resources in any way.
The code executed in each thread is as follows.
public void update(SomeClass c) {
while (<condition-1>) // the conditions here and the calculation of
// k below dont have anything to do
// with the members of c
if (<condition-2>) {
// calculate k here
synchronized (c) {
c.setInt(c.getInt()+k);
// System.out.println("in "+this.toString());
}
}
}
The run() method is just invoking the above method on the members updated from within the constructor by the params passed to it:
public void run() { update(c); }
When I run this on large sequences, the threads aren't interleaving much-- i see one thread executing for long without any other thread running in between.
There must be a better way of doing this.
I can't change the internals of SomeClass, or of the class invoking the threads.
How can this be done better?
TIA.
//=====================================
EDIT:
I'm not after manipulating the execution sequence of the threads. They all have the same priority. It`s just that what i see in the outcome is suggesting that the threads aren't sharing the execution time evenly-- one of them, once takes over, executing on. However, I can't see why this code should be doing this.
It`s just that what i see in the outcome is suggesting that the threads aren't sharing the execution time evenly
Well, this is exactly what you don't want if you are after efficiency. Yanking a thread from being executed and scheduling another thread is generally very costly. Therefore it's actually advantageous to do one of them, once takes over, executing on. Of course, when this is overdone you could see higher throughput but longer response time. In theory. In practice, JVMs thread scheduling is well tuned for almost all purposes, and you don't want to try changing it in almost all situations. As a rule of thumb, if you are interested in response times in millisecond order, you probably want to stay away messing with it.
tl;dr: It's not being inefficient, you probably want to leave it as it is.
EDIT:
Having said that, using an AtomicInteger may help in performance, and is in my opinion less error prone than using a lock (synchronized keyword). You need to be hitting that variable really very hard in order to get a measurable benefit though.
The JDK provides a nice solution for multi threaded int access, AtomicInteger:
http://docs.oracle.com/javase/7/docs/api/java/util/concurrent/atomic/AtomicInteger.html
As Enno Shioji has pointed out, letting one thread proceed might be the most efficient way to execute your code in some scenarios.
It depends on how much cost the thread synchronization imposes in relation to the other work of your code (which we don’t know). If you have a loop like:
while (<condition-1>)
if (<condition-2>) {
// calculate k here
synchronized (c) {
c.setInt(c.getInt()+k);
}
}
and the test for condition-1 and condition-2 and the calculation of k is rather cheap compared to the synchronization cost, the Hotspot optimizer might decide to reduce the overhead by transforming the code to something like this:
synchronized (c) {
while (<condition-1>)
if (<condition-2>) {
// calculate k here
c.setInt(c.getInt()+k);
}
}
(or a rather more complicated structure by performing loop unrolling and span the synchronized block over multiple iterations). The bottom line is that the optimized code might block other threads longer but let the one owning the lock finish faster resulting in an overall faster execution.
This does not mean that a single-threaded execution was the fastest way to handle your problem. It also doesn’t mean that using an AtomicInteger here would be the best option to solve the problem. It would create a higher CPU load and possibly a small acceleration but it doesn’t solve your real mistake:
It is completely unnecessary to update c within the loop at a high frequency. After all, your threads do not depend on seeing updates to c timely. It even looks like they are not using it at all. So the correct fix would be to move the update out of the loop:
int kTotal=0;
while (<condition-1>)
if (<condition-2>) {
// calculate k here
kTotal += k;
}
synchronized (c) {
c.setInt(c.getInt()+kTotal);
}
Now, all threads can run in parallel (assuming the code you haven’t posted here doesn’t contain inter-thread dependencies) and the synchronization cost is reduced to a minimum. You could still change it to an AtomicInteger as well but that’s not that important anymore.
Answering to this
i see one thread executing for long without any other thread running in between.
There must be a better way of doing this.
You can not control how threads will be executed. JVM does this for you, and does not like you to interfere in its work.
Still you can look at yield as your option, but that also does not ensure same thread will not be picked again.
The java.lang.Thread.yield() method causes the currently executing thread object to temporarily pause and allow other threads to execute.
I've found it better to use wait() and notify() than yield. Check out this example (seen from a book)-
class Q {
int n;
boolean valueSet = false;
synchronized int get() {
if(!valueSet)
wait(); //handle InterruptedException
//
valueSet = false;
notify();//if thread waiting in put, now notified
}
synchronized void put(int n) {
if(valueSet)
wait(); //handle InterruptedException
//
valueSet = true;
//if thread in get waiting then that is resumed now
notify();
}
}
or you could try using sleep() and join the threads in the end in main() but that isn't a foolproof way
You are having public void update(SomeClass c) method in your code and this method is an instance method in which you are passing the object as parameter.
synchronized(c) in your code is doing nothing. Let me show you with some example,
So if you will make different objects of this class and then try to make them different threads like,
class A extends Thread{
public void update(SomeClass c){}
public void run(){
update(c)
}
public static void main(String args[]){
A t1 = new A();
A t2 = new A();
t1.start();
t2.start();
}
}
Then both of these t1 & t2 will have their own copies of update method and the reference variable c which you are making synchronized will also be different for both the threads. t1 calls its own update() method and t2 calls its own update() method. So synchronization won't work.
Synchronization will work when you have something common for both the threads.
Something like,
class A extends Thread{
static SomeClass c;
public void update(){
synchronized(c){
}
}
public void run(){
update(c)
}
public static void main(String args[]){
A t1 = new A();
A t2 = new A();
t1.start();
t2.start();
}
}
This way the actual concept of synchronization will be applied.

What happens to a field in java that is accessed asynchronously (inconsistently) by multiple threads?

I think I have a fairly firm grasp on using the synchronized keyword to prevent inconsistencies between threads in java, but I don't fully understand what happens if you don't use that keyword.
Say for instance that I have a field accessed/modified by two threads:
private String sharedString = "";
class OneThread extends Thread {
private Boolean mRunning = false;
public OneThread() {}
public synchronized setRunning(Boolean b) {
mRunning = b;
}
#Override
public void run() {
while (mRunning) {
// read or write to shared string
sharedString = "text from thread 1";
System.out.println("String seen from thread 1: " + sharedString);
super.run();
}
}
}
class AnotherThread extends Thread {
private Boolean mRunning = false;
public AnotherThread() {}
public synchronized setRunning(Boolean b) {
mRunning = b;
}
#Override
public void run() {
while (mRunning) {
// read or write to shared string
sharedString = "text from thread 2";
System.out.println("String seen from thread 2: " + sharedString);
super.run();
}
}
}
Since both of these threads are accessing and modifying the field sharedString without using the synchronized keyword, I would expect inconsistencies. What I am wondering is what actually happens though. While debugging, I have stepped carefully through both threads in situations like this and noticed that even while one thread is paused, it's state can be "sticky".
For the sake of the above example, suppose both threads are paused in the debugger. If I step through one of the threads and leave the other paused, I would expect it would operate like a single threaded application. Yet, many times right after modifying the field, the next line that accesses it retrieves the "wrong" value (a value inconsistent with what it was just modified to).
I know that this code is not good.. but I am asking the question because I'm hoping someone could provide an answer that gives some insight into what actually happens in the virtual machine when multi-threaded applications are implemented poorly. Does the thread who's field modification attempt was unsuccessful have any effect at all?
If after poorly implementing multi-threaded code we are simply in the realm of "undefined" behavior, and there is no value in learning about this behavior, I'm ok with that.. just a multi-threading noob trying to understand what I observe in the debugger.
This is due to another critical function of synchronization across threads in Java: preventing data staleness. As part of the Java Memory Model, a Java thread may cache values for shared data. There is no guarantee that a thread will ever see updates made by another thread unless either the shared mutable data is accessed in synchronized blocks or it is marked as volatile. See here for more information.
There's really no way for the printout to be "wrong" if there is no other thread that can change the shared value (as would be the case if there are really only 2 threads and one is definitely paused). Can you by chance provide the code that "kicks off" these 2 threads (i.e. your main)?

Questions on Concurrency from Java Guide

So I've been reading on concurrency and have some questions on the way (guide I followed - though I'm not sure if its the best source):
Processes vs. Threads: Is the difference basically that a process is the program as a whole while a thread can be a (small) part of a program?
I am not exactly sure why there is a interrupted() method and a InterruptedException. Why should the interrupted() method even be used? It just seems to me that Java just adds an extra layer of indirection.
For synchronization (and specifically about the one in that link), how does adding the synchronize keyword even fix the problem? I mean, if Thread A gives back its incremented c and Thread B gives back the decremented c and store it to some other variable, I am not exactly sure how the problem is solved. I mean this may be answering my own question, but is it supposed to be assumed that after one of the threads return an answer, terminate? And if that is the case, why would adding synchronize make a difference?
I read (from some random PDF) that if you have two Threads start() subsequently, you cannot guarantee that the first thread will occur before the second thread. How would you guarantee it, though?
In synchronization statements, I am not completely sure whats the point of adding synchronized within the method. What is wrong with leaving it out? Is it because one expects both to mutate separately, but to be obtained together? Why not just have the two non-synchronized?
Is volatile just a keyword for variables and is synonymous with synchronized?
In the deadlock problem, how does synchronize even help the situation? What makes this situation different from starting two threads that change a variable?
Moreover, where is the "wait"/lock for the other person to bowBack? I would have thought that bow() was blocked, not bowBack().
I'll stop here because I think if I went any further without these questions answered, I will not be able to understand the later lessons.
Answers:
Yes, a process is an operating system process that has an address space, a thread is a unit of execution, and there can be multiple units of execution in a process.
The interrupt() method and InterruptedException are generally used to wake up threads that are waiting to either have them do something or terminate.
Synchronizing is a form of mutual exclusion or locking, something very standard and required in computer programming. Google these terms and read up on that and you will have your answer.
True, this cannot be guaranteed, you would have to have some mechanism, involving synchronization that the threads used to make sure they ran in the desired order. This would be specific to the code in the threads.
See answer to #3
Volatile is a way to make sure that a particular variable can be properly shared between different threads. It is necessary on multi-processor machines (which almost everyone has these days) to make sure the value of the variable is consistent between the processors. It is effectively a way to synchronize a single value.
Read about deadlocking in more general terms to understand this. Once you first understand mutual exclusion and locking you will be able to understand how deadlocks can happen.
I have not read the materials that you read, so I don't understand this one. Sorry.
I find that the examples used to explain synchronization and volatility are contrived and difficult to understand the purpose of. Here are my preferred examples:
Synchronized:
private Value value;
public void setValue(Value v) {
value = v;
}
public void doSomething() {
if(value != null) {
doFirstThing();
int val = value.getInt(); // Will throw NullPointerException if another
// thread calls setValue(null);
doSecondThing(val);
}
}
The above code is perfectly correct if run in a single-threaded environment. However with even 2 threads there is the possibility that value will be changed in between the check and when it is used. This is because the method doSomething() is not atomic.
To address this, use synchronization:
private Value value;
private Object lock = new Object();
public void setValue(Value v) {
synchronized(lock) {
value = v;
}
}
public void doSomething() {
synchronized(lock) { // Prevents setValue being called by another thread.
if(value != null) {
doFirstThing();
int val = value.getInt(); // Cannot throw NullPointerException.
doSecondThing(val);
}
}
}
Volatile:
private boolean running = true;
// Called by Thread 1.
public void run() {
while(running) {
doSomething();
}
}
// Called by Thread 2.
public void stop() {
running = false;
}
To explain this requires knowledge of the Java Memory Model. It is worth reading about in depth, but the short version for this example is that Threads have their own copies of variables which are only sync'd to main memory on a synchronized block and when a volatile variable is reached. The Java compiler (specifically the JIT) is allowed to optimise the code into this:
public void run() {
while(true) { // Will never end
doSomething();
}
}
To prevent this optimisation you can set a variable to be volatile, which forces the thread to access main memory every time it reads the variable. Note that this is unnecessary if you are using synchronized statements as both keywords cause a sync to main memory.
I haven't addressed your questions directly as Francis did so. I hope these examples can give you an idea of the concepts in a better way than the examples you saw in the Oracle tutorial.

Categories

Resources