Why is order of expressions in if statement important - java

Suppose I have an IF condition :
if (A || B)
∧
|
|
left
{
// do something
}
Now suppose that A is more likely to receive a true value then B , why do I care which one is on the left ?
If I put both of them in the IF brackets , then I know (as the programmer of the code) that both parties are needed .
The thing is , that my professor wrote on his lecture notes that I should put the "more likely variable to receive a true" on the left .
Can someone please explain the benefit ? okay , I put it on the left ... what am I gaining ? run time ?

Its not just about choosing the most likely condition on the left. You can also have a safe guard on the left meaning you can only have one order. Consider
if (s == null || s.length() == 0) // if the String is null or empty.
You can't swap the order here as the first condition protects the second from throwing an NPE.
Similarly you can have
if (s != null && s.length() > 0) // if the String is not empty
The reason for choosing the most likely to be true for || or false for && is a micro-optimisation, to avoid the cost of evaluated in the second expression. Whether this translates to a measurable performance difference is debatable.

I put it on the left ... what am I gaining ? run time ?
Because || operator in C++ uses short-circuit evaluation.
i.e: B is evaulated only if A is evaluated to a false.
However, note that in C++ short-circuit evaluation is guaranteed for "built in" data types and not custom data types.

As per javadoc
The && and || operators perform Conditional-AND and Conditional-OR operations on two boolean expressions. These operators exhibit "short-circuiting" behavior, which means that the second operand is evaluated only if needed
So, if true statement comes first in the order, it short-circuits the second operand at runtime.

If the expression on the left is true, there is no need to evaluate the expression on the right, and so it can be optimized out at run time. This is a technique called short-circuiting. So by placing the expression more likely to be true on the left, we can expect our program to perform better than if it were the other way around.

You should place the condition that is more likely to be true first because that will cause the if statement to short-circuit. Meaning it will not evaluate the rest of the if statement because it will already know the answer is true. This makes code more efficient.
This is especially useful when your if statement is evaluating expensive things:
if(doExpensiveCheck1() || doExpensiveCheck2()) { }
In this case cause the checks are expensive it is in your benefit to place the most likely check first.

In many cases there is no practical difference apart from a tiny performance improvement. Where this becomes useful is if your checks are very expensive function calls (unlikely) or you need to check things in order. Say for example you want to check a property on something and to check if that something is nil first, you might do something like:
If (a != nil && a.attribute == valid)
{}

Yes exactly, you're gaining runtime, it won't seem much for one operation, but you have to keep in mind that operations will get repeated millions of times
Why perform two evaluations when one is enough is the logic

At runtime if(a||b) will test a first, if a is true it will not waste time testing b therefor the compiler will be 1 execution ahead. Therefore if a is more likely to be true than b this test is also likely to cut 1 line. The total number of lines not executed is tiny on a single line but it’s huge if the statement is nested in a loop of some sort(for,while ,recession or database related queries ). Eg per say we have 1million mins to test data in a database at 1 minute per record (30sec for condition A and 30 sec for condition B). Let A have 80% chances to be true and B have 20% chances to be true. The total time needed if you put A first is 600-000hrs yet it’s 900-000hrs if you put B first.if A is tested first[(0,8*1millions hours)*0,5mins+(0,2*1million hours)*1min]===6000-000hrs : if B is tested first [(0,2*1million hours)*0,5mins+(0,2*1million hours)*1min]===9000-000hrs. However you will notice the difference is less significant if the probability of A becoming true is closer to that of B.

public class Main
{
public static void main(String[] args) {
System.out.println("Hello World");
Integer a = null;
Integer b = 3;
Integer c = 5;
if(a != null && a == 2){
System.out.println("both");
}else{
System.out.println("false");
}
}
}
Hello World
false

Related

if conditionals : boolean expression variables placing order [duplicate]

This question already has answers here:
Java logical operator short-circuiting
(10 answers)
Closed 5 months ago.
When we're using an if-conditional, we specify the condition in a boolean expression such as following :
if(boolean expression).
If I have two variables in a boolean expression, such as (bagWeight > WEIGHT_LIMIT), does the order of the two variables in which they appear matter? In other words, can I swap those two variables' places such as following? (WEIGHT_LIMIT < bagWeight). Notice it would still be bag weight is less than weight limit, but I just switch the order of which one appears first in the boolean expression. AND Does it depend on which one becomes a subject, like one that gets focused on and evaluated? (In this case, we're trying to figure out if the bag weight is heavier than the limit or not. So the bag weight gets evaluated according to something.. I would call it a subject.)
Eclipse doesn't scream at me that it's wrong, and intuitively it makes sense, but somehow it just bothers me whether there's a more common programming practice or not. So my questions were, can I swap the two variables' places and would not matter? and does it depend on the context of which being a subject? and which is a more common programming practice?
You can freely change the order of the two variables as you prefer. Eclipse (or the compiler) doesn't care, it just evaluates the expression and returns a value, either true of false.
can I swap those two variables' places such as following? (WEIGHT_LIMIT < bagWeight)
Yes, it will still work exactly the same.
Order only comes in to play when using short circuit operates such as || or &&.
examples:
if (boolean1 || boolean2)
In this case, boolean1 will be evaluated first. If it evaluates to true, then boolean2 will not be evaluated, since the first one meets the criteria of the if statement.
if (boolean1 && boolean2)
In this case if boolean1 is evaluates to false, then boolean2 will never be evaluated because the fact that boolean1 is false means that even if boolean2 was true, the condition of the if statement would never be satisfied.
Hope that helps.
The order that Java evaluates && and || is not so important if everything is already evaluated into variables as in your example. If these where method calls instead then the second part of the "if" will not be called unless necessary. Examples of when the myMethod() would not be evaluated at all.
if (true || myMethod())
or
if (false && myMethod())
That's why you might see statements similar to this in actual code.
String myStr = null;
if (myStr != null && myStr.trim().size() > 0)
If Java were to evaluate the second part then you would get a NullPointerException when myStr is null. The fact that Java will bypass the second part keeps that from happening.
The order of the operands doesn't matter in your specific case. Since the if statement is comparing two values, the two values must be evaluated. However, there are some cases when order does matter:
|| and &&
|| and && are shorthand logic operators. The second operand of || will not be evaluated if the first is true. The second operand of && will not be evaluated if the first is false.
++ and --
These can yield different results:
if (i++ > i)
if (i < i++)
In the first line, i is evaluated after the increment is done, so the first operand is 1 less than the second.
In the second line, i is evaluated first, which evaluates to i, then i++, which evaluates to i as well.

Deciding the order in which operators will operate

While deciding the order in which operators will operate, I am confused between following two statement.
Statement will be executed from left to right.
It will be executed according to precedence order of operators.
Following code executes from left to right
int i=5;
boolean b = i<5 && ++i<5;//line2
System.out.println(i);//prints 5
//left to right execution in line2.
//< is executed and ++ is not.Even though ++ has higher precedence.
But this code below seems to follow precedence order:
int a=1,b=1,c=1;
boolean b = a==b&&b==c;//line2: (a==b)&&(b==c)
/* In line2 code would not run from left to right.
First a==b is evaluated then b==c and then && operator.*/
I have asked partially this question here but did't get good enough explanation.
Can someone please clarify?
Assignment works from right to left. In your case, the boolean b will be assigned to whatever the expression on the right evaluates to:
a==b&&b==c
Note that assignment operators have different rules than logical/bitwise operators in terms of precedence (i.e. = doesn't have the same precedence as == or &&).
I think you need to know about logical AND and OR.
something && somethingElse - imagine that expression. It's gonna return true ONLY if both values are true. If both are false, or one of them is false - returns false. So, let's say that the first part is actually false (something=false). Like I said, if anything in this expression is false then there is no need to go further, because no matter what is the second one it's gonna return false anyway. That's why in your first example it doesn't go any further - there is no need, i is already not less than 5 so there is no reason to check the second part, because it's already gonna be false.
In your second example a==b is true, BUT we can't return true already, we need to check the second one, because like I said, if the second one is false then it has to return false. So the expression needs to go further to check the second part. Not like in first example where we could stop, here we need to keep checking.
When it comes to OR (||) rules are different. Only one of the parts needs to be true (that's why it's called or, this or that, doesn't matter, one truth is enough). So with || if the first part is true the we can return no matter what, but if the first part is false then we need to keep going no matter what, because the second one may be true.
i<5 && ++i<5;//line2
&&
[&&] eval LHS: i<5 to false
[&&] skip RHS: ++i<5 and yield false
And
//line2: (a==b)&&(b==c)
a==b&&b==c;
&&
[&&] eval LHS: a==b to true
[&&] eval RHS: b==c to true, yield true
The && is a short-cut operator that does not evaluate the right hand side.
The precedences are important for binary operators. The precedence of unary operators cannot reorder anything, but traditionally it has higher precedence, as it is tighter bound to the final value, and especially for postfix operators.
The prefix operator ++ has the important characteristic, that it is called before the evaluation of the term to which it is applied.
It goes without say, that the usage above is something to be get shot for, hung on the nearest AVL tree, tiered and feathered and chased out of town.

What is the different in putting most possible true condition in if, else-if or else

Is there is any difference to put most probable condition in if, else-if or else condition
Ex :
int[] a = {2,4,6,9,10,0,30,0,31,66}
int firstCase = 0, secondCase = 0, thirdCase = 0;
for( int i=0;i<10;i++ ){
int m = a[i] % 5;
if(m < 3) {
firstCase++;
} else if(m == 3) {
secondCase++;
} else {
thirdCase++;
}
}
What is the difference of the execution time with input
int[] a = {3,6,8,7,0,0,0,0,0,0}
Is there is any different to put most possible true condition in if, else-if or else condition
Actually, the answer with Java is that "it depends".
You see, when you run Java code, the JVM starts out by using the using the interpreter while gathering statistics. One of the statistics that may be recorded is which of the paths in a branch instruction is most often taken. These statistics could then used by the JIT compiler to influence code reordering, where this does not alter the compiled code's semantics.
So if you were to execute your code, with two different datasets (i.e. "mostly zero" and "mostly non-zero"), it is possible that the JIT compiler would compile the code differently.
Whether it can actually make this optimization depends on whether it can figure out that the reordering is valid. For example, can it deduce that the conditions being tested are mutually exclusive?
So how does this affect the complexity? Well ... lets do the sums for your simplified example, assuming that the JIT compiler doesn't do anything "smart". And assume that we are not just dealing with arrays of length 10 (which renders the discussion of complexity moot).
Consider this:
For each zero, the loop does one test and one increment - say 2 operations.
For each non-zero element, the loop does two tests and one increment - say 3 operations.
So that is roughly 2*N operations for N elements when all zero versus 3*N operations ehen all non-zero. But both are O(N) ... so the Big O complexity is not affected.
(OK I left some stuff out ... but you get the picture. One of the cases is going to be faster, but the complexity is not affected.)
There's a bit more to this than you're being told.
'if' versus 'else': If a condition and its converse are not equally likely, you should handle the more likely condition in the 'else' block, not the 'if' block. The 'if' block requires a conditional jump which isn't taken and a final branch around the 'else' block; the 'else' block requires a condition branch which is taken and no final branch at all.
'if' versus 'else if' versus 'else': Obviously you should handle the most common case in the 'if' block, to avoid the second test. The same considerations as at (1) determine that the more common case as between the final 'else if' and the final 'else' should be handled in the final 'else' block.
Having said all that, unless the tests are non-trivial, or the contents of all these blocks are utterly trivial, it it is rather unlikely that any of it will make a discernible difference.
There is no difference if you only have an if-else, since the condition will always be evaluated and it does not matter whether it is almost always true or false. However, if you have an if in the else part (the else if), it is much better to put the most possible true condition in the first if. Therefore, most of the time you won't need to evaluate the condition inside the else, increasing performance.
If most conditions are true in if then the execution time will be less .Because in the first if condition only it satisfied.
If most conditions are true in if-else then the execution time will be less then last and more than first scenarios .
If most conditions are true in else then the execution time will be more.Because it checkes first 2 conditions.
Sure it is.
if ... else if ... checks are going in order in which they were coded. So, if you will place most possible condition in the end of this conditions checking queue - such code will work slightly slower.
But it all depenends how these conditions are built (how complex they are).
Most Possible condition should go to the if and then if else and so on.
It's good to write the most common condition in the very first level so that if that condition is true or false will be treated first in less time.
If you put the most frequent condition in middle (else..if) or in last (else), then it will take time to reach to that condition statement because it needs to check every condition statement.

What are the cases in which it is better to use unconditional AND (& instead of &&)

I'd like to know some cases in Java (or more generally:
in programming) when it is preferred in boolean expressions to use the unconditional AND (&) instead of the conditional version (&&).
I know how they work, but I cannot think about a case when use the single & is worth it.
I have found cases in real life where both sides of the expression were really cheap, so it shaved off a nanosecond or two to avoid the branch and to use the unconditional & instead of &&. (These were extremely high-performance math utilities, though; I would almost never use this in other code, and I wouldn't have done it anyway without exhaustive benchmarking to prove it was better.)
(To give specific examples, x > 0 is going to be super cheap and side-effect-free. Why bother risking a branch misprediction to avoid a test that's going to be so cheap anyway? Sure, since it's a boolean the end result is going to be used in a branch anyway, but if (x >= 0 && x <= 10) involves two branches, and if (x >= 0 & x <= 10) involves only one.)
The only difference is that && and || stop the evaluation as soon as it is known. So for example:
if (a != null && a.get() != null)
works well with &&, but with & you could get a NullPointerException if a is null.
The only case I can think about where you want to use & is if the second operand has a side effect, for example (probably not the best example but you get the point):
public static void main(String[] args) {
int i = 1;
if (i == 0 & ++i != 2) {
}
System.out.println(i); //2
i = 1;
if (i == 0 && ++i != 2) {
}
System.out.println(i); //1
}
However, this looks like smelly code to me (in both cases).
The && allows the jvm to do short circuit evaluation. That is, if the first argument is false, then it doesn't need to bother checking the second argument.
A single & will run both sides regardless.
So, as a contrived example, you might have:
if (account.isAllowed() & logAccountAndCheckFlag(account))
// Do something
In that example, you might always want to log the fact that the owner of the account attempted to do something.
I don't think I have ever used a single & in commercial programming though.
Wikipedia has nicely described the Short Circuit Evaluation
Where do you prefer non short-circuit operators ?
From the same link:
Untested second condition leads to unperformed side effect
Code efficiency
Short-circuiting can lead to errors in branch prediction on modern
processors, and dramatically reduce performance (a notable example is
highly optimized ray with axis aligned box intersection code in ray
tracing)[clarification needed]. Some compilers can detect such cases
and emit faster code, but it is not always possible due to possible
violations of the C standard. Highly optimized code should use other
ways for doing this (like manual usage of assembly code)
If there are side effects that must happen, but that's a little ugly.
The bitwise AND (&) is mostly useful for just that - bitwise math.
Input validation is one possible case. You typically want to report all the errors in a form to the user in a single pass instead of stopping after the first one and forcing them to click submit repeatedly and only get a single error each time:
public boolean validateField(string userInput, string paramName) {
bool valid;
//do validation
if (valid) {
//updates UI to remove error indicator (if present)
reportValid(paramName);
} else {
//updates UI to indicate a problem (color change, error icon, etc)
reportInvalid(paramName);
}
}
public boolean validateAllInput(...) {
boolean valid = true;
valid = valid & validateField(userInput1, paramName1);
valid = valid & validateField(userInput2, paramName2);
valid = valid & validateField(userInput3, paramName3);
valid = valid & validateField(userInput4, paramName4);
valid = valid & validateField(userInput5, paramName5);
return valid;
}
public void onSubmit() {
if (validateAllInput(...)) {
//go to next page of wizard, update database, etc
processUserInput(userInput1, userInput2, ... );
}
}
public void onInput1Changed() {
validateField(input1.Text, paramName1);
}
public void onInput2Changed() {
validateField(input2.Text, paramName2);
}
...
Granted, you could trivially avoid the need for short circuit evaluation in validateAllInput() by refactoring the if (valid) { reportValid() ... logic outside of validateField(); but then you'd need to call the extracted code every time validateField() was called; at a minimum adding 10 extra lines for method calls. As always it's a case of which tradeoff's work best for you.
If the expression are trivial, you may get a micro-optimisation by using & or | in that you are preventing a branch. ie.
if(a && b) { }
if(!(a || b)) { }
is the same as
if (a) if (b) { }
if (!a) if (!b) { }
which has two places a branch can occur.
However using an unconditional & or |, there can be only one branch.
Whetehr this helps or not is highly dependant on what the code is doing.
If you use this, I sugegst commenting it to make it very clear why it has been done.
There isn't any specific use of single & but you can consider the following situation.
if (x > 0 & someMethod(...))
{
// code...
}
Consider that someMethod() is doing some operation which will modify instance variables or do something which will impact behavior later in processing.
So in this case if you use && operator and the first condition fails it will never go in someMethod(). In this case single & operator will suffice.
Because & is a bit-wise operator, you can do up to 32-checks in a single operation concurrently. This can become a significant speed gain for this very specific use cases. If you need to check a large number of conditions, and do it often and the cost of boxing/unboxing the conditions are amortized by the number of checks, or if you store your data on-disk and on-RAM in that format (it is more space efficient to store 32 conditions in a single bitmask), the & operator can give a huge speed benefit over a series of 32 individual &&. For example if you want to select all units that can move, is an infantry, has weapon upgrade, and is controlled by player 3, you can do:
int MASK = CAN_MOVE | INFANTRY | CAN_ATTACK | HAS_WEAPON_UPGRADE | PLAYER_3;
for (Unit u in allunits) {
if (u.mask & MASK == MASK) {
...;
}
}
See my other answers on a related question for more on the topic.
The only benefit I can think of is when you need to invoke a method or execute a code, no matter the first expression is evaluated to true or false:
public boolean update()
{
// do whatever you want here
return true;
}
// ...
if(x == y & update()){ /* ... */}
Although you can do this without &:
if(x == y){/* ... */}
update();
Short-circuiting can lead to errors in branch prediction on modern processors, and dramatically reduce performance (a notable example is highly optimized ray with axis aligned box intersection code in ray tracing)[clarification needed].

Is it bad to explicitly compare against boolean constants e.g. if (b == false) in Java?

Is it bad to write:
if (b == false) //...
while (b != true) //...
Is it always better to instead write:
if (!b) //...
while (!b) //...
Presumably there is no difference in performance (or is there?), but how do you weigh the explicitness, the conciseness, the clarity, the readability, etc between the two?
Update
To limit the subjectivity, I'd also appreciate any quotes from authoritative coding style guidelines over which is always preferable or which to use when.
Note: the variable name b is just used as an example, ala foo and bar.
It's not necessarily bad, it's just superfluous. Also, the actual variable name weights a lot. I would prefer for example if (userIsAllowedToLogin) over if (b) or even worse if (flag).
As to the performance concern, the compiler optimizes it away at any way.
As to the authoritative sources, I can't find something explicitly in the Java Code Conventions as originally written by Sun, but at least Checkstyle has a SimplifyBooleanExpression module which would warn about that.
You should not use the first style. I have seen people use:
if ( b == true )
if ( b == false )
I personally find it hard to read but it is passable. However, a big problem I have with that style is that it leads to the incredibly counter-intuitive examples you showed:
if ( b != true )
if ( b != false )
That takes more effort on the part of the reader to determine the authors intent. Personally, I find including an explicit comparison to true or false to be redundant and thus harder to read, but that's me.
This is strongly a matter of taste.
Personally I've found that if (!a) { is a lot less readable (EDIT: to me) than if (a == false) { and hence more error prone when maintaining the code later, and I've converted to use the latter form.
Basically I dislike the choice of symbols for logic operations instead of words (C versus Pascal), because to me a = 10 and not b = 20 reads easier than a == 10 && !(b==20), but that is the way it is in Java.
Anybody who puts the "== false" approach down in favour of "!" clearly never had stared at code for too long and missed that exclamation mark. Yes you can get code-blind.
The overriding reason why you shouldn't use the first style is because both of these are valid:
if (b = false) //...
while (b = true) //...
That is, if you accidentally leave out one character, you create an assignment instead of a comparison. An assignment expression evaluates to the value that was assigned, so the first statement above assigns the value false to b and evaluates to false. The second assigns true to b, so it always evaluates to true, no matter what you do with b inside the loop.
I've never seen the former except in code written by beginners; it's always the latter, and I don't think anyone is really confused by it. On the other hand, I think
int x;
...
if(x) //...
vs
if(x != 0) //...
is much more debatable, and in that case I do prefer the second
IMHO, I think if you just make the bool variable names prepended with "Is", it will be self evident and more meaningful and then, you can remove the explicit comparison with true or false
Example:
isEdited // use IsEdited in case of property names
isAuthorized // use IsAuthorized in case of property names
etc
I prefer the first, because it's clearer. The machine can read either equally well, but I try to write code for other people to read, not just the machine.
In my opinion it is simply annoying. Not something I would cause a ruckus over though.
The normal guideline is to never test against boolean. Some argue that the additional verbosity adds to clarity. The added code may help some people, but every reader will need to read more code.
This morning, I have lost 1/2 hour to find a bug. The code was
if ( !strcmp(runway_in_use,"CLOSED") == IPAS_FALSE)
printf(" ACTIVE FALSE \n"); else
printf(" ACTIVE TRUE \n");
If it was coded with normal convention, I would have seen a lot faster that it was wrong:
if (strcmp(runway_in_use, "CLOSED"))
printf(" ACTIVE FALSE \n"); else
printf(" ACTIVE TRUE \n");
I prefer the long approach, but I compare using == instead of != 99% of time.
I know this question is about Java, but I often switch between languages, and in C#, for instance, comparing with (for isntance) == false can help when dealing with nullable bool types. So I got this habbit of comparing with true or false but using the == operator.
I do these:
if(isSomething == false) or if(isSomething == true)
but I hate these:
if(isSomething != false) or if(isSomething != true)
for obvious readability reasons!
As long as you keep your code readable, it will not matter.
Personally, I would refactor the code so I am not using a negative test. for example.
if (b == false) {
// false
} else {
// true
}
or
boolean b = false;
while(b == false) {
if (condition)
b = true;
}
IMHO, In 90% of cases, code can be refactored so the negative test is not required.
This is my first answer on StackOverflow so be nice...
Recently while refactoring I noticed that 2 blocks of code had almost the exact same code but one used had
for (Alert alert : alerts) {
Long currentId = alert.getUserId();
if (vipList.contains(currentId)) {
customersToNotify.add(alert);
if (customersToNotify.size() == maxAlerts) {
break;
}
}
}
and the other had
for (Alert alert : alerts) {
Long currentId = alert.getUserId();
if (!vipList.contains(currentId)) {
customersToNotify.add(alert);
if (customersToNotify.size() == maxAlerts) {
break;
}
}
}
so in this case it made sense to create a method which worked for both conditions like this using boolean == condition to flip the meaning
private void appendCustomersToNotify(List<Alert> alerts
List<Alert> customersToNotify, List<Long> vipList, boolean vip){
for (Alert alert : alerts) {
Long currentId = alertItem.getUserId();
if (vip == vipList.contains(currentId)) {
customersToNotify.add(alertItem);
if (customersToNotify.size() == maxAlerts) {
break;
}
}
}
}
I would say it is bad.
while (!b) {
// do something
}
reads much better than
while (b != true) {
// do something
}
One of the reasons the first one (b==false) is frowned upon is that beginners often do not realize that the second alternative (!b) is possible at all. So using the first form may point at a misconception with boolean expressions and boolean variables. This way, using the second form has become some kind of a sjiboleth: when someone writes this, he/she probably understands what's going on.
I believe that this has caused the difference to be considered more important than it really is.
While both are valid, to me the first feels like a type error.
To me b == false looks as wrong as (i == 0) == false. It is like: huh?
Booleans are not an enum with 2 possible values. You don't compare them. Boolean are predicates and represent some truth. They have specific operators like &, |, ^, !.
To reverse the truth of an expression use the operator '!', pronounch it as "not".
With proper naming, it becomes natural: !isEmpty reads "not is empty", quite readable to me.
While isEmpty == false reads something like "it is false that it is empty", which I need more time to process.
I won't go into all of the details at length because many people have already answered correctly.
Functionality-wise, it gives the same result.
As far as styling goes, it's a matter of preference, but I do believe !condition to be more readable.
For the performance argument, I have seen many say that it makes no difference, but they have nothing to justify their claims. Let's go just a bit deeper into that one. So what happens when you compare them?
First, logically:
if(condition == false)
In this case, if is comparing its desired value to execute with the value between the parentheses, which has to be computed.
if(!condition)
In this case, if is directly compared to the opposite(NOT) of the condition. So instead of 2 comparisons, it is one comparison and 1 NOT operation, which is faster.
I wouldn't just say this without having tested it of course. Here is a quick screenshot of the test I did. !condition is nearly twice as fast over 10 million iterations.
https://imgur.com/a/jrPVKMw
EDIT: I tested this in C#, compiled with visual studio. Some compilers may be smarter and optimize it properly, which would make the performance the same.

Categories

Resources