I wonder if modern Java lang syntax allows enforcement of check for all known Enum instances: no Enum element left without case.
Like for:
enum RGB { RED, GREEN, BLUE }
compiler should forbid:
switch (rgb) {
case RED:
case BLUE:
// Oops we forgot GREEN!
}
I know default: throw new IllegalStateException() but that happens at runtime. I'm looking for compile time enforcement or weaker warning...
You have two options.
Use the new switch-as-an-expression form (and I'm also throwing in the new arrow case labels but that's not pertinent to your question):
int x = switch(rgb) {
case RED -> 0xFF0000;
case BLUE -> 0x0000FF;
case GREEN -> 0x00FF00;
};
This will cause a compiler error if you forget a case.
NB: These were properly introduced as preview in JDK13, and can be used without the --enable-preview switch starting with JDK14.
If that's not what you are looking for, there is no compiler option, and in general java's compiler is not pluggable in this way (it's pluggable via the annotation processor system, which cannot generally look at code within methods, thus, cannot be used for this).
But what does exist is linting tools: Tools that scan your code and look for things. For example, most IDEs have them built in, and eclipse (which I'm familiar with - others may well have this too) has an option to generate a warning or error (whatever you prefer) for missed cases - this works on the 'old' switch statement as well. There are also standalone linting tools like sonarqube.
I have a program that basically looks like this:
boolean[] stuffNThings;
int state=1;
for(String string:list){
switch(state){
case 1:
if(/*condition*/){
// foo
break;
}else{
stuffNThings=new boolean[/*size*/];
state=2;
}
// intentional fallthrough
case 2:
// bar
stuffNThings[0]=true;
}
}
As you, a human, can see, case 2 only ever happens when there was previously a state 1 and it switched to state 2 after initialising the array. But Eclipse and the Java compiler don't see this, because it looks like pretty complex logic to them. So Eclipse complains:
The local variable stuffNThings may not have been initialized."
And if I change "boolean[] stuffNThings;" to "boolean[] stuffNThings=null;", it switches to this error message:
Potential null pointer access: The variable stuffNThings may be null at this location.
I also can't initialise it at the top, because the size of the array is only determined after the final loop in state 1.
Java thinks that the array could be null there, but I know that it can't. Is there some way to tell Java this? Or am I definitely forced to put a useless null check around it? Adding that makes the code harder to understand, because it looks like there may be a case where the value doesn't actually get set to true.
Java thinks that the array could be null there, but I know that it can't.
Strictly speaking, Java thinks that the variable could be uninitialized. If it is not definitely initialized, the value should not be observable.
(Whether the variable is silently initialized to null or left in an indeterminate state is an implementation detail. The point is, the language says you shouldn't be allowed to see the value.)
But anyway, the solution is to initialize it to null. It is redundant, but there is no way to tell Java to "just trust me, it will be initialized".
In the variations where you are getting "Potential null pointer access" messages:
It is a warning, not an error.
You can ignore or suppress a warning. (If your correctness analysis is wrong then you may get NPE's as a result. But that's your choice.)
You can turn off some or all warnings with compiler switches.
You can suppress a specific warning with a #SuppressWarnings annotation:
For Eclipse, use #SuppressWarnings("null").
For Android, use #SuppressWarnings("ConstantConditions").
Unfortunately, the warning tags are not fully standardized. However, a compiler should silently ignore a #SuppressWarnings for a warning tag that it doesn't recognize.
You may be able to restructure the code.
In your example, the code is using switch drop through. People seldom do that because it leads to code that is hard to understand. So, I'm not surprised that you can find edge-case examples involving drop-through where a compiler gets the NPE warnings a bit wrong.
Either way, you can easily avoid the need to do drop-through by restructuring your code. Copy the code in the case 2: case to the end of the case 1: case. Fixed. Move on.
Note the "possibly uninitialized" error is not the Java compiler being "stupid". There is a whole chapter of the JLS on the rules for definite assignment, etcetera. A Java compiler is not permitted to be smart about it, because that would mean that the same Java code would be legal or not legal, depending on the compiler implementation. That would be bad for code portability.
What we actually have here is a language design compromise. The language stops you from using variables that are (really) not initialized. But to do this, the "dumb" compiler must sometimes stop you using variables that you (the smart programmer) know will be initialized ... because the rules say that it should.
(The alternatives are worse: either no compile-time checks for uninitialized variables leading to hard crashes in unpredictable places, or checks that are different for different compilers.)
A distinct non-answer: when code is "so" complicated that an IDE / java compiler doesn't "see it", then that is a good indication that your code is too complicated anyway. At least for me, it wasn't obvious what you said. I had to read up and down repeatedly to convince myself that the statement given in the question is correct.
You have an if in a switch in a for. Clean code, and "single layer of abstraction" would tell you: not a good starting point.
Look at your code. What you have there is a state machine in disguise. Ask yourself whether it would be worth to refactor this on larger scale, for example by turning it into an explicit state machine of some sort.
Another less intrusive idea: use a List instead of an array. Then you can simply create an empty list, and add elements to that as needed.
After just trying to execute the code regardless of Eclipse complaining, I noticed that it does indeed run without problems. So apparently it was just a warning being set to "error" level, despite not being critical.
There was a "configure problem severity" button, so I set the severity of "Potential null pointer access" to "warning" (and adjusted some other levels accordingly). Now Eclipse just marks it as warning and executes the code without complaining.
More understandable would be:
boolean[] stuffNThings;
boolean initialized = false;
for (String string: list) {
if (!initialized) {
if (!/*condition*/) {
stuffNThings = new boolean[/*size*/];
initailized = true;
}
}
if (initialized) {
// bar
stuffNThings[0] = true;
}
}
Two loops, one for the initialisation, and one for playing with the stuff might or might not be more clear.
It is easier on flow analysis (compared to a switch with fall-through).
Furthermore instead of a boolean[] a BitSet might used too (as it is not fixed sized as an array).
BitSet stuffNThings = new BitSet(/*max size*/);
The rule squid:128 seems to exist to prevent fall-through in switch case unless explicitly stated. It seems a reasonable rule, as forgetting a break is a common mistake.
However fall-through are perfectly valid when wanted.
The Documentation of this rule states that the only way to achieve a fall-through is to use continue
case 4: // Use of continue statement
continue;
I have also checked the source code of SwitchCaseWithoutBreakCheck are the implementation really check for "continue" statement
#Override
public void visitContinueStatement(ContinueStatementTree tree) {
super.visitContinueStatement(tree);
markSwitchCasesAsCompliant();
}
However, the Java language does not support continue in switch/case. Nor the online documentation nor ./java-checks/src/test/files/checks/SwitchCaseWithoutBreakCheck.java are valid Java programs.
Am I missing something or is this rule fully broken and prevent using fall-through ?
You are totally right in saying that the description here is wrong and then you actually have no way to do not trigger the rule if you want to actually use fallthrough (and thus you might want either to mark issue as false positive in this case or deactivate the rule alltogether)
calling the rule "broken" is an opinion so I won't argue on that ;)
Nevertheless, a ticket has been created to handle the issue : http://jira.sonarsource.com/browse/SONARJAVA-1169
I need to know how to test the default case in a switch statement with junit. I can't change the code itself and I'm trying for 100% coverage but I don't know how to test my default. Helps?
public Hello helloSwitch() {
Hello hi = Hello.A;
switch (this) {
case A:
hi = Hello.B;
break;
case B:
hi = Hello.C;
break;
case C:
hi = Hello.A;
break;
default:
hi = Hello.A;
break;
}
I had to modify the code a fair bit so sorry that it looks silly. I just need to know how to write a junit to test the default, I've tested everything else.
I can't change this code.
Edit: changed
Edit: This code isn't important I jut need to know how to write the unit test for the default
Edit: I can't change, the code itself, I'm only writing the tests. I need 100% coverage though.
Let me assume that 'hallo' is a variable set somewhere outside the given method. Let me further assume that the Enum type currently only allows values present in the switch statement.
In this case you can use "null" to trigger the default case. In this case the default statement is unreachable and should not be there at all. While there might be ways to still "test" this - meaning to execute the code running a test - this would not add any benefit.
If you have more enum constants available than pick any covered by the default case.
As some already have mentioned:
dead code cannot and should not be tested but removed.
100% test coverage sounds nice but usually is not a realistic or sensible goal
in my opinion test shouldn't even know about the code in a method but test the method as a black box.
If you are using JaCoCo then maybe vote for this improvement to ignore default cases which can't be covered: https://github.com/jacoco/jacoco/issues/1211
visit this link can help you :
[JUnit is a standardized framework for testing Java units (that is, Java classes). JUnit can be automated to take the some of the work out of testing.
Imagine you’ve created an enum type with three values: GREEN, YELLOW, and RED. Listing 1 contains the code:
http://www.dummies.com/programming/java/using-junit ]1
I currently have a technical point of difference with an acquaintance. In a nutshell, it's the difference between these two basic styles of Java exception handling:
Option 1 (mine):
try {
...
} catch (OneKindOfException) {
...
} catch (AnotherKind) {
...
} catch (AThirdKind) {
...
}
Option 2 (his):
try {
...
} catch (AppException e) {
switch(e.getCode()) {
case Constants.ONE_KIND:
...
break;
case Constants.ANOTHER_KIND:
...
break;
case Constants.A_THIRD_KIND:
...
break;
default:
...
}
}
His argument -- after I used copious links about user input validation, exception handling, assertions and contracts, etc. to back up my point of view -- boiled down to this:
"It’s a good model. I've used it since me and a friend of mine came up with it in 1998, almost 10 years ago. Take another look and you'll see that the compromises we made to the academic arguments make a lot of sense."
Does anyone have a knock-down argument for why Option 1 is the way to go?
When you have a switch statement, you're less object oriented. There are also more opportunities for mistakes, forgetting a "break;" statement, forgetting to add a case for an Exception if you add a new Exception that is thrown.
I also find your way of doing it to be MUCH more readable, and it's the standard idiom that all developers will immediately understand.
For my taste, the amount of boiler plate to do your acquaintance's method, the amount of code that has nothing to do with actually handling the Exceptions, is unacceptable. The more boilerplate code there is around your actual program logic, the harder the code is to read and to maintain. And using an uncommon idiom makes code more difficult to understand.
But the deal breaker, as I said above, is that when you modify the called method so that it throws an additional Exception, you will automatically know you have to modify your code because it will fail to compile. However, if you use your acquaintance's method and you modify the called method to throw a new variety of AppException, your code will not know there is anything different about this new variety and your code may silently fail by going down an inappropriate error-handling leg. This is assuming that you actually remembered to put in a default so at least it's handled and not silently ignored.
the way option 2 is coded, any unexpected exception type will be swallowed! (this can be fixed by re-throwing in the default case, but that is arguably an ugly thing to do - much better/more efficient to not catch it in the first place)
option 2 is a manual recreation of what option 1 most likely does under the hood, i.e. it ignores the preferred syntax of the language to use older constructs best avoided for maintenance and readability reasons. In other words, option 2 is reinventing the wheel using uglier syntax than that provided by the language constructs.
clearly, both ways work; option 2 is merely obsoleted by the more modern syntax supported by option 1
I don't know if I have a knock down argument but initial thoughts are
Option 2 works until your trying to catch an Exception that doesn't implement getCode()
Option 2 encourages the developer to catch general exceptions, this is a problem because if you don't implement a case statement for a given subclass of AppException the compiler will not warn you. Ofcourse you could run into the same problem with option 1 but atleast option 1 does not activly encourage this.
With option 1, the caller has the option of selecting exactly which exception to catch, and to ignore all others. With option 2, the caller has to remember to re-throw any exceptions not explicitly caught.
Additionally, there's better self-documentation with option 1, as the method signature needs to specify exactly which exceptions are thrown, rather than a single over-riding exception.
If there's a need to have an all-encompassing AppException, the other exception types can always inherit from it.
The knock-down argument would be that it breaks encapsulation since I now I have to know something about the subclass of Exception's public interface in order to handle exceptions by it. A good example of this "mistake" in the JDK is java.sql.SQLException, exposing getErrorCode and getSQLState methods.
It looks to me like you're overusing exceptions in either case. As a general rule, I try to throw exceptions only when both of the following are true:
An unexpected condition has occurred that cannot be handled here.
Somebody will care about the stack trace.
How about a third way? You could use an enum for the type of error and simply return it as part of the method's type. For this, you would use, for example, Option<A> or Either<A, B>.
For example, you would have:
enum Error { ONE_ERROR, ANOTHER_ERROR, THIRD_ERROR };
and instead of
public Foo mightError(Bar b) throws OneException
you will have
public Either<Error, Foo> mightError(Bar b)
Throw/catch is a bit like goto/comefrom. Easy to abuse. See Go To Statement Considered Harmful. and Lazy Error Handling
I think it depends on the extent to which this is used. I certainly wouldn't have "one exception to rule them all" which is thrown by everything. On the other hand, if there is a whole class of situations which are almost certainly going to be handled the same way, but you may need to distinguish between them for (say) user feedback purposes, option 2 would make sense just for those exceptions. They should be very narrow in scope - so that wherever it makes sense for one "code" to be thrown, it should probably make sense for all the others to be thrown too.
The crucial test for me would be: "would it ever make sense to catch an AppException with one code, but want to let another code remain uncaught?" If so, they should be different types.
Each checked Exception is an, um, exception condition that must be handled for the program behavior to be defined. There's no need to go into contractual obligations and whatnot, it's a simple matter of meaningfulness. Say you ask a shopkeeper how much something costs and it turns out the item is not for sale. Now, if you insist you'll only accept non-negative numerical values for an answer, there is no correct answer that could ever be provided to you. This is the point with checked exceptions, you ask that some action be performed (perhaps producing a response), and if your request cannot be performed in a meaningful manner, you'll have to plan for that reasonably. This is the cost of writing robust code.
With Option 2 you are completely obscuring the meaning of the exception conditions in your code. You should not collapse different error conditions into a single generic AppException unless they will never need to be handled differently. The fact that you're branching on getCode() indicates otherwise, so use different Exceptions for different exceptions.
The only real merit I can see with Option 2 is for cleanly handling different exceptions with the same code block. This nice blog post talks about this problem with Java. Still, this is a style vs. correctness issue, and correctness wins.
I'd support option 2 if it was:
default:
throw e;
It's a bit uglier syntax, but the ability to execute the same code for multiple exceptions (ie cases in a row) is much better. The only thing that would bug me is producing a unique id when making an exception, and the system could definitely be improved.
Unnecessary have to know the code and declare constants for the exception which could have been abstract when using option 1.
The second option (as I guess) will change to traditional (as option 1) when there is only one specific exception to catch, so I see inconsistencey over there.
Use both.
The first for most of the exceptions in your code.
The second for those very "specific" exceptions you've create.
Don't struggle with little things like this.
BTW 1st is better.