Why using labels in Java is a bad practice? I cant find a reason. All explanations - you shouldn't use it just because you shouldn't.
It's difficult to read code containing breaks to a label. Also, a label can be accidentally moved, or code inserted at an incorrect location with respect to a label. The compiler is not able to warn you of these effects since the code remains syntactically valid.
Code that's difficult to read is difficult to maintain. Bugs will inevitably creep in.
Other control structures (break, continue, while, for, etc.) don't suffer from this.
Note that a switch to a label doesn't suffer from these effects either: the structure of a switch block is well-defined.
The most sensible alternative to breaking out of a nested loop is to recast the code to a function and use return. You also get the added benefit of being able (potentially) to return a value back to the caller.
I think that you are referring to break and continue labeled.
The problem is that labeled break (and continue) is a construct of imperative languages that is absolutely not related to Object Oriented.
In Object Oriented programs the flows can be easily understood. It is not possible to jump from a part of code to another part of code, you can only call a method or continue current code or exit the current block of code.
Jumping from position to position is a probable point of break for your application where bugs can easily happens. Jumping creates what is called spaghetti code
Labelled breaks (and breaks, in a smaller way) are a more-modern equivalent to the old GOTO statements of older languages (FORTRAN, COBOL, Basic). Goto statements were found to be much more liable to contain an error than all other kinds of statements combined -- the study I'm remembering measured it as 9 times more likely. This gave rise to the "structured programming" movement in the 70s, and the banning of the goto statement from some software shops at the time.
It is more important to be able to read code easily than to be able to write it without restrictions.
Labels are fine to break out of nested for-loops. I'd suggest to put the nested loops in a separate method and then break out with return.
The problem is that the complex flows of processing becomes really hard to follow.
Related
Some information (don't want to confuse you with a lot of shitty code):
I've done a pretty large console programm (my largest project so far) which helps me a lot with managing some accounts / assets and more. I'm constantly adding more features but at the same time I reshape the code to work on my shitty coding style.
The console program has a lot of commands the user can type and for every command different methods get called / objects get created / manipulated and so on.
My keywords which are saved in an ArrayList<String> and my commands have this type: [keyword] [...n more Strings]
DESIGN PROBLEM 1:
I have a method cmdProcessor(String[] arguments) which handles the input (command) of the user, and the [keyword] is always the first argument arguments[0]. That means I have a large number of if-statements of this type:
if(arguments[0].equalsIgnoreCase("keyword") callMethod(argmts); where in the String[] argmts the remaining arguments[1] ... [n] are.
Is this a good way to handle this or should I go with switch-case?
Or something else (what?)? Is it better to save the keywords in a HashMap<String, Method>?
DESIGN PROBLEM 2:
The methods (see above callMethod(argmts) ), which are triggered by the entered keyword look even more chaotic. Since the same method can have different numbers and forms of arguments saved in the String[] argmts the method is full of if(argmts.length == ...) to check length, and every of these if-blocks has a bunch of switch-case options which also have a lot of ifs and so on. The last else and the default-case in switch-case I always use for error-handling (throwing error codes and and explanation why the pattern doesn't match and so on).
Is this good or are there better ways?
I thought about using lots of submethods, which would also blow up
my program and cost a lot of time but maybe improve readability / overview. Is this okay, or what is the best
option in such cases (lots of ifs and switch-case)?
Since I want to build more and more around this program maybe I should start now to fix bad design before it's too late. :)
About Design-Problem 1:
My go-to would be to register a lot of Handlers, which you can base on a common interface and then implement the specific behavior individually. This is good, because the central method handling your input is slim, and you only need to register a lot of singletons once, on initialization. Disadvantage: if you forget one, it will not work. So maybe, you can register them automatically (reflection or something thelike).
Aside from that, a map is better than a List in this case, because (I assume) you don't need a sorting. You need a mapping from key to behavior, so a map seems better (though even a very large set of keywords would probably not be very inefficient, if you stick to a list).
About Design Problem 2:
If I was you, I'd use actual Regular-Expression patterns. Take a look at the java.util.regex.Pattern-class. You can isolate groups and validate the values you receive. Though it does not spare you the exception/error-handling, it does help a lot in segmentation and interpretation efforts.
while reading "Java complete reference seventh edition" in page 100.
I read this statement "However, be careful. Too many break statements have the tendency to destructure your code"
What I don't understand how can break statement change or deconstructed in my code?
Is that in java only or in all programming languages?
Is that something linked to byte-code?
Thanx so much, please do not misunderstand me :)
Too many break statements have the tendency to destructure your code
I believe that the author means that it is more difficult to follow the execution paths in your code. This is because break jumps to a line of your code that potentially can be far away from the line with the break.
The author uses the word de-structure your code. Its just an expression. what he actually meant by that is:
Imagine you writing five loops one inside another having break statements for all the loops. If the loop gets bigger and bigger, there are high chances of losing the execution path as a developer i.e. which loop is being executed and from which loop the control has broken out of.
Imagine the chaos it creates if you have more loops having break for each loop/more break statements for a few loops. The only thing that catches your eye is the break statement .
However, be careful. Too many break statements have the tendency to destructure your code.
This doesn't mean that if you have a switch statement to compare over 100 unique values, then it will de-structure your code. Because, in this case, that's the only need.
But suppose, you are using too many loops, could be anything, for-loop, while-loop, do-while-loop or even if-else conditions, excessively using breakstatement will come out with too many permutations and combinations of execution path which we at development phase might not see. And then, can cause a big trouble at real time execution with multiple values or multiple types of values which may trigger a new, unexplored path of execution. (I mean, those paths, which we were not intended to create.)
So, the author says better to avoid too many break statements. Not all.
I've just read about spaghetti code ( wiki link) that "goto" statement creates, I wonder if label in java makes spaghetti code?
I just interested in this because one of my old question about break and label in java that I asked here
Labels are so rarely needed/used that no, not really. Also you can't jump to a label, you need to break to it, so you can't get the similar kind of confusion as with filling the code with goto whereever statements.
The main problem with labels is that they are rarely used which means they are surprising and possibly confusing for a reader. e.g
http://stackoverflow.com/
System.out.println("Hello SO");
At first glance, that doesn't even look like valid Java code, but it is.
Because labels tend to be used only when the are really needed, and sometime not used when they should have been used IMHO, they don't lead to spaghetti code in Java in reality.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I understand the JVM optimizes some things for you (not clear on which things yet), but lets say I were to do this:
while(true) {
int var = 0;
}
would doing:
int var;
while(true) {
var = 0;
}
take less space? Since you aren't declaring a new reference every time, you don't have to specify the type every time.
I understand you really would only need to put var outside of while if I wanted to use it outside of that loop (instead of only being able to use it locally like in the first example). Also, what about objects, would it be different that primitive types in that situation? I understand it's a small situation, but build-up of this kind of stuff can cause my application to take a lot of memory/cpu. I'm trying to use the least amount of operations possible, but I don't completely understand whats going on behind the scenes.
If someone could help me out, even maybe link me to somewhere I can learn about saving cpu by decreasing amount of operations, it would be highly appreciated. Please no books (unless they're free! :D), no way of getting one right now /:
Don't. Premature optimization is the root of all evil.
Instead, write your code as it makes most sense conceptually. Write it thoughtfully, yes. But don't think you can be a 'human compiler' and optimize and still write good code.
Once you have written your code (more or less naively, depending on your level of experience) you write performance tests for it. Try to think of different ways in which the code may be used (many times in a row, from front to back or reversed, many concurrent invocations etc) and try to cover these in test cases. Then benchmark your code.
If you find that some test cases are not performing well, investigate why. Measure parts of the test case to see where the time is going. Zoom into the parts where most time is spent.
Mostly, you will find weird loops where, upon reading the code again, you will think 'that was silly to write it that way. Of course this is slow' and easily fix it. In my experience most performance problems can be solved this way and 'hardcore optimization' is hardly ever needed.
In the end you will find that 99* percent of all performance problems can be solved by touching only 1 percent of the code. The other code never comes into play. This is why you should not 'prematurely' optimize. You will be spending valuable time optimizing code that had no performance issues in the first place. And making it less readable in the process.
Numbers made up of course but you know what I mean :)
Hot Licks points out the fact that this isn't much of an answer, so let me expand on this with some good ol' perfomance tips:
Keep an eye out for I/O
Most performance problems are not in pure Java. Instead they are in interfacing with other systems. In particular disk access is notoriously slow. So is the network. So minimize it's use.
Optimize SQL queries
SQL queries will add seconds, even minutes, to your program's execution time if you don't watch out. So think about those very carefully. Again, benchmark them. You can write very optimized Java code, but if it first spends ten seconds waiting for the database to run some monster SQL query than it will never be fast.
Use the right kind of collections
Most performance problems are related to doing things lots of times. Usually when working with big sets of data. Putting your data in a Map instead of in a List can make a huge difference. Also there are specialized collection types for all sorts of performance requirements. Study them and pick wisely.
Don't write code
When performance really matters, squeezing the last 'drops' out of some piece of code becomes a science all in itself. Unless you are writing some very exotic code, chances are great there will be some library or toolkit to solve your kind of problems. It will be used by many in the real world. Tried and tested. Don't try to beat that code. Use it.
We humble Java developers are end-users of code. We take the building blocks that the language and it's ecosystem provides and tie it together to form an application. For the most part, performance problems are caused by us not using the provided tools correctly, or not using any tools at all for that matter. But we really need specifics to be able to discuss those. Benchmarking gives you that specifity. And when the slow code is identified it is usually just a matter of changing a collection from list to map, or sorting it beforehand, or dropping a join from some query etc.
Attempting to optimise code which doesn't need to be optimised increases complexity and decreases readability.
However, there are cases were improving readability also comes with improved performance.
For example,
if a numeric value cannot be null, use a primitive instead of a wrapper. This makes it clearer that the value cannot be null but also uses less memory and reduces pressure on the GC.
use a Set when you have a collection which cannot have duplicates. Often a List is used when in fact a Set would be more appropriate, depending on the operations you perform, this can also be faster by reducing time complexity.
consider using an enum with one instance for a singleton (if you have to use singletons at all) This is much simpler as well as faster than double check locking. Hint: try to only have stateless singletons.
writing simpler, well structured code is also easier for the JIT to optimise. This is where trying to out smart the JIT with more complex solutions will back fire because you end up confusing the JIT and what you think should be faster is actually slower. (And it's more complicated as well)
try to reduce how much you write to the console (and IO in general) in critical sections. Writing to the console is so expensive, both for the program and the poor human having to read it that is it worth spending more time producing concise console output.
try to use a StringBuilder when you have a loop of elements to add. Note: Avoid using StringBuilder for one liners, just series of append() as this can actually be slower and harder to read.
Perfection is achieved, not when there is nothing more to add, but when there is nothing left to take away. --
Antoine de Saint-Exupery,
French writer (1900 - 1944)
Developers like to solve hard problems and there is a very strong temptation to solve problems which don't need to be solved. This is a very common behaviour for developers of up to 10 years experience (it was for me anyway ;), after about this point you have already solved most common problem before and you start selecting the best/minimum set of solutions which will solve a problem. This is the point you want to get to in your career and you will be able to develop quality software in far less time than you could before.
If you dream up an interesting problem to solve, go ahead and solve it in your own time, see what difference it makes, but don't include it in your working code unless you know (because you measured) that it really makes a difference.
However, if you find a simpler, elegant solution to a problem, this is worth including not because it might be faster (thought it might be), but because it should make the code easier to understand and maintain and this is usually far more valuable use of your time. Successfully used software usually costs three times as much to maintain as it cost to develop. Do what will make the life of the poor person who has to understand why you did something easier (which is harder if you didn't do it for any good reason in the first place) as this might be you one day ;)
A good example on when you might make an application slower to improve reasoning, is in the use of immutable values and concurrency. Immutable values are usually slower than mutable ones, sometimes much slower, however when used with concurrency, mutable state is very hard to get provably right, and you need this because testing it is good but not reliable. Using concurrency you have much more CPU to burn so a bit more cost in using immutable objects is a very sensible trade off. In some cases using immutable objects can allow you to avoid using locks and actually improve throughput. e.g. CopyOnWriteArrayList, if you have a high read to write ration.
I'm currently in a class on systems software development. We are writing the two-pass assembler for the assembly language of a fictional machine. We've implemented the tokenizer, and all of the classes that we need to abstractedly represent this program - all that is left (besides implementing the code generator in a later phase) is to parse the tokens. Here is where I'm having a major issue. I'm choosing to implement this as a recursive descent parser, since that's the only technique I currently have experience with...but we are not allowed to stop assembly on syntax errors. For instance, if the user gives a load word instruction with invalid syntax, we are to replace it with a NOP. If the user gives a bad label, we are to simply ignore it. If the user places unknown characters in a line, we discard them.
On the one hand, it sounds easy - however, implementing this causes me to break (what I understand to be) one of the important rules of a recursive descent parser. Each of my functions pulls multiple tokens before calling another function, since I need to account for all of the possible fixable syntax errors. Given that I can't stop assembly, and I must have enough information about my current context to intelligently determine what the user was intending to do, I have to handle a lot within one function.
This turns the program from a true recursive descent parser into more of a semi-finite-state-machine. I feel like I'm doing this badly, but I'm not sure how else to implement this. Does anyone have any suggestions/ideas?
BTW - I'm not allowed to use tools like ANTLR, or any other parser generator.
Thanks.
My suggestion would be don't try. Poor syntax error recovery is inherent in recursive descent parsers. If you are not allowed to use a parser generator, then decent syntax error recovery is probably beyond the scope of your homework. (Check with your instructor ...)