Why do standard classes sometimes have seemingly unrelated methods? - java

While studying the standard Java library and its classes, i couldn't help noticing that some of those classes have methods that, in my opinion, have next to no relevance to those classes' cause.
The methods i'm talking about are, for example, Integer#getInteger, which retrieves a value of some "system property", or System#arraycopy, whose purpose is well-defined by its name.
Still, both of these methods seem kinda out of place, especially the first one, which for some reason binds working with system resources to a primitive type wrapper class.
From my current point of view, such method placement policy looks like a violation of a fundamental OOP design principle: that each class must be dedicated to solving its particular set of problems and not turn itself into a Swiss army knife.
But since i don't think that Java designers are idiots, i assume that there's some logic behind a decision to place those methods right where they are. So i'd be grateful if someone could explain what that logic really is.
Thanks!
Update
A few people have hinted at the fact that Java does have its illogical things that are simply remnants of a turbulent past. I reformulate my question then: why is Java so unwilling to mark its architectural flaws as deprecated, since it's not like that the existing deprecated features are likely to be discontinued in any observable future, and making things deprecated really helps refraining from using them in newly created code?

This is a good thing to wonder about. I know about more recent features (such as generics, lambda's etc) there are several blogs and posts on mailing lists that explain the choices made by the library makers. These are very interesting to read.
In your case I expect the answer isn't too exiting. The reason they were made is hard to tell. But both classes exist since JDK1.0. In those days the quality of programming in general (and also Java and OO in particular) was perhaps lower (meaning there were fewer common practices, library makers had to invent many paradigms themselves). Also there were other constraints in those times, such as Object creation being expensive.
Many of those awkwardly designed methods and classes now have a better alternative. (See Date and the package java.time)
The arraycopy you would expect to be added to the Arrays class, but unfortunately it is not there.
Ideally the original method would be deprecated for a while and then removed. Many libraries follow this strategy. Java however is very conservative about this and only deprecates things that really should not be used (such as Thread.stop(). I don't think a method has ever been removed in Java due to deprecation. This means it is fairly easy to upgrade your software to a newer version of Java, but it comes at the cost of leaving some clutter in the libraries.
The fact that java is so conservative about keeping the new JDK/JRE versions compatible with older source code and binaries is loved and hated. For your hobby project, or a small actively developed project upgrading to a new JVM that removes deprecated functions after a few years is not too difficult. But don't forget that many projects are not actively developed or the developers have a hard time making changes securely, for instance because they lack a proper regression test. In these projects changes in APIs cost a lot of time to comply to, and run the risk of introducing bugs.
Also libraries often try to support older versions of Java as well as newer version, they will have a problem doing so when methods have been deleted.

The Integer-example is probably just a design decision. If you want to implicitly interpret a property as Integer use java.lang.Integer. Otherwise you would have to provide a getter method for each java.lang-Type. Something like:
System.getPropertyAsBoolean(String)
System.getPropertyAsByte(String)
System.getPropertyAsInteger(String)
...
And for each data type, you'd require one additional method for the default:
- System.getPropertyAsBoolean(String, boolean)
- System.getPropertyAsByte(String, byte)
...
Since java.lang-Types already have some cast abilities (Integer.valueOf(String)), I am not too surprised to find a getProperty method here. Convenience in trade for breaking principles a tiny bit.
For the System.arraycopy, I guess it is an operation that depends on the operating system. You probably copy memory from one location to another in a very efficient way. If I would want to copy an array like that, I'd look for it in java.lang.System

"I assume that there's some logic behind a decision to place those
methods right where they are."
While that is often true, I have found that when somethings off, this assumption is typically where you are mislead.
A language is in constant development, from the day someone proposes a new language to the day it is antiquated. In between those extremes are some phases that the language, go through. Especially if someone is spending money on it and wants people to use it, a very peculiar phase often occurs, just before or after the first release:
The "we need this to work yesterday" phase.
This is where stuff like this happens, you have an almost complete language, but the programmers need to do something to to show what the language can do, or a specific application needs a feature that was not designed into the language.
So where do we add this feature?
- well, where it makes most sense to that particular programmer who's task it is to "make it work yesterday".
The logic may be that, this is where the function makes the most sense, since it doesn't belong anywhere else, and it doesn't deserve a class of its own. It could also be something like: so far, we have never done an array copy, without using system.. lets put arraycopy in there, and save everyone an extra include..
in the next generation of the language, people will not move the feature, since some experienced programmers will complain. So the feature may be duplicated, and found in a place where it makes more sense.
much later, it will be marked as deprecated, and deleted, if anyone cares to clean it up..

Related

What is the limitations and advantages of Cross Compiler for C# to java?

I want to migrate my entire C# 4.0(.Net 2010) desktop Application to Java.I don't know any tool available for that?Please suggest me good one.
Also, i would like to know what are the limitations and advantages of Cross Compiler for C# to java?
please guide me to get out of this problem...
Saravanan.P
Crosscompilers will usually produce rather messy code, and sometimes code that doesn't even compile.
Some (maybe most) will force your new code into having bindings with custom libraries from the crosscompiler, and thus be forever linked to that product.
Your new code will be very hard to maintain and expand as a result, and might well offer poor performance as well as compared to the old code when compiled.
In general, you would most likely be better off rewriting the application yourself (or hiring people to do so) if it is going to have to be used and maintained actively for more than a short, transitional period.
That said, for some things a crosscompiler can be helpful. For example start with a crosscompiled version and over time replace that codebase with newly written code, this would get you working more quickly and you'd not have to maintain 2 separate code bases, in 2 different languages, using 2 toolsets, at the same time.

Java refactor to Generics industry standards

There seems to be some debate over refactoring to utilize java generics within my current team. The question I have is what are the current industry standards in terms of refactoring older Java code to take advantage of some of these features? Of course by industry standards I am referring to best practices. A link to a book or a site with these listed will be awarded the answer vote as that is the least subjective way to handle this question.
I don't think that blindly following what somebody else declares to be "best practice" or "industry standard" is ever a good idea. You're in the best position to decide whether changing your code is worthwhile or not.
The questions you need to answer are what benefits will you get from upgrading the old code, what will it cost, and what are the risks?
The main benefit is that you will have improved compile-time type checking, which should help to detect bugs in new code that uses the updated code. It may even highlight bugs in existing code. Code that uses generics, while sometimes quite verbose, is typically more readable as it is explicit about which types are valid in which contexts. You'll also no longer have to suppress/ignore compiler warnings.
The cost is the amount of time it will take to make and test the necessary changes to introduce generics. Any time you make code changes there is a chance that you might introduce bugs, so that's a risk. Do the benefits outweigh the costs? That depends on how much code you have, how it's being used and what other demands you have on your time.
The papers from this MIT research group might provide you with some useful guidelines:
Refactoring Java Applications to Use Generic Libraries
Efficiently refactoring Java applications to use generic libraries
Robert Fuhrer, Frank Tip, Julian Dolby, Adam Kiezun and Markus Keller
ECOOP 2005 --- Object-Oriented Programming, 19th European Conference, (Glasgow, Scotland), July 25-29, 2005.
Refactoring techniques for migrating applications to generic Java container classes
Frank Tip, Robert Fuhrer, Julian Dolby, and Adam Kiezun
IBM T.J. Watson Research Center IBM Research Report RC 23238, (Yorktown Heights, NY, USA), June 2, 2004.
Utilizing java generics is definitely a good idea. It is backward compatible. So the codebase you cannot convert will continue to work with the new code.
EDIT : I should have mentioned type erasure as the reason which makes backward compatible.
Best practices for adopting generics? The first best practice is "do". Try to eliminate as many casts as you can from your code. If you want to make your life easier, use IntelliJ's "Generify" refactoring -- just point it at your entire codebase, let it do its thing, and then do a little post-cleanup.
Your team should be balancing the benefits of a large-scale refactoring against the cost and the technical and business risks of doing this ... and the other priorities that your team has.
"Best practice" arguments and opinions from people who don't understand your project and the business context are simply not relevant here.
Best Practices don't exist. That's a weird term that suggests that the door closes on the 'bestness' of a particular solution... Use generics? Yes. Immediately. It's an awkward journey, since so many of the big libraries (Hibernate, Spring) still fail to embrace them completely.. but in my experience, dealing with a mix of generics and brave-casts still makes for a better code base than not using them at all.
I'd also make it policy to convert-as-you-touch instead of some sort of giant refactoring mission.
It is usually good idea to refactor to use generics.
It's not required so don't treat this as an urgent task, but you can consider not using generics is a mild form of technical debt - so if you have time available and the code base is planned to have a long ongoing life then it is worth investing in upgrading it.
The main benefits are:
Better compile-time type checking, which will reduce errors
Remove of unnecessary casts from your source code, which makes code more readable
It is still backwards compatible with old code (thanks to type erasure)
There is no real downside - the only potential issue I can think of is if you ever wanted to port the code back to earlier versions of Java without generic support. But that would be a very unusual thing to do!
If you do decide to refactor into generics then I recommend the following steps:
Turn on all your compiler warnings (Eclipse has pretty good warnings)
Add the generic type to your class first e.g. MyClass<T>
Then change the type of any method signatures / internal fields / data structures to use T
You will probably have many warnings / errors throughout the code at this point. This is OK, just work through them and fix them all. Often your IDE may be able to "quick fix" many of them.
Write / refactor test cases that demonstrate the generic features
It is fairly quick to do all this - I think I managed to refactor about 10,000 lines of Java library code to use generics in less than one day, which included updating some client code.

Writing a Java standard class library alternative from the scratch

I am just curious but I want to know if it is feasible to remove totally the Java standard class libraries coming with the JVM and start a new one from the scratch [à la ClassPath].
If that is possible, what classes MUST be implemented as minimum? (Object and String come to my mind, but... I do not know).
Such thing breaks some license? Is there any way to say to the "java" command to "not use the rt.jar"?
Thanks in advance,
Ernesto
You can use the -Xbootclasspath option to specify your own set of core classes.
If you do go down this path, you will probably end up with a lot of problems if you intend to also use third party libraries as they will depend on the core API and any inconsistencies in your version will likely cause bugs.
As an absolute minimum you'd probably have to reimplement everything in the java.lang package. As well as Object and String, the primitive wrapper classes need to be present in order for auto-boxing to work. I don't think you can replace java.lang without a fair bit of native code to make things like threads work.
In theory, "yes" it is possible, though you might also need to implement your own JVM! The relationships between the JVM and some of the low level classes (Object, Class, Thread, etc) are such that these classes have to be implemented as part of the JVM.
In practice, it is such a big task that you'd be working on it for the rest of your life, and the chances are that nobody would use your code even if you succeeded. That doesn't sound like "fun" to me.
Such thing breaks some license?
Not per-say. But if you ever tried to release it calling it "Java", Sun's lawyers would be after you for trademark infringement. You can only legally call your implementation Java if it has been validated against the Sun TCK.
But I don't want to be totally discouraging. If you want to hack on the internals of a JVM or stuff like that, the JNode project is always looking for keen new people.
No, it is not feasible at all. I mean, sure, you could do it, but you aren't going to do it better than a large corporation or open source project with years of experience and large numbers of Java gurus. It might be fun to geek it up though.

Why will there be no native properties in Java 7?

Is there any rational reason, why native properties will not be part of Java 7?
There are some high-level reasons related to schedule and resources of course. Implementation of properties and understanding all of the ramifications and intersections with other language features is a large task similar to the size of various Java 5 language changes.
But I think the real reason Sun is not pushing properties is the same as closures:
1) There is no consensus on what the implementation should look like. Or rather, there are many competing alternatives and people who are passionate about properties disagree about crucial parts of the implementation.
2) Perhaps more importantly, there is a significant lack of consensus about whether the feature is wanted at all. While many people want properties, there are also many people that don't think it's necessary or useful (in particular, I think server-side people see properties as far less crucial to their daily life than swing programmers).
Properties history here:
http://tech.puredanger.com/java7#property
Doing properties "right" in Java will not be easy. Rémi Forax's work especially has been valuable in figuring out what this might look like, and uncovering a lot of the "gotchas" that will have to be dealt with.
Meanwhile, Java 7 has already taken too long. The closures debate was a huge, controversial distraction that wasted a lot of mind-power that could have been used to develop features (like properties) that have broad consensus of support. Eventually, the decision was made to limit major changes to modularization (Project Jigsaw). Only "small change" is being considered for the language (under Project Coin).
JavaFX has beautiful property support, so Sun clearly understands the value of properties and knows how to implement them. But having been spoiled by JavaFX properties, developers are less likely to settle for a half-baked implementation in Java. If they are worth doing, they are worth doing right.
Any given thing is "not done" by default, so no particular reason is needed for something to remain not done. Rather some compelling reason is needed to move something from "not done" to "planned" or "done". No sufficiently compelling reason has yet arisen for this language feature.
There are two more reasons to avoid properties in any language:
Properties are not very object-oriented. Making them easy to write encourages the pattern where the object just serves up its internal state and the caller manipulates it. The object should provide higher-level methods and keep its internals private. Next time you're tediously implementing a getter, consider what the caller will do with the data and whether you can just provide that functionality directly.
Properties encourage mutable state (through setters), which makes a program less parallelizable. As the number of cores goes up, we should all be trying to make our objects immutable to make concurrent reasoning easier. Next time you're tediously implementing a setter, consider removing it and making the object immutable.
Not enough time?
Not yet specced properly?
Difficult to add to java due to java's implementation?
Deemed not important enough, ie other things were prioritiesed?

Is static metaprogramming possible in Java?

I am a fan of static metaprogramming in C++. I know Java now has generics. Does this mean that static metaprogramming (i.e., compile-time program execution) is possible in Java? If so, can anyone recommend any good resources where one can learn more about it?
No, this is not possible. Generics are not as powerful as templates. For instance, a template argument can be a user-defined type, a primitive type, or a value; but a generic template argument can only be Object or a subtype thereof.
Edit: This is an old answer; since 2011 we have Java 7, which has Annotations that can be used for such trickery.
The short answer
This question is nearly more than 10 years old, but I am still missing one answer to this. And this is: yes, but not because of generics and note quite the same as C++.
As of Java 6, we have the pluggable annotation processing api. Static metaprogramming is (as you already stated in your question)
compile-time program execution
If you know about metaprogramming, then you also know that this is not really true, but for the sake of simplicity, we will use this. Please look here if you want to learn more about metaprogramming in general.
The pluggable annotation processing api is called by the compiler, right after the .java files are read but before the compiler writes the byte-code to the .class files. (I had one source for this, but i cannot find it anymore.. maybe someone can help me out here?).
It allows you, to do logic at compile time with pure java-code. However, the world you are coding in is quite different. Not specifically bad or anything, just different. The classes you are analyzing do not yet exist and you are working on meta data of the classes. But the compiler is run in a JVM, which means you can also create classes and program normally. But furthermore, you can analyze generics, because our annotation processor is called before type erasure.
The main gist about static metaprogramming in java is, that you provide meta-data (in form of annotations) and the processor will be able to find all annotated classes to process them. On (more easy) example can be found on Baeldung, where an easy example is formed. In my opinion, this is quite a good source for getting started. If you understand this, try to google yourself. There are multiple good sources out there, to much to list here. Also take a look at Google AutoService, which utilizes an annotation processor, to take away your hassle of creating and maintaining the service files. If you want to create classes, i recommend looking at JavaPoet.
Sadly though, this API does not allow us, to manipulate source code. But if you really want to, you should take a look at Project Lombok. They do it, but it is not supported.
Why is this important (Further reading for the interested ones among you)
TL;DR: It is quite baffling to me, why we don't use static metaprogramming as much as dynamic, because it has many many advantages.
Most developers see "Dynamic and Static" and immediately jump to the conclusion that dynamic is better. Nothing wrong with that, static has a lot of negative connotations for developers. But in this case (and specifically for java) this is the exact other way around.
Dynamic metaprogramming requires reflections, which has some major drawbacks. There are quite a lot of them. In short: Performance, Security, and Design.
Static metaprogramming (i.e. Annotation Processing) allows us to intersect the compiler, which already does most of the things we try to accomplish with reflections. We can also create classes in this process, which are again passed to the annotation processors. You then can (for example) generate classes, which do what normally had to be done using reflections. Further more, we can implement a "fail fast" system, because we can inform the compiler about errors, warnings and such.
To conclude and compare as much as possible: let us imagine Spring. Spring tries to find all Component annotated classes at runtime (which we could simplify by using service files at compile time), then generates certain proxy classes (which we already could have done at compile time) and resolves bean dependencies (which, again, we already could have done at compile time). Jake Whartons talk about Dagger2, in which he explains why they switched to static metaprogramming. I still don't understand why the big players like Spring don't use it.
This post is to short to fully explain those differences and why static would be more powerful. If you want, i am currently working on a presentation for this. If you are interested and speak German (sorry about that), you can have a look at my website. There you find a presentation, which tries to explain the differences in 45 minutes. Only the slides though.
Take a look at Clojure. It's a LISP with Macros (meta-programming) that runs on the JVM and is very interoperable with Java.
What do you exactly mean by "static metaprogramming"? Yes, C++ template metaprogramming is impossible in Java, but it offers other methods, much more powerful than those from C++:
reflection
aspect-oriented programming (#AspectJ)
bytecode manipulation (Javassist, ObjectWeb ASM, Java agents)
code generation (Annotation Processing Tool, template engines like Velocity)
Abstract Syntax Tree manipulations (APIs provided by popular IDEs)
possibility to run Java compiler and use compiled code even at runtime
There's no best method: each of those methods has its strengths and weaknesses.
Due to flexibility of JVM, all of those methods in Java can be used both at compilation time and runtime.
No. Even more, generic types are erased to their upper bound by the compiler, so you cannot create a new instance of a generic type T at runtime.
The best way to do metaprogamming in Java is to circumvent the type erasure and hand in the Class<T> object of your type T. Still, this is only a hack.
If you need powerful compile-time logic for Java, one way to do that is with some kind of code generation. Since, as other posters have pointed out, the Java language doesn't provide any features suitable for doing compile-time logic, this may be your best option (iff you really do have a need for compile-time logic). Once you have exhausted the other possibilities and you are sure you want to do code-generation, you might be interested in my open source project Rjava, available at:
http://www.github.com/blak3mill3r
It is a Java code generation library written in Ruby, which I wrote in order to generate Google Web Toolkit interfaces for Ruby on Rails applications automatically. It has proved quite handy for that.
As a warning, it can be very difficult to debug Rjava code, Rjava doesn't do much checking, it just assumes you know what you're doing. That's pretty much the state of static metaprogramming anyway. I'd say it's significantly easier to debug than anything non-trivial done with C++ TMP, and it is possible to use it for the same kinds of things.
Anyway, if you were considering writing a program which outputs Java source code, stop right now and check out Rjava. It might not do what you want yet, but it's MIT licensed, so feel free to improve it, deep fry it, or sell it to your grandma. I'd be glad to have other devs who are experienced with generic programming to comment on the design.
Lombok offers a weak form of compile time metaprogramming. However, the technique they use is completely general.
See Java code transform at compile time for a related discussion
You can use a metaprogramming library for Java such as Spoon: https://github.com/INRIA/spoon/
No, generics in Java is purely a way to avoid casting of Object.
In a very reduced sense, maybe?
http://michid.wordpress.com/2008/08/13/type-safe-builder-pattern-in-java/

Categories

Resources