So I'm pretty new to Java and this may be a question conceived due to some misconceptions, but I'm just curious: Do all "forms" (I guess different JREs? Or VMs?), for example, I've heard of Dalvik, of Java include an automatically imported System.out? What makes them different from the standard one i downloaded from Oracle to learn with? Inform me if I'm not understanding any concepts correctly, please.
In addition to the Java language specification which, in section 7.3, says that
Every compilation unit implicitly imports every public type name declared in the predefined package java.lang, as if the declaration import java.lang.*; appeared at the beginning of each compilation unit immediately after any package declaration. As a result, the names of all those types are available as simple names in every compilation unit.
there is also a Java Compatibility Kit which any conforming implementation of Java must pass. Thus, anything which wants to call itself "Java" must include all the parts of Java, including System.out.
Related
On a Java web project (running on Tomcat & JSF & Spring), a custom renderer was written to get custom converters to be called even if the value to be converted is null, as is explained here: JSF Custom Converter not called on null value
However, the SonarQube scan is detecting an issue on the import line, namely:
import com.sun.faces.renderkit.html_basic.TextRenderer;
as it is a com.sun.* package and not a standard Java API package. The rule description states:
Classes from "sun.*" packages should not be used (squid:S1191)
Classes in the sun.* or com.sun.* packages are considered implementation details, and are not part of the Java API.
They can cause problems when moving to new versions of Java because there is no backwards compatibility guarantee. Similarly, they can cause problems when moving to a different Java vendor, such as OpenJDK.
Such classes are almost always wrapped by Java API classes that should be used instead.
Noncompliant Code Example
import com.sun.jna.Native; // Noncompliant
import sun.misc.BASE64Encoder; // Noncompliant
This makes good sense and all, but I can't find a Java API wrapper for this class, only the source code and packages that the class is in... What is the appropriate action to take in this case?
I think this is kind of false positive. Usage of com.sun.faces is an usage of internal implementation-specific classes but for JSF rather than JDK. Those classes will not be removed by other JVMs or some new version of JDK. You just bind your code to Sun's (Oracle's) implementation of JSF which might be OK or not OK for you.
Looking into the code of that rule at GitHub, it looks like that it is configurable to avoid such false positives but setting the exclude property as a comma-separated list. I am not sure where exactly you can do this in UI but https://docs.sonarqube.org/display/SONARQUBE50/Configuring+Rules might be a starting point.
I was doing some experiments today in android source.
Let me tell the complete thing,
I compiled framework.jar from android source and decompiled it and generated smali source and kept it aside. Then from CyanogenMod repo I added the commits of a feature to android source and compiled framework.jar again and again decompiled smali source to see the changes in terms of smali so that I can port them over to my ROM.
The feature I am porting requires importing of certain classes e.g import dalvik.system.VMRuntime and extra coding for utilization of those imported classes. So now my problem is, I am only able to see the extra coding i.e utilization of those classes in the smali code but not those imports. So when I port only the smali code I get java.lang.NoSuchMethodError in logcat which shows that it is unable to find that method. The reason is clear because the necessary class is not imported then how to do it in smali code? i see no way to do that in smali and due to which the newly introduced methods don't work.
Any feasible solution to this?
The only thing an import does in java is make it so that you can mention a class without having to specify the full package name. In smali, there are no imports - everything always uses the fully qualified class name that includes the package.
As such, your problem is definitely not related to imports. It sounds like you are trying to use a method that simply doesn't exist on the device.
You can disassemble the framework jars from your device and find the definition of the dalvik.system.VMRuntime and see what methods are available. Or alternately add some reflection calls and log the info to logcat.
It's worth noting that VMRuntime is not part of the public API, and it may not be present or consistent on all devices or future versions of Android.
(I don't know smali, so there may be a much better solution)
No Java program ever requires any import statement. To use e.g. ArrayList you need to either import it or refer to it in full, as java.util.ArrayList.
There is a significant difference between e.g. a C++ #include and a Java import. A C++ #include directly inserts code in your program, typically the declaration for a class you are using. The process of getting the declarations is divided into two stages in Java. First the compiler determines the fully qualified class name, then it uses its own library and the classpath to find the declaration for that name. Java import is used only in calculating the fully qualified class name, and so is not needed for any class that is only referred to by its fully qualified name.
Perhaps you could pre-process the code you are adding to replace e.g. VMRuntime with dalvik.system.VMRuntime etc. so that you can compile it with no imports.
Here is an example of a short program that uses java.util classes, in different ways, without any import:
public class Test {
public static void main(String[] args) {
java.util.List<String> list = new java.util.ArrayList<String>();
list.add("bbb");
list.add("aaa");
java.util.Collections.sort(list);
System.out.println(list);
}
}
I have a functionality that I wish to provide to a customer for a software mockup that we are preparing - and I want to know if it's
possible
intelligent (a.k.a. not stupid)
the best thing
I want the customer to be able to write a java class that implements my Computable interface and stick it in some predetermined folder. This folder will contain the .java files rather than .class files. Then, at runtime, I want my program to search that folder and extract all of the Computables from that folder and store them in a map from the name of the Computable to the Computable object. The Computable should only have a default constructor and the it interface will only have one method called compute which maps an array of Object to an Object.
The Java Compiler API introduced in Java SE 6 should give you what you need.
You may find Google Reflections useful to find classes implementing/extending a certain interface/superclass in the classpath. It's then as straightforward as
Reflections reflections = new Reflections("my.project.prefix");
Set<Class<? extends SomeClassOrInterface>> subTypes = reflections.getSubTypesOf(SomeClassOrInterface.class);
Then, to test if it indeed has a no-arg default constructor, just check for each if Class#newInstance() doesn't throw any exception.
There are several suggestions provided as answers to this question.
Here too On-the-fly, in-memory java code compilation for Java 5 and Java 6
If it's easy enough to compile at runtime that would be fine.
You can use javax.tools to do the compilation as needed. Create dynamic applications with javax.tools may help, too. It's also possible to do it in memory.
One caveat: using the compiler creates a dependency on the JDK; the JRE alone is insufficient.
take a look: Find Java classes implementing an interface
I think this would be simpler if you allowed your customer to type in a code declaration using something like Groovy, which is Java-ish enough, and easy to execute at runtime from a String value.
It's easy enough to iterate through the list of files in a folder. Someone mentioned that it's possible to call the Java compiler from Java (if you re-distribute the JDK, which I think is a point whose legality needs checking!!) That's much of the battle.
You seem to have a fixed model in your mind where only files fulfilling a certain interface are extracted from the folder. I think this is where your method needs to give a little. The sensible way (IMO) to do this would be to compile all files in that folder, and then with their classes stashed away somewhere, you can load and reflect them and then determine which of them "do" the interface and which don't. Those that don't will have been needlessly loaded into your JVM, but unless it's intentionally very space-wasteful, code you don't execute can't harm your program.
Having determined which ones do the computable thing, you can then store those classes (or instances thereof) in a Collection and do whatever you like with them. You simply ignore the other ones.
You could use BeanShell. This library is small and doesn't require the JDK. It is used in a number of IDE and web servers. The latest version appears to have the support you need loading .java files from the class path. (Still in beta)
I know that the package java.lang is auto-imported by every java program we write, hence all the classes in it are automatically available to us.
My question is why not auto import java.util and other packages too? That sure will save some typing :)
So please explain why is this not done.
A good reason not to autoimport too much is to avoid namespace clashes. If everything in java.util was imported automatically and then you wanted to refer to a different class named 'Map', for example, you would have to refer to it by its fully-qualified name.
In response to other answers in this thread, import does not actually modify the internal representation of your class files. In fact, here is a link to the JVM spec describing the class file structure: see that imports are not stored anywhere.
All the good IDE's will resolve your imports automatically, only prompting when there is a conflict (two packages with the same classname).
Because java.lang have the core Java language classes, and java.util for example not.
But some other languages, like Groovy automatically imports java.util for you :)
I think the idea behind java.lang is that these classes all have some connection to the language and runtime which is special, and can't be implemented on your own. Primitive wrappers, VM security and permissions and inspection, package and class loading -- all things that must be built-in to the Java system. Everything in java.util, like collections, while incredibly useful, could be implemented in pure Java. Some parts of it (time zones come to mind) have even been implemented by third-party libraries even better.
Or at least, that was true back in the Java 1.0 days. Today, for example, Iterator is also integral to the language, since it's automatically used by for-each loops, right? But backwards-compatibility was always a big thing with Java, so we get to live with this inconsistency forever.
I came across with namespace collision even with java.lang.System (our application contains a System named class). An explicit import solved my problem, but it took some minutes until I pointed out that com.mycompany.classes.System isn't imported automatically for identifier System by Eclipse because it already exists in java.lang.
Anyways, it isn't a good idea to pollute the class scope with too much identifiers, because your code will org.classes.**look** com.application.**like** com.classes.**this**.
Auto-importing java.lang is a good idea because it contains the very core classes and interfaces used in java.
#gameover, May be every java program needs the class that comes from java.lang,
but the java.util class contains the class that my be need or not that is depended on programmer. So the java have the default configuration for java.lang but we need to import to java.util classes according to our program.
java.lang package provides fundamental classes to built java programs. Object is the root of class hierarchy so it needs to be available for every programmer whether the programmer is beginner or expert.
While if we talk about other packages they are used for enlarging the programs. For example, java.util package which is only used when Collection classes are needed. While in every program Collection classes are not used while basic datatypes are essential for every java program.
To avoid unnecessary load of other classes in program other packages are not auto imported while essential package java.lang is auto imported.
There are different reason for to become java.lang package is default
1)
Actually ,whatever we are declaring variable in java program.that will be stored in object,Object class is available in java.lang package.it is super class of class hierarchy,Many we have performed operation in object class method like thread programming.
2)
Many time it is required class for developing program ,that class is available in java.lang package,
So, it is necessary to import,when java people developed jdk, it was default package,
Auto importing also leads some memory issue and some times it may conflict.
Eg: Date Type in java & SQL .
Also whatever the base core java objects are defined in Java.lang package .
Eg: any datatype & return type all are declared in java.lang package so to do some basic program also we need this package imports.
Without java.lang package, development is not possible.
But without java.util, java.io or any other packages, we can still write any number of programs.
That is the reason why java.lang package is being imported by default.
I recently started learning Java and found it very strange that every Java public class must be declared in a separate file. I am a C# programmer and C# doesn't enforce any such restriction.
Why does Java do this? Were there any design considerations?
Edit (based on a few answers):
Why is Java not removing this restriction now in the age of IDEs? This will not break any existing code (or will it?).
I have just taken a C# solution and did just this (remove any file that had multiple public classes in them) and broke them out to individual files and this has made life much easier.
If you have multiple public classes in a file you have a few issues:
What do you name the file? One of the public classes? Another name? People have enough issues around poor solution code organization and file naming conventions to have one extra issue.
Also, when you are browsing the file / project explorer its good that things aren't hidden. For example you see one file and drill down and there are 200 classes all mushed together. If you have one file one class, you can organize your tests better and get a feel for the structure and complexity of a solution.
I think Java got this right.
According to the Java Language Specification, Third Edition:
This restriction implies that there must be at most one such type per compilation unit. This restriction makes it easy for a compiler for the Java programming language or an implementation of the Java virtual machine to find a named class within a package; for example, the source code for a public type wet.sprocket.Toad would be found in a file Toad.java in the directory wet/sprocket, and the corresponding object code would be found in the file Toad.class in the same directory.
Emphasis is mine.
It seems like basically they wanted to translate the OS's directory separator into dots for namespaces, and vice versa.
So yes, it was a design consideration of some sort.
From Thinking in Java
:
There can be only one public class per compilation unit (file).
The idea is that each compilation unit has a single public interface represented by that public class. It can have as many supporting “friendly” classes as you want. If you have more than one public class inside a compilation unit, the compiler will give you an error message.
From the specification (7.2.6)
When packages are stored in a file system (?7.2.1), the host system may choose to enforce the restriction that it is a compile-time error if a type is not found in a file under a name composed of the type name plus an extension (such as .java or .jav) if either of the following is true:
The type is referred to by code in other compilation units of the package in which the type is declared.
The type is declared public (and therefore is potentially accessible from code in other packages).
This restriction implies that there must be at most one such type per compilation unit.
This restriction makes it easy for a compiler for the Java programming language or an implementation of the Java virtual machine to find a named class within a package; for example, the source code for a public type wet.sprocket.Toad would be found in a file Toad.java in the directory wet/sprocket, and the corresponding object code would be found in the file Toad.class in the same directory.
In short: it may be about finding classes without having to load everything on your classpath.
Edit: "may choose" seems like it leaves the possibility to not follow that restriction, and the meaning of "may" is probable the one described in RFC 2119 (i.e. "optional")
In practice though, this is enforced in so many platform and relied upon by so many tools and IDE that I do not see any "host system" choosing to not enforce that restriction.
From "Once upon an Oak ..."
It's pretty obvious - like most things are once you know the design reasons - the compiler would have to make an additional pass through all the compilation units (.java files) to figure out what classes were where, and that would make the compilation even slower.
(Note:
the Oak Language Specification for Oak version 0.2 (postcript document): Oak was the original name of what is now commonly known as Java, and this manual is the oldest manual available for Oak (i.e. Java).
For more history on the origins of Java, please have a look at the Green Project and Java(TM) Technology: An Early History
)
It's just to avoid confusion in the sense that Java was created with simplicity in mind from the perspective of the developer. Your "primary" classes are your public classes and they are easy to find (by a human) if they are in a file with the same name and in a directory specified by the class's package.
You must recall that the Java language was developed in the mid-90s, in the days before IDEs made code navigation and searching a breeze.
If a class is only used by one other class, make it a private inner class. This way you have your multiple classes in a file.
If a class is used by multiple other classes, which of these classes would you put into the same file? All three? You would end up having all your classes in a single file...
That's just how the language designers decided to do it. I think the main reason was to optimize the compiler pass-throughs - the compiler does not have to guess or parse through files to locate the public classes. I think it's actually a good thing, it makes the code files much easier to find, and forces you to stay away from putting too much into one file. I also like how Java forces you to put your code files in the same directory structure as the package - that makes it easy to locate any code file.
It is technically legal to have multiple Java top level classes in one file. However this is considered to be bad practice (in most cases), and some Java tools may not work if you do this.
The JLS says this:
When packages are stored in a file
system (§7.2.1), the host system may
choose to enforce the restriction that
it is a compile-time error if a type
is not found in a file under a name
composed of the type name plus an
extension (such as .java or .jav) if
either of the following is true:
The type is referred to by code in other compilation units of the package in which the type is declared.
The type is declared public (and therefore is potentially accessible from code in other packages).
Note the use of may in the JLS text. This says that a compiler may reject this as invalid, or it may not. That is not a good situation if you are trying to build your Java code to be portable at the source code level. Thus, even if multiple classes in one source file works on your development platform, it is bad practice to do this.
My understanding is that this "permission to reject" is a design decision that is intended in part to make it easier to implement Java on a wider range of platforms. If (conversely) the JLS required all compilers to support source files containing multiple classes, there would be conceptual issues implementing Java on a platform which wasn't file-system based.
In practice, seasoned Java developers don't miss being able to do this at all. Modularization and information hiding are better done using an appropriate combination of packages, class access modifiers and inner or nested classes.
Why is java not removing this restriction now in the age of IDEs? This will not break any existing code (or will it?).
Now all code is uniform. When you see a source file you know what to expect. it is same for every project. If java were to remove this convention you have to relearn code structure for every project you work on, where as now you learn it once and apply it everywhere. We should not be trusting IDE's for everything.
Not really an answer to the question but a data point none the less.
I grepped the headers of my personal C++ utilty library (you can get it yourself from here) and almost all of the header files that actually do declare classes (some just declare free functions) declare more than one class. I like to think of myself as a pretty good C++ designer (though the library is a bit of a bodge in places - I'm its only user), so I suggest that for C++ at least, multiple classes in the same file are normal and even good practice.
It allows for simpler heuristics for going from Foobar.class to Foobar.java.
If Foobar could be in any Java file you have a mapping problem, which may eventually mean you have to do a full scan of all java files to locate the definition of the class.
Personally I have found this to be one of the strange rules that combined result in that Java applications can grow very large and still be sturdy.
Well, actually it is an optional restriction according to Java Language Specification (Section 7.6, Page No. 209) but followed by Oracle Java compiler as a mandatory restriction. According to Java Language Specification,
When packages are stored in a file system (§7.2.1), the host system
may choose to enforce the restriction that it is a compile-time error
if a type is not found in a file under a name composed of the type
name plus an extension (such as .java or .jav) if either of the
following is true:
The type is referred to by code in other compilation units of the package in which the type is declared.
The type is declared public (and therefore is potentially accessible from code in other packages).
This restriction implies that there must be at most one such type per
compilation unit. This restriction makes it easy for a Java compiler
to find a named class within a package.
In practice, many programmers choose to put each class or interface
type in its own compilation unit, whether or not it is public or is
referred to by code in other compilation units.
For example, the source code for a public type wet.sprocket.Toad would
be found in a file Toad.java in the directory wet/sprocket , and the
corresponding object code would be found in the file Toad.class in the
same directory.
To get more clear picture let's imagine there are two public classes public class A and public class B in a same source file and A class have reference to the not yet compiled class B. And we are compiling (compiling-linking-loading) class A now while linking to class B compiler will be forced to examine each *.java files within the current package because class B don’t have it’s specific B.java file. So In above case, it is a little bit time consuming for the compiler to find which class lies under which source file and in which class the main method lies.
So the reason behind keeping one public class per source file is to actually make compilation process faster because it enables a more efficient lookup of the source and compiled files during linking (import statements). The idea is if you know the name of a class, you know where it should be found for each classpath entry and no indexing will be required.
And also as soon as we execute our application JVM by default looks for the public class (since no restrictions and can be accessible from anywhere) and also looks for public static void main(String args[]) in that public class. Public class acts as the initial class from where the JVM instance for the Java application (program) is begun. So when we provide more than one public class in a program the compiler itself stops you by throwing an error. This is because later we can’t confuse the JVM as to which class to be its initial class because only one public class with the public static void main(String args[]) is the initial class for JVM.
You can read more on Why Single Java Source File Can Not Have More Than One public class