I was answering this question, where I recommended utilizing exports to syntax to prevent external consumers from accessing code that is intended for internal use between modules.
But on further reflection, the only real safety checking that modules implement is that it matches the name. Consider this example where I am implementing two modules:
module a {
exports unsafe to b
}
module b {
requires a
}
The package unsafe contains code that would be unsafe to have exposed. Is there any way to securely export this to internal modules without exposing them externally?
In the above example, a rogue entity could simply name their module b and would gain access to the code (not secure). The JLS doesn't seem to spell out anything that can prevent it.
The hashing of modules as pointed by Alan shall work in your case. Though I personally like the description and the example from the JMOD tool which directly answers your question :
With the --hash-modules option or the jmod hash command, you can, in
each module's descriptor, record hashes of the content of the modules
that are allowed to depend upon it, thus "tying" together these
modules.
This lets you to allow a package to be exported to one or
more specifically-named modules and to no others through qualified
exports. The runtime verifies if the recorded hash of a module matches
the one resolved at run time; if not, the runtime returns an error.
Related
In java module system, we can have:
module hellomodule {
exports com.name.hello;
requires transitive greetings;
}
by doing this the packages exposed by greetings module will effectively become part of the API exposed by the hellomodue.
We may want to avoid this to a certain degree, would be nice, for example, to allow only visibility on certain classes, the ones used by hellomodule in the signature of the methods it contains perhaps.
Is there any way to do this, i.e. allow only certain classes or packages to be leaked ?
I expected it's possible to use i.e. Guava-19 in myModuleA and guava-20 in myModuleB, since jigsaw modules have their own classpath.
Let's say myModuleA uses Iterators.emptyIterator(); - which is removed in guava-20 and myModuleB uses the new static method FluentIterable.of(); - which wasn't available in guava-19. Unfortunately, my test is negative. At compile-time, it looks fine. In contrast to runtime the result is a NoSuchMethodError. Means that, the class which was the first on the classloader decides which one fails.
The encapsulation with the underlying coupling? I found a reason for myself. It couldn't be supported because of transitive dependencies would have the same problem as before. If a guava class which has version conflicts occurred in the signature in ModuleA and ModuleB depends on it. Which class should be used?
But why all over the internet we can read "jigsaw - the module system stops the classpath hell"? We have now multiple smaller "similar-to-classpaths" with the same problems. It's more an uncertainty than a question.
Version Conflicts
First a correction: You say that modules have their own class path, which is not correct. The application's class path remains as it is. Parallel to it the module path was introduced but it essentially works in the same way. Particularly, all application classes are loaded by the same class loader (by default at least).
That there is only a single class loader for all application classes also explains why there can't be two versions of the same class: The entire class loading infrastructure is built on the assumption that a fully qualified class name suffices to identify a class with a class loader.
This also opens the path to the solution for multiple versions. Like before you can achieve that by using different class loaders. The module system native way to do that would be to create additional layers (each layer has its own loader).
Module Hell?
So does the module system replace class path hell with module hell? Well, multiple versions of the same library are still not possible without creating new class loaders, so this fundamental problem remains.
On the other hand, now you at least get an error at compile or launch due to split packages. This prevents the program from subtly misbehaving, which is not that bad, either.
Theoretically it is possible to use different versions of the same library within your application. The concept that enables this: layering!
When you study Jigsaw under the hood you find a whole section dedicated to this topic.
The idea is basically that you can further group modules using these layers. Layers are constructed at runtime; and they have their own classloader. Meaning: it should be absolutely possible to use modules in different versions within one application - they just need to go into different layers. And as shown - this kind of "multiple version support" is actively discussed by the people working on java/jigsaw. It is not an obscure feature - it is meant to support different module versions under one hood.
The only disclaimer at this point: unfortunately there are no "complete" source code examples out there (of which I know), thus I can only link to that Oracle presentation.
In other words: there is some sort of solution to this versioning problem on the horizon - but it will take more time until to make experiences in real world code with this new idea. And to be precise: you can have different layers that are isolated by different class loaders. There is no support that would allow you that "the same object" uses modV1 and modV2 at the same time. You can only have two objects, one using modV1 and the other modV2.
( German readers might want to have a look here - that publication contain another introduction to the topic of layers ).
Java 9 doesn't solve such problems. In a nutshell what was done in java 9 is to extend classic access modifiers (public, protected, package-private, private) to the jar levels.
Prior to java 9, if a module A depends on module B, then all public classes from B will be visible for A.
With Java 9, visibility could be configured, so it could be limited only to a subset of classes, each module could define which packages exports and which packages requires.
Most of those checks are done by the compiler.
From a run time perspective(classloader architecture), there is no big change, all application modules are loaded by the same classloader, so it's not possible to have the same class with different versions in the same jvm unless you use a modular framework like OSGI or manipulate classloaders by yourself.
As others have hinted, JPMS layers can help with that. You can use them just manually, but Layrry might be helpful to you, which is a fluent API and configuration-based launcher for running layered applications. It allows you to define the layer structure by means of configuration and it will fire up the layer graph for you. It also supports the dynamic addition/removal of layers at runtime.
Disclaimer: I'm the initial creator of Layrry
In Java 9's module declaration there are 2 constructs:
exports com.foo;
And
opens com.foo;
Where exports grants compile-time access, while opens allows runtime access, as reflection and resources.
opens has one leniency over exports that you can define the whole module as open, resulting the same as explicitly opening every package:
open module com.mod {
But there's no similar construct
exported module com.mod {
My Question: Why is it so; what decisions has been made to allow opening the whole module at once but not with exporting?
A module's exports define its API, which should be deliberately designed and kept stable. An "exported module" could easily and inadvertently change its API by adding, removing, or renaming packages, which would go against the stability goal. (This is essentially the same reason why there are no "wildcard exports" like exports foo.bar.*).
Open packages, on the other hand, do not really define a module's API. Sure, code can depend on functionality that is only accessible via reflection, but the Java community generally sees reflection as a "hack" when used to access internals.
Its much wider (and more beneficial) use is to access an artifact to provide a service for it (XML/JSON serialization, persistence, dependency injection, ...). In such cases, the code reflecting over the module does not depend on it and is hence not broken by moving stuff around. There is hence less reason to keep the opened packages stable, which makes a free-for-all approach like open modules feasible.
I am developing a generic Android game engine that will be used in many of my Android apps as the base system. The problem is that in all of my Java files I currently have to hardcode the package name like this:
package com.example.mygameengine;
But because I want to use the code of my generic game engine in many different apps, I need to find a way to specify the package name for the Java files at compile time because I do not want to keep several copies of my Java sources just because of differences in the package name. I want to have one central source tree and the package name should be dynamically changeable depending on the app I'm about to compile.
So is there a way to do something like this:
package $(PACKAGE_NAME)
In the Java sources where $(PACKAGE_NAME) is to be substituted with the real package name at compile time? Maybe javac has an option that allows me to specify a package name for the file it is passed instead of taking it from the file itself? Note that I'm not using Eclipse but barebones command line tools like ant and make.
EDIT: I do not understand why this is tagged as a duplicate. I've asked a fundamental question about whether the "package" directive in the Java language requires a hard-coded string argument in the source code or whether it is also possible to set this package name at compile time using a compiler directive. That's quite a different question than the one that has been linked here as the presumedly "original" question which is much more closely tied to the Android build system. My question is about the fundamentals of the Java language, not about the Android build system.
Can you build your generic game engine code into a stand alone JAR.
Then you can include it as a dependency into each of your Android app that uses it.
With proper versioning you will then have just one central place where this code is stored.
You can't. And that's not the proper approach to code reuse.
Simply package your commonly used code in a jar and include that jar in every project you want (in Android you do that by adding the class to the classpath, and then marking it as exported in the "Order & Export" tab)
It's not possible, because its part of core meta-data about your class
A package is a collection of related Java entities (such as classes, interfaces, exceptions, errors and enums). Packages are used for:
Resolving naming conflict of classes by prefixing the class name with a package name. For example, com.zzz.Circle and com.yyy.Circle are two distinct classes. Although they share the same class name Circle, but they belong to two different packages: com.zzz and com.yyy. These two classes can be used in the same program and distinguished using the fully-qualified class name - package name plus class name. This mechanism is called Namespace Management.
Access Control: Besides public and private, Java has two access control modifiers – protected and default – that are related to package. A protected entity is accessible by classes in the same package and its subclasses. An entity without access control modifier (i.e., default) is accessible by classes in the same package only.
For distributing a collection of reusable classes, usually in a format known as Java Archive (JAR) file.
I always doubt when creating packages, I want to take advantage of the package limited access but at the same time I want to have similar classes divided into packages.
The problem comes when you understand that packages are not hierarchical in Java:
At first, packages appear to be
hierarchical, but they are not.
source
Imagine I have an API defined with its classes at foo.bar, only the classes the API client needs are set public. Then I have another package with some internal objects I need in the API defined at foo.bar.pojos, this classes need to be public so they can be accessed by foo.bar but this means the API client could also access them if the package foo.bar.pojos is imported.
What is the common package politic that should be followed?
I've seen two ways of doing.
The first one consists in separating the public API and internal classes into two different artefacts (jars). The documentation is separated as well, and it's thus easy for the end user to make the distinction between what is internal and what is not. But it sometimes make things more complex to have two jars, two source trees, etc.
The second one consists in delivering a single jar, but have a good documentation allowing to know what's internal and what's not. The textual documentation can explain how to use the API (and thus avoids talking about the internals). And the javadoc can specify that a class is for internal use and is thus subject to changes.
Yes, Java packages don't give you enough control over your dependencies. The classic way to deal with this is to put external APIs in one package and internal implementation classes in another, and rely on people's good sense to avoid creating dependencies on the latter.
With Maven and OSGI, you have an additional mechanism for managing dependencies between modules / bundles of packages. In the case of OSGI, you can explicitly declare some packages as not exported, and an OSGI aware development environment will prevent people creating harmful dependencies. Maven's module support is weaker, but at least it controls dependency cycles.
Finally, you could use custom PMD rules to enforce your project's modularization conventions ... in the same way that there are rules to discourage dependencies on Java's "com.sun.*" package tree.
It is a mess.
Using only what Java itself offers, you have to put everything in the same package. You end up with a single (or a few) packages with lots of classes, and no good way to group them for yourself (but at least that problem does not leak outside). Most people don't do that, though, and as a result, your (as a developer on top of these libraries) public classpath is littered with stuff you should never need to see.
You might like OSGi, which has (and enforces) the concept of bundle-private packages. Those are not exported to the outside world.