I've been thinking some about "good practice" regarding package structure within osgi bundles. Currently, on average, we have like 8-12 classes per bundle. One of my initiative/proposal has been to have two packages; com.company_name.osgi.services.api (for api-related classes/interfaces (which is exported externally) and one package com.company_name.osgi.services.impl for implementation (not exported)). What are the pros cons of this? Any other suggestions?
You might also consider puting the interfaces in com.company_name.subsystem, and the implementation in com.company_name.subsystem.impl, the OSGI specific code, if there is any, could be in com.company_name.subsystem.osgi.
Sometime you might have multiple implementation of the same interfaces. In this case you could consider - com.company_name.subsystem.impl1 and com.company_name.subsystem.impl2, for example:
com.company.scm // the scm api
com.company.scm.git // its git implementaton
com.company.scm.svn // its subversion implementation
com.company.scm.osgi // the place to put an OSGI Activator
In this sense package structure could be OSGi agnostic, if you later on move to a different container, you just put an additional
com.company.scm.sca // or whatever component model you might think of
Always having api and impl in your package name could be annoying. If in doubt use impl but not api.
It's not the number of classes that is important but the concepts. In my opinion you should have one conceptual entity in a bundle. In some cases this might be just a few classes in other several packages with 100s of classes.
What it important is that you separate the API and the implementation. One bundle contains the API of your concept and the other the implementation. Like this you can provide different implementations for a well defined API. In some cases this might be even necessary if you want to access the services from a bundle remotely (using e.g. R-OGSi)
The API bundles are then used by code sharing and the services from the implementation bundles by service sharing. Best way to explore those possibilities is to look at the ServiceTrackers.
In your case you could have the implementation in the same package, but all of its classes "package protected". This way, you can export the package and the implementation would not be visible to the outside, even when not using OSGi (but as a regular jar file).
Related
I'm looking for different ways to prevent internals leaking into an API. This is a huge problem because once these internals leak into the API; you can run either into unexpected incompatibility issues or into frozen internals.
One of the simplest ways to do so is just make use of different Maven modules; one module with API and one module with implementation. This way it is impossible to expose the implementation from the API.
Unfortunately not everyone agrees this is the best approach; But are there other alternatives? E.g using checkstyle or other 'architecture checking' tools?
PS: Java 9 for us is not usable, since we are about to upgrade to Java 8 and this will be the lowest supporting version for quite some time to come.
Following your checkstyle idea, it should be possible to set up rules which examine import statements in source files.
Checkstyle has built-in support for that, specifically the IllegalImport and ImportControl rules.
This of course works best if public and internal classes can be easily separated by package names.
The idea for IllegalImport would be that you configure a TreeWalker in checkstyle which only looks at your API-sources, and which excludes imports from internal packages.
With the ImportControl rule on the other hand you can define very detailed access rules for the whole application/module in a separate XML file.
It is standard in Java to define an API using interfaces and implement them using classes. That way you can change the "internals" however you want and nothing changes for the user(s) of the API.
One alternative is to have one module (Jar file) for API and implementation (but then again, is it an API or just any kind of library?). Inside one separates classes and interfaces by using packages, e.g. com.acme.stuff.api and com.acme.stuff.impl. It is important to make classes inside the latter package protected or just package-protected.
Not only does the package name show the consuming developer "hey, this is the implementation", it is also not possible to use anything inside (let's omit reflections at this point for the sake of simplicity).
But again: This is against the idea of an API, because usually the implementation can be changed. With this approach one cannot separate API from implementation, because both are inside the same module.
If it is only about hiding internals of a library, then this is one (not the one) feasible approach.
And just in case you meant a library instead of an API, which only exposes its "frontend" (by using interfaces or abstract classes and such), use different package names, e.g. com.acme.stuff and com.acme.stuff.internal. The same visibility rules apply of course.
Also: This way one does not need Checkstyle and other burdens.
Here is a good start : http://wiki.netbeans.org/API_Design
Key point : Do not expose more than you want Obviously the less of the implementation is expressed in the API, the more flexibility one can have in future. There are some tricks that one can use to hide the implementation, but still deliver the desired functionality
I think you don't need any checkstyle or anything like that, just a good old solid design and architecture should be enough. Polymorphism is all you need here.
One of the simplest ways to do so is just make use of different Maven
modules; one module with API and one module with implementation. This
way it is impossible to expose the implementation from the API.
Yes, I totally agree, hide as much as possible, separate your interface in a standalone project.
I'm trying to do something clever. I am creating a weather application in which we can replace the weather API with another weather API without affecting the code base. So I started with a Maven project with multiple modules.
I have a Base module that contains the Interface class and the Base class. The Interface class contains the calls to the APIs (all calls are similar, if not exact) and the Base class contains the properties to the APIs (again, all properties are similar, if not exact).
I have a module for each of the two weather APIs we are testing with plans to create more modules for new weather APIs as we grow the application.
Finally, I have created a Core module (includes main) to implement the specific module class for the weather API I want to test.
Now, I know the simplest way to do this would be to use a switch statement and enumeration. But I want to know if there is a more clever way to do this. Maybe using a Pattern? Any suggestions?
Here is a picture of the structure I have just described:
Here is the UML representation:
This is a learning process for me. I want to discover how a real Java Guru would implement the appropriate module and class based on a specified configuration.
Thank you for your suggestions.
I'm trying to do something clever. I am creating a weather application
in which we can replace the weather API with another weather API
without affecting the code base.
Without reading further down, this first statement makes me think about a plugin architecture design, but in the process of software design, decisions must not be rushed, the more you delay, the more information you have and a better informed decision can be made, for now is just an idea to keep in mind.
I have a Base module that contains the Interface class and the Base
class. The Interface class contains the calls to the APIs (all calls
are similar, if not exact) and the Base class contains the properties
to the APIs (again, all properties are similar, if not exact).
When different modules share behaviour/state, it is a good idea to refactor them and produce base abstract classes and interfaces, so you are on the right track, but, if there are differences, those shouldn't be refactored into the base module. The reason behind that is simple, maintainability. If you start adding if clauses or switches to deal with these differences, you just introduced coupling between modules, and you'll be always having to make changes in the base module, whenever you add/modify other modules, and this is not desirable at all.
This is reflected by the Open/Closed principle form the SOLID principles, which states that a class should be open for extension but closed for modifications.
So after you've refactored the common behaviour into the base modules, then each new API should extend the base module, as you did.
Finally, I have created a Core module (includes main) to implement the
specific module class for the weather API I want to test.
Now, I know the simplest way to do this would be to use a switch
statement and enumeration. But I want to know if there is a more
clever way to do this. Maybe using a Pattern? Any suggestions?
Indeed, making use of a switch, makes it work, but its not a clean design at all, for the same reason as before, when adding, modifying or removing modules, would require to modify this module aswell, and also this code can potentially break.
One possible solution, would be to delegate this responsability on a new component and make use of a creational design pattern like the Abstract Factory, which will provide a interface to instantiate components without specifying its classes.
As for the architecture, so far, the plugin architecture still makes sense, but what if the different modules extend the base contract adding more features? One option is to use the Facade pattern to adapt the module calls and provide an output that implements an interface that clients expect.
But then again, with the provided details, this is the solution I'd suggest, but the scenario should be studied carefully and in greater detail, in order to be able to assure that these are the right tools for the job, and commit to them.
In addition to Salvador Juan Martinez's answer...
To implement a plugin architecture Java's Jar File Specification provides support for service provider interfaces (SPI) and how they are looked up.
As of Java 1.6. you can use the ServiceLoader to lookup service providers. For Java 1.5. and less you must do it on your own or use a library. E.g. commons-discovery.
The usage is quiet simple. In your case put a META-INF/services/com.a2i.weatherbase.IWeather file in each plugin module.
In the Weather Forecast IO module the file should contain only one line
com.a2i.weatherforecastio.ForecastIO
The line must be the full quallified name of an IWeather implementation class.
Do the same for the other module and you can load the implementations via ServiceLoader.
ServiceLoader<IWeather> weatherServicesLoader = ServiceLoader.load(IWeather.class);
Iterator<IWeather> weatherServices = weatherServicesLoader.iterator();
Now it depends on your runtime classpath how many services will be found. Try to add and remove module jar archives from the classpath and run your application.
EDIT
I wrote a blog about a pluggable architecture with standard java. See http://www.link-intersystems.com/blog/2016/01/02/a-plug-in-architecture-implemented-with-java/
Source code is also available at https://github.com/link-intersystems/blog/tree/master/java-plugin-architecture
One solution is you have to define the common interface with all the identified common operations. The extensions/plugins need to implement that interface and have to provide the implementation to common operations.
You can use an abstract factory design pattern to hook up the exact implementation at runtime based on the input parameters.
Interfaces and abstract classes are always good in such scenarios, Thanks.
For some time I am struggling to get an arquillian test case running. This test involves classes rooted in JSF classes and it ran into an ClassFormatError: Absent Code as the implementation for the javax.faces.model.DataModel could not be found.
My assumption was that I need to provide my test with a JSF implementation, but the implementations I found (for example the one bundled with JBoss) do not have the javax.faces package, only com.sun, and I could find no trace of the DataModel class.
Where am I misunderstanding the way it works here? Why doesn't the impl actually implement the api?
The API:
the public types to which consumers can write code to
The implementation:
private types consumers should not rely on
implementation details including things like markup (e.g. HTML) and container (e.g. servlet)
This separation isn't as clean as it should be, but this is largely the intent. Separation into these two jars are how the developers chose to organise the code but you'll need both to utilize the library in most contexts.
I'm looking to create a Java 'library' (JAR) that can be used within other projects, but I'd like my library to be extensible. I'm after something OSGi-esque, where my library has 'extension points' that other JARs can hook into. The thinking is that my library will have a core set of methods that will be called by the projects it's used in, but the exact implementation of these methods might vary based on the project/context.
I've looked into OSGi (e.g. Equinox), but I'm not sure it can be used in the way I'm after. It seems to be geared towards standalone apps rather than creating a library.
What would be the best way of achieving this? Can OSGi be used in this way, and if not are there frameworks out there that will do this?
I hope all that's clear - I have a clear idea of what I want, but it's hard to articulate.
OSGi is great, but I don't think that this is what you need. By using OSGi (-Services), you force the user of your library to use an OSGi environment, too.
I think as #Peter stated, you can do this by simply extending classes of your library in the specific project / context.
But in case you want to use OSGi, there is a simple way to achieve this. It's called Bundle Fragments. This way you can create a bundle and extend a so-called Host-Bundle", i.e. your library, without altering the original library. A popular use case for this is if you have platform specific code in your bundles.
What you are naming a Java library is named "Bundle" in OSGi context.
OSGi Bundle is a JAR file with some special Meta-Information in its MANIFEST.MF file. Now, every OSGi Bundle have either Exported-Packages or Imported-Packages.
Through Export-Packages Manifest header, you can show what all packages you are exporting.. And your other project can simply add the package it wants to use from them to its Import-Packages..
Here's an example: -
Bundle A Manifest: -
Export-Packages: com.demo.exported;
Bundle B Manifest: -
Import-Packages: com.demo.exported;version=(1.0.0, 2.0.0]
This way your bundle B (A different project) can call the methods from the class in the package that it imported from Bundle A..
Now, the version you see in the import-package, is just to show what all package version can it accept.. You can have 2 bundles with two different implementation of some interfaces and provide this package in two different version.. Both will be available..
Till now, I was talking about Static data-types..
You can also have your services exposed dynamically through Declarative Service.. In this case you will have to define one XML file (Component Definition) where you show what all services your Bundle will expose.. And in the other bundle, you can again define another XML, to show what all services it requires..
These are called, Provided Services and Referenced Services..
I think this will give you a little idea about what can be done.
And if I am wrong somewhere in interpreting your problem please specify the same..
*NOTE: - And of course OSGi is used for creating independent Bundles, that can be re-used in other projects.. They bring Modularity to your project..
As others have mentioned, you don't need OSGi or any framework for this. You can do this my employing patterns like the template method pattern or the strategy pattern. There are several other patterns for dynamically modifying/extending functionality, but these seem to fit your description most. They do not require any framework.
The benefit you would get from a framework like OSGi would be that it would manage the wiring for you. Normally, you'll have to write some code that glues your libraries and the extensions together - with a framework like OSGi, this will not be automated with minimal overhead (in case of OSGi, the overhead is some entries in the JAR-manifest).
I always doubt when creating packages, I want to take advantage of the package limited access but at the same time I want to have similar classes divided into packages.
The problem comes when you understand that packages are not hierarchical in Java:
At first, packages appear to be
hierarchical, but they are not.
source
Imagine I have an API defined with its classes at foo.bar, only the classes the API client needs are set public. Then I have another package with some internal objects I need in the API defined at foo.bar.pojos, this classes need to be public so they can be accessed by foo.bar but this means the API client could also access them if the package foo.bar.pojos is imported.
What is the common package politic that should be followed?
I've seen two ways of doing.
The first one consists in separating the public API and internal classes into two different artefacts (jars). The documentation is separated as well, and it's thus easy for the end user to make the distinction between what is internal and what is not. But it sometimes make things more complex to have two jars, two source trees, etc.
The second one consists in delivering a single jar, but have a good documentation allowing to know what's internal and what's not. The textual documentation can explain how to use the API (and thus avoids talking about the internals). And the javadoc can specify that a class is for internal use and is thus subject to changes.
Yes, Java packages don't give you enough control over your dependencies. The classic way to deal with this is to put external APIs in one package and internal implementation classes in another, and rely on people's good sense to avoid creating dependencies on the latter.
With Maven and OSGI, you have an additional mechanism for managing dependencies between modules / bundles of packages. In the case of OSGI, you can explicitly declare some packages as not exported, and an OSGI aware development environment will prevent people creating harmful dependencies. Maven's module support is weaker, but at least it controls dependency cycles.
Finally, you could use custom PMD rules to enforce your project's modularization conventions ... in the same way that there are rules to discourage dependencies on Java's "com.sun.*" package tree.
It is a mess.
Using only what Java itself offers, you have to put everything in the same package. You end up with a single (or a few) packages with lots of classes, and no good way to group them for yourself (but at least that problem does not leak outside). Most people don't do that, though, and as a result, your (as a developer on top of these libraries) public classpath is littered with stuff you should never need to see.
You might like OSGi, which has (and enforces) the concept of bundle-private packages. Those are not exported to the outside world.