The basic problem is as such: I've got a project that already uses multiple Maven modules for various sub-projects. However, one of the modules (the core module) could itself be split into multiple OSGi bundles when created. This is due to the core module containing several optional dependencies, each of which have isolated Java packages where they're required. For instance, support for JSON input files are optional as they require the optional dependencies from Jackson. The classes that rely on the Jackson dependencies are all isolated to certain json packages within the module. Thus, in theory, I could create a minimal bundle from core that doesn't include the packages that rely on optional dependencies.
Normally, I'd simply split up this module into more Maven modules to make life easier for creating bundles via Felix's maven-bundle-plugin. The problem here is that I still want to create a core JAR for non-OSGi users who don't want to have to include several extra JARs just to use optional functionality (which requires they provide the optional dependencies on the class path as it is). Not only that, but I don't wish to have to split up this module into more modules as it makes development on the project more tedious for the developers as well, especially when we're already splitting up code into proper package-based modules as it is.
The way we were trying to use OSGi already was to make the API module a fragment host (in order to allow it to load a provider bundle without requiring OSGi support), then make the other bundles use said fragment host. This seemed to work well for the smaller modules outside of core, but for core, we wanted to be able to provide multiple bundles from a single module so that optional dependencies wouldn't be required in the bundle itself. As it stands, for plugins, we already have a mechanism for scanning them and ignoring plugins that don't have all the required classes to load them (e.g., if a plugin requires a JPA provider but the JPA API is not available, that plugin isn't loaded). Once we can successfully split up the core module into multiple bundles, I can use declarative services as the plugin method in an OSGi environment (instead of the default class path JAR scanning mechanism in place for normal Java environments), so that isn't an issue.
Is there any way to do all this using Felix's maven-bundle-plugin? Or will I have to use the assembly plugin to copy subsets of the module where bundles can be generated from? Or will I have to resort to writing an Ant script (or Maven plugin) to do this? We've tried using separate Maven modules that simply import the core module as a dependency and generating a bundle from there, but the resultant bundle is always empty regardless of import/export package settings and embed dependencies.
Or, is there a better way to do this? We already use the <optional>true</optional> configuration for the optional dependencies, yet the Felix plugin doesn't seem to care about that and imports all of those dependencies anyways without using the optional attribute.
Well, this is what I'm ending up doing to accomplish this. I'm using the maven-assembly-plugin to copy the binaries I need and filtering out the classes I don't want to include using the <fileSets/> element similar to the <fileset/> element in Ant.
Using the generated directories for each assembly, I'm using the maven-bundle-plugin along with the <buildDirectory/> configuration option to specify where the bundle's class files are located.
It's not ideal, but it's better than writing an Ant script for a Maven project!
Related
I am in process to create a maven repository of a JAVA module which is part of Eclipse (probably OSGI). I am trying to get transitive dependencies of the JAR files I need using this way (http://wiki.eclipse.org/JFace).
It use Plugin dependency analyser feature of the Eclipse to create a tree. I am wondering how does it work in background. I can find Imported namespace in manifest file of a JAR. But how does it find the relevent JAR file using that information?
My end goal is to export all these transitive depenedencies JAR and convert them to maven repository. I want to automate it ideally so I don't have to do it manually whenever there is an update.
In general Eclipse PDE projects have a target platform. This target platform holds the dependencies.
In OSGi bundles, the dependencies are specified in the manifest. There are many types of dependencies, the OSGi dependency model is magnitudes more powerful than the simplistic 'require' model of other module systems. Not only has it many types of dependencies (require bundle (the classic one), import package, require execution environment, require an implementation for an API, require a service, etc.) specified in namespaces, it also supports a powerful filter that is asserted on the properties of a capability. A capability is the opposite of a requirement.
To handle these capabilities and requirements, OSGi has a resolver. It takes a set of initial requirements and finds a solution where all requirements are satisfied and all the rules are followed. The result of the resolver is a set of wires that connect bundles together.
By far the best environment to play with this is bndtools. In bndtools you can use a P2 repository directly (Normally the target platform is a P2 repository). (You can also directly use a Maven repository or an OSGi standardized repository aka OBR.) In the bndtools bndrun file you can then select one or more root bundles of your graph. The resolver then calculates the set of bundles that are a happy closure. This is all done on GUI level.
There is also a command line called 'bnd' that you can install via brew or download its jar from https://bndtools.ci.cloudbees.com/job/bnd.master/lastSuccessfulBuild/artifact/biz.aQute.bnd/generated/
That is a miraculous little tool that can do wonders but very badly documented although it does have a help command. :-( (I am the author)
I currently manage a few separate Maven projects in which I use Protobufs as a serialization format and over the wire. I am using David Trott's maven-protoc plugin to generate the code at compile time.
All is good and well until I want those project to communicate between one another — or rather, use each other's protobufs. The protobuf language has an "import" directive which does what I want but I'm faced with the challenge of having project A exporting a ".proto" file (or possibly some intermediate format?) for project B to depend upon.
Maven provides a way for a project to bundle resources but AFAIK, these are meant to be used at runtime by the code and not by a goal during the compile / source generation phase — at least I haven't been able to find documentation that describes what I want to achieve.
I've found another way to achieve, and it doesn't involve any Maven magic. Diving into the code for the maven-protoc plugin, I found that this is a supported use case -- the plugin will look for and collect and .proto files in dependent jars and unpack them into a temporary directory. That directory is then set as an import path to the protoc invocation.
All that needs to happen is for the .proto file to be included in the dependency's package, which I did by making it a resource:
projects/a/src/main/resources/a.proto
Now in projects/b/pom.xml, add 'a' as a regular Maven dependency and just import a.proto from b.proto as if it existed locally:
b.proto:
import "a.proto";
This isn't ideal, since files names may clash between various projects, but this should occur rarely enough.
You can package your .proto files in a separate .jar/.zip in the project where they are generated, and publish them in your repository using a dedicated classifier. Using the assembly plugin might help here to publish something close to "source jars" that are built during releases.
Then, on projects using them, add previously created artifact as dependency.
Use the dependency plugin with the "unpack-dependencies" goal, and bind it to a phase before "compile".
This is more a question about what's out there, and future directions about resolving tools such as Ivy. Is there anything that can mention class-level dependencies for packages, rather than package level dependencies?
For example, let's say I have an apache-xyxy package, that comes with an ivy.xml that lists all it's dependencies. But suppose I only use class WX in apache-xyxy, which doesn't require most of those dependencies. Couldn't a resolver be intelligent and identify that class WX can only possibly invoke the set of other classes (AB, DC, EF), and none of those classes use any of other dependencies, to create a minimal subset of required dependencies? This would be easier and safer than cherry picking to remove some package dependencies that aren't needed because of the specific classes used in that package, and also prevent breaking down several larger packages into smaller ones just for this reason.
Then, if I later decided to use class GH from apache-xyxy, I could do an ivy resolve, and it would dynamically bring in the additional required libraries.
When packaging compiled java code for distribution it's common practice to bundle Java "packages" together. It's also quite possible (but silly) to split a java package across multiple jars. Large frameworks (like Spring) have lots of sub packages in different jars so that users can pick and choose what they need at run-time..... Of course the more jar options one has, the more complex it becomes to populate the run-time classpath...
The keyword here is "run-time".... Tools like Apache ivy and Apache Maven are primarily designed to manage dependencies needed at build time....
Apache Maven does have a "runtime" scope, for it's dependencies, but it's limited to a single list of jars. Typically this scope is used for deciding which jars are needed for testing and populating the lib directory of a WAR file.
Apache ivy has a similar more flexible mechanism called "configurations". It's possible to create as many runtime configurations as you need, and these can be used to decide which jars are downloaded by ivy.
So while it would appear ivy has the answer, I've rarely seen ivy used when launching programs (The one exception is Groovy's Grape annotations)
So what, you might ask, is the answer?
The future of "run-time" classpath management is either OSGI or project jigsaw. I'm more familiar with OSGI where special dependency indicators are added the the jar file's manifest, stating what it's dependencies are. The idea is that when a container loads a jar (called a "bundle") it can check and see whether the other dependencies are already loaded. These dependencies can be retrieved and loaded from a common repository. This is fundentally different way to launch java. Traditionally each application is loaded onto it's own isolated classpath.....
Time will tell if either project catches on. In the meantime we use Apache ivy and Apache Maven to build self-contained and possibly over-bloated WAR (EAR, etc) packages.
I have 3 Java projects with the same entities.
I want to share entities between these projects because entities can evolve during the development phase.
We are thinking about building a jar with entities and sharing it using Maven (with a repository).
Maybe you have another solution ?
I also can recommend to use Maven to share code between projects.
Here are some tips to get started:
Use a Maven Repository Manager such as Nexus. It will help you to
create a stable development environment.
Every developer (also the Continuous Integration Server user) should configure their settings file to use your Maven Repository
Manager. Don't specify your repositories in the POMs, confiugre them
only in your Maven Repository Manager.
http://www.sonatype.com/books/nexus-book/reference/maven-sect-single-group.html
Use the dependencyManagement and pluginManagement elements of your parent POMs to specify all versions of the plugins and dependencies
you are using. Omit these versions in the other POMs (they will
inherit them from the parent POM).
I also recommend to use different POMs for multi-module builds and parent POMs.
If you want to share common interfaces, classes, functionality or components, Maven is the way to go. In addition to the dependency management, you also get the added bonus of a standard project layout that will simplify things. Easy integration with most common continuous integration servers and a standard release process are further benefits.
Definitely take a look at Maven!
making an own JAR-library is definitely a good solution.
The jar-file is easy to distribute via dependency management (maven, ivy, gradle ..)
The jar is versioned
The projects using the library can be tested against a certain verion. Otherwise it may gets a problem if you change enties and forget to change a depending project. -> integration tests
Regards
Entities are the representation of a given object am I correct? If so the default mechanism implemented by Java is Object serialization - http://en.wikipedia.org/wiki/Serialization. In the case of jar files if an entity changes you would have to change jar once again each time as well. It may be tedious.
Geneate a standard war file in roo.. But then change it's package to jar file.
Then from any standard war file you can just deploy this jar (Ill use the jar as a maven dependency). Ill maintain a unique named applicationConext like pizzaShop-applicationContext.xml and like pizzaShop-applicationContext-jpa.xml. so from a parent spring project I can stack up various roo projects in this fashion.
Ill also keep their generated webapps folder to allow for the generator to work more easily. (This means I have to open up the pom.xml and keep changing it back to jar). Also helps with cut and paste fodder for non roo generated war files web.xml entry additions.
Seems like it may be a confusing point about roo.. You can just mix and match these jars as you would any spring project. They function like self contained units of springness and work fine sitting side by side with other spring jars all under the same webapp/web.xml context.
Its tedious but still better then writing spring code by hand.
Currently when I am writting a bundle in that depends on a package, I have to "import" or "depend" on a whole other bundle in Maven that contains that package.
This seems like it is counter-productive to what OSGi gives me.
For example let's say I have two bundles: BundleAPI and BundleImpl.
BundleAPI provides the API interfaces:
// BundleAPI's manifest
export-package: com.service.api
BundleImpl provides the implementation:
//BundleImpl's manifest
import-package com.service.api
However, when I am coding BundleImpl in Eclipse, I am forced to "depend" in maven POM on BundleAPI itself - so that eclipse does not complain.
//BundleImpl's POM
<dependency>
<groupId>com.service</groupId>
<artifactId>com.service.api</artifactId>
[...]
</dependency>
So - on one hand, I am depending only on the package com.service.api, while on the other - I need to have the whole bundle - BundleAPI.
Is there a way to make maven or eclipse smart enough to just find the packages somewhere, instead of whole bundles?
I am very much confused as to how this works - any type of clarity here would be great. Maybe I am missing something fundamentally simple?
The key is to distinguish between build-time dependencies and runtime dependencies.
At build time you have to depend on a whole artifact, i.e. a JAR file or bundle. That's pretty much unavoidable because of the way Java compilers work. However at runtime you depend only on the packages you use in your bundle, and this is how OSGi manages runtime substitution. This is the Import-Package statement in your final bundle.
Of course as a developer you don't want to list two parallel sets of dependencies, that would be crazy. Fortunately maven-bundle-plugin is based on a tool called bnd that calculates the Import-Package statement for you based on analysing your code and discovering the actual packages used. Other tools such as bndtools (an Eclipse-based IDE for OSGi development) also use bnd in this way. Incidentally bnd is much more reliable and accurate than any human at doing this job!
So, you define only the module-level dependencies that you need at build time, and the tool generates the runtime package-level dependencies.
I would recommend against using Tycho because it forces you to use Eclipse PDE, which in turn forces you to manually manage imported packages (for the sake of full disclosure, I am the author of bndtools which competes against PDE).
You cannot develop bundles like regular Java projects with Maven and eclipse. You basically have 2 options.
Apache Felix Bundle Plugin: Basically you develop the project as a regular Java project and use Maven as you normally would. This plugin will be used to add all the OSGi specifics to the jar manifest at deployment time to OSGi enable it. The disadvantage of this aproach is that you are using a Java project in your workspace instead of a bundle, which makes running your project in the OSGi container a little extra work since Eclipse doesn't recognize it as a plugin project. Thus you have to add the jar from the Maven build as part of the target platform manually.
Tycho: This is another Maven plugin that attempts to actually bring theses two environments together and does a pretty good job of it. In this scenario, you actually create an Eclipse bundle/plugin project, which obviously makes for seamless integration in Eclipse. The pom then marks the project as being an eclipse-plugin type, which effectively makes Maven resolve the project dependencies (defined in the manifest) via the target platform instead of Maven itself.
I would take the Tycho approach as it gives a much more integrated approach with Eclipse.
Having the whole jar as a dependency shouldn't be a problem, that's how you have to do it with Maven anyway.