How to make a desktop application modular? - java

How do you make a Java desktop application modular? How should the modules be classified?

As a design goal, modularity means that you want to have an application composed of separate parts (modules) where each part has its area of responsibility and contains all classes concerned with that area (high cohesion), and communication between those parts happens through narrow, well-defined and -documented interfaces (loose coupling).
You achieve this by planning your design beforehand and adjusting those planse and refactoring the code constantly during implementation.
It's useful to make a difference between technical modules such as GUI, network communication or DB access (which often form layers, though these may be sub-divided into several modules), and domain modules that contain the application-specific logic and often don't form layers.

Have a look at OSGi technologies. Each module of your application (called a bundle) is a separate jar, and OSGi takes care of dependency resolution and dynamically loading bundle classpaths etc.
For desktop applications I would strongly recommend looking at DA-Launcher from www.dynamicjava.org. It makes deploying your app SOOO much easier. They also have a few things like dynamic JPA that are useful for any OSGi app.

You mean modular like Eclipse?
If you base your java desktop application on Eclipse RCP or NetBeans RCP, you'll get modularity "for free" (almost ;-))

The answer to Your question really depends on what did you mean by "modular".
There are several levels of concerns that you must consider when making your application modular.
First of all you must consider if the "modularity" you seek is architectural modlarity, deploymeyment time modularity or runtime modularity.
In any case every consecutive level implies all of the previous levels.
For starters - to make your application modular you have to start from architecture. Separate your concerns into well defined clean cut parts that have well defined interfaces to the "outside world". Use of good design patterns and dependency injection and designing for unit testability go a long way here towards achieving nice separation of concerns that is the bedrock of modular design.
Start from small but keep in mind the large picture. When designing a bit larger chunks (or modules) of your system make sure they have as few overlapping areas as possible. Every module should make almost no assumptions about the environment they run in and be serving only one single concern. Any services it requires from it's peers should be explicitly provided by external initialization (preferably using dependency injection for gluing the modules together into a working app).
If your architecture is modular, it is an easy task to separate the concerns into their own deployment units (in form of projects, jars, bundles, plug-ins, extensions or whatever) and you can start easily mixing and matching various modules during the deployment to get the exact feature set you need for the particular application instance. This is what I mean by deployment time modularity.
Going a long way towards enabling deployment time modularity are Dependency Injection frameworks like Guice, Spring framefork and others.
Runtime modularity the way I see this is something akin to the modularity provided by Eclipse and NetBeans plugins or Mozilla extensions where you can change the configuration and set of your application modules after the deployment/installation.
This implies some sort of architecture and infrastructure that recognizes new plug-ins/extensions either at application initialization time or dynamically at runtime.
Latter also means that all your modules must be build with implicit assumption that any service that a module uses can easily dissapear at any point in time, making the extra effort to ensure robustness of the code running in this volatile world.

I would also recommend Eclipse RCP or have a look at Netbeans RCP. The two are very similar. One thing that separates them is that Eclipse RCP uses native GUI libraries instead of Swing which Netbeans uses.
Pros and cons is that Elcipse might be a bit faster though you are more limited to the kind of controls the operating system offers. Netbeans uses Swing which might be more familiar to most Java developers and the ability to develop custom controls are endless.
Its been a while since I worked with Eclipse RCP so I'm probably wrong about developing custom controls in Eclipse RCP.
The thing they have in common is that developing smart and modular desktop apps is fun and you get professional looking apps in much less time!
Good luck!

You could also take a look at the Java Plug-in Framework,
http://jpf.sourceforge.net/
JPF can greatly improve the modularity
and extensibility of your Java systems
and minimize support and maintenance
costs.

have a try with
Spring RCP
http://www.springsource.org/spring-rcp
when organizing your GUI part of application...

Related

Is it appropriate to use OSGI framework on small java app?

I have a plan to reimplement one of my small but usefull applications with OSGI framework. I never used it, so I ask is it appropriate to use OSGI on small app and is it a big difference in speed or/and memory footprint when using such framework. Also if it is a good option I would ask what implementation is best for small applications.
thank you!
For modern computer systems, speed and memory footprint of OSGi are of no concern at all: remember that OSGi was developed for resource-constrained devices. The memory footprint is in the hundreds of kBs, and once the service resolution is done, the framework has no impact on the speed of your application (for instance, there are no proxies). In short, no worries at runtime.
I like the way a properly designed OSGi application cleans up the application's structure, by forcing you to think about your modules and services. I will stay away from all the benefits of modularization and service orientation here, just remember they apply just as well to small applications as to large. Hey, you might even start to find reusable components!
You will need to think about packaging and shipping your application: depending on your audience, you can get away with just shipping a bunch of bundles, using a shell script to get the system going (using e.g. Pax Runner), or you might need to invest in something a little more fancy, like nice application packaging with an icon.
I use karaf/iPOJO as an OSGi container to allow upgrading versions of libraries while the application is running.
However, for a small application which you can restart any time, I would keep things as simple as possible.
Today, OSGi has basically no overhead, the frameworks range from 350k to 1mb. In runtime, OSGi stays out of your way.
You should take a look at bndtools, it provides a very nice development environment for OSGi bundles including launching, debugging, and testing. You can easily switch between frameworks.
You can find bndtools in the Eclipse marketplace.
OSGi has still a "little" more overhead than regulare Java projects. I would instead rely on Maven modules, if you wanna have versioning.
If you choose the OSGi apporach take a look at the Eclipse plugin creation. This is based on Eclipse Equinox and can be applied to new projects fast by the wizzards Eclipse offers for creating new projects.
Good luck!

Taking the next step with java development?

I want to take the next step in java web development, I am hoping to get insight & feedback on: what my next steps should be and how best to take them.
While learning the basics of java web development, I put together a simple web app that performs simple accounting and financial calculations. The web app is on a single jvm, uses Tomcat, and has standard web functionality - i.e. login/logout, basic security, etc.
How can I make this web app more "enterprise ready" - distribute functionality of tiers over different servers/jvms, HA, balance-able, etc.
What do I need to know/learn? - i.e. EJB3 or Spring Framework (seems spring is better option), REST and/or SOAP, etc.
How would one recommend (books, websites, etc.) I learn the "requirements" (see preceding line)?
Thanks!
In my opinion, you should try different approaches for a same problem, so you could compare the pros and cons of different tools and frameworks.
For instance, try to build an application using EJB, and then the same application using Spring. Take the presentation layer of your code written with JSF and then rewrite it using Tapestry.
I think this will be very helpful to you, as you'll be able to make best decisions when choosing tools for your future developments.
A few things to consider, as food for thought:
How good is the error handling/logging of the application? For example, if the user tries to put in X in for a currency value, what does the application do?
What is configurable within the application from the user and what is in configuration files and what is in a database with regards to configuration? Do you have passwords encrypted within the application?
What patterns would used in building this application? Are there patterns you could see using now that you have a prototype?
Is this application ready to handle different currencies and languages?
What happens if someone leaves the screen for a few hours and tries to use a form?
What administrative functionality does the application have?
Does it handle the case where the user has JavaScript disabled?
What are the limitations of your application, IOW what can't it handle the way it is?
Have you considered trying to write a manual for the application?
EJB or Spring? gets asked quite a lot nowadays, here's a decent related question about them.
Have some real users using your application. You'll be amazed on how many "new" features/improvements can be performed in your app ( and the technologies you'll learn to satisfy those requirements ) by having real users using it.
I would suggest the following books/tutorials are a must for every Java developer:
Manning: Spring in Action - 2nd edition
Manning: Java Persistence with Hibernate
Core JavaServer Faces
Adobe Flex ( Adobe website video tutorials )
Effective Java
Apart from the standard technologies above you must be familiar with
Different testing frameworks , JUnit is a must
Build tools like ANT and Maven
Also you can build small projects by downloading trial versions of MyEclipse or Flex Builder.
I suggest that you create small java experiment projects for each new framework/library that you want to learn.
I've had good success using maven to help me quickly and consistently create java projects that I use to experiment with one technology at a time, such as Spring, Hibernate, etc. I use maven's site life cycle to record notes about what I learned and to document how to build and run each project. So, now I have 20 or so projects that I can use as baseline projects, one for each framework, to build upon.
Also, I prefer buying and reading books rather than relying on google and websites to learn new frameworks. Seems that I'm able to learn a lot faster this way.
I also suggest that you write web apps that you, yourself, would want to use. Or write a web app that solves a problem you've been having. I've found that I learn a lot more this way rather than simply copying and pasting from examples in a text book.
Hope that helps,
- Dave
Spring or Tapestry would be good options for new learnings. Does your app use any web services? If not work those in. Work with other application servers like JBoss and Weblogic and note their nuances with java. I'd also recommend learning Maven and work that into your build/deploy process.
Have fun,
Mike
You could vastly reduce the time taken to build your apps by learning some Test-Driven Development.
Try learning JUnit - it's becoming a core skill now, even in unagile shops.
If you're focussing on the web, try out Selenium - which has a Java controller to drive your tests from Java test cases.
After investing a bit of time in TDD will pay off no matter which frameworks or apps you work on. If you learn to test drive your code, you'll end up with smaller, cleaner code and less debugging.
You might look at AppFuse, which is a bundle of Java things together.
Or, you might take a look at a few more technologies to play with and add in:
Version Control - SVN
Tools - Ant or Maven
Framework - Spring, Seam, Struts
ORM - Hibernate or iBatis
Test Driven Development - JUnit, Emma
Continuous Integration - Hudson
I'd also read the Pragmatic Programmer and/or Code Complete.
Allow me to state that "enterprise"-ready does not necessarily imply scale-out solutions, many, many enterprise Java applications are running on larger systems requiring long-running-systems skill of their own.
I recommend mastering the Java language and runtime, understanding how bytecodes and loading traverse the JVM vs. focus on any given framework.
Speaking of frameworks and if you really have the time, try recreating an application framework yourself. Try and re-invent the wheel. IMO it is an excellent lesson in why frameworks themselves exist and teaches one to employ the features instead of always trying to work around them.
One more thing, never forget the database. I don't care what that looks like Oracle, MySQL or NoSQL, but become also an equal master at the data store.

In enterprise Java/.Net projects, does every developer have all dependencies in their classpath?

On large-scale Java/.Net Enterprise projects, does every developer need to have all the components/libraries/dependencies in their classpath/local development environment, in order to make it build?
Or are they divided up into smaller sections can be built in isolation (so that they don't need to reference all the dependencies)?
In other words: if they want to run the whole application, they need all the components; but if they are only running a subset of the app, they'll only need the corresponding subset of components.
Are large enterprise projects usually organized in the first way or the second way?
A possible organization is if you are working on a module of the whole project that is self-contained, but referenced by other modules (in other words, a leaf-node in the dependency tree).
Another organization is if you dynamically load classes that you use, you can build without having any of them in your classpath. To run it, your classpath only needs to access the ones that you actually load (there might be many others that form different parts of the project, that you don't load).
These are theoretical possibilities; but what's standard practice for enterprise projects, in... well, in practice?
I've expanded this to include .Net, because I think the same issues would arise there (DLL hell?)
There's a different answer to this question for every project out there. A few general points:
"running a subset of the app" is often not possible, as very few apps are modular enough so that each part of them can actually run independantly.
What you sometimes have is an app core that is always required, and modules built on that core that are more or less independant of each other.
The big difference is usually not between having vs. not having all components, but between having them as source code vs. having them as JAR files.
On large apps, developers typically have only the parts they're working on in source code and the rest as JAR files
If you need runtime modularization (i.e. components are loaded and unloaded on demand at runtime), that's what OSGi is intended for.
They may need only a subset to build, and another subset to run their tests, but because all dependencies of less-than-trivially-sized Java projects can very quickly become a nightmare to keep track of, Java developers have come to love developed a love/hate-relationship with their elaborate build systems, such as Maven, which manage their development environment for them.
For projects that do not use such a system, it is generally easiest to just include everything all the time. The trade-off is unnecessarily bloated development environments versus having to spend time to track down missing dependencies.
A good project structure will break down things so that you can run independent modules.
But in real life, most projects I've seen don't do this until someone gets fed up and takes initiative to break them down.
If you use a good dependency management infrastructure like Maven or Ivy properly, you can store compiled modules on a server and download these dependencies on an as-needed basis.
You can also get away with having many mock objects and services to help break down the testing dependencies on other product components.
I certainly agree with the comments that it would be "good" to separate things. But in practice, that's very rare.
Assuming that you must work in an environment which has not been separated, there's another organizational strategy, and it's what I've seen used. Since your question refers to both build and run dependencies, you don't appear to be talking about processes, but about classes and jars.
The simple solution for that is to have the complete set of built, integration-tested (or integration-test-ready, for that matter) dependencies up on a shared server.
Then developers build in their local environments the portions of the system on which they're working, using a classpath which references first their development and then the appropriate shared server.
Your question isn't very clear, but I think the answer is that every class your application needs has to be in the CLASSPATH or the class loader with throw a ClassNotFoundException.
That's true whether you're a solo developer or working on a larger, distributed team.
In my experience, applications are packaged one way. If you only want a subset, you have to package it as such.
If you mean test cases as something separate, those usually aren't packaged with production code.
In my opinion, it's not whether developers can work on a subset of the application, but rather managing the dependencies between the projects (think Eclipse projects) that make up the app. Often you might have a tree of such projects where one or more project can depend on other projects. In such cases it's usually the role of the upstream/common project to make sure downstream projects are not broken due to changes in this upstream project.
Think of it like this - let's assume you have a utils project where you put all the common/utility functionality for your application - this could be validation logic, string utilities, logging, etc. And you have a bunch of other projects that use classes from this utils.
utils
/ \
proja projb
In this case, the person working on utils should also have proja and projb on their development environment as any change to utils will break them. However if you're only working on projb then you might not have to include proja as you have no dependency to that project.

Embedded OSGi or Application Bundle

I've just spent the last two days reading up all the OSGi stuff I can get my hands on and I finally think I've got my head around it.
I'm now trying to integrate it with an existing application for many reasons such as 3rd party plugins, automatic updates, not to mention that SOA just makes me happy.
I now have a decision I'm struggling to make, which is weather
My entire application should become an OSGi bundle installed by default in the container; or
My application should launch an embedded OSGi container and interact with it for all the plugged services.
I'd prefer 1, as this lets me update the application easily and the architecture would be consistent. Of course I expect to have to refactor the application into many smaller bundles. However 2 makes things much easier in the short term, but will become awkward in the future.
for option 1) you really don't want your whole application in one bundle - you would loose all the benefit from OSGi - but really that depend on the size of your application.
It really depends where you want to run application and which task you want it to perform. Also you probably want to have some kind of remoting to access the exposed services.
in option 1) you need to enable some kind of http/servlet bundle (there is a bridge that exists)
in option 2) you application can run inside an application server so you don't have to worry about that.
The first question you want to ask yourself is about the operational environment. Who is going to run the application? Do they need/want to be trained on OSGi? Are they more comfortable with the J2EE stack?
I think the best option for you is to keep your options open, there is no real differences between 1) and 2) but what is staring the OSGi framework, either your code or the framework code. Your application itself, ie the bundles constituting your application will be exactly the same.
My advice would be not to worry too much about OSGi runtime to start with - but start on OSGi development - nothing stop you from developing "OSGi-style" and running in a standard JRE environment.
I think you want to go with option 1, and have your application consist of a set of bundles inside of an (mostly out-of-the-box) OSGi container.
It will improve modularity of your own code. You may even find that some parts of it can provide services of use outside of the original application.
It is much easier to use other bundles from inside of OSGi, than from the host application. Because the host application cannot see the bundles' classes (and the bundles can only see what you explicitly expose from the host), you have to set up a pretty convoluted classpath or resort to reflection to call bundles from outside the container.
So I'd say that even in the short run, option 1 is probably easier.
Also, I agree with Patrick's assertion that the bulk of your code does not need to care if it runs in OSGi or in a plain JVM. Especially when using Declarative Services and such the need to use OSGi interfaces and mechanisms from your code is greatly reduced: You just add a few descriptor files to the jar's META-INF.
I would rather go with option 2,
Inherently your application is not a bundle, but an application.
If you want the OSGi value addition, spawn the OSGi container from within your application.
That way, at a future date if you decide to move away from OSGi, you can do in a simple way.
Have you looked at the Spring Application server? Doesn't this allow you to manage this stuff?
I would definitely recommend 1 - the app should become an OSGi bundle(s), and not only because of easy updating. If half of your code is in the OSGi framework and half is outside, you will have to construct a bridge for the communication between the two halves; you could also have issues with classes visibility.
There are also many benefits from 1, and it is not so difficult to achieve. What I would recommend is the following:
Separate the application in as many modules as it seems logical to you.
You are not forced to have many modules - OSGi can as easily handle two bundles 10 MB each as well as 100 smaller bundles. The separation should be a result of the functionality itself - a good starting point is the UML architecture diagram you probably did before you even started implementing the stuff. The places where the different functional parts communicate with each other are exactly the places where you should think about defining interfaces instead of classes - and these interfaces will then become your OSGi services and the implementations will become the bundles - and the next time you will have to update some part you will find out it is much easier to predict the effect on the other parts of the app because you separated it clearly and declared it in the manifest of the bundles.
Separate any external/open source libraries you use in separate bundles. They will most probably be the parts that will have to be updated more often and on a different timeline than your own code. It is also more important here to define clear package dependencies, package versions, and to avoid depending on the implementation parts instead of only on interfaces!
Think about which parts of the app you want to expose to plugins. Then make OSGi services out of these parts - i.e. publish the interfaces in the OSGi registry. You don't need to implement any specific thing - you can publish any Java object. The plugins will then use the regitry for the lookup.
The same goes for the plugins - think about what you want to get from plugins and define the respective interfaces that the plugins can implement and publish and your app can lookup in the registry.
And as a final tip - see what bundles are already available in the OSGi framework you have chosen. There are many standard OSGi interfaces defined by the OSGi spec - for configuration, logging, persistent storage, remote connections, user admin, eventing, and many more. Since they are standard, you can use them without becoming dependent on any specific OSGi implementation. And uninstall what you don't need.

What does OSGi solve?

I've read on Wikipedia and other sites about OSGi, but I don't really see the big picture. It says that it's a component-based platform, and that you can reload modules at runtime. Also the "practical example" given everywhere is the Eclipse Plugin Framework.
My questions are:
What is the clear and simple definition of OSGi?
What common problems does it solve?
By "common problems" I mean problems we face everyday, like "What can OSGi do for making our jobs more efficient/fun/simple?"
what benefits does OSGi's component system provide you? Well, Here is quite a list:
Reduced Complexity - Developing with OSGi technology means developing bundles: the OSGi components. Bundles are modules. They hide their internals from other bundles and communicate through well defined services. Hiding internals means more freedom to change later. This not only reduces the number of bugs, it also makes bundles simpler to develop because correctly sized bundles implement a piece of functionality through well defined interfaces. There is an interesting blog that describes what OSGi technology did for their development process.
Reuse - The OSGi component model makes it very easy to use many third party components in an application. An increasing number of open source projects provide their JARs ready made for OSGi. However, commercial libraries are also becoming available as ready made bundles.
Real World - The OSGi framework is dynamic. It can update bundles on the fly and services can come and go. Developers used to more traditional Java see this as a very problematic feature and fail to see the advantage. However, it turns out that the real world is highly dynamic and having dynamic services that can come and go makes the services a perfect match for many real world scenarios. For example, a service could model a device in the network. If the device is detected, the service is registered. If the device goes away, the service is unregistered. There are a surprising number of real world scenarios that match this dynamic service model. Applications can therefore reuse the powerful primitives of the service registry (register, get, list with an expressive filter language, and waiting for services to appear and disappear) in their own domain. This not only saves writing code, it also provides global visibility, debugging tools, and more functionality than would have implemented for a dedicated solution. Writing code in such a dynamic environment sounds like a nightmare, but fortunately, there are support classes and frameworks that take most, if not all, of the pain out of it.
Easy Deployment - The OSGi technology is not just a standard for components. It also specifies how components are installed and managed. This API has been used by many bundles to provide a management agent. This management agent can be as simple as a command shell, a TR-69 management protocol driver, OMA DM protocol driver, a cloud computing interface for Amazon's EC2, or an IBM Tivoli management system. The standardized management API makes it very easy to integrate OSGi technology in existing and future systems.
Dynamic Updates - The OSGi component model is a dynamic model. Bundles can be installed, started, stopped, updated, and uninstalled without bringing down the whole system. Many Java developers do not believe this can be done reliably and therefore initially do not use this in production. However, after using this in development for some time, most start to realize that it actually works and significantly reduces deployment times.
Adaptive - The OSGi component model is designed from the ground up to allow the mixing and matching of components. This requires that the dependencies of components need to be specified and it requires components to live in an environment where their optional dependencies are not always available. The OSGi service registry is a dynamic registry where bundles can register, get, and listen to services. This dynamic service model allows bundles to find out what capabilities are available on the system and adapt the functionality they can provide. This makes code more flexible and resilient to changes.
Transparency - Bundles and services are first class citizens in the OSGi environment. The management API provides access to the internal state of a bundle as well as how it is connected to other bundles. For example, most frameworks provide a command shell that shows this internal state. Parts of the applications can be stopped to debug a certain problem, or diagnostic bundles can be brought in. Instead of staring at millions of lines of logging output and long reboot times, OSGi applications can often be debugged with a live command shell.
Versioning - OSGi technology solves JAR hell. JAR hell is the problem that library A works with library B;version=2, but library C can only work with B;version=3. In standard Java, you're out of luck. In the OSGi environment, all bundles are carefully versioned and only bundles that can collaborate are wired together in the same class space. This allows both bundle A and C to function with their own library. Though it is not advised to design systems with this versioning issue, it can be a life saver in some cases.
Simple - The OSGi API is surprisingly simple. The core API is only one package and less than 30 classes/interfaces. This core API is sufficient to write bundles, install them, start, stop, update, and uninstall them and includes all listener and security classes. There are very few APIs that provide so much functionality for so little API.
Small - The OSGi Release 4 Framework can be implemented in about a 300KB JAR file. This is a small overhead for the amount of functionality that is added to an application by including OSGi. OSGi therefore runs on a large range of devices: from very small, to small, to mainframes. It only asks for a minimal Java VM to run and adds very little on top of it.
Fast - One of the primary responsibilities of the OSGi framework is loading the classes from bundles. In traditional Java, the JARs are completely visible and placed on a linear list. Searching a class requires searching through this (often very long, 150 is not uncommon) list. In contrast, OSGi pre-wires bundles and knows for each bundle exactly which bundle provides the class. This lack of searching is a significant speed up factor at startup.
Lazy - Lazy in software is good and the OSGi technology has many mechanisms in place to do things only when they are really needed. For example, bundles can be started eagerly, but they can also be configured to only start when other bundles are using them. Services can be registered, but only created when they are used. The specifications have been optimized several times to allow for these kind of lazy scenarios that can save tremendous runtime costs.
Secure - Java has a very powerful fine grained security model at the bottom but it has turned out very hard to configure in practice. The result is that most secure Java applications are running with a binary choice: no security or very limited capabilities. The OSGi security model leverages the fine grained security model but improves the usability (as well as hardening the original model) by having the bundle developer specify the requested security details in an easily audited form while the operator of the environment remains fully in charge. Overall, OSGi likely provides one of the most secure application environments that is still usable short of hardware protected computing platforms.
Non Intrusive - Applications (bundles) in an OSGi environment are left to their own. They can use virtually any facility of the VM without the OSGi restricting them. Best practice in OSGi is to write Plain Old Java Objects and for this reason, there is no special interface required for OSGi services, even a Java String object can act as an OSGi service. This strategy makes application code easier to port to another environment.
Runs Everywhere - Well, that depends. The original goal of Java was to run anywhere. Obviously, it is not possible to run all code everywhere because the capabilities of the Java VMs differ. A VM in a mobile phone will likely not support the same libraries as an IBM mainframe running a banking application. There are two issue to take care of. First, the OSGi APIs should not use classes that are not available on all environments. Second, a bundle should not start if it contains code that is not available in the execution environment. Both of these issues have been taken care of in the OSGi specifications.
Source : www.osgi.org/Technology/WhyOSGi
I've found the following benefits from OSGi:
Each plugin is a versioned artifact that has its own classloader.
Each plugin depends on both specific jars that it contains and also other specific versioned plug-ins.
Because of the versioning and isolated classloaders, different versions of the same artifact can be loaded at the same time. If one component of your application relies on one version of a plug-in and another depends on another version, they both can be loaded at the same time.
With this, you can structure your application as a set of versioned plugin artifacts that are loaded on demand. Each plugin is a standalone component. Just as Maven helps you structure your build so it is repeatable and defined by a set of specific versions of artifacts it is created by, OSGi helps you do this at runtime.
I don't care too much about the hotplugability of OSGi modules (at least currently). It's more the enforced modularity. Not having millions of "public" classes available on the classpath at any time protects well from circular dependencies: You have to really think about your public interfaces - not just in terms of the java language construct "public", but in terms of your library/module: What (exactly) are the components, that you want to make available for others? What (exactly) are the interfaces (of other modules) you really need to implement your functionality?
It's nice, that hotplug comes with it, but I'd rather restart my usual applications than testing all combinations of hotplugability...
You can, analogically speaking, change the motor of your car without turning it off.
You can customize complex systems for the customers. See the power of Eclipse.
You can reuse entire components. Better than just objects.
You use a stable platform to develop component based Applications. The benefits of this are huge.
You can build Components with the black box concept. Other components don't need to know about hidden interfaces, them see just the published interfaces.
You can use in the same system several equal components, but in different releases, without compromise the application. OSGi solves the Jar Hell problem.
With OSGi you develop thinking to architect systems with CBD
There are a lot of benefits (I reminded just these now), available for everyone who uses Java.
edited for clarity. OSGi page gave a better simple answer than mine
A simple answer: An OSGi Service Platform provides a standardized, component-oriented computing environment for cooperating networked services. This architecture significantly reduces the overall complexity of building, maintaining and deploying applications.
The OSGi Service Platform provides the functions to change the composition dynamically on the device of a variety of networks, without requiring a restarts.
In a single application structure, say the Eclipse IDE, it's not a big deal to restart when you install a new plugin. Using the OSGi implementation completely, you should be able to add plugins at runtime, get the new functionality, but not have to restart eclipse at all.
Again, not a big deal for every day, small application use.
But, when you start to look at multi-computer, distributed application frameworks, that's where it starts to get interesting. When you have to have 100% uptime for critical systems, the capability to hotswap components or add new functionality at runtime is useful. Granted, there are capabilities for doing this now for the most part, but OSGi is trying to bundle everything into a nice little framework with common interfaces.
Does OSGi solve common problems, I'm not sure about that. I mean, it can, but the overhead may not be worth it for simpler problems. But it's something to consider when you are starting to deal with larger, networked, applications.
A Few Things that drive me nuts on OSGi:
1) The implentations and their context loaders have a lot of quirks to them, and can be somewhat async (We use felix inside of confluence). Compared to a pure spring (no DM) where [main] is pretty much running through everything sync.
2)Classes are not equal after a hot load. Say, for instance you have a tangosol cache layer on hibernate. It is filled with Fork.class, outside of the OSGi scope. You hotload a new jar, and Fork has not changed. Class[Fork] != Class[Fork]. It also appears during serialization, for the same underlying causes.
3)Clustering.
You can work around these things, but it is a major major pain, and makes your architecture look flawed.
And to those of you advertising the hotplugging.. OSGi's #1 Client? Eclipse. What does Eclipse do after loading the bundle?
It restarts.
OSGi makes your code throw NoClassDefFoundError and ClassNotFoundException for no apparent reason (most probably because you forgot to export a package in OSGi configuration file); since it has ClassLoaders it can make your class com.example.Foo fail to be cast to com.example.Foo since it's actually two different classes loaded by two different classloaders. It can make your Eclipse boot into an OSGi console after installing an Eclipse plugin.
For me, OSGi only added complexity (because it added one more mental model for me to grok), added annoyances because of exceptions; I never really needed the dynamicity it "offers". It was intrusive since it required OSGi bundle configuration for all modules; it was definitely not simple (in a larger project).
Because of my bad experience, I tend to stay away from that monster, thank you very much. I'd rather suffer from jar dependency hell, since that's way way more easily understandable than the classloader hell OSGi introduces.
I am yet to be a "fan" of OSGi...
I have been working with an enterprise application at Fortune 100 companies. Recently, the product we use has "upgraded" to an OSGi implementation.
starting local cba deployment...
[2/18/14 8:47:23:727 EST] 00000347 CheckForOasis
finally deployed and "the following bundles will be quiesced and then restarted"
[2/18/14 9:38:33:108 EST] 00000143 AriesApplicat I CWSAI0054I: As part of an update operation for application
51 minutes... each time code changes... The previous version (non-OSGi) would deploy in less than 5 minutes on older development machines.
on a machine with 16 gig ram and 40 free gig disk and Intel i5-3437U 1.9 GHz CPU
The "benefit" of this upgrade was sold as improving (production) deployments - an activity that we do about 4 times a year with maybe 2-4 small fix deployments a year. Adding 45 minutes per day to 15 people (QA and developers) I can't imagine ever being justified. In big enterprise applications, if your application is a core application, then changing it is, rightly so (small changes have potential for far reaching impacts - must be communicated and planned with consumers all over the enterprise), a monumental activity - wrong architecture for OSGi. If your application is not an enterprise application - i.e. each consumer can have their own tailored module likely hitting their own silo of data in their own silo'd database and running on a server that hosts many applications, then maybe look at OSGi. At least, that is my experience thus far.
If a Java based application requires adding or removing modules (extending the base functionality of application), without shutting down the JVM, OSGI can be employed. Usually if the cost of shutting down JVM is more, just to update or to enhance functionality.
Examples:
Eclipse: Provides platform for plugins to install, uninstall, update and inter-depend.
AEM: WCM application, where functionality change will be business driven, which can not afford down times for maintenance.
Note: Spring framework stopped supporting OSGI spring bundles, considering it as unnecessary complexity for transaction based applications or for some point in these lines. I personally do not consider OSGI unless it is absolutely necessary, in something big like building a platform.
I've been doing work with OSGi almost 8 or so years and I have to say that you should consider OSGi only if you have a business need to update, remove, install or replace a component on runtime. This also means that you should have a modular mindset and understanding what modularity means. There's some arguments that OSGi is lightweight - yes, that is true but there are also some other frameworks that are lightweight and easier to maintain and develop. Same goes to secure java blah blah.
OSGi requires a solid architecture to be used correctly and it's quite easy to make OSGi-system that could just as easily be a standalone-runnable-jar without any OSGi being involved.
The OSGi provides following benefit:
■ A portable and secure execution environment based on Java
■ A service management system, which can be used to register and share services across bundles and decouple service providers from service consumers
■ A dynamic module system, which can be used to dynamically install and uninstall
Java modules, which OSGi calls bundles
■ A lightweight and scalable solution
It is also being used to bring additional portability of middleware and applications on the mobile side. Mobile side is available for WinMo, Symbian, Android for example. As soon as integration with device features occurs, can get fragmented.
At the very least, OSGi makes you THINK about modularity, code reuse, versioning and in general the plumbing of a project.
Others have already outlined the benefits in detail, I hereby explain the practical usecases I have either seen or used OSGi.
In one of our application, we have event based flow and flow is defined in plugins based on OSGi platform so tomorrow if some client wants different/additional flow then he just have to deploy one more plugin, configure it from our console and he is done.
It is used for deploying different Store connectors, for example, suppose we already have Oracle DB connector and tomorrow mongodb is required to be connected then write a new connector and deploy it and configure the details through console and again you are done. deployment of connnectors is handled by OSGi plugin framework.
There is already a quite convincing statement in its official site, I may quote as
The key reason OSGi technology is so successful is that it provides a very mature component system that actually works in a surprising number of environments. The OSGi component system is actually used to build highly complex applications like IDEs (Eclipse), application servers (GlassFish, IBM Websphere, Oracle/BEA Weblogic, Jonas, JBoss), application frameworks (Spring, Guice), industrial automation, residential gateways, phones, and so much more.
As for the benefits to developer?
DEVELOPERS: OSGi reduces complexity by providing a modular architecture for today’s large-scale distributed systems as well as small, embedded applications. Building systems from in-house and off-the-shelf modules significantly reduces complexity and thus development and maintenance expenses. The OSGi programming model realizes the promise of component-based systems.
Please check the details in Benefits of Using OSGi.

Categories

Resources