Java Plugin Development - java

I'm developing a java application that has 20 plugins, each plugin's have some similar GUI menu item's and menu event's but they store data in different table's in a database. Currently I'm creating separate GUI, event and model classes for each plugin by copying and pasting in diffrent class files.
Is it wise to develop separate GUI, event and model classes for each plugin's and duplicate similar methods to other plugin's?
I need your advice on how to create a generic GUI, event and model interface for all the plugin's without making my application uneasy to maintain.
Thank you.

We have a plug-in system in our product, and ran into the same issue. Many of the plug-ins share a lot of code. We ultimately decided on the following:
Define clean Interfaces (PluginABC implements MyProductInterface). These interfaces don't require a specific implementation but...
We provided a AbstractPlugin that Plugins can extend. This AbstractPlugin provides a ton of standard functionality that is useful for most plug-ins.
Example:
public interface MyProductInterface {
public void doIt();
}
public class MyPlugin implements MyProductInterface extends AbtractPlugin {
public doIt() {
// ...
usefulMethodForMostPlugins();
/// ...
}
}
public abstract class AbstractPlugin {
public usefulMethodForMostPlugins() ...
}

Related

Android: 2 aar libraries with the same package

Edit: A follow-up question based on this discussion was published in the following link.
Android: How to manage common codebase in multiple libraries used by the same application
I have two android aar library projects: LibA using ClassA, and LibB using ClassB. Both libs have the same base package. both libs use the same class named BaseClass, currently resides separately within each lib in package name 'common'. BaseClass contains one method named baseMethod.
This creates two libs using a class with the same name and a different implementation.
this is how the classes look like:
ClassA:
package mybasepackage.a;
import mybasepackage.common.BaseClass;
public class ClassA {
BaseClass baseClass;
public ClassA() {
this.baseClass= new BaseClass();
}
public String myPublicMethod(){
return this.baseClass.baseMethod();
}
}
ClassB:
package mybasepackage.b;
import mybasepackage.common.BaseClass;
public class ClassB {
BaseClass baseClass;
public ClassB() {
this.baseClass = new BaseClass();
}
public String myPublicMethod(){
return this.baseClass.baseMethod();
}
}
BaseClass In LibA:
package mybasepackage.common;
public class BaseClass{
public String baseMethod() {
return "Called from ClassA";
}
}
BaseClass in LibB:
package mybasepackage.common;
public class BaseClass{
public String baseMethod() {
return "Called from ClassB";
}
}
When I try to compile both libs in the same app, it throws a duplicated class error: "Program type already present: mybasepackage.common.BaseClass", this happens because the compiler cannot know which BaseClass to compile since it resides within both libs.
My goal is to allow both aar libs to compile successfully within the same app, while providing different implementations for the BaseClass. More formally, LibA and LibB should compile in the same application such as:
Calling new ClassA().baseMethod() will return "Called from ClassA".
Calling new ClassB().baseMethod() will return "Called from ClassB".
Pre condition: I cannot change the base package name in one of the libs because it essentially creates an unwanted duplication of BaseClass.
NOTE: I'm aware this may not be possible via the aar approach. If that is truly the case, I'm willing to consider other deployment architectures as long as I'll be able to compile these libs with the same common class using different implementations, as described in the question.
My goal is to allow both aar libs to compile successfully within the same app, while providing different implementations for the BaseClass
That is not possible, sorry.
I'm aware this may not be possible via the aar approach.
It has nothing to do with AARs. You cannot have two classes with the same fully-qualified class name in the same app, period. It does not matter where those duplicate classes come from.
I'm willing to consider other deployment architectures as long as I'll be able to compile these libs with the same common class using different implementations, as described in the question.
That is not possible, sorry. Again: it does not matter where the duplicate classes come from. You simply cannot have duplicate classes.
Given your precondition you just can't do that in this way. You cannot have 2 different libraries in java with the same package name, which is the main problem that throws your error (and not the name of the classes).
What you can do and maybe if possible is the best way to handle with that is to merge the two libraries into just one and add two subpackages inside and then just import them:
import mybasepackage.common.a_name.BaseClass; // class A
import mybasepackage.common.b_name.BaseClass; // class B
This will prevent the duplication error because they just have the same name but from different packages.
Another idea if this way doesn't fit your expectation is to change the architecture by implementing another abstraction layer in which you define your BaseClass as an abstract method:
package mybasepackage.common;
public class abstract BaseClass{
public String myPublicMethod();
}
and then you just implement the method inside ClassA and ClassB:
public class ClassA implements BaseClass{
public ClassA() {
super();
}
#Override
public String myPublicMethod(){
// logic for A
}
}
NB note that the above implementation of class A is just a stub and it is not supposed to work as it is. Adapt to your need.
In any case by the way you can't have two packages with same classes name.
Just build three artifacts, because two artifacts will always require an exclude on one of the dependencies set. When the two -liba and -libb libraries depend on a third -base, -core or -common library, there are no duplicate classes - and if you want to keep the package name, just make the package name depend on all of them, alike a meta-package:
mybasepackage
|
mybasepackage-liba -> mybasepackage-common
|
mybasepackage-libb -> mybasepackage-common
mybasepackage-common

How to manage required, conflicting Java dependencies

Say I need (its required) to use fizz-1.0.jar and buzz-2.3.2.jar in my Java project. Now, fizz-1.0.jar depends on foo-0.1.35.jar, and buzz-2.3.2.jar depends on foo-4.2.17.jar.
foo-0.1.35.jar contains a Widget class like so:
public class Widget {
public int doSomething(int x) {
return x++;
}
}
foo-4.2.17.jar contains a heavily modifed version of Widget:
public class Widget {
public Meh makeStuff() {
return new Meh();
}
}
Unfortunately, both fizz-1.0.jar and buzz-2.3.2.jar make heavy use of both versions of Widget.
I can't just blindly add both versions of foo-x.y.z.jar to the classpath, because whichever Widget gets loaded first will only work for either fizz-1.0.jar or buzz-2.3.2.jar.
What are my options here? Remember I must have both fizz-1.0.jar and buzz-2.3.2.jar, and must satisfy all of their transitive dependencies.
I would recommend that you use a framework which distinguishes class loads.
E.g. OSGi framework. Then you can create 2 bundles - one with the fizz implementation and one with the buzz implementation. They both can contain their dependent libraries which do not conflict anymore because they are loaded from 2 different classs loaders.
Example osgi containers or "Eclipse Equinox" and "Apache Felix" but there are more.
Hope that helps.

Lejos (java) and interfaces // UML suggestion

I created a project with lejos 0.9. Now what i know is that i'm only able to upload and compile classes (from java to nxj files) with the eclips plugin when the class has a public static void main(String[] args) . But i have to get more classes and interfaces on the lego mindstorm brick. Is there a way to do this ? Connecting directly to the brick is not a good idea because then java files will be put on the brick which cannot be run.
Another option for this problem might be to change the uml design. This is the current design
So basically there is a robot class and other robots like humanoid etc extend this robot class. Then there are behaviours. Which all implement the interface Iwalk. Every robot can get behaviors dynamically because of polymorphism .
In humanoid.java:
package Robots;
import Behaviours.;
import IBehaviours.;
public class Humanoid extends Robot {
private Iwalk walker = new ForwardLegs();
Iwalk getWalker() {
return walker;
}
public void setWalker(Iwalk walker) {
this.walker = walker;
}
public void moving() {
setWalker(walker);
walker.move();
}
}
In a helloworld.java class (not linked to any class , just to initiate)
Humanoid asimov = new Humanoid();
asimov.setWalker(new ForwardLegs());
asimov.moving();
So to answers exist to my question: How to put interfaces on the legomindstorm brick with lejos. Or another UML design wich does the same but without interfaces. Ty in advance .
I solved the problem by not using interfaces. The reason i needed interfaces was because i need polymorphism. Which is also possible by changing the IWalk interface to an abstract class and change the implements to extends in the behaviors.

Cross Platform Development Suggestions

Im working on a project in java that will eventually run on linux and windows machines and maybe mac. My program installs/configures vnc server so I'm looking for suggestions on how I should implement this part of the project. Should I just have a modular design or would it be possible to create a platform independent architecture for this problem?
I think that if VNC configuration is different on different platform you should just create interface and hierarchy of classes that implement it, i.e.
public interface VncConfigurator {
public void configure(Configuration cofiguration) throws ConfigurationException;
}
public class WindowsVncConfigurator implements VncConfgurator {
public void configure(Configuration cofiguration) throws ConfigurationException {}
}
public class LinuxVncConfigurator implements VncConfgurator {
public void configure(Configuration cofiguration) throws ConfigurationException {}
}
etc, etc.
You can also create abstract configurator or cofigurator utils where the common logic will be implemented.
Now create factory that instantiates "right" implementation of configurator according to the platform. And you are done.
I believe that on Windows you will need some additional libraries, e.g. those that provide access to registry. But if you need this first check the following link: http://alexradzin.blogspot.com/2011/01/access-windows-registry-with-pure-java.html

Java (Eclipse) - Conditional compilation

I have a java project that is referenced in j2me project and in android project.
In this project i would like to use conditional compilation.
Something like...
//#if android
...
//#endif
//if j2me
...
//#endif
I have been reading about this but i did not find anything useful yet.
You could use Antenna (there is a plugin for Eclipse, and you can use it with the Ant build system).
I'm using it in my projects in a way you've described and it works perfectly :)
EDIT: here is the example related to #WhiteFang34 solution that is a way to go:
In your core project:
//base class Base.java
public abstract class Base {
public static Base getInstance()
{
//#ifdef ANDROID
return new AndroidBaseImpl();
//#elif J2ME
return new J2MEBaseImpl();
//#endif
}
public abstract void doSomething();
}
//Android specific implementation AndroidBaseImpl.java
//#ifdef ANDROID
public class AndroidBaseImpl extends Base {
public void doSomething() {
//Android code
}
}
//#endif
//J2ME specific implementation J2MEBaseImpl.java
//#ifdef J2ME
public class J2MEBaseImpl extends Base {
public void doSomething() {
// J2Me code
}
}
//#endif
In your project that uses the core project:
public class App {
public void something {
// Depends on the preprocessor symbol you used to build a project
Base.getInstance().doSomething();
}
}
Than if you want to build for the Android, you just define ANDROID preprocessor symbol or J2ME if you want to do a build for a J2ME platform...
Anyway, I hope it helps :)
Perhaps you should consider creating interfaces around the logic that's specific to a profile (J2ME, Android or other in the future). Then create concrete implementations of your interface for each profile. Any common parts you could split out into an abstract base class for both implementations to extend. This way your logic for each profile is nicely separated for different concerns. For each profile just build the appropriate set of classes (you could separate them by package for example). It'll be easier to maintain, debug, test and understand in the long run.
Eclipse MTJ project provides preprocessing support as documented . This support was mainly targeted for tackling fragmentation problems on JavaME. I have not tested the preprocessing support together with the Android tooling but it may just work.

Categories

Resources