Best practice: Extending or overriding an Android library project class - java

We're using an Android Library Project to share core classes and resources across different builds (targets) of our Android application. The Android projects for each specific target reference the Core library project (behind the scenes, Eclipse creates and references a jar from the referenced library project).
Overriding resources such as images and XML layouts is easy. Resource files placed in the target project, such as the app icon or an XML layout, automatically override the core library's resources with the same name when the app is built. However, sometimes a class needs to be overridden to enable target-specific behavior. For example, the Amazon target preferences screen cannot contain a link to the Google Play app page, requiring a change in the Amazon project's preferences.xml and preferences Activity class.
The goal is to reduce the amount of duplicate code among target projects while removing as much target-specific code from the Core library as possible. We've come up with a couple of approaches to implement logic specific to different targets:
Write the target-specific functions within Core library classes and use if/switch blocks to select behavior based on product SKU. This approach is not very modular and bloats the Core library codebase.
Extend the particular Core class in a target project and override the base (Core) class functions as needed. Then keep a reference to the base-class object in the Core library and instantiate it with an extended class object (from How to override a class within an Android library project?)
Are there other strategies to override or extend an Android library project class? What are some of the best practices for sharing and extending common classes among Android app targets?

Library project is referenced as a raw project dependency (source-based mechanism), not as a compiled jar dependency (compiled-code based library mechanism).
#yorkw this is not true for the latest versions of ADT Plugin for Eclipse
http://developer.android.com/sdk/eclipse-adt.html
From version 17 Change log
New build features
Added feature to automatically setup JAR dependencies. Any .jar files in the /libs folder are added to the build configuration (similar to how the Ant build system works). Also, .jar files needed by library projects are also automatically added to projects that depend on those library projects. (more info)
More info http://tools.android.com/recent/dealingwithdependenciesinandroidprojects
Before that, update overwriting of the Activity from Library project was easy, just exclude the class. Now the library is included as jar file, and there is no way to exclude class file from jar dependency.
EDIT:
My solution to overwrete/extend Activity from library jar:
I created a simple util class:
public class ActivityUtil {
private static Class getActivityClass(Class clazz) {
// Check for extended activity
String extClassName = clazz.getName() + "Extended";
try {
Class extClass = Class.forName(extClassName);
return extClass;
} catch (ClassNotFoundException e) {
e.printStackTrace();
// Extended class is not found return base
return clazz;
}
}
public static Intent createIntent(Context context, Class clazz) {
Class activityClass = getActivityClass(clazz);
return new Intent(context, activityClass);
}
}
In order to overwrite a library's "SampleActivity" class it a the project which depends on that library, create a new class with the name SampleActivityExtended in the project in the same package and add the new activity to your AndroidManifest.xml.
IMPORTANT: all intents referencing overwritten activities should be created through the util class in the following manner:
Intent intent = ActivityUtil.createIntent(MainActivity.this, SampleActivity.class);
...
startActivity(intent);

behind the scenes, Eclipse creates and references a jar from the referenced library project.
This is not quite accurate. Library project is referenced as a raw project dependency (source-based mechanism), not as a compiled jar dependency (compiled-code based library mechanism). Currently Android SDK does not support exporting a library project to a self-contained JAR file. The library project must always be compiled/built indirectly, by referencing the library in the dependent application and building that application. When build dependent project, the compiled source and raw resources that need to be filtered/merged from Library project are copied and properly included in the final apk file. Note that Android team had started revamping the whole Library Project design (move it from ource-based mechanism to compiled-code based library mechanism) since r14, as mentioned in this earlier blog post.
What are some of the best practices for sharing and extending common classes among Android app targets?
The solution given by Android is Library Project.
The solution given by Java is Inheritance and Polymorphism.
Come together, the best practice IMO is the second option you mentioned in the question:
2.Extend the particular Core class in a target project and override the base (Core) class functions as needed. Then keep a reference to the base-class object in the Core library and instantiate it with an extended class object (from Android library project - How to overwrite a class?)
From my personal experience, I always use Android Library Project (Sometimes with Regular Java Project, for implementing/building common-lib.jar that contains only POJO) manage common code, for instance SuperActivity or SuperService, and extends/implements proper classes/interfaces in the dependent project for Polymorphism.

Solution based on PoisoneR's solution and Turbo's solution.
public static Class<?> getExtendedClass(Context context, String clsName) {
// Check for extended activity
String pkgName = context.getPackageName();
Logger.log("pkgName", pkgName);
String extClassName = pkgName + "." + clsName + "Extended";
Logger.log("extClassName", extClassName);
try {
Class<?> extClass = Class.forName(extClassName);
return extClass;
} catch (ClassNotFoundException e) {
e.printStackTrace();
// Extended class is not found return base
return null;
}
}
The benefits of this is that
The extended class can be in the project's package, not the library's package. Thanks to Turbo for this part.
By taking a String as an argument instead of a Class object, this method is able to be used even with ProGuard. getName() is where the problem is with ProGuard, as that will return something like "a" instead of the name of the original class. So in the original solution instead of looking for ClassExtended it will look for aExtended instead, something which does not exist.

What about using a callback approach here? (Okay, callback is a little bit misleading but I currently have no other word for it:
You could declare an interface in every Activity which should/may be expanded by the user. This interface will have methods like List<Preference> getPreferences(Activity activity) (pass whatever parameters you need here, I would use an Activity or at least a Context to be futureproof).
This approach could give you what you want when I have understood it correctly. While I haven't done this before and don't know how other people handle this I would give it a try and see if it works.

Could you, please, clarify what is different in Kindle and regular Android?
I think - they are the same.
What you need is different resources for Kindle and other devices. Then use appropriate resource.
For example I use 2 links to store:
<string name="appStore"><a href=http://market.android.com/details?id=com.puzzle.jigsaw>Android Market</a> or <a href=http://www.amazon.com/gp/mas/dl/android?p=com.puzzle.jigsaw>Amazon Appstore</a> <br>http://market.android.com/details?id=com.puzzle.jigsaw <br>href=http://www.amazon.com/gp/mas/dl/android?p=com.puzzle.jigsaw</string>
<string name="appStore_amazon"><a href=http://www.amazon.com/gp/mas/dl/android?p=com.puzzle.jigsaw>Amazon Appstore</a> <br>href=http://www.amazon.com/gp/mas/dl/android?p=com.puzzle.jigsaw</string>
and use appStore for all none Amazone product and appStore_amazon for Kindle.
How to determine where are you on run time - that would be another question which was answered here many times.

I was inspired by PoinsoneR's answer to create a Utility class to do the same thing for Fragments - override a fragment in an android Library. The steps are similar to his answer so I won't go into great detail, but here is the class:
package com.mysweetapp.utilities;
import android.support.v4.app.Fragment;
public class FragmentUtilities
{
private static Class getFragmentClass(Class clazz)
{
// Check for extended fragment
String extClassName = clazz.getName() + "Extended";
try
{
Class extClass = Class.forName(extClassName);
return extClass;
}
catch (ClassNotFoundException e)
{
e.printStackTrace();
// Extended class is not found return base
return clazz;
}
}
public static Fragment getFragment(Class clazz)
{
Class fragmentClass = getFragmentClass(clazz);
Fragment toRet = null;
try
{
toRet = (Fragment)fragmentClass.newInstance();
return toRet;
}
catch (InstantiationException e)
{
// TODO Auto-generated catch block
e.printStackTrace();
}
catch (IllegalAccessException e)
{
// TODO Auto-generated catch block
e.printStackTrace();
}
return toRet;
}
}
Usage:
FragmentUtilities.getFragment(MySpecialFragment.class)

You can also use an Activity factory if you need to provide extended activitys for differnt build variants and have your library deal with the abstract factory alone. This can be set in your build variants Application file.

Related

How to keep a jar file external but still use its classes in my Android project?

I need to have a jar file located in a main/assets directory within an Android project. It is important the jar file is located there.
With my main Android project is there a way to reference this jar file in my code and to use its classes?
To be clear I don't want to add the jar to the main project once compiled.
EDIT: I have tried the link below and it seems to load the Class file I've stated. But I'm strugging how to define constructor arguments for the dynamically loaded Class.
android-custom-class-loading-sample
EDIT2
Nearly there. I've confirmed the class is loaded from my classes.jar. I'm stuck instantiating it though.
On the licenseValidatorClazz.getConstructor line I get the error below. I'm guessing I'm missing something from my Interface file?
java.lang.NoSuchMethodException: [interface com.google.android.vending.licensing.Policy, interface com.google.android.vending.licensing.DeviceLimiter, interface com.google.android.vending.licensing.LicenseCheckerCallback, int, class java.lang.String, class java.lang.String]
public Class licenseValidatorClazz = null;
public LicenseValidator validator;
...
// Initialize the class loader with the secondary dex file.
DexClassLoader cl = new DexClassLoader(dexInternalStoragePath.getAbsolutePath(),
optimizedDexOutputPath.getAbsolutePath(),
null,
mContext.getClassLoader());
try {
// Load the library class from the class loader.
licenseValidatorClazz = cl.loadClass("com.google.android.vending.licensing.LicenseValidator");
validator = (LicenseValidator) licenseValidatorClazz.getConstructor(Policy.class,DeviceLimiter.class,LicenseCheckerCallback.class,int.class,String.class,String.class).newInstance(ddd, new NullDeviceLimiter(),
callback, generateNonce(), mPackageName, mVersionCode);
} catch (Exception exception) {
// Handle exception gracefully here.
exception.printStackTrace();
}
I have an Interface which contains the functions to pass to the loaded class.
public interface LicenseValidator
{
public LicenseCheckerCallback getCallback();
public int getNonce();
public String getPackageName();
public void verify(PublicKey publicKey, int responseCode, String signedData, String signature);
public void handleResponse(int response, ResponseData rawData);
public void handleApplicationError(int code);
public void handleInvalidResponse();
}
TO use an external jar to be associated with your application and use it during runtime, it needs to be in dalvik format since normal jars cannot work under dalvikVM.
Convert your files using the dx tool
using aapt cmd , add those classes.dex to your jar file.
Now this jar which contains files in dalvik format can be loaded into our project.
Here is a post which explains the procedure to accomplish it.
There are steps to accomplish this.
You have to make a copy of your JAR file into the private internal storage of your aplication.
Using the dx tool inside the android folder, you have to generate a classes.dex file associated with the JAR file. The dx tool will be at the location /android-sdks/build-tools/19.0.1 (this file is needed by the Dalvik VM, simply jar can not be read by the dalvik VM))
Using the aapt tool command which is also inside the same location, you have to add the classes.dex to the JAR file.
This JAR file could be loaded dynamically using DexClassLoader.
If you are making a JAR from any one your own library, you have to do this steps (1-4) every time when there is a change in your library source code. So you can automate this steps by creating a shell script(in Mac/Linux/Ubuntu) or batch scripts(in Windows). You can refere this link to understand how to write shell scripts.
Note : One situation for implementing this method is, when it is impossible to add the JAR files directly to the build path of core project and need to be loaded dynamically at run time. In normal cases the JAR files could be added to the build path.
please check this link for the detailed code and implementation.
How to load a jar file at runtime
Android: How to dynamically load classes from a JAR file?
Hope this helps!!
You should try out the Services API - java.util.ServiceLoader
You define a service interface and its implementations in your jar.
package com.my.project;
public interface MyService { ... }
public class MyServiceBarImpl implements MyService { ... }
public class MyServiceFooImpl implements MyService { ... }
Then you define the services contained within the jar file in the META-INF/services/ directory. For instance, in the file 'META-INF/services/com.my.project.MyService', you list the provider classes.
# Known MyService providers.
com.my.project.MyServiceBarImpl # The original implementation for handling "bar"s.
com.my.project.MyServiceFooImpl # A later implementation for "foo"s.
Then, in your main codebase, you can instantiate a MyService instance with the ServiceLoader:
for (MyService service : ServiceLoader.load(MyService.class)) {
//Perform some test to determine which is the right MyServiceImpl
//and then do something with the MyService instance
}
These examples are taken more-or-less straight from the API, although I've changed the package names to make them slightly less annoying to read.

How to implemented to a shared interface pulled from a WAR

I have a web service we'll call service.war. It implements an interface we'll call ServicePluginInterface. During the startup of service.war, it reads in environment variables and uses them to search for a jar (MyPlugin.jar). When it finds that jar, it then uses a second environment variable to load the plugin within the jar. The class that it loads looks like this:
public class MyPlugin implements ServicePluginInterface {...}
The servlet attempts to load the plugin using code like:
try {
if (pluginClass == null) {
plugin = null;
}
else {
ZipClassLoader zipLoader = new ZipClassLoader(Main.class.getClassLoader(), pluginJar);
plugin = (ServicePluginInterface)zipLoader.loadClass(pluginClass).newInstance();
plugin.getAccount(null,null);
}
} catch (Exception e) {
...
}
The trick is that I don't have source or a jar for ServicePluginInterface. Not wanting to give up so easily, I pulled the class files out of the service.war files. By using those class files as dependencies, I was able to build, without compiler warnings, MyPlugin. However, when actually executed by Tomcat, the section of code above generates a runtime exception:
java.lang.ClassCastException: com.whatever.MyPlugin cannot be cast to com.whomever.ServicePluginInterface
As a second point of reference, I am also able to construct a synthetic class loader (separate java executable that uses the same class loading mechanism. Again, since I do not have the original source to ServicePluginInterface, I used the class files from the WAR. This second, synthetic loader, or faux-servlet if you will, CAN load MyPlugin just fine. So I would postulate that the Tomcat JVM seems to be detecting some sort of difference between the classes found inside the WAR, and extracted class files. However, since all I did to extract the class files was to open the WAR as a zip and copy them out, it is hard to imagine what that might be.
Javier made a helpful suggestion about removing the definition of ServicePluginInterface, the problem with that solution was that the ZipClassLoader that the servlet uses to load the plugin out of the jar overrides the ClassLoader findClass function to pull the class out of the JAR like so:
protected Class<?> findClass(String name) throws ClassNotFoundException
{
ZipEntry entry = this.myFile.getEntry(name.replace('.', '/') + ".class");
if (entry == null) {
throw new ClassNotFoundException(name);
}
...
}
The class ZipClassLoader then recursively loads all parent objects and interfaces from the jar. This means that if the plugin jar does not contain the definition for ServicePluginInterface, it will fail.
Classes defined by different class loaders are different:
At run time, several reference types with the same binary name may be
loaded simultaneously by different class loaders. These types may or
may not represent the same type declaration. Even if two such types do
represent the same type declaration, they are considered distinct. JLS
In that case zipLoader returns an instance of MyPlugin that implements the other ServicePluginInterface (is it loaded from the zip too?):
(ServicePluginInterface)zipLoader.loadClass(pluginClass).newInstance();
It seems that the application server already has a definition of ServicePluginInterface, then you don't need to redeploy it. It should be enough to add the required files (ServicePluginInterface, etc.) as non-deployed dependecies of your project.
Another approach goes by living with the fact, and accessing methods in ServicePluginInterface via reflection (use the Class object returned by zipLoader, instead of ServicePluginInterface.class).

Dynamic ClassLoader

I have a large desktop Java application and I want to allow other developers to develop plugins for. Plugins will be jars placed in a specified dir. They will not be on the classpath at startup. I will load and deploy them at runtime.
The complication is that some plugins will have dependencies on each other, as well as the core application. So I cannot load each plugin/jar in its own URLClassLoader. Therefore I want to load all plugins into 1 URLClassLoader. Furthermore, some plugins may fail to initialise for various reasons. And I only want a ClassLoader at the end of day that knows about the successfully loaded plugins. The reasons are quite bizarre and relate to some legacy stuff that is using reflection to instantiate classes. This needs to fail if the plugin doesn't initialise for classes defined inside the plugin jar that failed.
Without this requirement, the solution would be:
Collect the jar URLs and build a ClassLoader based on them
Try to initialise a plugin class from each jar (defined in config in the manifest)
Now the ClassLoader here would be passed to the legacy system for it to use for its reflection stuff. However, it's my understanding that it will still be able to instantiate classes from plugin jars whose plugin failed to initialise (since the jar will still in the URL[] of the ClassLoader). Hence this breaks my requirement above.
The only solution I have come up with so far is to create a custom URLClassLoader as follows (simply to allow access to findClass()):
public class CustomURLClassLoader extends URLClassLoader {
public CustomURLClassLoader(final URL[] urls, final ClassLoader parent) {
super(urls, parent);
}
#Override
protected Class<?> findClass(final String name) throws ClassNotFoundException {
return super.findClass(name);
}
}
And then I made another custom ClassLoader that essentially knows about multiple child ClassLoaders:
public class MultiURLClassLoader extends ClassLoader {
private Set<CustomURLClassLoader> loaders = new HashSet<CustomURLClassLoader>();
public MultiURLClassLoader(final ClassLoader parent) {
super(parent);
}
#Override
protected Class<?> findClass(final String name) throws ClassNotFoundException {
Iterator<CustomURLClassLoader> loadersIter = loaders.iterator();
boolean first = true;
while (first || loadersIter.hasNext()) {
try {
if (first) {
return super.findClass(name);
} else {
return loadersIter.next().findClass(name);
}
} catch (ClassNotFoundException e) {
first = false;
}
}
throw new ClassNotFoundException(name);
}
public void addClassLoader(final CustomURLClassLoader classLoader) {
loaders.add(classLoader);
}
public void removeClassLoader(final CustomURLClassLoader classLoader) {
loaders.remove(classLoader);
}
}
Then my loading plugin alogorithm will be something like
MultiURLClassLoader multiURLClassLoader = new MultiURLClassLoader(ClassLoader.getSystemClassLoader());
for (File pluginJar : new File("plugindir").listFiles()) {
CustomURLClassLoader classLoader = null;
try {
URL pluginURL = pluginJar.toURI().toURL();
final URL[] pluginJarUrl = new URL[] { pluginURL };
classLoader = new CustomURLClassLoader(pluginJarUrl, multiURLClassLoader);
multiURLClassLoader.addClassLoader(classLoader);
Class<?> clazz = Class.forName("some.PluginClass", false, multiURLClassLoader);
Constructor<?> ctor = clazz.getConstructor();
SomePluginInterface plugin = (SomePluginInterface)ctor1.newInstance();
plugin.initialise();
} catch (SomePluginInitialiseException e) {
multiURLClassLoader.removeClassLoader(classLoader);
}
}
Then I can pass the multiURLClassLoader instance onto the legacy system and it will only be able to find classes (via reflection) whose plugin successfully loaded.
I've done some basic testing and it seems to work as I'd like so far. But I would very much like someones opinion on whether this seems like a good idea or not? I have never played this much with ClassLoaders before and I am wanting to avoid getting myself in too deep before its too late.
Thanks!
The problem I see is that if you don't know in advance which plugin depends on which, it's very hard to do anything reasonable, to debug problems, to isolate non-functional or bad-behaving plugins, etc.
Therefore I'd suggest another option: Add another field into each plugin's manifest, which will say on what other plugins it depends. Perhaps just a list of other plugin JARs it needs to function. (The core application classes would be always available.) I believe this would make the design much more robust and simplify many things.
Then, you could choose from different designs, for example:
For each plugin you could create a separate ClassLoader that would load just the JARs it needs. Probably the most robust solution. But I see a drawback: plugins that act as dependencies for many other ones will be loaded repeatedly in different class-loaders. It depends on circumstances (plugin count, JARs size, ...) if this could be a problem or not, it could even be an advantage.
You could have one big ClassLoader for all plugins, as you suggest, but you could ask it for plugin classes in the order of their dependencies. The ones that don't depend on anything first, then the ones that depend on those first ones etc. If some plugin class fails to load/initialize, you could immediately discard all plugins that depend on it.
Are you looking for something like the OSGi approach?
You could do something like Petr Pudlák has said, however you should take in account the fact that one of the solutions you have can create cyclic dependencies...

How to access DLL methods in Java code using JNA?

By running System.loadLibrary("myAPI"), I verified that the DLL file "myAPI.dll" can be successfully loaded into my Eclipse Java project. Now I need to call methods specified inside this DLL file from my Java code. To do this, I added JNA to my Java project. Then I wrote the below-given code snippet that should be able to get instances of classes IProject and ProjectFactory (specified in the DLL file).
I still don't understand how to properly implement this with JNA. I checked different threads, e.g. this one, but the ones I checked don't provide an answer. Any help is highly appreciated. Thanks.
import com.sun.jna.Library;
import com.sun.jna.Native;
public class MyClass {
public interface myAPI extends Library {
//...
}
void LoadProj() {
myAPI api = (myAPI) Native.loadLibrary("myAPI",myAPI.class);
String fileName = "xxx.sp";
IProject project; // this is wrong but shows what I am trying to do
try {
project = ProjectFactory.LoadProject(fileName);
}
catch (Exception ex) {
MessageBox.Show(this, ex.Message, "Load failure");
}
}
}
Not sure what problem you are facing but as a practice your myAPI interface should declare all the methods verbatim with appropriate parameter mapping. I don't see any methods inside your interface.
Please checkout the this link as well as the link mentioned above by #Perception
If there are no Java classes or Java source hidden inside this DLL (which would be ... strange), then it will never work this way. You can't instantiate C# classes or use C# interfaces. MessageBox.Show( isn't Java either, it is Windows Forms code.

Java (Eclipse) - Conditional compilation

I have a java project that is referenced in j2me project and in android project.
In this project i would like to use conditional compilation.
Something like...
//#if android
...
//#endif
//if j2me
...
//#endif
I have been reading about this but i did not find anything useful yet.
You could use Antenna (there is a plugin for Eclipse, and you can use it with the Ant build system).
I'm using it in my projects in a way you've described and it works perfectly :)
EDIT: here is the example related to #WhiteFang34 solution that is a way to go:
In your core project:
//base class Base.java
public abstract class Base {
public static Base getInstance()
{
//#ifdef ANDROID
return new AndroidBaseImpl();
//#elif J2ME
return new J2MEBaseImpl();
//#endif
}
public abstract void doSomething();
}
//Android specific implementation AndroidBaseImpl.java
//#ifdef ANDROID
public class AndroidBaseImpl extends Base {
public void doSomething() {
//Android code
}
}
//#endif
//J2ME specific implementation J2MEBaseImpl.java
//#ifdef J2ME
public class J2MEBaseImpl extends Base {
public void doSomething() {
// J2Me code
}
}
//#endif
In your project that uses the core project:
public class App {
public void something {
// Depends on the preprocessor symbol you used to build a project
Base.getInstance().doSomething();
}
}
Than if you want to build for the Android, you just define ANDROID preprocessor symbol or J2ME if you want to do a build for a J2ME platform...
Anyway, I hope it helps :)
Perhaps you should consider creating interfaces around the logic that's specific to a profile (J2ME, Android or other in the future). Then create concrete implementations of your interface for each profile. Any common parts you could split out into an abstract base class for both implementations to extend. This way your logic for each profile is nicely separated for different concerns. For each profile just build the appropriate set of classes (you could separate them by package for example). It'll be easier to maintain, debug, test and understand in the long run.
Eclipse MTJ project provides preprocessing support as documented . This support was mainly targeted for tackling fragmentation problems on JavaME. I have not tested the preprocessing support together with the Android tooling but it may just work.

Categories

Resources