My program gets information from an external source (can be a file, a database, or anything else I might decide upon in the future).
I want to define an interface with all my data needs, and classes that implement it (e.g. a class to get the data from a file, another for DB, etc...).
I want the rest of my project to not care where the data comes from, and not need to create any object to get the data, for example to call "DataSource.getSomething();"
For that I need DataSource to contain a variable of the type of the interface and initialize it with one of the concrete implementations, and expose all of its methods (that come from the interface) as static methods.
So, lets say the interface name is K, and the concrete implementations are A,B,C.
The way I do it today is:
public class DataSource {
private static K myVar = new B();
// For **every** method in K I do something like this:
public static String getSomething() {
return myVar.doSomething();
}
...
}
This is very bad since I need to copy all the methods of the interface and make them static just so I can delegate it to myVar, and many other obvious reasons.
What is the correct way to do it? (maybe there is a design pattern for it?)
**Note - since this will be the backbone of many many other projects and I will use these calls from thousands (if not tens of thousands) code lines, I insist on keeping it simple like "DataSource.getSomething();", I do not want anything like "DataSource.getInstance().getSomething();" **
Edit :
I was offered here to use DI framework like Guice, does this mean I will need to add the DI code in every entry point (i.e. "main" method) in all my projects, or there is a way to do it once for all projects?
The classes using your data source should access it via an interface, and the correct instance provided to the class at construction time.
So first of all make DataSource an interface:
public interface DataSource {
String getSomething();
}
Now a concrete implementation:
public class B implements DataSource {
public String getSomething() {
//read a file, call a database whatever..
}
}
And then your calling class looks like this:
public class MyThingThatNeedsData {
private DataSource ds;
public MyThingThatNeedsData(DataSource ds) {
this.ds = ds;
}
public doSomethingRequiringData() {
String something = ds.getSomething();
//do whatever with the data
}
}
Somewhere else in your code you can instantiate this class:
public class Program {
public static void main(String[] args) {
DataSource ds = new B(); //Here we've picked the concrete implementation
MyThingThatNeedsData thing = new MyThingThatNeedsData(ds); //And we pass it in
String result = thing.doSomethingThatRequiresData();
}
}
You can do the last step using a Dependency Injection framework like Spring or Guice if you want to get fancy.
Bonus points: In your unit tests you can provide a mock/stub implementation of DataSource instead and your client class will be none the wiser!
I want to focus in my answer one important aspect in your question; you wrote:
Note - I insist on keeping it simple like "DataSource.getSomething();", I do not want anything like "DataSource.getInstance().getSomething();"
Thing is: simplicity is not measured on number of characters. Simplicity comes out of good design; and good design comes out of following best practices.
In other words: if you think that DataSource.getSomething() is "easier" than something that uses (for example) dependency injection to "magically" provide you with an object that implements a certain interfaces; then: you are mistaken!
It is the other way round: those are separated concerns: one the one hand; you should declare such an interface that describes the functionality that need. On the other hand, you have client code that needs an object of that interface. That is all you should be focusing on. The step of "creating" that object; and making it available to your code might look more complicated than just calling a static method; but I guarantee you: following the answer from Paolo will make your product better.
It is sometimes easy to do the wrong thing!
EDIT: one pattern that I am using:
interface SomeFunc {
void foo();
}
class SomeFuncImpl implements SomeFunc {
...
}
enum SomeFuncProvider implements SomeFunc {
INSTANCE;
private final SomeFunc delegatee = new SomeFuncImpl();
#Override
void foo() { delegatee.foo(); }
This pattern allows you to write client code like
class Client {
private final SomeFunc func;
Client() { this(SomeFuncProvider.INSTANCE); }
Client(SomeFunc func) { this.func = func; }
Meaning:
There is a nice (singleton-correctway) of accessing an object giving you your functionality
The impl class is completely unit-testable
Client code uses dependency injection, and is therefore also fully unit-testable
My program gets information from an external source (can be a file, a database, or anything else I might decide upon in the future).
This is the thought behind patterns such as Data Access Object (short DAO) or the Repository pattern. The difference is blurry. Both are about abstracting away a data source behind a uniform interface. A common approach is having one DAO/Repository class per business- or database entity. It's up to you if you want them all to behave similarly (e.g. CRUD methods) or be specific with special queries and stuff. In Java EE the patterns are most often implemented using the Java Persistence API (short JPA).
For that I need DataSource to contain a variable of the type of the
interface and initialize it with one of the concrete implementations,
For this initialization you don't want to know or define the type in the using classes. This is where Inversion Of Control (short IOC) comes into play. A simple way to archieve this is putting all dependencies into constructor parameters, but this way you only move the problem one stage up. In Java context you'll often hear the term Context and Dependency Injection (short CDI) which is basically an implementation of the IOC idea. Specifically in Java EE there's the CDI package, which enables you to inject instances of classes based on their implemented interfaces. You basically do not call any constructors anymore when using CDI effectively. You only define your class' dependencies using annotations.
and expose all of its methods (that come from the interface)
This is a misconception. You do want it to expose the interface-defined method ONLY. All other public methods on the class are irrelevant and only meant for testing or in rare cases where you want to use specific behavior.
as static methods.
Having stateful classes with static method only is an antipattern. Since your data source classes must contain a reference to the underlying data source, they have a state. That said, the class needs a private field. This makes usage through static methods impossible. Additionally, static classes are very hard to test and do not behave nicely in multi-threaded environments.
My problem is that I am trying to interop with a Java app whose jar file contains obfuscated byte code. The app releases updates ever month or so, and when they do a release, most of the class and method names change.
Thus, the method proposed here:
http://rickyclarkson.blogspot.com/2006/07/duck-typing-in-java-and-no-reflection.html
or
Simulating duck typing in Java
won't work in my solution because because I would have to update the interfaces by hand each time.
What I do have however is an automatically generated (for the most part) mapping from deobfuscated class name <-> obfuscated class name by means of parsing the class files for calls to debug logging calls in the form of:
logger.log(severity, "ClassName", "MethodName() has some error")
What I generate is something like this:
public final static String MyRealName = "someObfuscatedName".
public final static String MyRealName_myCoolMethod = "someMethodName".
I have a fairly decent solution for interacting with objects of "myRealName" via the reflection API and simply proxy objects that implement a subset of functionality of the object it is proxying. Somewhat like this:
class MyRealName {
private Object backingObject;
public MyRealName(Object o) { backingObject = o;}
public void myCoolMethod() {
return getFieldValue(backingObject
, DeobNames.MyRealName_myCoolMethod);
}
}
However, the problem arises when I want to test my code in the absence of the obfuscated app from running - startup time and setup could take several minutes whereas I want test verification to be a couple of seconds.
What I am looking for is some way of easily adapting my tests to accommodate the frequently changing class names that my code depends upon.
I was intrigued by the power of tools like JMockit, etc in that they were able to automatically generate mock objects for me, I'm hoping to be able to have some thin layer that will enable to still have the majority of my mocks generated quite easily vs having to manually write everything, every update.
If you are running the code from Java, I don't think this is possible.
However if you are running the code with Groovy then you can use Groovy's methodMissing
See: http://groovy.codehaus.org/Using+methodMissing+and+propertyMissing
I've never written an annotation in Java.
I've got a simple Java class for performance measurement. I call it PerfLog. Here's an example of its use:
public class MyClassToTest {
public String MyMethod() {
PerfLog p = new PerfLog("MyClassToTest", "MyMethod");
try {
// All the code that I want to time.
return whatever;
} finally {
p.stop();
}
}
}
When p.stop() is called, a line will be written to the log file:
2010/10/29T14:30:00.00 MyClassToTest MyMethod elapsed time: 00:00:00.0105
Can PerfLog be rewritten as an Annotation so that I can write this instead?
public class MyClassToTest {
#PerfLog
public String MyMethod() {
// All the code I want to time.
return whatever;
}
}
It would seem to be a good candidate for annotating: It's easy to add or take away the annotation; a production build can leave out PerfLog entirely without having to remove the annotations from the source code; the annotation processor can get the class and method names.
Is this easy to do? Is there a recipe somethere that I can follow?
It has to be Java 5 so I know I have to use apt somewhere.
There is no trivial way to do this using standard Java tools. The path of least resistance would almost certainly be to use an AOP-style library like Google Guice or Spring or AspectJ. Any home-grown attempt to solve this problem will essentially end up doing what AOP libraries would already do for you.
Consider using AOP or Spring - supports Annotations to intercept method invocations and implement custom code
In case that sounds interesting, the Spring AOP docs are here : http://static.springsource.org/spring/docs/2.5.x/reference/aop.html
You can definitely write the annotation parser and its runtime implementation on your own, but reinventing will only be error prone and inefficient when compared to industry solutions.
If you insist on implementing this on your own (without AOP or AOP with Spring), here is what I can suggest (may not be the bestest method) :
Create your beans via a custom FactoryBean always.
In the custom FactoryBean implementation, query the class for methods and check if they are annotated.
If yes, instead of returning the instance of the class itself, return a proxy over the instance.
In the invoke of this proxy, wrap the call to the actual instance's method with new PerfLog() and p.stop()
Its effectively what AOP would (more powerfully) do for you. However, take note that final classes, static methods, classes with no interfaces etc.. will still be a problem in this case (a different ball game).
I want to make my program initialization a bit "smarter".
I have several classes which represent commands. All these classes are immutable (i.e. creating only one instance of each should be enough for the whole application). All these classes implement Command interface.
I think that the fact that some classes are placed in the same jar with a class with the main method (maybe even in 1 predefined package) and that these classes implement one known interface should give enough information to make it possible to automate creation of their instances.
How can I implement this feature? (Obviously, it's something tightly connected with reflection, and maybe, with java class loading mechanisms, but I'm not an expert in these fields).
I want to have something like this:
public static void init() {
...
Map<String, Command> commands = Maps.newHashMap();
for (Class clazz : findCommandImplementationsInThisJarFile()) {
//some try/catch stuff is ommited
Command command = clazz.newInstance();
commands.put(command.getName(), command);
}
...
}
How to implement the tricky method findCommandImplementationsInThisJarFile()?
You need extcos:
select(javaClasses()).from("your.package").returning(allAnnotatedWith(YourAnnotation.class))
Also supports thoseImplementing and thoseExtending etc...
See: http://sourceforge.net/projects/extcos/
Spring also contains similar code for it's component scanning that is easily adapted.
I'm writing a library that needs to have some code if a particular library is included. Since this code is scattered all around the project, it would be nice if users didn't have to comment/uncomment everything themselves.
In C, this would be easy enough with a #define in a header, and then code blocks surrounded with #ifdefs. Of course, Java doesn't have the C preprocessor...
To clarify - several external libraries will be distributed with mine. I do not want to have to include them all to minimize my executable size. If a developer does include a library, I need to be able to use it, and if not, then it can just be ignored.
What is the best way to do this in Java?
There's no way to do what you want from within Java. You could preprocess the Java source files, but that's outside the scope of Java.
Can you not abstract the differences and then vary the implementation?
Based on your clarification, it sounds like you might be able to create a factory method that will return either an object from one of the external libraries or a "stub" class whose functions will do what you would have done in the "not-available" conditional code.
As other have said, there is no such thing as #define/#ifdef in Java. But regarding your problem of having optional external libraries, which you would use, if present, and not use if not, using proxy classes might be an option (if the library interfaces aren't too big).
I had to do this once for the Mac OS X specific extensions for AWT/Swing (found in com.apple.eawt.*). The classes are, of course, only on the class-path if the application is running on Mac OS. To be able to use them but still allow the same app to be used on other platforms, I wrote simple proxy classes, which just offered the same methods as the original EAWT classes. Internally, the proxies used some reflection to determine if the real classes were on the class-path and would pass through all method calls. By using the java.lang.reflect.Proxy class, you can even create and pass around objects of a type defined in the external library, without having it available at compile time.
For example, the proxy for com.apple.eawt.ApplicationListener looked like this:
public class ApplicationListener {
private static Class<?> nativeClass;
static Class<?> getNativeClass() {
try {
if (ApplicationListener.nativeClass == null) {
ApplicationListener.nativeClass = Class.forName("com.apple.eawt.ApplicationListener");
}
return ApplicationListener.nativeClass;
} catch (ClassNotFoundException ex) {
throw new RuntimeException("This system does not support the Apple EAWT!", ex);
}
}
private Object nativeObject;
public ApplicationListener() {
Class<?> nativeClass = ApplicationListener.getNativeClass();
this.nativeObject = Proxy.newProxyInstance(nativeClass.getClassLoader(), new Class<?>[] {
nativeClass
}, new InvocationHandler() {
public Object invoke(Object proxy, Method method, Object[] args) throws Throwable {
String methodName = method.getName();
ApplicationEvent event = new ApplicationEvent(args[0]);
if (methodName.equals("handleReOpenApplication")) {
ApplicationListener.this.handleReOpenApplication(event);
} else if (methodName.equals("handleQuit")) {
ApplicationListener.this.handleQuit(event);
} else if (methodName.equals("handlePrintFile")) {
ApplicationListener.this.handlePrintFile(event);
} else if (methodName.equals("handlePreferences")) {
ApplicationListener.this.handlePreferences(event);
} else if (methodName.equals("handleOpenFile")) {
ApplicationListener.this.handleOpenFile(event);
} else if (methodName.equals("handleOpenApplication")) {
ApplicationListener.this.handleOpenApplication(event);
} else if (methodName.equals("handleAbout")) {
ApplicationListener.this.handleAbout(event);
}
return null;
}
});
}
Object getNativeObject() {
return this.nativeObject;
}
// followed by abstract definitions of all handle...(ApplicationEvent) methods
}
All this only makes sense, if you need just a few classes from an external library, because you have to do everything via reflection at runtime. For larger libraries, you probably would need some way to automate the generation of the proxies. But then, if you really are that dependent on a large external library, you should just require it at compile time.
Comment by Peter Lawrey: (Sorry to edit, its very hard to put code into a comment)
The follow example is generic by method so you don't need to know all the methods involved. You can also make this generic by class so you only need one InvocationHandler class coded to cover all cases.
public Object invoke(Object proxy, Method method, Object[] args) throws Throwable {
String methodName = method.getName();
ApplicationEvent event = new ApplicationEvent(args[0]);
Method method = ApplicationListener.class.getMethod(methodName, ApplicationEvent.class);
return method.invoke(ApplicationListener.this, event);
}
In Java one could use a variety of approaches to achieve the same result:
Dependency Injection
Annotations
Reflection
The Java way is to put behaviour that varies into a set of separate classes abstracted through an interface, then plug the required class at run time. See also:
Factory pattern
Builder pattern
Strategy pattern
Well, Java syntax is close enough to C that you could simply use the C preprocessor, which is usually shipped as a separate executable.
But Java isn't really about doing things at compile time anyway. The way I've handled similar situations before is with reflection. In your case, since your calls to the possibly-non-present library are scattered throughout the code, I would make a wrapper class, replace all the calls to the library with calls to the wrapper class, and then use reflection inside the wrapper class to invoke on the library if it is present.
Use a constant:
This week we create some constants
that have all of the benefits of using
the C preprocessor's facilities to
define compile-time constants and
conditionally compiled code.
Java has gotten rid of the entire
notion of a textual preprocessor (if
you take Java as a "descendent" of
C/C++). We can, however, get the best
benefits of at least some of the C
preprocessor's features in Java:
constants and conditional compilation.
I don't believe that there really is such a thing. Most true Java users will tell you that this is a Good Thing, and that relying on conditional compilation should be avoided at almost all costs.
I'm don't really agree with them...
You CAN use constants that can be defined from the compile line, and that will have some of the effect, but not really all. (For example, you can't have things that don't compile, but you still want, inside #if 0... (and no, comments don't always solve that problem, because nesting comments can be tricky...)).
I think that most people will tell you to use some form of inheritance to do this, but that can be very ugly as well, with lots of repeated code...
That said, you CAN always just set up your IDE to throw your java through the pre-processor before sending it to javac...
"to minimize my executable size"
What do you mean by "executable size"?
If you mean the amount of code loaded at runtime, then you can conditionally load classes through the classloader. So you distribute your alternative code no matter what, but it's only actually loaded if the library that it stands in for is missing. You can use an Adapter (or similar) to encapsulate the API, to make sure that almost all of your code is exactly the same either way, and one of two wrapper classes is loaded according to your case. The Java security SPI might give you some ideas how this can be structured and implemented.
If you mean the size of your .jar file, then you can do the above, but tell your developers how to strip the unnecessary classes out of the jar, in the case where they know they aren't going to be needed.
I have one more best way to say.
What you need is a final variable.
public static final boolean LibraryIncluded= false; //or true - manually set this
Then inside the code say as
if(LibraryIncluded){
//do what you want to do if library is included
}
else
{
//do if you want anything to do if the library is not included
}
This will work as #ifdef. Any one of the blocks will be present in the executable code. Other will be eliminated in the compile time itself
Use properties to do this kind of thing.
Use things like Class.forName to identify the class.
Do not use if-statements when you can trivially translate a property directly to a class.
Depending on what you are doing (not quite enough information) you could do something like this:
interface Foo
{
void foo();
}
class FakeFoo
implements Foo
{
public void foo()
{
// do nothing
}
}
class RealFoo
{
public void foo()
{
// do something
}
}
and then provide a class to abstract the instantiation:
class FooFactory
{
public static Foo makeFoo()
{
final String name;
final FooClass fooClass;
final Foo foo;
name = System.getProperty("foo.class");
fooClass = Class.forName(name);
foo = (Foo)fooClass.newInstance();
return (foo);
}
}
Then run java with -Dfoo.name=RealFoo|FakeFoo
Ignored the exception handling in the makeFoo method and you can do it other ways... but the idea is the same.
That way you compile both versions of the Foo subclasses and let the developer choose at runtime which they wish to use.
I see you specifying two mutually exclusive problems here (or, more likely, you have chosen one and I'm just not understanding which choice you've made).
You have to make a choice: Are you shipping two versions of your source code (one if the library exists, and one if it does not), or are you shipping a single version and expecting it to work with the library if the library exists.
If you want a single version to detect the library's existence and use it if available, then you MUST have all the code to access it in your distributed code--you cannot trim it out. Since you are equating your problem with using a #define, I assumed this was not your goal--you want to ship 2 versions (The only way #define can work)
So, with 2 versions you can define a libraryInterface. This can either be an object that wraps your library and forwards all the calls to the library for you or an interface--in either case this object MUST exist at compile time for both modes.
public LibraryInterface getLibrary()
{
if(LIBRARY_EXISTS) // final boolean
{
// Instantiate your wrapper class or reflectively create an instance
return library;
}
return null;
}
Now, when you want to USE your library (cases where you would have had a #ifdef in C) you have this:
if(LIBRARY_EXISTS)
library.doFunc()
Library is an interface that exists in both cases. Since it's always protected by LIBRARY_EXISTS, it will compile out (should never even load into your class loader--but that's implementation dependent).
If your library is a pre-packaged library provided by a 3rd party, you may have to make Library a wrapper class that forwards it's calls to your library. Since your library wrapper is never instantiated if LIBRARY_EXISTS is false, it shouldn't even be loaded at runtime (Heck, it shouldn't even be compiled in if the JVM is smart enough since it's always protected by a final constant.) but remember that the wrapper MUST be available at compile time in both cases.
If it helps have a look at j2me polish or Using preprocessor directives in BlackBerry JDE plugin for eclipse?
this is for mobiles app but this can be reused no ?