We have implemented a typical DAO/Abstract Factory pattern. The design is like this:
DAOFactory - that returns either an instance of MySQLFactory / SQLiteFactory
MySQLFactory - returns DAOs that talks to MySQL DB
SQLiteFactory - returns DAOs that talks to SQLite DB
Things are fine. However, we need to create two executables: The one that is provided to customers uses the SQLiteFactory instance and relevant DAOs. In that executable, we don't want to include any class related to MySQLFactory. If I delete those classes then we see a ClassNotFoundException at run time when DAOFactory class is being loaded by class loader.
How can we implement our DAOFactory so that MySQLFactory is not required at runtime ? The same problem also exists for certain other classes i.e. certain classes are required only for in-house version of the app. What is a good way of implementation so that we can exclude classes from the software that is shipped to customers ?
Thanks
Deep
You can use Class.forName to load classes at runtime. For example:
public class DAOFactoryLoader {
public static DAOFactory loadMySQLFactory() {
return (DAOFactory) Class.forName("your.package.MySQLFactory").newInstance();
}
public static DAOFactory loadSQLiteFactory() {
return (DAOFactory) Class.forName("your.package.SQLiteFactory").newInstance();
}
}
Then you can call the right method for the specific version of your software, i.e. loadMySQLFactory when you bundle the MySQL implementation and loadSQLiteFactory when you bundle the SQLite implementation.
You can even pass the class name as a JVM property, for eg. -DdaoFactory=your.package.MySQLFactory and then use this to load the right class:
return (DAOFactory) Class.forName(System.getProperty("daoFactory")).newInstance();
Related
I was looking at rt.jar for some reasons and there i saw some packages like java.sql.* among others.
In a typical jdbc program we write (for Connection class, for example):
import java.sql.Connection;
As per the docs, java.sql.Connection is interface, not concrete implementation, and java.sql.Connection is in rt.jar.
When we write jdbc program, we need jdbc drivers, and from what i read jdbc drivers implement interfaces (e.g. java.sql.Connection).
So when we write in typical java program: (and load the jdbc drivers)
import java.sql.Connection;
--> does java.sql.Connection come from rt.jar or from the driver classes.
From what i guess, in this case java.sql.Connection has to come from rt.jar (as interface), and actual implementation comes from driver classes.
If my assumption is correct, in general do we need to include the jar's which have interface definitions in order to include the import.
For example, consider this situation:
package com.vipin.myinterface;
public interface Interface1 {
public void print();
}
And if we package above interface as interface1.jar.
Suppose Concrete1.java implements this interface:
package com.vipin.concrete1;
public class Concrete1 implements Interface1 {
public void print () {
//code
}
}
And this packaged in jar --> concrete1.jar.
Now, suppose i am writing an application which uses print() method, so do i need to include both these jar's?
The case for java.sql.Connection is that the driver provides the implementation classes for this and other interfaces like java.sql.Statement, java.sql.ResultSet, and on. All the magic of binding the interface to the proper class implementation happens in the method DriverManager#getConnection, which calls an internal method private static Connection getConnection(String url, java.util.Properties info, Class<?> caller) throws SQLException that will initialize the proper instance of java.sql.Connection.
Of course, you can use a similar approach in your code that will use reflection to:
Find the proper implementation of the interface
If there's a proper implementation, create an instance of this class.
Return the instance of this class once initialized and running.
Throw proper exception(s) if the class cannot be found or if it has any initialization issue.
Please do not think that just creating a jar containing the interfaces and another containing the implementation classes of this interfaces will automatically wire up on the fly for you, that doesn't happen.
I have created a OSGI service with declarative services to inject an object that implements an interface. If I inject the object in a class that is attached to the application model (handler,part,....) it is working fine. If I inject it in a class that is not attached to the application model it is always returning null.
Is it possible to use DI in classes that are not attached to the application model? I looked in the vogella tutorials but somehow I don't find a solution.
I know of three ways of how Eclipse 4 can inject objects in your classes:
During start-up the Eclipse runtime looks for relevant annotations in the classes it instantiates.
Objects injected in 1. are tracked and will be re-injected if changed.
Manually triggering injection using the ContextInjectionFactory and IEclipseContext.
What you want may be possible with the third option. Here is a code example:
ManipulateModelhandler man = new ManipulateModelhandler();
//inject the context into an object
//IEclipseContext iEclipseContext was injected into this class
ContextInjectionFactory.inject(man,iEclipseContext);
man.execute();
The problem is, however; that the IEclipseContext already needs to be injected into a class that can access the object that needs injection. Depending on the number of necessary injections, it might be more useful to use delegation instead (testability would be one argument).
#Inject
public void setFoo(Foo foo) {
//Bar is not attached to the e4 Application Model
bar.setFoo(foo);
}
Therefore, a better solution is probably using the #Creatable annotation.
Simply annotate your class, and give it a no-argument constructor.
#Creatable
public class Foo {
public Foo () {}
}
Using #Inject on that type as in the method above, will let Eclipse instantiate and inject it.
The disadvantage is that you cannot control the object creation anymore, as you would with ContextInjectionFactory.inject(..).
I refactored out some part of e(fx)clipse in order to achieve that. Have a look at this. Sorry for the shameless plug...
The following code doesn't work (of course), because the marked line does not compile:
MyClass {
//singleton stuff
private static MyClass instance;
private MyClass () {}
public static MyClass getInstance() {
if(instance==null) {
instance = new MyClass ();
}
return instance;
}
// method creating problems
public NonGenericSuperClassOfGenericClass create(Class<?>... classes) {
if(someCondition)
return new GenericClass<classes[0],classes[1]>; // DOES NOT COMPILE
else
return new OtherGenericClass<classes[0]>;
}
}
Therefore, I actually don't know whether "create" will return
GenericClass<classes[0],classes[1]>
or
OtherGenericClass<classes[0]>
which have different numbers of parameters.
This happens because I'm using Spring and I plan to use MongoDB, but in the future I may need to switch to something different (e.g. Hibernate).
The class GenericClass is something like:
GenericClass<PersistetType1, Long>
or
GenericClass<PersistentType2, Long>
where PersistentType1/2 are classes that I need to finally store in the DB, while, GenericClass is a sort of Proxy to access Mongo APIs. In fact, it looks like:
public MongoTemplate getTemplate();
public void save(T toInsert);
public List<T> select(Query selectionQuery);
public T selectById(ID id);
public WriteResult update(Query selectionQuery, Update updatedAttributes);
public void delete(T toRemove);
public void delete(Query selectionQuery);
Now, what?
From Controllers (or Entity, if you are picky) I need to instantiate the repository and invoke any methods. This causes the Controllers to be coupled with MongoDB, i.e. they explicitly have to instantiate such GenericClass, which is actually called MongoRepository and is strictly dependent on Mongo (in fact it is a generic with exactly two "degrees of freedom").
So, I decided to create MyClass, that is a further proxy that isolates Controllers. In this way, Controller can get the single instance of MyClass and let it create a new instance of the appropriate repository. In particular, when "somecondition" is true, it means that we want to use MongoRepository (when it is false, maybe, a need to instantiate a Hibernate proxy, i.e. HibernateRepository). However, MongoRepository is generic, therefore it requires some form of instantiation, that I hoped to pass as a parameter.
Unfortunately, generics are resolved at compile time, thus they don't work for me, I guess.
How can I fix that?
In order to decouple the underlying persistence store from your application logic I would use the DAO approach.
Define the interface of your DAO with the required methods e.g. save, update etc. And then provide an implementation for each persistence provider you might need e.g.UserAccess might be the interface which you could implement as HibernateUserAccess and MongoUserAccess. In each implementation you inject the appropriate Template e.g. Mongo or Hibernate and use that to complete the persistence operation.
The issue you might have is that your load operation would return an instance of User, this would need to vary across persistence providers i.e. JPA annotations would be different to the Spring Data annotations needed for MongoDB (leaky abstraction).
I would probably solve that by creating a User interface to represent the result of the persistence operation and having an implementation for each persistence provider. Either that or return a common model which you build from the results of a JPA or Mongo load.
i'm just learning java, and i meet some problems.
Here we have simple factory pattern:
public class SomeFactory {
...
public static void registerProduct(String name, Class<? extends IProduct > f)
}
public SomeProduct implements IProduct {
static {
SomeFactory.register("some product", SomeProduct.class);
}
...
}
All products should register themselves at factory.
But before using this code, all Products classes should be loaded.
I can put Class.forName() somewhere, for example in main function.
But i want to avoid such sort of manual classes loading. I want just add new IProduct
implementations, without updating other parts(such as SomeFactory or Main methods, etc.).
But i wonder, is it possible to automatically load some classes(marked with annotation, for example)?
P.S I want to notice, that no other classes will be added at run-time, all IProduct implementations are known before compiling.
UPD#1
Thank for your answering!
But is it possible to make auto-generated property-file with IProduct instances?
I mean is it possible to make some build-time script(for maven for example) that generates property-file or loader code? Are there such solutions or frameworks?
UPD#2
I finished with using Reflections library that provides run-time information, by scanning classpath at startup.
This is possible, but not easily. It would need to scan all the classes in the classpath to see if they have an annotation or implement the IProduct interface. See How do you find all subclasses of a given class in Java? for answers to such a problem.
I would do keep it simple and just have a list of classes to load, either in the factory itself, or in an external file (properties file, for example).
Have each product register itself, using a static block like this:
class MyProduct1{
static{
SomeFactory.register(MyProduct1.getClass());
}
..
..
}
An external property file can keep track of all Products.
Your main method can parse this list of Products and do a Class.forName("..").
This way you wouldnt need to code any specific product, just the property file keeps changing. Ah! yes adding security registration would also be a plus point.
Note: I'm just proposing an idea, I'vent tried it myself :)
I am looking for a way to do the following:
A Project :
Defines an abstract class that is called when some events happen (event handler if you will)
Defines the engine that will fire the events using the event handler above
B Project:
Defines the implementation for the abstract class
Runs the engine.
How can i register the implementation class and make sure that is the one being called when the engine runs.
EDIT 1: By register i mean i must somehow define which is the implementation that should be called for that given abstract object
Sorry if the question isn't too clear, let me know if you need some more details
Something like this?
class A implements EventHandlerForB {
...
}
public class B {
private EventHandlerForB eventHandler;
public void registerEventHandler(EventHandlerForB eventHandler) {
this.eventHandler = eventHandler;
}
...
}
public interface EventHandlerForB {
...
}
At runtime, you can have the name of the implementation passed in your A project (with a properties file or a Java system property).
Then you find this class in the classpath with class.forName() and instantiate it with newInstance().
But you'd prefer using a framework like Guice or Spring, that will allow you to glue stuff together in a clean way.
there are several "patterns" that try to address this issue. Using only JDK (6 or above) classes you may want to take a look at java.util.ServiceLoader