Imagine a situation:
#javax.persistence.Inheritance(strategy=javax.persistence.InheritanceType.JOINED)
#javax.persistence.DiscriminatorColumn
#javax.persistence.Entity
#javax.persistence.Table(name="PARENT")
public abstract class Parent{
...
}
#javax.persistence.Entity
#javax.persistence.Table(name="A")
public class A extends Parent{
...
}
#javax.persistence.Entity
#javax.persistence.Table(name="B")
public class B extends Parent{
...
}
Parent p = new A();
Now we call this:
p instance of A
always returns false!!
works ok on OpenJPA!
Should I file a bug? Hibernate 4.3.10
This is most likely because hibernate is returning a proxy.
Why does it do this? To implement lazy loading the framework needs to intercept your method calls that return a lazy loaded object or list of objects. It does this so it can first load the object from the DB and then allow your method to run. Hibernate does this by creating a proxy class. If you check the type in debug you should be able to see the actual type is a generated class which does not extend from your base class.
How to get around it? I had this problem once and successfully used the visitor pattern instead of using instanceof. It does add extra complication so it's not everyone's favorite pattern but IMHO it is a much cleaner approach than using instanceof.
If you use instanceof then you typically end up with if...else blocks checking for the different types. As you add more types you will have to re-visit each of these blocks. The advantage of the visitor pattern is that the conditional logic is built into your class hierarchy so if you add more types it makes it less likely you need to change everywhere that uses these classes.
I found this article useful when implementing the visitor pattern.
Not sure, but I think this will work.
public static boolean instanceOf(Object object, Class<?> superclass) {
return superclass.isAssignableFrom(Hibernate.getClass(object));
}
You can try to unproxy your object :
/**
*
* #param <T>
* #param entity
* #return
*/
#SuppressWarnings("unchecked")
public static <T> T initializeAndUnproxy(T entity) {
if (entity == null) {
// throw new NullPointerException("Entity passed for initialization is null");
return null;
}
Hibernate.initialize(entity);
if (entity instanceof HibernateProxy) {
entity = (T) ((HibernateProxy) entity).getHibernateLazyInitializer().getImplementation();
}
return entity;
}
That is because Hibernate uses run time proxies and OpenJPA, while supporting the proxy approach, prefers either compile time or runtime byte code enhancement.
See:
http://openjpa.apache.org/entity-enhancement.html
//Hibernate
Entity e = repository.load(entityId); // may return a proxy
//OpenJPA
Entity e = repository.load(entityId); //will return an (enhanced) actual instance of E
Hibernate returns Proxied Object. Rather than implementing a Visitor pattern (as described here), you can use the isAssignableFrom() method on the class you want to test (https://docs.oracle.com/javase/8/docs/api/java/lang/Class.html#isAssignableFrom-java.lang.Class-).
Related
I found this code in github for student management system thats use DTO pattern
public interface SuperController<T extends SuperDTO>{
public boolean add(T dto) throws SQLException;
public T getAll(String id)throws SQLException; }
and SuperDTO is empty class without any fields or methods
public class SuperDTO {
}
my problem i can uderstand how to call this SuperController using the following class :
public class ControllerFactory{
private static ControllerFactory controllerFactory;
private BatchDBControllerImpl ctrlBatch;
public enum ControllerType{
BATCH,CLASSES,LOGIN,REGISTER,STUDENT,ATTENDANCE,PAYMENT,EXAM,EXAM_DETAIL;
}
private ControllerFactory() {
}
public static ControllerFactory getInstance(){
if(controllerFactory==null){
controllerFactory=new ControllerFactory();
}
return controllerFactory;
}
public SuperController getController(ControllerType type){
switch(type){
case BATCH:
return new BatchDBControllerImpl();
case CLASSES:
return new ClassesDBControllerImpl();
case LOGIN:
return new LoginDBControllerImpl();
case REGISTER:
return new RegisterDBControllerImpl();
case STUDENT:
return new StudentDBControllerImpl();
case ATTENDANCE:
return new AttendanceDBControllerImpl();
case PAYMENT:
return new PaymentDBControllerImpl();
case EXAM_DETAIL:
return new ExamDetailDBControllerImpl();
case EXAM:
return new ExamDBControllerImpl();
default :
return null;
}
}
}
please explain me this method :
public SuperController getController(ControllerType type)
I'm unsure what you do and don't understand so please comment or edit your question if you need further clarification.
ControllerFactory is known as a factory object; this is a pattern of design in which instead of calling an object's constructor directly to instantiate it, we use a different class which is solely responsible for the task. In this case, the factory builds what seems to be different types of database controllers (each having different expected functionality) all of which implement the SuperController interface. The benefit here is that we can use this factory to create all of these objects with the same shared type. In addition, this factory is known as a singleton because there can only ever be one instance of the object created (note the static reference and private constructor).
So, once you have your factory you can use the getController method to get an instance of your desired object.
SuperController can be thought of as a contract detailing what basic functionality is expected from each of the potential objects that implement it. The exact details of implementation differ, but the overall pattern of behavior should be similar between them.
ControllerType is an enum that defines what valid types of controllers there are, and then based on which type is passed in a different kind of SuperController is constructed. Note that no matter which is returned it will always be of type SuperController since it implemented that interface.
If all you want to know is how to use it, then you need to get your factory instance, then use it to get your controller:
ControllerFactory controllerFactory = ControllerFactory.getInstance();
SuperController controller = controllerFactory.getController(ControllerFactory.ControllerType.DESIRED_TYPE_HERE);
I am working on an spring 2.0.1.RELEASE application.
Brief of Application:
1. I have separate Transformer beans that transforms my DTO to Domain
and vice versa.
2. I have separate Validator beans that validate my domain object being passed.
3. I have Service classes that takes care of the applying rules and calling persistence layer.
Now, i want to build a Workflow in my application:
where i will just call the start of the workflow and below mentioned steps will be executed in order and exception handling will be done as per the step:
1.First-Transformtion - transformToDomain() method will be called for that object type.
2.Second-Validator - class valid() method will be called for that object.
3.Third-Service - class save() method will be called for that object.
4.Fourth- Transformation - transformToDTO() method will be called for that object type.
after this my workflow ends and i will return the DTO object as response of my REST API.
Exception handling part is the one, i also want to take care of, like if particular exception handler exist for that step then call it, else call global exception handler.
I designed some prototype of same, but looking for some expert advice and how this can be achieved with a better design in java.
Explanation with example considering above use case is highly appreciable.
I'm not so sure if what you are describing is a workflow system in its true sense, perhaps a Chain of Responsibility is more of what you are talking about?
Following what you described as a sequence of execution, here is a simplified example of how I would implement the chain:
Transformer.java
public interface Transformer<IN, OUT> {
OUT transformToDomain(IN dto);
IN transformToDTO(OUT domainObject);
}
Validator.java
public interface Validator<T> {
boolean isValid(T object);
}
Service.java
public interface Service {
void save(Object object);
}
And the implementation that binds everything:
ProcessChain.java
public class ProcessChain {
private Transformer transformer;
private Service service;
private Validator validator;
Object process(Object dto) throws MyValidationException {
Object domainObject = transformer.transformToDomain(dto);
boolean isValid = validator.isValid(domainObject);
if(!isValid){
throw new MyValidationException("Validation message here");
}
service.save(domainObject);
return transformer.transformToDTO(domainObject);
}
}
I haven't specified any Spring related things here because your question seems to be a design question rather than a technology questions.
Hope this helps
Brief of what i implemented in a way with not much hustle:
This is how I created flow of handlers:
Stream.<Supplier<RequestHandler>>of(
TransformToDomainRequestHandler::new,
ValidateRequestHandler::new,
PersistenceHandler::new,
TransformToDTORequestHandler::new)
.sequential()
.map(c -> c.get()) /* Create the handler instance */
.reduce((processed, unProcessed) -> { /* chains all handlers together */
RequestHandler previous = processed;
RequestHandler target = previous.getNextRequestHandler();
while (target != null && previous != null) {
previous = target;
target = target.getNextRequestHandler();
}
previous.setNextRequestHandler(unProcessed);
return processed;
}).get();
This is my Request Handler which all other handler extends
I am developing a small application for my client and I tried to apply there Factory Method design pattern. I am not sure if I have done it correctly.
Basically I have an abstract class Scheme that is extended by concrete Schemes (AccountScheme, ContactScheme, OrderScheme etc.). Each class consists mainly of instance variables and a method responsible for transforming Scheme into actual system object (AccountScheme will be used eventually to create Account, ContactScheme to create Contact and so on).
I also have SchemeFactory class which has a static method createScheme taking two parameters - type of system object the Scheme should be able to transform into AND JSON String which will be parsed into the Scheme object itself.
And finally there is a ApiService class which handles Rest Requests and uses the SchemeFactory to create Schemes (using request body). The schemes are processed after that and at certain point if needed particular System Objects is created (using scheme) and inserted to database.
I believe the UML diagram (it is my first one) would look something like that:
UML Diagram
The concept is correct.
Your UML not show the abstract class. In your case, you can have something like this (as described in you UML):
class SchemaFactory
{
public static Schema getSchema(String type, String json)
{
if ( type.equals("account") )
return new AccountSchema(json);
else if ( type.equals("contact") )
return new ContactSchema(json);
else if ( type.equals("order") )
return new OrderSchema(json);
throw new IllegalArgumentException();
}
}
The interface:
interface Schema {
}
The implementation of AccountSchema:
class AccountSchema implements Schema {
AccountSchema(String json) {
//use json
}
}
The abstract class is optional for the pattern. It's useful if you would like to force that the Schemas fill the constructor of abstract class with json as parameter, but the schema class can still fake, like:
public class FakeSchema extends AbstractSchema {
public FakeSchema () {
super(null);
}
}
Here's my use case:
I need to do some generic operation before and after each method of a given class, which is based on the parameter(s) of the method. For example:
void process(Processable object) {
LOGGER.log(object.getDesc());
object.process();
}
class BaseClass {
String method1(Object o){ //o may or may not be Processable(add process logic only in former case)
if(o intstanceof Prcessable){
LOGGER.log(object.getDesc());
object.process();
}
//method logic
}
}
My BaseClass has a lot of methods and I know for a fact that the same functionality will be added to several similar classes as well in future.
Is something like the following possible?
#MarkForProcessing
String method1(#Process Object o){
//method logic
}
PS: Can AspectJ/guice be used? Also want to know how to implement this from scratch for understanding.
Edit: Forgot to mention, what I have tried.(Not complete or working)
public #interface MarkForProcessing {
String getMetadata();
}
final public class Handler {
public boolean process(Object instance) throws Exception {
Class<?> clazz = instance.getClass();
for(Method m : clazz.getDeclaredMethods()) {
if(m.isAnnotationPresent(LocalSource.class)) {
LocalSource annotation = m.getAnnotation(MarkForProcessing.class);
Class<?> returnType = m.getReturnType();
Class<?>[] inputParamTypes = m.getParameterTypes();
Class<?> inputType = null;
// We are interested in just 1st param
if(inputParamTypes.length != 0) {
inputType = inputParamTypes[0];
}
// But all i have access to here is just the types, I need access to the method param.
}
return false;
}
return false;
}
Yes, it can be done. Yes, you can use AspectJ. No, Guice would only be tangentially related to this problem.
The traditional aspect approach creates a proxy which is basically a subclass of the class you've given it (e.g. a subclass of BaseClass) but that subclass is created at runtime. The subclass delegates to the wrapped class for all methods. However, when creating this new subclass you can specify some extra behavior to add before or after (or both) the call to the wrapped class. In other words, if you have:
public class Foo() {
public void doFoo() {...}
}
Then the dynamic proxy would be a subclass of Foo created at runtime that looks something like:
public class Foo$Proxy {
public void doFoo() {
//Custom pre-invocation code
super.doFoo();
//Custom post-invocation code
}
}
Actually creating a dynamic proxy is a magical process known as bytecode manipulation. If you want to to do that yourself you can use tools such as cglib or asm. Or you can use JDK dynamic proxies. The main downside to JDK proxies are that they can only wrap interfaces.
AOP tools like AspectJ provide an abstraction on top of the raw bytecode manipulation for doing the above (you can do a lot with bytecode manipulation, adding behavior before and after methods is all aspects allow). Typically they define 'Aspect's which are classes that have special methods called 'advice' along with a 'pointcut' which defines when to apply that advice. In other words you may have:
#Aspect
public class FooAspect {
#Around("#annotation(MarkForProcessing)")
public void doProcessing(final ProceedingJoinPoint joinPoint) throws Throwable
{
//Do some before processing
joinPoint.proceed(); //Invokes the underlying method
//Do some after processing
}
}
The aspect is FooAspect, the advice is doProcessing, and the pointcut is "#annotation(MarkForProcessing)" which matches all methods that are annotated with #MarkForProcessing. It's worth pointing out that the ProceedingJoinPoint will have a reference to the actual parameter values (unlike the java.lang.reflect.Method)
The last step is actually applying your aspect to an instance of your class. Typically this is either done with a container (e.g. Guice or Spring). Most containers have some way of knowing about a collection of aspects and when to apply them to classes constructed by that container. You can also do this programmatically. For example, with AspectJ you would do:
AspectJProxyFactory factory = new AspectJProxyFactory(baseClassInstance);
factory.addAspect(FooAspect.class);
BaseClass proxy = factory.getProxy();
Last, but not least, there are AOP implementations which use compile-time "weaving" which is a second compilation step run on the class files that applies the aspects. In other words, you don't have to do the above or use a container, the aspect will be injected into the class file itself.
Using Hibernate 3.6.8.Final and Spring 3.0.5.RELEASE , I'm trying to add some Common DAO functionality for classes that have multiple implementations overridden higher up to implement the specific classes however it doesn't work for DetachedCriteria.
Example:
In base class:
public interface ICat {
public void setMeowSound(String meow);
public String getMeowSound();
}
Then each inherited project would define the hibernate annotations.
e.g.
#Entity
#Table(name="SQUAWKY_CATS")
public class SquawkyMeowingCat implements ICat, Serializable {
#Id
#Column(name="SQUAWK_NAME")
private String meow;
public String getMeowSound() {
return meow;
}
public void setMeowString(String meow) {
this.meow = meow;
}
}
This means I can use:
Criteria criteria = Session.createCriteria(ICat.class);
And Spring/Hibernate knows that it pulls the annotations for ICat from the concrete inheritance in the particular project.
However if I try to do:
DetachedCriteria subQuery = DetachedCriteria.forClass(ICat.class,"inner"); // etcetera
then I get an Unknown entity at runtime for ICat.
Now this makes sense as in the first instance is creating it off the Session so it has all the configuration that it needs whereas the DetachedCriteria is a static method however it errors when trying to do the
criteria.list()
by which time it has picked up the Session and should know that ICat is actually a SquawkyMeowingCat which has all the annotations.
So my questions are two part:
1) Is this known behaviour and will be like this forever more?
2) Can anyone think of a simple way around it without using an Interface and concrete ClassHolder which hands back the instance of the class it needs to create?
I'm not sure about the case of the DetachedCriteria, but one way to avoid explicit dependence on the concrete class might be to query Hibernate's metadata using the interface:
public <T> Class<? extends T> findEntityClassForEntityInterface(
SessionFactory sessionFactory,
Class<T> entityInterface
) {
for (ClassMetadata metadata : sessionFactory.getAllClassMetadata().values()) {
Class entityClass = metadata.getMappedClass(EntityMode.POJO);
if (entityInterface.isAssignableFrom(entityClass)) {
return entityClass;
}
}
return null;
}
With the usual caveats about the robustness of illustrative code spippets.