pattern to transfer search model to dao - java

We have a dao as a project (jar file).
Clients use its interfaces and factories to operate with database.
Alongside with standard CRUD operations, dao allows you to search an entity by some search criteria.
What is the best way to represent this criteria?
Is transfer object appropriate pattern in this situation?
How should client create SearchModel instance?
Please, share.
Regards.

I usually use a generic DAO:
package persistence;
import java.io.Serializable;
import java.util.List;
public interface GenericDao<T, K extends Serializable>
{
T find(K id);
List<T> find();
List<T> find(T example);
List<T> find(String queryName, String [] paramNames, Object [] bindValues);
K save(T instance);
void update(T instance);
void delete(T instance);
}
This allows me to use named queries with bound parameters and query by example. I've found it to be flexible enough to satisfy most of my needs.

Related

Generic Repository within DDD: How can I make this interface generic?

I'm developing a multi-module CMS application following Domain-Driven Design principles. I'm trying to figure out how to implement Generic Repository, thus avoiding a lot of boiler-plate code.
The idea is to have a "two-way" mapping strategy (model to entity and vice versa) and Generic Repository implemented in the Persistence module. Further, an interface in the Domain module would act as a contract between Domain and Persistence, so I can use it for later injection in the other layers.
How can I make this interface generic?
To be specific, the problem here is the mapping. Since I'm using a "two-way" mapping strategy, the Domain module has no idea about DB specific entities.
Is there a way to map generic type models between layers? Or use some other mapping strategy while keeping the layers loosely coupled?
Here is a code example to clarify what I'm trying to achieve.
This would be the code example for Generic Repository:
#MappedSuperclass
public abstract class AbstractJpaMappedType {
…
String attribute
}
#Entity
public class ConcreteJpaType extends AbstractJpaMappedType { … }
#NoRepositoryBean
public interface JpaMappedTypeRepository<T extends AbstractJpaMappedType>
extends Repository<T, Long> {
#Query("select t from #{#entityName} t where t.attribute = ?1")
List<T> findAllByAttribute(String attribute);
}
public interface ConcreteRepository
extends JpaMappedTypeRepository<ConcreteType> { … }
Further, I want to make my own Custom Repository to be able to do some mapping of model to entity and vice versa, so I wouldn't have JPA specific annotation in my domain classes, thus making it loosely coupled. I want this Custom Repository to implement an interface from Domain module, allowing me to inject it later in the Services layer.
public class CustomRepositoryImpl implements CustomRepository {
public final JpaMappedTypeRepository<T> repository;
...
}
How can I make this class and this interface generic so that I would be able to do mapping between model and entity, since Domain layer has no information about entity classes?
I figured it out eventually.
The problem, as it was stated in the question, was the mapping between layers. I created a mapping interface to declare mapping methods. I used #ObjectFactory annotation from MapStruct to deal with generic mapping (look here):
public interface EntityMapper<M, E> {
M toModel(E entity);
List<M> toModelList(List<E> entities);
E toEntity(M model);
List<E> toEntityList(List<M> models);
// default object factory methods
}
Then I proceeded with creating a mapper for each of the child classes and extending it with the EntityMapper interface with concrete types that I want to map.
#Mapper(componentModel="spring")
public interface ConcreteEntityMapper extends EntityMapper<ConcreteModelType, ConcreteJpaType> {
}
I created an abstract class where I injected the JPA repository and the mapper, and also implemented common methods.
abstract class CustomRepositoryImpl<T extends AbstractModelMappedType, E extends AbstractJpaMappedType> {
private final JpaMappedTypeRepository<E> repository;
private final EntityMapper<M, E> mapper;
//... common methods for mapping and querying repositories.
}
Then I extended a ConcreteTypeRepositoryImpl with this abstract class, and implemented a generic interface, which I can later use as a reference in other layers.
public interface CustomRepository<M> {
M saveOrUpdate(M model);
Optional<M> getById(Long id);
List<M> getByName(String name);
List<M> getAll();
void delete(Long id);
}
#Component
public class ConcreteTypeRepositoryImpl extends CustomRepositoryImpl<ConcreteModelType,ConcreteJpaType> implements CustomRepository<ConcreteModelType> {
public ConcreteTypeRepositoryImpl(JpaMappedTypeRepository<ConcreteJpaType> repository,
EntityMapper<ConcreteModelType, ConcreteJpaType> mapper) {
super(repository, mapper);
}
}
And that would be it. Now I can inject CustomRepository into other layers and hit desired repository.

Java generic return type for a interface getter (<T> T get();)

I am working in the design of a SDK that will provide: basic definitions (interfaces), logging and transaction engine for different projects. Each project will be considered as a platform and will be developed as a different project using as basis the SDK; they has similarities but each implementation should be able to solve specific behaviors, however the basic definitions from the core SDK should solve most part of the problems. For example: HUUniversity or MITUniversity
So far I have almost all achieved for example: there is StudentManager interface that provide essential behavior for any implementation:
public interface StudentManager<T> extends Manager, Release {
int getCurrentStudent();
int getTotalStudent();
TransactionManager getStudentManager();
List<T> getStudent();
void addStudent(T participant);
T getStudent(String id);
T removeStudent(String id);
}
That way each platform implementation will be able to implement its own definition where basically extend from the basic definition provided within the SDK but the each implementation will be strongly typed and would be able to implement new behaviors:
public interface HUStudentManager extends StudentManager<HUStudent>, ParticipantListener {
List<HUStudentCommand> getCommands(String audioId);
HUStudent getParticipant(ListType list, String id);
HUStudent getParticipantByName(String name);
List<HUStudent> getParticipants(StudentState state);
List<HUStudent getParticipantsOnList(ListType list);
List<HUStudent> getParticipantsOnList(ListType list, Sort sort);
void addParticipantOnList(HUStudent participant, ListType listType, long epoch);
HUStudentCommand removeCommand(String id);
HUStudentCommand removeParticipantByName String name);
void saveCommand(HUStudentCommand command)
}
Implementation: the HU platform has its own definition of a StudentManager (HUStudentManager), and extend from the basis (defined on the SDK) since the SDK doesn't know about any HUStudent definition I added a generic param to it so each
public class HUStudentManagerImpl extends HU implements
HUStudentManager<HUStudent> {
#Override
public void addStudent(HUStudent student) {
if(Utils.isNull(m_students.putIfAbsent(student.getId(), participant))){
m_totalStudents.incrementAndGet();
getLogger().log(Keywords.DEBUG,"{0}The instance: {1} with the specified key: {2} has been added to the ConcurrentMap<String, HUStudents>", getData(), student.getClass().toString(), student.getId());
}else{
getLogger().log(Keywords.WARNING,"{0}The instance: {1} with the specified key: {2} already exists in the ConcurrentMap<String, HUStudents>", getData(), student.getClass().toString(), student.getId());
}
}
}
The sample above works fine and solve the problem where I leave to each developer use his own definitions for the specific platform which of course will extend from the basic definitions
But I wasn't able to figured out how let the developer to use his own definition for a single type within the interface definition ie:
public interface Student extends IManager, IRelease {
UUID getUUID();
String getId();
<T> T getSchedule();
<T> T getElapsedTime();
}
I pretend that the basic interface allows to each developer to use its own definition force them to implement a basic behavior or implement a new one but extending from the existing on the SDK:
public interface HUStudent extends Student {
HUClass getClass()
}
How can I implement this on the final HUStudentImpl class without get the compiler error in order to suppress the types. is that possible or should I shadowed the definitions in super class with the desire type
public interface HUStudentImpl extends HU, implements HUStudent {
//Type safety: The expression of type getSchedule() needs unchecked
//conversion to conform to HUSchedule
HUSchedule getSchedule(); //Def from Student interface at SDK
HUElapsedTime getElapsedTime(); //Def from Student interface at SDK
}
I cannot use parameter over the Student interface since each getter could be a different type.
Hope someone can enlighten me and point me in the right direction.
Thanks in advance, best regards.

Inheritance in HQL with Spring Data JPA

I have set of classes which inherit from a single super class:
Super
|
+------+-------+
Aaaa Bbbb Cccc
Each of the Aaaa,Bbbb,Cccc then should contain method findByTag. The problem is that I can't manage to define it generally. Following example defines specific findByTag for Aaaa.
public interface AaaaRepository extends SuperRepository<Aaaa> {
#Query("select distinct a from Aaaa a " +
"join a.tags t " +
"join fetch a.locale where t = ?1")
public List<Event> findByTag(Tag t);
}
Note that the Superclass is #MappedSuperclass and does not have its own table in database.
I would like to use some kind of "Super" in the query which would be replaced in each class by its name.
My second problem is that I don't know how to force #ElementCollection to be Eagerly fetched. I have to always explicitly say "join fetch" in the query. If it is not fetched, once the transaction is finished, I can't access those objects, which I did not explicitly fetched. (LazyFetch Exceptions...)
Thanks
Looking at the documentation, custom implementations section, what about this approach:
Create an interface that extends repository and has your findByTag method, without annotations.
Create an implementation of that class, and in the method implementation you use the JPA criteria. You also need a class field to hold the actual class for the domain object, because generics are erased at compilation time. Then you use that field to build the criteria.
Read the documentation to use this implementation as a base class for the repository factory, then Spring Data will build implementations for the other repositories based on this custom one.
public interface MyRepository<T, ID> extends JpaRepository<T, ID> {
public List<Event> findByTag(Tag t);
}
public class MyRepositoryImpl<T, ID> implements MyRepository<T, ID> {
private Class<T> actualClass; // initialized in the constructor
public List<Event> findByTag(Tag t) {
// here you build the criteria using actualClass field, and execute it.
}
}
public interface AaaaRepository extends MyRepository <Aaaa, Integer> {
// other methods...
}
Look at "Example 1.16. Custom repository factory bean" of the documentation to create the factory bean.
When Spring instantiates the implementation of AaaaRepository, it will use MyRepositoryImpl as base class.
Will this work for you?
Instead of writing it this way, I would create a data access object that resembles the pseudo java code below:
class DAO<T> {
private Class<T> clazz;
DAO( Class<T> class) { this.clazz = t; }
#PersistenceContext
private EntityManager em;
public List<T> findByTag(Tag t ) {
Query q = em.createQuery( "select from " + clazz.getSimpleName + "....";
...
return q.getResultList();
}
}
Hope it helps!
In the end, I was so unhappy about the behaviour and inflexibility of the Spring Data JPA, so that I wrote myself a small tool for building the queries in a simple way. An example of using is here:
https://github.com/knyttl/Maite/wiki/Maite-Persistence
There are two children classes and the parent class which define the functionality. But the trick is in a fluent interface of building the query.
It is just in the beggining, but it already works so that I have no duplicity and correct inheritance.
Small example of the parent class - check the link above for detail:
#Autowired
EntityManager em;
protected abstract String getName();
protected Clause select() {
return em
.select("DISTINCT i")
.from(this.getName(), "i")
.joinFetch("i.locale lf")
}
public List<T> findByTag(Tag tag) {
return (T) this.select()
.join("i.tags t")
.where("t = ?", tag)
.fetchAll();
}

Dao Registry refactoring

Using the generic dao pattern, I define the generic interface:
public interface GenericDao<T extends DataObject, ID extends Serializable> {
T save(T t);
void delete(ID id);
T findById(ID id);
Class<T> getPersistentClass();
}
I then implemented an default GenericDaoImpl implementation to perform these functions with the following constructor:
public GenericDaoImpl(Class<T> clazz) {
this.persistentClass = clazz;
DaoRegistry.getInstance().register(clazz, this);
}
The point of the DaoRegistry is to look up a Dao by the class associating to it. This allows me to extend GenericDaoImpl and overwrite methods for objects that requires special handling:
DaoRegistry.getInstance().getDao(someClass.getClass()).save(someClass);
While it works, there are a few things that I don't like about it:
DaoRegistry is an singleton
The logic of calling save is complicated
Is there a better way to do this?
Edit
I am not looking to debate whether Singleton is an anti-pattern or not.
First of all, what is your problem with DaoRegistry being singleton?
Anyway, you could have an abstract base class for your entities that'd implement save like this
public T save(){
DaoRegistry.getInstance().getDao(this.getClass()).save(this);
}
then you could simply call someEntity.save()
Or it may be more straightforward if the entity classes itself implemented the whole GenericDao interface (save, delete and find methods), so the contents of your GenericDaoImpl would be in the base class of your entities.
It could be better to use instance of DaoRegistry instead of static methods. It would make it more manageable for test configurations. You could implement it as
#Component("daoRegistry")
public class DaoRegistry {
#Autowired
private List<GenericDao> customDaos;
private GenericDao defaultDao = new GenericDaoImpl();
public <T> T getDao(Class<T> clazz) {
// search customDaos for matching clazz, return default dao otherwise
}
}
Also you could add save method to it and rename accordingly. All customised daos should be available as beans.

when to use the abstract factory pattern?

I want to know when we need to use the abstract factory pattern.
Here is an example,I want to know if it is necessary.
The UML
THe above is the abstract factory pattern, it is recommended by my classmate.
THe following is myown implemention. I do not think it is necessary to use the pattern.
And the following is some core codes:
package net;
import java.io.IOException;
import java.util.HashMap;
import java.util.Map;
import java.util.Properties;
public class Test {
public static void main(String[] args) throws IOException, InstantiationException, IllegalAccessException, ClassNotFoundException {
DaoRepository dr=new DaoRepository();
AbstractDao dao=dr.findDao("sql");
dao.insert();
}
}
class DaoRepository {
Map<String, AbstractDao> daoMap=new HashMap<String, AbstractDao>();
public DaoRepository () throws IOException, InstantiationException, IllegalAccessException, ClassNotFoundException {
Properties p=new Properties();
p.load(DaoRepository.class.getResourceAsStream("Test.properties"));
initDaos(p);
}
public void initDaos(Properties p) throws InstantiationException, IllegalAccessException, ClassNotFoundException {
String[] daoarray=p.getProperty("dao").split(",");
for(String dao:daoarray) {
AbstractDao ad=(AbstractDao)Class.forName(dao).newInstance();
daoMap.put(ad.getID(),ad);
}
}
public AbstractDao findDao(String id) {return daoMap.get(id);}
}
abstract class AbstractDao {
public abstract String getID();
public abstract void insert();
public abstract void update();
}
class SqlDao extends AbstractDao {
public SqlDao() {}
public String getID() {return "sql";}
public void insert() {System.out.println("sql insert");}
public void update() {System.out.println("sql update");}
}
class AccessDao extends AbstractDao {
public AccessDao() {}
public String getID() {return "access";}
public void insert() {System.out.println("access insert");}
public void update() {System.out.println("access update");}
}
And the content of the Test.properties is just one line:
dao=net.SqlDao,net.SqlDao
So any ont can tell me if this suitation is necessary?
-------------------The following is added to explain the real suitation--------------
I use the example of Dao is beacuse it is common,anyone know it.
In fact,what I am working now is not related to the DAO,I am working to build a Web
service,the web serivce contains some algorithms to chang a file to other format,
For example:net.CreatePDF,net.CreateWord and etc,it expose two interfaces to client:getAlgorithms and doProcess.
The getAlogrithoms will return all the algorithms's ids,each id is realted to the
corresponding algorithm.
User who call the doProcess method will also provide the algorithm id he wanted.
All the algorithm extends the AbstractAlgorithm which define a run() method.
I use a AlogrithmsRepository to store all the algorithms(from
the properties file which config the concrete java classes of the algorithms by the web
service admin).That's to say, the interface DoProcess exposed by the web service is
executed by the concrete alogrithm.
I can give a simple example:
1)user send getAlgorithms request:
http://host:port/ws?request=getAlgorithms
Then user will get a list of algorithms embeded in a xml.
<AlgorithmsList>
<algorithm>pdf</algorithm>
<algorithm>word<algorithm>
</AlgorithmsList>
2)user send a DoProcess to server by:
http://xxx/ws?request=doProcess&alogrithm=pdf&file=http://xx/Test.word
when the server recieve this type of requst,it will get the concrete algorithm instance according to the "algorithm" parameter(it is pdf in this request) from the AlgorithmRepostory. And call the method:
AbstractAlgorithm algo=AlgorithmRepostory.getAlgo("pdf");
algo.start();
Then a pdf file will be sent to user.
BTW,in this example, the each algorithm is similar to the sqlDao,AccessDao.
Here is the image:
The design image
Now,does the AlgorithmRepostory need to use the Abstract Factory?
The main difference between the two approaches is that the top one uses different DAO factories to create DAO's while the bottom one stores a set of DAO's and returns references to the DAO's in the repository.
The bottom approach has a problem if multiple threads need access to the same type of DAO concurently as JDBC connections are not synchronised.
This can be fixed by having the DAO implement a newInstance() method which simply creates and returns a new DAO.
abstract class AbstractDao {
public abstract String getID();
public abstract void insert();
public abstract void update();
public abstract AbstractDao newInstance();
}
class SqlDao extends AbstractDao {
public SqlDao() {}
public String getID() {return "sql";}
public void insert() {System.out.println("sql insert");}
public void update() {System.out.println("sql update");}
public AbstractDao newInstance() { return new SqlDao();}
}
The repository can use the DAO's in the repository as factories for the DAO's returned by the Repository (which I would rename to Factory in that case) like this:
public AbstractDao newDao(String id) {
return daoMap.containsKey(id) ? daoMap.get(id).newInstance() : null;
}
Update
As for your question should your web-service implement a factory or can it use the repository like you described? Again the answer depends on the details:
For web-services it is normal to
expect multiple concurrent clients
Therefore the instances executing the
process for two clients must not
influence eachother
Which means they must not have shared state
A factory delivers a fresh instance on
every request, so no state is shared
when you use a factory pattern
If (and only if) the instances in your
repository are stateless your
web-service can also use the
repository as you describe, for this
they probably need to instantiate
other objects to actually execute the
process based on the request
parameters passed
If you ask to compare 2 designs from UML, 2nd API on UML have following disadvantage:
caller needs to explicitly specify type of DAO in call to getDAO(). Instead, caller shouldn't care about type of DAO it works with, as long as DAO complies with interface. First design allows caller simply call createDAO() and get interface to work with. This way control of which impl to use is more flexible and caller don't have this responsibility, which improves overall coherence of design.
Abstract Factory is useful if you need to separate multiple dimensions of choices in creating something.
In the common example case of windowing systems, you want to make a family of widgets for assorted windowing systems, and you create a concrete factory per windowing system which creates widgets that work in that system.
In your case of building DAOs, it is likely useful if you need to make a family of DAOs for the assorted entities in your domain, and want to make a "sql" version and an "access" version of the entire family. This is I think the point your classmate is trying to make, and if that's what you're doing it's likely to be a good idea.
If you have only one thing varying, it's overkill.

Categories

Resources