how to factor dao with class parameter - java

I want to create a Factory which returns dao-instance, depending on the Class clazz
Teammember, Scene and Equipment are my Model Classes.
My DAO's look like this:
public class JDBCTeammemberDAO implements JdbcDAO<Teammember>
my Factory looks like this:
public class DAOFactory {
JdbcDAO createDAO(Class clazz) {
if(clazz.equals(Teammember.class)) {
return new JDBCTeammemberDAO();
}
if(clazz.equals(Scene.class)) {
return new JDBCSceneDAO();
}
if(clazz.equals(Equipment.class)) {
return new JDBCEquipmentDAO();
}
return null;
}
}
I was thinking about switch and polymorphism, but I couldn't figure out how.
Basically I want to find the Implementation "SomeClass implements JdbcDAO"
My first approach was:
String name = clazz.getName().substring(6); // model.Teammember
Class<?> forName;
try {
forName = Class.forName("dao.jdbc.JDBC" + name + "DAO");
return (JdbcDAO) forName.newInstance();
} catch (ClassNotFoundException e) {
e.printStackTrace();
} catch (InstantiationException e) {
e.printStackTrace();
} catch (IllegalAccessException e) {
e.printStackTrace();
}
but I don't feel good with handling this with String method. Besides, it doesn't work, if I have different Model and Dao names (like: JDBCMemberDAO instead of JDBCTeammemberDAO)

I was in a similar situation and decided to use a Dao registry to handle the issue. Using the generic dao pattern #Perception mentioned:
public interface JdbcDAO<T> {
T find(Long id) ;
T create(T entity);
T update(T entity);
void delete(T entity);
}
public class JdbcDAOImpl<T> {
this.clazz = clazz;
DaoRegistry.register (clazz, this);
}
Then you can have your
public class JDBCTeammemberDAO extend JdbcDAOImpl<TeamMember> {
public class JDBCTeammemberDAO () {
super(TeamMember.class);
}
}
DaoRegistry would look something like this:
public class DaoRegistry {
private Map<Class, JdbcDAO> daoMap;
public synchronized void register (Class type, JdbcDao dao) {
if (!daoMap.containsKey(type))
daoMap.put (type, dao);
else
logger.error ("Something is really wrong because you are creating another dao for this class.", e);
}
public JdbcDAO get(Class type) {return daoMap.get(type);
}
This is just the jest of it, you will need to make sure it is thread-safe. Hope this helps.

Sormula works as you describe. It provides a method to get the "DAO" for a row/record class. See database.getTable(Inventory.class); in example 1. You don't need to write any DAO's.

If you don't mind a slight redesign, this problem is easy enough to solve with a little bit of Generics and Polymorphism:
public interface JdbcDAO<T> {
T find(Long id) ;
T create(T entity);
T update(T entity);
void delete(T entity);
// Other common definitions
}
public class JdbcDAOImpl<T> {
private Class<T> clazz;
public JdbcDAOImpl() {
super();
}
protected JdbcDAOImpl(Class<T> clazz) {
super();
this.clazz = clazz;
}
// Common implementation here
}
public class EquipmentDAO extends JdbcDAOImpl<Equipment> {
public EquipmentDAO() {
super(Equipment.class);
}
// Subclass specific implementation here
}
Rinse and repeat the specific impl for each of your Models and instantiate them directly (without use of a factory).

Related

Duplicate check via Unique constraint in DB not working

I have a spring boot service that should persist several entities of two types in a transaction to an Oracle DB. The table for the first entity type is huge (3 Mio. entries/day, partitioned,...) and I have the issue that I need to react on duplicates. There are some fields I use to create a hash on and I have a unique constraint in the database on that field. I thought it is a clever idea to just saveAndFlush entity by entity and react on the ConstraintViolationException. Based on the result of saving the list of first entities, I need to create the second entity and save that as well, but it rolls back everything.
My question now would be if this approach is generally wrong, or ok and there is some small issue? If it is generally wrong, how should I do this duplicate check then (a select upfront is not an option)?
Here is some pseudo-code to get a better idea
#Entity
public class Foo{
public String uniqueHash;
// couple of other properties that will be used to calculate the hash
}
#Entity
public class Bar{
private List goodIds;
private List badIds;
public Bar(List goodIds, List badIds){
this.goodIds = goodIds;
this.badIds = badIds;
}
}
#Repository
#Transactional(noRollbackFor = PersistenceException.class)
public interface FooRepository extends JpaRepository<Foo, String> {
Foo saveAndFlush(Foo f) throws PersistenceException;
}
#Repository
#Transactional(noRollbackFor = PersistenceException.class)
public interface BarRepository extends JpaRepository<Bar, String> {
Bar saveAndFlush(Bar b) throws PersistenceException;
}
SomeService
#Transactional(noRollbackFor = PersistenceException.class)
public void doSomething(List<Foo> foos){
List<String> goodIds = new ArrayList();
List<String> badIds = new ArrayList();
for (Foo foo : foos) {
try {
fooRepository.saveAndFlush(foo);
goodIds.add(foo.getId());
} catch (PersistenceException e) {
if (e.getCause() instanceof ConstraintViolationException) {
badIds.add(foo.getId);
} else {
throw e;
}
}
}
barRepository.saveAndFlush(new Bar(goodIds, badIds));
}
Finally, I found a way to achieve the expected behavior, and even better, I was able to get rid of these "noRollBackFor" attributes. I only restructured the process and try to save everything in a transaction, if it fails, the Exception is caught on the calling method, the input is "cleaned" and the transactional method is called again (recursively). These duplicates are rare situations (happens every 10k Foo instance), so from a performance perspective, it's fine to have these subsequent transactions. Here is the changed pseudo-code again
#Entity
public class Foo{
public String uniqueHash;
// couple of other properties that will be used to calculate the hash
}
#Entity
public class Bar{
private List goodIds;
private List badIds;
public Bar(List goodIds, List badIds){
this.goodIds = goodIds;
this.badIds = badIds;
}
public List getGoodIds(){
return goodIds;
}
public List getBadIds(){
return badIds;
}
}
#Repository
public interface FooRepository extends JpaRepository<Foo, String> {
}
#Repository
public interface BarRepository extends JpaRepository<Bar, String> {
}
public class FooException extends RuntimeException {
private final Foo foo;
public FooException(String message, Foo foo) {
super(message);
this.foo = foo;
}
public getFoo(){
return foo;
}
}
SomeService
public void doSomething(List<Foo> foos, Bar bar){
try{
doSomethingTransactional(foos,bar);
}
catch (FooException e) {
bar.getBadIds().add(e.getFoo().getId());
foos.remove(foo);
doSomething(foos, bar);
}
}
#Transactional
public void doSomethingTransactional(List<Foo> foos, Bar bar){
for (Foo foo : foos) {
try {
fooRepository.saveAndFlush(foo);
bar.getGoodIds.add(foo.getId());
} catch(DataAccessException e) {
if (e.getCause() instanceof ConstraintViolationException
&& ((ConstraintViolationException) e.getCause()).getConstraintName().contains("Some DB Message")) {
throw new FooException("Foo already exists", foo);
} else {
throw e;
}
}
}
barRepository.saveAndFlush(bar);
}
You might be able to use a custom #SQLInsert to make use of Oracles MERGE statement for this purpose. Also see https://stackoverflow.com/a/64764412/412446

Java How To Avoid Type Casting

I have faced this problem a few times in the past, but haven't really found a good solution/design for it.
The below example code will generate PDF doc from Entity (Company or Article)
public class Entity
{
int id;
}
public class Company extends Entity
{
private String HQ;
}
public class Article extends Entity
{
private String title;
}
public interface EntityPDFGenerator
{
void generate(Entity entity);
}
public class ArticlePDFGenerator implements EntityPDFGenerator
{
public void generate(Entity entity)
{
Article article = (Article) entity;
// create Article related PDF from entity
}
}
public class CompanyPDFGenerator implements EntityPDFGenerator
{
public void generate(Entity entity)
{
Company company = (Company) entity;
// create Company related PDF
}
}
Main class:
public class PDFGenerator
{
public void generate(Entity entity)
{
EntityPDFGenerator pdfGenerator = getConcretePDFGenerator(entity);
pdfGenerator.generate(entity);
}
// lets make the factory task simple for now
EntityPDFGenerator getConcretePDFGenerator(Entity entity)
{
if(entity instanceof Article){
return new ArticlePDFGenerator();
}else{
return new CompanyPDFGenerator();
}
}
}
In the above approach the problem is with the casting the Entity to the concrete type (casting can be dangerous in later stage of the code). I tried to make it with generics, but then I get the warning
Unchecked call to 'generate(T)'
Can I improve this code?
Here, you go with the suggested changes:
public interface EntityPDFGenerator<T extends Entity> {
void generate(T entity);
}
public class ArticlePDFGenerator implements EntityPDFGenerator<Article> {
public void generate(Article entity)
{
// create Article related PDF from entity
}
}
public class CompanyPDFGenerator implements EntityPDFGenerator<Company> {
public void generate(Company entity)
{
// create Company related PDF
}
}
Short answer
Generics is not the right tool here. You can make the casting explicit:
public class CompanyPDFGenerator implements EntityPDFGenerator
{
public void generate(Entity entity)
{
if (! (entity instanceof Company)) {
throw new IllegalArgumentException("CompanyPDFGenerator works with Company object. You provided " + (entity == null ? "null" : entity.getClass().getName()));
}
Company company = (Company) entity;
System.out.println(company);
// create Company related PDF
}
}
Or you can define some sort of data structure in the entity class and use only that in the printer:
public abstract class Entity
{
int id;
public abstract EntityPdfData getPdfData();
}
// ...
public class CompanyPDFGenerator implements EntityPDFGenerator
{
public void generate(Entity entity)
{
EntityPdfData entityPdfData = entity.getPdfData();
// create Company related PDF
}
}
Long answer
Generics is useful if you know the types at compile-time. I.e. if you can write into your program that actual type. For lists it looks so simple:
// now you know at compile time that you need a list of integers
List<Integer> list = new ArrayList<>();
In your example you don't know that:
public void generate(Entity entity)
{
// either Article or Company can come it. It's a general method
EntityPDFGenerator pdfGenerator = getConcretePDFGenerator(entity);
pdfGenerator.generate(entity);
}
Suppose you want to add type to the EntityPDFGenerator , like this:
public static interface EntityPDFGenerator<T extends Entity>
{
void generate(T entity);
}
public static class ArticlePDFGenerator implements EntityPDFGenerator<Article>
{
public void generate(Article entity)
{
Article article = (Article) entity;
// create Article related PDF from entity
}
}
public static class CompanyPDFGenerator implements EntityPDFGenerator<Company>
{
public void generate(Company entity)
{
Company company = (Company) entity;
// create Company related PDF
}
}
This looks nice. However, getting the right generator will be tricky. Java generics is invariant. Even ArrayList<Integer> is not a subclass of ArrayList<Number>. So, ArticlePdfGenerator is not a subclass of EntityPDFGenerator<T extends Entity>. I.e. this will not compile:
<T extends Entity> EntityPDFGenerator<T> getConcretePDFGenerator(T entity, Class<T> classToken)
{
if(entity instanceof Article){
return new ArticlePDFGenerator();
}else{
return new CompanyPDFGenerator();
}
}
I would suggest to move the getGenerator() method in the Entity class and override it in the Company and Article classes.
Unless, of course, there is a good reason not to.

Weld: Generic factory for many service-interfaces extending common super-interface

How can I create a single common factory for hundreds of service-interfaces?
I have a common generic super-interface, which all my service-interfaces extend: BaseDao<T>
There are hundreds of (generated) interfaces sub-classing my BaseDao, e.g. CustomerDao extends BaseDao<Customer>. Of course, I do not want to implement a single factory for every sub-class. Especially, because there is already a DaoFactory, which I need to "glue" into my Weld-environment.
Hence, I implemented this:
#ApplicationScoped
public class InjectingDaoFactory {
#SuppressWarnings("rawtypes") // We *MUST* *NOT* declare a wild-card -- Weld does not accept it => omit the type argument completely.
#Produces
public BaseDao getDao(final InjectionPoint injectionPoint) {
final Type type = injectionPoint.getType();
// ... some checks and helpful exceptions ...
final Class<?> c = (Class<?>) type;
// ... more checks and helpful exceptions ...
#SuppressWarnings("unchecked")
final Class<BaseDao<?>> clazz = (Class<BaseDao<?>>) c;
final BaseDao<?> dao = DaoFactory.getDao(clazz);
return dao;
}
}
In the code requiring such a DAO, I now tried this:
#Inject
private CustomerDao customerDao;
But I get the error org.jboss.weld.exceptions.DeploymentException: WELD-001408: Unsatisfied dependencies for type CustomerDao with qualifiers #Default -- Weld does not understand that my InjectingDaoFactory is capable of providing the correct sub-class to meet the dependency on CustomerDao.
Please note that I (of course) did not have the chance to debug the code of my factory. Maybe I need to use InjectionPoint.getMember() instead of InjectionPoint.getType() -- this is not my problem, now. My problem is that the responsibility of my factory for the sub-interfaces extending BaseDao is not understood by Weld at all.
So, what do I need to do to make Weld understand that one single factory can provide all the implementations of the many sub-interfaces of my BaseDao common DAO-interface?
According to this documentation, I created the following extension, which seems to work fine:
public class InjectingDaoExtension implements Extension {
public InjectingDaoExtension() {
}
private final Set<Class<? extends BaseDao>> injectedDaoInterfaces = new HashSet<>();
public <T> void processInjectionTarget(#Observes ProcessInjectionTarget<T> pit, BeanManager beanManager) {
final InjectionTarget<T> it = pit.getInjectionTarget();
for (InjectionPoint injectionPoint : it.getInjectionPoints()) {
Field field = null;
try {
Member member = injectionPoint.getMember();
field = member.getDeclaringClass().getDeclaredField(member.getName());
} catch (Exception e) {
// ignore
}
if (field != null) {
Class<?> type = field.getType();
if (BaseDao.class.isAssignableFrom(type)) {
if (! type.isInterface()) {
pit.addDefinitionError(new IllegalStateException(String.format("%s is not an interface! Cannot inject: %s", type, field)));
}
#SuppressWarnings("unchecked")
Class<? extends BaseDao> c = (Class<? extends BaseDao>) type;
injectedDaoInterfaces.add(c);
} else {
field = null;
}
}
}
}
public void afterBeanDiscovery(#Observes AfterBeanDiscovery abd, BeanManager beanManager) {
for (Class<? extends BaseDao> daoInterface : injectedDaoInterfaces) {
abd.addBean(createBean(daoInterface, beanManager));
}
}
protected <D extends BaseDao> Bean<D> createBean(final Class<D> daoInterface, final BeanManager beanManager) {
return new Bean<D>() {
private InjectionTarget<D> injectionTarget;
public synchronized InjectionTarget<D> getInjectionTargetOrNull() {
return injectionTarget;
}
public synchronized InjectionTarget<D> getInjectionTarget() {
if (injectionTarget == null) {
D handler = DaoFactory.getDao(daoInterface);
#SuppressWarnings("unchecked")
Class<D> handlerClass = (Class<D>) handler.getClass();
final AnnotatedType<D> at = beanManager.createAnnotatedType(handlerClass);
injectionTarget = beanManager.createInjectionTarget(at);
}
return injectionTarget;
}
#Override
public Class<?> getBeanClass() {
return daoInterface;
}
#Override
public Set<InjectionPoint> getInjectionPoints() {
// The underlying DaoFactory is not yet initialised, when this method is first called!
// Hence we do not use getInjectionTarget(), but getInjectionTargetOrNull(). Maybe this
// causes problems with injections inside the DAOs, but so far, they don't use injection
// and it does not matter. Additionally, they are RequestScoped and therefore the injection
// later *may* work fine. Cannot and do not need to test this now. Marco :-)
InjectionTarget<D> it = getInjectionTargetOrNull();
return it == null ? Collections.emptySet() : it.getInjectionPoints();
}
#Override
public String getName() {
return getBeanClass().getSimpleName();
}
#Override
public Set<Annotation> getQualifiers() {
Set<Annotation> qualifiers = new HashSet<Annotation>();
qualifiers.add(new AnnotationLiteral<Default>() {});
qualifiers.add(new AnnotationLiteral<Any>() {});
return qualifiers;
}
#Override
public Class<? extends Annotation> getScope() {
return RequestScoped.class;
}
#Override
public Set<Class<? extends Annotation>> getStereotypes() {
return Collections.emptySet();
}
#Override
public Set<Type> getTypes() {
Set<Type> types = new HashSet<>();
types.add(daoInterface); // TODO add more types?!
return types;
}
#Override
public D create(CreationalContext<D> creationalContext) {
D handler = DaoFactory.getDao(daoInterface);
InjectionTarget<D> it = getInjectionTarget();
it.inject(handler, creationalContext);
it.postConstruct(handler);
return handler;
}
#Override
public void destroy(D instance, CreationalContext<D> creationalContext) {
InjectionTarget<D> it = getInjectionTarget();
it.preDestroy(instance);
it.dispose(instance);
creationalContext.release();
}
#Override
public boolean isAlternative() {
return false;
}
#Override
public boolean isNullable() {
return false;
}
};
}
}
The idea is that it first collects all sub-interfaces of BaseDao that need to be injected. Then, it provides the factory for each of them.
Important: As already stated in the comments, it is necessary to put this extension in a separate JAR, which does not provide any services. As soon as I placed a class implementing Extension in the same JAR as a service implementation (e.g. published via #RequestScoped), the service was not found, anymore.

Avoid If-else code smell with creation of objects which depend upon specific conditions

Is there a better way to deal with an instanciation of an object (Product) which depends upon another object type (Condition) than using if-else paired with instanceof as the following code shows?
import java.util.ArrayList;
import java.util.List;
abstract class AbstractProduct {
private AbstractCondition condition;
public AbstractProduct(AbstractCondition condition) {
this.condition = condition;
}
public abstract void doSomething();
}
class ProductA extends AbstractProduct {
AbstractCondition condition;
public ProductA(AbstractCondition condition) {
super(condition);
}
#Override
public void doSomething() {
System.out.println("I'm Product A");
}
}
class ProductB extends AbstractProduct {
public ProductB(AbstractCondition condition) {
super(condition);
}
#Override
public void doSomething() {
System.out.println("I'm Product B");
}
}
class AbstractCondition { }
class ConditionA extends AbstractCondition { }
class ConditionB extends AbstractCondition { }
public class Try {
public static void main(String[] args) {
List<AbstractCondition> conditions = new ArrayList<AbstractCondition>();
List<AbstractProduct> products = new ArrayList<AbstractProduct>();
conditions.add(new ConditionA());
conditions.add(new ConditionB());
conditions.add(new ConditionB());
conditions.add(new ConditionA());
for (AbstractCondition c : conditions) {
tryDoSomething(c);
}
}
public static void tryDoSomething(AbstractCondition condition) {
AbstractProduct product = null;
if (condition instanceof ConditionA) {
product = new ProductA(condition);
} else if (condition instanceof ConditionB) {
product = new ProductB(condition);
}
product.doSomething();
}
}
The difference with the code above of my real code is: I have NO direct control over AbstractCondition and its subtypes (as they are in a library), but the creation of a concrete subtype of AbstractProduct depends on the concrete condition.
My goal being: try to avoid the if-else code smell in tryDoSomething().
I would also like to avoid reflection because it feels like cheating and I do think it's not an elegant, clean and readable solution.
In other words, I would like to tackle the problem just with good OOP principles (e.g. exploiting polymorphism) and pheraps some design patterns (which apparently I don't know in this specific case).
Since you can't edit the original objects, you need to create a static map from condition type to product type:
private static HashMap< Class<? extends AbstractCondition>,
Class<? extends AbstractProduct>
> conditionToProduct;`
Fill it in static initialization with the pairs of Condition,Product:
static {
conditionToProduct.put(ConditionA.class, ProductA.class);
...
}
and in runtime just query the map:
Class<? extends AbstractProduct> productClass = conditionToProduct.get(condition.getClass());
productClass.newInstance();
AbstractCondition needs to know either the type or how to construct a product.
So add one of the following functions to AbstractCondition
Class<? extends AbstractProduct> getProductClass()
or
AbstractProduct createProduct()
You should create a Factory class to help you with that then.
interface IFactoryProduct{
AbstractProduct getProduct(AbstractCondition condition) throws Exception;
}
This will be your interface, just need to implement it like this.
class FactoryProduct implements IFactoryProduct{
public AbstractProduct getProduct(AbstractCondition condition) throws Exception{
return (AbstractProduct)getClass().getMethod("getProduct", condition.getClass()).invoke(this, condition);
}
public ProductA getProduct(ConditionA condition){
return new ProductA();
}
public ProductB getProduct(ConditionB condition){
return new ProductB();
}
}
Using the reflexion to redirect with the correct method will do the trick. this is upgradable for subclassed if you want.
EDIT:
Some example :
List<AbstractCondition> list = new ArrayList<AbstractCondition>();
list.add(new ConditionA());
list.add(new ConditionB());
for(AbstractCondition c : list){
try {
System.out.println(f.getProduct(c));
} catch (Exception ex) {
Logger.getLogger(Main.class.getName()).log(Level.SEVERE, null, ex);
}
}
labo.ProductA#c17164
labo.ProductB#1fb8ee3
A more complexe reflexion version allowing a subclass to be received :
public AbstractProduct getProduct(AbstractCondition condition) throws Exception{
Method m = getMethodFor(condition.getClass());
if(m == null )
throw new Exception("No method for this condition " + condition.getClass().getSimpleName());
else
return (AbstractProduct) m.invoke(this, condition);
}
private Method getMethodFor(Class<? extends AbstractCondition> clazz ) throws Exception{
try {
return getClass().getMethod("getProduct", clazz);
} catch (NoSuchMethodException ex) {
if(clazz.getSuperclass() != AbstractCondition.class){
return getMethodFor((Class<? extends AbstractCondition>)clazz.getSuperclass());
}
return null;
}
}
This allows me to send ConditionC extending ConditionB to build the same product has ConditionB would. Interesting for complexe heritage.

Is it possible to extend enum in Java 8?

Just playing and came up with a sweet way to add functionality to enums in Java Enum toString() method with this.
Some further tinkering allowed me to nearly also add a tidy (i.e. not throwing an exception) reverse look-up but there's a problem. It's reporting:
error: valueOf(String) in X cannot implement valueOf(String) in HasValue
public enum X implements PoliteEnum, ReverseLookup {
overriding method is static
Is there a way?
The aim here is to silently add (via an interface implementation with a default method like I added politeName in the linked answer) a lookup method that does the valueOf function without throwing an exception. Is it possible? It is clearly now possible to extend enum - one of my major problems with Java until now.
Here's my failed attempt:
public interface HasName {
public String name();
}
public interface PoliteEnum extends HasName {
default String politeName() {
return name().replace("_", " ");
}
}
public interface Lookup<P, Q> {
public Q lookup(P p);
}
public interface HasValue {
HasValue valueOf(String name);
}
public interface ReverseLookup extends HasValue, Lookup<String, HasValue> {
#Override
default HasValue lookup(String from) {
try {
return valueOf(from);
} catch (IllegalArgumentException e) {
return null;
}
}
}
public enum X implements PoliteEnum/* NOT ALLOWED :( , ReverseLookup*/ {
A_For_Ism, B_For_Mutton, C_Forth_Highlanders;
}
public void test() {
// Test the politeName
for (X x : X.values()) {
System.out.println(x.politeName());
}
// ToDo: Test lookup
}
You are over-complicating your design. If you are willing to accept that you can invoke a default method on an instance only, there entire code may look like this:
interface ReverseLookupSupport<E extends Enum<E>> {
Class<E> getDeclaringClass();
default E lookup(String name) {
try {
return Enum.valueOf(getDeclaringClass(), name);
} catch(IllegalArgumentException ex) { return null; }
}
}
enum Test implements ReverseLookupSupport<Test> {
FOO, BAR
}
You can test it with:
Test foo=Test.FOO;
Test bar=foo.lookup("BAR"), baz=foo.lookup("BAZ");
System.out.println(bar+" "+baz);
An non-throwing/catching alternative would be:
interface ReverseLookupSupport<E extends Enum<E>> {
Class<E> getDeclaringClass();
default Optional<E> lookup(String name) {
return Stream.of(getDeclaringClass().getEnumConstants())
.filter(e->e.name().equals(name)).findFirst();
}
to use like:
Test foo=Test.FOO;
Test bar=foo.lookup("BAR").orElse(null), baz=foo.lookup("BAZ").orElse(null);
System.out.println(bar+" "+baz);
Here, there's basically two points. Specifically the reason it doesn't compile is 8.4.8.1:
It is a compile-time error if an instance method overrides a static method.
In other words, an enum can't implement HasValue because of the name clash.
Then there's the more general issue we have which is that static methods just cannot be 'overridden'. Since valueOf is a static method inserted by the compiler on the Enum-derived class itself, there's no way to change it. We also can't use interfaces to solve it since they do not have static methods.
In this specific case it's a place where composition can make this kind of thing less repetetive, for example:
public class ValueOfHelper<E extends Enum<E>> {
private final Map<String, E> map = new HashMap<String, E>();
public ValueOfHelper(Class<E> cls) {
for(E e : EnumSet.allOf(cls))
map.put(e.name(), e);
}
public E valueOfOrNull(String name) {
return map.get(name);
}
}
public enum Composed {
A, B, C;
private static final ValueOfHelper<Composed> HELPER = (
new ValueOfHelper<Composed>(Composed.class)
);
public static Composed valueOfOrNull(String name) {
return HELPER.valueOfOrNull(name);
}
}
(Plus, I'd recommend that over catching the exception anyway.)
I realize "you can't do it" is not really a desirable answer but I don't see a way around it due to the static aspect.
The case is the same as you can not create default toString() in interface. The enum already contains signature for static valueOf(String) method therefore you can not override it.
The enum are compile time constant and because of that it really doubtful that they will be extensible someday.
If you want to get the constant via name you can use this:
public static <E extends Enum<E>> Optional<E> valueFor(Class<E> type, String name) {
return Arrays.stream(type.getEnumConstants()).filter( x -> x.name().equals(name)).findFirst();
}
I think I have an answer - it's hacky and uses reflection but seems to fit the brief - i.e. reverse lookup without methods in the enum and without throwing exception.
public interface HasName {
public String name();
}
public interface PoliteEnum extends HasName {
default String politeName() {
return name().replace("_", " ");
}
}
public interface Lookup<P, Q> {
public Q lookup(P p);
}
public interface ReverseLookup<T extends Enum<T>> extends Lookup<String, T> {
#Override
default T lookup(String s) {
return (T) useMap(this, s);
}
}
// Probably do somethiong better than this in the final version.
static final Map<String, Enum> theMap = new HashMap<>();
static Enum useMap(Object o, String s) {
if (theMap.isEmpty()) {
try {
// Yukk!!
Enum it = (Enum)o;
Class c = it.getDeclaringClass();
// Reflect to call the static method.
Method method = c.getMethod("values");
// Yukk!!
Enum[] enums = (Enum[])method.invoke(null);
// Walk the enums.
for ( Enum e : enums) {
theMap.put(e.name(), e);
}
} catch (Exception ex) {
// Ewwww
}
}
return theMap.get(s);
}
public enum X implements PoliteEnum, ReverseLookup<X> {
A_For_Ism,
B_For_Mutton,
C_Forth_Highlanders;
}
public void test() {
for (X x : X.values()) {
System.out.println(x.politeName());
}
for (X x : X.values()) {
System.out.println(x.lookup(x.name()));
}
}
prints
A For Ism
B For Mutton
C Forth Highlanders
A_For_Ism
B_For_Mutton
C_Forth_Highlanders
Added
Inspired by #Holger - this is what I feel is most like what I was looking for:
public interface ReverseLookup<E extends Enum<E>> extends Lookup<String, E> {
// Map of all classes that have lookups.
Map<Class, Map<String, Enum>> lookups = new ConcurrentHashMap<>();
// What I need from the Enum.
Class<E> getDeclaringClass();
#Override
default E lookup(String name) throws InterruptedException, ExecutionException {
// What class.
Class<E> c = getDeclaringClass();
// Get the map.
final Map<String, Enum> lookup = lookups.computeIfAbsent(c,
k -> Stream.of(c.getEnumConstants())
// Roll each enum into the lookup.
.collect(Collectors.toMap(Enum::name, Function.identity())));
// Look it up.
return c.cast(lookup.get(name));
}
}
// Use the above interfaces to add to the enum.
public enum X implements PoliteName, ReverseLookup<X> {
A_For_Ism,
B_For_Mutton,
C_Forth_Highlanders;
}

Categories

Resources