How to distinguish between instances of class? - java

I have class Validator, which manage all validation criteria from files and database. But this criteria are loaded by Loader like this:
Validator validator = Loader.load("clients"); //get all from clients.cfg file
What is the best approach to determine from another class, which criteria are currently loaded?
Importer importer;
Validator clientsValidator = Loader.load("clients");
Validator addressValidator = Loader.load("address"); ...
importer.validate(data, clientsValidator, addressValidator);
public class Importer{
public void validate(Data data, Validator... validator){
...
validateClient(data, one of validators);
validateAddress(data, another of validator);
...
}
}
I need to know in Importer class, which Validator is for Clients, which for Addresses... Any good approaches?

The best way would be for you to be add a field and accompanying method to Validator to return the identifier (e.g. "clients") with which it was created.
Alternatively, if by using a different identifier when calling Loader.load() you get back instances of different classes implementing the Validator interface, then you can use the Object.getClass() method to tell those classes apart. If those classes are within a pretty small set you might even get away with using instanceof directly.
We would need more information, such as what Loader does exactly, what Validator is and how much you are allowed to change their code before being able to provide a more concrete answer.
EDIT:
Quite honestly, perhaps you should reconsider a redesign of your data model. As it stands, you can apparently mix clients and addresses without any checks. You should restructure your code to be able to rely on the type safety features of Java.
One way would be to have a generic class/interface Validator<T>, where T would the class of the validated objects:
public interface Validator<T> {
public boolean validate(T object);
}
You could then have specific Data subclasses for your data, such as Address or Client, and set typed Validator objects to Importer through specific methods:
public class Importer {
public void addAddressValidator(Validator<Address> validator) {
...
}
public void addClientValidator(Validator<Client> validator) {
...
}
}
This is far safer than mixing all validator objects in a single variadic method call, and it is also the preferred approach of most common frameworks in the wild.

Why not have a getSource() in Validator which gets set when Loader loads the source.
Thinking more about the specific question below :
I need to know in Importer class, which Validator is for Clients,
which for Addresses... Any good approaches?
Actually a better way to do this is if Loader can return a ClientValidator (implementation of Validator) for client and AddressValidator for addresses.
That way you can avoid the if-else conditions and directly call validate on the Validator class

Pass the validators by position. You must also check if the specific validator is null or not before you use.
public void validate(Data data,
Validator clientsValidator,
Validator addressValidator) {
...
if (clientsValidator != null) {
validateClient(data, clientsValidator);
}
if (addressValidator != null) {
validateAddress(data, addressValidator);
}
...
}

Related

Storing all classes that use an interface with reflection? [duplicate]

Can I do it with reflection or something like that?
I have been searching for a while and there seems to be different approaches, here is a summary:
reflections library is pretty popular if u don't mind adding the dependency. It would look like this:
Reflections reflections = new Reflections("firstdeveloper.examples.reflections");
Set<Class<? extends Pet>> classes = reflections.getSubTypesOf(Pet.class);
ServiceLoader (as per erickson answer) and it would look like this:
ServiceLoader<Pet> loader = ServiceLoader.load(Pet.class);
for (Pet implClass : loader) {
System.out.println(implClass.getClass().getSimpleName()); // prints Dog, Cat
}
Note that for this to work you need to define Petas a ServiceProviderInterface (SPI) and declare its implementations. you do that by creating a file in resources/META-INF/services with the name examples.reflections.Pet and declare all implementations of Pet in it
examples.reflections.Dog
examples.reflections.Cat
package-level annotation. here is an example:
Package[] packages = Package.getPackages();
for (Package p : packages) {
MyPackageAnnotation annotation = p.getAnnotation(MyPackageAnnotation.class);
if (annotation != null) {
Class<?>[] implementations = annotation.implementationsOfPet();
for (Class<?> impl : implementations) {
System.out.println(impl.getSimpleName());
}
}
}
and the annotation definition:
#Retention(RetentionPolicy.RUNTIME)
#Target(ElementType.PACKAGE)
public #interface MyPackageAnnotation {
Class<?>[] implementationsOfPet() default {};
}
and you must declare the package-level annotation in a file named package-info.java inside that package. here are sample contents:
#MyPackageAnnotation(implementationsOfPet = {Dog.class, Cat.class})
package examples.reflections;
Note that only packages that are known to the ClassLoader at that time will be loaded by a call to Package.getPackages().
In addition, there are other approaches based on URLClassLoader that will always be limited to classes that have been already loaded, Unless you do a directory-based search.
What erickson said, but if you still want to do it then take a look at Reflections. From their page:
Using Reflections you can query your metadata for:
get all subtypes of some type
get all types annotated with some annotation
get all types annotated with some annotation, including annotation parameters matching
get all methods annotated with some
In general, it's expensive to do this. To use reflection, the class has to be loaded. If you want to load every class available on the classpath, that will take time and memory, and isn't recommended.
If you want to avoid this, you'd need to implement your own class file parser that operated more efficiently, instead of reflection. A byte code engineering library may help with this approach.
The Service Provider mechanism is the conventional means to enumerate implementations of a pluggable service, and has become more established with the introduction of Project Jigsaw (modules) in Java 9. Use the ServiceLoader in Java 6, or implement your own in earlier versions. I provided an example in another answer.
Spring has a pretty simple way to acheive this:
public interface ITask {
void doStuff();
}
#Component
public class MyTask implements ITask {
public void doStuff(){}
}
Then you can autowire a list of type ITask and Spring will populate it with all implementations:
#Service
public class TaskService {
#Autowired
private List<ITask> tasks;
}
The most robust mechanism for listing all classes that implement a given interface is currently ClassGraph, because it handles the widest possible array of classpath specification mechanisms, including the new JPMS module system. (I am the author.)
try (ScanResult scanResult = new ClassGraph().whitelistPackages("x.y.z")
.enableClassInfo().scan()) {
for (ClassInfo ci : scanResult.getClassesImplementing("x.y.z.SomeInterface")) {
foundImplementingClass(ci); // Do something with the ClassInfo object
}
}
With ClassGraph it's pretty simple:
Groovy code to find implementations of my.package.MyInterface:
#Grab('io.github.classgraph:classgraph:4.6.18')
import io.github.classgraph.*
new ClassGraph().enableClassInfo().scan().withCloseable { scanResult ->
scanResult.getClassesImplementing('my.package.MyInterface').findAll{!it.abstract}*.name
}
What erikson said is best. Here's a related question and answer thread - http://www.velocityreviews.com/forums/t137693-find-all-implementing-classes-in-classpath.html
The Apache BCEL library allows you to read classes without loading them. I believe it will be faster because you should be able to skip the verification step. The other problem with loading all classes using the classloader is that you will suffer a huge memory impact as well as inadvertently run any static code blocks which you probably do not want to do.
The Apache BCEL library link - http://jakarta.apache.org/bcel/
Yes, the first step is to identify "all" the classes that you cared about. If you already have this information, you can enumerate through each of them and use instanceof to validate the relationship. A related article is here: https://web.archive.org/web/20100226233915/www.javaworld.com/javaworld/javatips/jw-javatip113.html
Also, if you are writing an IDE plugin (where what you are trying to do is relatively common), then the IDE typically offers you more efficient ways to access the class hierarchy of the current state of the user code.
I ran into the same issue. My solution was to use reflection to examine all of the methods in an ObjectFactory class, eliminating those that were not createXXX() methods returning an instance of one of my bound POJOs. Each class so discovered is added to a Class[] array, which was then passed to the JAXBContext instantiation call. This performs well, needing only to load the ObjectFactory class, which was about to be needed anyway. I only need to maintain the ObjectFactory class, a task either performed by hand (in my case, because I started with POJOs and used schemagen), or can be generated as needed by xjc. Either way, it is performant, simple, and effective.
A new version of #kaybee99's answer, but now returning what the user asks: the implementations...
Spring has a pretty simple way to acheive this:
public interface ITask {
void doStuff();
default ITask getImplementation() {
return this;
}
}
#Component
public class MyTask implements ITask {
public void doStuff(){}
}
Then you can autowire a list of type ITask and Spring will populate it with all implementations:
#Service
public class TaskService {
#Autowired(required = false)
private List<ITask> tasks;
if ( tasks != null)
for (ITask<?> taskImpl: tasks) {
taskImpl.doStuff();
}
}

Force the usage of a JPA AttributeConverter for enums

We're trying to figure out a robust way of persisting enums using JPA. The common approach of using #Enumerated is not desirable, because it's too easy to break the mappings when refactoring. Each enum should have a separate database value that can be different than the enum name/order, so that you can safely change the name or internal ordering (e.g. the ordinal values) of the enum without breaking anything. E.g. this blog post has an example on how to achieve this, but we feel the suggested solution adds too much clutter to the code. We'd like to achieve a similar result by using the new AttributeConverter mechanism introduced in JPA 2.1. We have an interface that each enum should implement that defines a method for getting the value that is used to store the enum in the database. Example:
public interface PersistableEnum {
String getDatabaseValue();
}
...
public enum SomeEnum implements PersistableEnum {
FOO("foo"), BAR("bar");
private String databaseValue;
private SomeEnum(String databaseValue) {
this.databaseValue = databaseValue;
}
public void getDatabaseValue() {
return databaseValue;
}
}
We also have a base converter that has the logic for converting enums to Strings and vice versa, and separate concrete converter classes for each enum type (AFAIK, a fully generic enum converter is not possible to implement, this is also noted in this SO answer). The concrete converters then simply call the base class that does the conversion, like this:
public abstract class EnumConverter<E extends PersistableEnum> {
protected String toDatabaseValue(E value) {
// Do the conversion...
}
protected E toEntityAttribute(Class<E> enumClass, String value) {
// Do the conversion...
}
}
...
#Converter(autoApply = true)
public class SomeEnumConverter extends EnumConverter<SomeEnum>
implements AttributeConverter<SomeEnum, String> {
public String convertToDatabaseColumn(SomeEnum attribute) {
return toDatabaseValue(attribute);
}
public SomeEnum convertToEntityAttribute(String dbData) {
return toEntityAttribute(SomeEnum.class, dbData);
}
}
However, while this approach works very nicely in a technical sense, there's still a pretty nasty pitfall: Whenever someone creates a new enum class whose values need to be stored to the database, that person also needs to remember to make the new enum implement the PersistableEnum interface and write a converter class for it. Without this, the enum will get persisted without a problem, but the conversion will default to using #Enumerated(EnumType.ORDINAL), which is exactly what we want to avoid. How could we prevent this? Is there a way to make JPA (in our case, Hibernate) NOT default to any mapping, but e.g. throw an exception if no #Enumerated is defined on a field and no converter can be found for the type? Or could we create a "catch all" converter that is called for all enums that don't have their own specific converter class and always throw an exception from there? Or do we just have to suck it up and try to remember the additional steps each time?
You want to ensure that all Enums are instances of PersistableEnum.
You need to set a Default Entity Listener (an entity listener whose callbacks apply to all entities in the persistence unit).
In the Default Entity Listener class implement the #PrePersist method and make sure all the Enums are instances of PersistableEnum.

Java Inheritance and Wrapping

I have a generated object that I want to:
Preserve existing functionality of without injecting into the constructor and rewriting every method to call injectedObject.sameMethod().
Add additional functionality to that generated object without modifying the generated object.
add additional functionality to.
For example:
public class GeneratedObject {
public String getThis() { ... }
public String getThat() { ... }
}
public interface ObjectWrapper {
String doThisWithThat();
}
public class ObjectWrapperImpl extends GeneratedObject implements ObjectWrapper {
String doThisWithThat() { ... }
}
However, downcasting is not allowed, what is the proper implementation without rewriting a bunch of redundant code just to wrap the object?
I think decorator pattern may help you: "The decorator pattern can be used to extend (decorate) the functionality of a certain object at run-time, independently of other instances of the same class"
Have you tried aspectj? http://www.eclipse.org/aspectj/doc/next/progguide/semantics-declare.html It's a bit complicated but so is your request.
If you can extract an interface from GeneratedObject, then it would be possible to do this using a dynamic proxy. You would make a proxy which implemented the extracted interface and ObjectWrapper, with an invocation handler which passed all calls to methods in the GeneratedObject interface through to the delegate, and sent the doThisWithThat() calls elsewhere.
Proxies aren't pretty, but the ugliness is at least well-localised.

Spring + Mongo + Generics + Flexibility

The following code doesn't work (of course), because the marked line does not compile:
MyClass {
//singleton stuff
private static MyClass instance;
private MyClass () {}
public static MyClass getInstance() {
if(instance==null) {
instance = new MyClass ();
}
return instance;
}
// method creating problems
public NonGenericSuperClassOfGenericClass create(Class<?>... classes) {
if(someCondition)
return new GenericClass<classes[0],classes[1]>; // DOES NOT COMPILE
else
return new OtherGenericClass<classes[0]>;
}
}
Therefore, I actually don't know whether "create" will return
GenericClass<classes[0],classes[1]>
or
OtherGenericClass<classes[0]>
which have different numbers of parameters.
This happens because I'm using Spring and I plan to use MongoDB, but in the future I may need to switch to something different (e.g. Hibernate).
The class GenericClass is something like:
GenericClass<PersistetType1, Long>
or
GenericClass<PersistentType2, Long>
where PersistentType1/2 are classes that I need to finally store in the DB, while, GenericClass is a sort of Proxy to access Mongo APIs. In fact, it looks like:
public MongoTemplate getTemplate();
public void save(T toInsert);
public List<T> select(Query selectionQuery);
public T selectById(ID id);
public WriteResult update(Query selectionQuery, Update updatedAttributes);
public void delete(T toRemove);
public void delete(Query selectionQuery);
Now, what?
From Controllers (or Entity, if you are picky) I need to instantiate the repository and invoke any methods. This causes the Controllers to be coupled with MongoDB, i.e. they explicitly have to instantiate such GenericClass, which is actually called MongoRepository and is strictly dependent on Mongo (in fact it is a generic with exactly two "degrees of freedom").
So, I decided to create MyClass, that is a further proxy that isolates Controllers. In this way, Controller can get the single instance of MyClass and let it create a new instance of the appropriate repository. In particular, when "somecondition" is true, it means that we want to use MongoRepository (when it is false, maybe, a need to instantiate a Hibernate proxy, i.e. HibernateRepository). However, MongoRepository is generic, therefore it requires some form of instantiation, that I hoped to pass as a parameter.
Unfortunately, generics are resolved at compile time, thus they don't work for me, I guess.
How can I fix that?
In order to decouple the underlying persistence store from your application logic I would use the DAO approach.
Define the interface of your DAO with the required methods e.g. save, update etc. And then provide an implementation for each persistence provider you might need e.g.UserAccess might be the interface which you could implement as HibernateUserAccess and MongoUserAccess. In each implementation you inject the appropriate Template e.g. Mongo or Hibernate and use that to complete the persistence operation.
The issue you might have is that your load operation would return an instance of User, this would need to vary across persistence providers i.e. JPA annotations would be different to the Spring Data annotations needed for MongoDB (leaky abstraction).
I would probably solve that by creating a User interface to represent the result of the persistence operation and having an implementation for each persistence provider. Either that or return a common model which you build from the results of a JPA or Mongo load.

How can I change annotations/Hibernate validation rules at runtime?

If have a Java class with some fields I want to validate using Hibernate Validator.
Now I want my users to be able to configure at runtime which validations take place.
For example:
public class MyPojo {
...
#NotEmpty
String void getMyField() {
...
}
...
}
Let's say I want to remove the NotEmpty check or replace it with Email or CreditCardNumber, how can I do it? Is it even possible? I guess it comes down to changing annotations at runtime...
You can't do it normally.
Here's what I've done to get more dynamic validations working via Hibernate Validator.
Extend the ClassValidator class.
Override the getInvalidVaues(Object myObj) method. First, call super.getInvalidValues(myObj), then add the hook to your customized validation.
Instantiate your custom validator and call getInvalidValues to validate. Any hibernate annotated validations will kick off at this point, and your custom dynamic validations (anything not supported by annotations) will kick off as well.
Example:
public class MyObjectValidator extends ClassValidator<MyObject>
{
public MyObjectValidator()
{
super(MyObject.class);
}
public InvalidValue[] getInvalidValues(MyObject myObj)
{
List<InvalidValue> invalids = new ArrayList<InvalidValue>();
invalids.addAll(Arrays.asList(super.getInvalidValues(myObj)));
// add custom validations here
invalids.addAll(validateDynamicStuff(myObj));
InvalidValue[] results = new InvalidValue[invalids.size()];
return invalids.toArray(results);
}
private List<InvalidValue> validateDynamicStuff(MyObject myObj)
{
// ... whatever validations you want ...
}
}
So your custom validation code can contain logic like "Do this validation, if the user configured it, otherwise do that one", etc. You may or may not be able to leverage the same code that powers the hibernate validations, but either way, what you are doing is more involved that the 'normal' use case for hibernate validator.
Actually it is possible in hibernate validator 4.1. Just read the documentation about programatic constraint creation.
I don't think you'll be able to remove or change the annotation, it's part of the class definition. You can build a new class, which is possible at runtime but a little involved. Hibernate may support programmatic access to the validations and allow you to override the annotation, I don't know the API that well. Hibernate does a bit of runtime class building itself... that might be a good place to learn how to do it if you're interested.

Categories

Resources