Related
I'm implementing caching in my Spring Boot (v2.5.2) app using GemFire.
The class which needs to be cached -
import org.springframework.data.annotation.Id;
import org.springframework.data.gemfire.mapping.annotation.Region;
import com.auth0.jwk.Jwk;
#Region("jwks")
public class Key {
#Id
private String id;
private Jwk jwk;
#PersistenceConstructor
public Key(String id, Jwk jwk){
this.id = id;
this.jwk = jwk;
}
// Getters and setters for variables
}
I get the following error while fetching the stored entity from cache -
org.apache.geode.SerializationException : While deserializing query result with root cause
java.lang.NoSuchMethodException: com.auth0.jwk.Jwk.<init>()
at java.lang.Class.getConstructors0(Class.java:3082)
at java.lang.Class.getDeclaredConstructor(Class.java:2187)
...
How can I deserialize this object when Jwk doesn't have a no-arg constructor?
Edit -
build.gradle
implementation 'org.springframework.geode:spring-geode-starter'
implementation 'org.springframework.geode:spring-data-geode'
Also, updated the constructor as per suggested answer which continues to give the same error.
Error causing code -
Optional<Key> key = this.keyRepository.findById(id);
First, please be precise when you ask questions.
Your Key class is not even valid Java. The Key class by itself would not even compile as is.
The constructors (JsonWebKey) in this case are 1) not named after the class (Key) and 2) you cannot have 2 no arg constructors.
I am assuming the second constructor would be:
public Key(String id, Jwk jwk) {
this.id = id;
this.jwk = jwk;
}
I am guessing either the constructors are misnamed or perhaps are part of some [static] inner class??
Never-the-less, assuming you are using Spring Boot for Apache Geode (and VMware Tanzu (Pivotal) GemFire), a.k.a. SBDG (see here; if you are not using SBDG, you should be!), and since SBDG enables GemFire/Geode PDX serialization by default, then SBDG with the help of Spring Data for Apache Geode (and VMware Tanzu GemFire), a.k.a. SDG (upon which SBDG is built, along with the core Spring Framework, Spring Data and Spring Boot) will handle these serialization concerns for you, with no extra effort on your part. See the corresponding SDG documentation.
By way of example, I wrote this test class, which resides in this package, to demonstrate.
By default, the test class is using PDX serialization configured with Spring using the test configuration. The Spring-based configuration class is here. The other test configuration classes are only enabled with the appropriate test Spring profile configured.
The model class for this test is the CompositeValue class in the model sub-package.
As you can see, the class has 2 primary constructors. I also declared a commented-out, default, public no-arg constructor, which I will explain further below.
This CompositeValue model class was designed very deliberately. You will notice that it is (mostly) immutable.
I use a Spring Data CrudRepository (see here) to save (persist/store) an instance of CompositeValue in a GemFire/Geode Region ("Values") and then retrieve it (findBy..). The "Values" Region is necessarily a PARTITION Region (see here), since a PARTITION Region stores values in serialized form, and in our case, PDX serialized.
Using the Spring configuration, the test runs and passes successfully! Spring is doing all the heavy lifting!
If you are NOT using Spring to its fullest extent, then you are (most likely) going to have problems unless you know what you are doing with GemFire/Geode.
Out-of-the-box, GemFire/Geode PDX serialization has certain limitations. See the documentation. Specifically, see here.
For instance, if you are using GemFire/Geode's ReflectionBasedAutoSerializer class (Javadoc, documentation; and I suspect you are) and not Spring, then it requires your application domain objects (model classes / entities) to have a default, public no-arg constructor.
This flies in the face of immutable, effectively immutable and mostly immutable classes since then you cannot appropriately initialize classes using constructors, which is crucial in a highly concurrent, multi-Threaded context, like GemFire/Geode.
You can see the effects of trying to use GemFire/Geode's ReflectionBasedAutoSerializer by enabling the "gemfire" profile in the example test class I wrote, for which the configuration is here.
The test will NOT pass without the commented-out, default, public no-arg constructor.
When using Apache Geode 1.13.4, I get the following error:
2021-11-01 11:53:08,720 WARN ode.pdx.internal.AutoSerializableManager: 274 - Class
io.stackoverflow.questions.spring.geode.serialization.pdx.model.CompositeValue
matched with '.*' cannot be auto-serialized due to missing public no-arg constructor.
Will attempt using Java serialization.
However, even with Java serialization (the GemFire/Geode backup serialization strategy), the test results in a failure:
Caused by: java.io.NotSerializableException: io.stackoverflow.questions.spring.geode.serialization.pdx.model.CompositeValue
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1184)
at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:348)
at org.apache.geode.internal.InternalDataSerializer.writeSerializableObject(InternalDataSerializer.java:2184)
at org.apache.geode.internal.InternalDataSerializer.basicWriteObject(InternalDataSerializer.java:2058)
at org.apache.geode.DataSerializer.writeObject(DataSerializer.java:2839)
at org.apache.geode.internal.cache.CachedDeserializableFactory.calcSerializedSize(CachedDeserializableFactory.java:245)
... 60 common frames omitted
Well, the java.io.NotSerializableException is thrown because the CompositeValue class (deliberately) does not implement the java.io.Serialiable interface.
Why deliberately? Because you cannot implement java.io.Serializable on classes you do not own, which is true when using 3rd party libraries and their classes. Even though we own the CompositeValue in my case, I am making a point, because in your case, you don't own Jwk.
So, not only can we not use (mostly/effectively) immutable classes, we also cannot rely on default serialization mechanisms baked into GemFire/Geode.
Of course, we can handle this by implementing a custom PdxSerializer, the second strategy in the documentation (under "Procedure", Step 1, "Serializing your Domain Object with a PdxSerializer").
If we again change the active Spring profile to "gemfire-custom-pdxserializer" in the example test I wrote, then the test will pass.
But, it comes at a high price! See the necessary configuration to make this arrangement work.
In our case, we have only 1 such model / entity class to build a custom PdxSerializer for. However, imagine if we had hundreds of classes to handle.
To make matters worse, GemFire/Geode only allows a single PdxSerializer to be registered with a Singleton GemFire/Geode cache, which means you can only have 1. Now you must rely on the Composite Software Design Pattern to compose multiple PdxSerializers necessary to handle all your application domain model types requiring serialization. While this is elegant, you must build a custom PdxSerializer per application model / entity type. Of course, you could bake all type handling into 1 PdxSerializer implementation, but that would get ugly rather quickly!
Finally, your application model / entity types could implement GemFire/Geode's PdxSerializable interface (Javadoc). This is no better than java.io.Serializable and (again) does not work for types you don't own. It also couples your application to GemFire/Geode and is why I did NOT demonstrate this approach, as it should be considered an anti-pattern.
With SDG's MappingPdxSerializer, which SBDG's auto-configures for you (by default), you do not need to do any of the above. SBDG auto-configures PDX by default (making the SDG #EnablePdx annotation unnecessary) and there is no special requirement for your application domain object (model) / entity classes.
However, if you have more than 1 constructor in your entity class, then you will need to designate 1 constructor as the primary persistence constructor. In the CompositeValue class, this constructor was designated as the primary, persistence constructor using Spring Data's #PersistenceConstructor annotation, which SDG's MappingPdxSerializer takes into account when deserializing and constructing your application domain object model types.
If you only have 1 constructor in your class, then you do not even need to declare the #PersistenceConstructor annotation on the only constructor. That is if the other constructor in CompositeValue did not exist, then the #PersistenceConstructor annotation on this constructor would not be necessary. SD[G] can figure it out.
Feel free to play around with my example test for you learning purposes.
I'm using JAX-RS for exposing REST endpoints.
To maintain a good package state I'd like to have my DTO classes (the one I return as Json and accept from Json) as package-private.
Does JAX-RS require those classes to be always public?
I'd like to apply the same thing to my custom Exception Mapper (#Provider annotated).
According to JAX-RS 2.0 specification (ch. 04, p. 27):
4.1.2 Constructors
Provider classes that are instantiated by the JAX-RS runtime and MUST
have a public constructor for which the JAX-RS runtime can provide all
parameter values. Note that a zero argument constructor is permissible
under this rule.
Effectively public ctors can only exists in public classes.
As discussed, this is a vendor-specific question, but any vendor which allows registering package-private provider doesn't truly follow the specification.
We recently migrated from a pretty old Spring version (3.2.16.RELEASE) to one of the latest (4.2.5.RELEASE). Because of this change we were able to remove in some .xml files more than 1k lines of code and replace them with just about 50-100 annotations in the Java classes. All in all a great change.
Yesterday I also started the process of removing all interfaces that were being added by the #Service-annotation. So, before we had something like this:
//The interface:
public interface SomeInterfaceService extends DtoProcessingService<SimpleDto> {
}
//The class:
#Service("someClassService")
public class SomeClassService
extends SomeAbstractClassService<SimpleDto, Strategy>
implements SomeInterfaceService {
...
}
Which we've refactored to:
// The interface has been removed
//The class:
#Service
public class SomeClassService
extends SomeAbstractClassService<SimpleDto, Stategy> {
...
}
In the new Spring versions, #Service automatically uses the class-name starting with a lowercase (so someClassService).
At first we received some errors, and a possible solution for this was a Spring xml-setting to explicitly use proxy classes. So, I've read a bit about what the force of these proxies imply, and found the following things:
... According to the project page "CGLIB is used to extend Java classes and implements interfaces at runtime". So in this case the trick consists on create a proxy that EXTENDS the original object and therefore can be used instead. (source)
1) final methods cannot be advised, as they cannot be overriden. (source)
2) You will need the CGLIB 2 binaries on your classpath, whereas dynamic proxies are available with the JDK. Spring will automatically warn you when it needs CGLIB and the CGLIB library classes are not found on the classpath. (source)
3) The constructor of your proxied object will be called twice. This is a natural consequence of the CGLIB proxy model whereby a subclass is generated for each proxied object. For each proxied instance, two objects are created: the actual proxied object and an instance of the subclass that implements the advice. This behavior is not exhibited when using JDK proxies. Usually, calling the constructor of the proxied type twice, is not an issue, as there are usually only assignments taking place and no real logic is implemented in the constructor. (source)
4) Your classes cannot be final. (source 1 & source 2)
None of our classes were final, none of their methods were final, none of the classes had implicit Constructors, so I've added this setting to the Spring xml:
<tx:annotation-driven proxy-target-class="true"/>
And the error disappeared.
Now I have a few problems and concerns though:
Some classes / interfaces were a bit too difficult to refactor, so we left them as they were (for now). These are mostly class-trees with a lot of generics, abstraction and therefore difficult to refactor easily. Also, there are some other classes with interfaces I haven't even touched yet that still use interfaces.
Is this possible with the setting above, to use both CGLIB proxies AND JDK interfaces?
My guess is not entirely. Why? The classes with interfaces still seem to be initialized, but the fields that used to be filled automatically with Spring / Tapestry, aren't anymore for these classes (and I'm getting NullPointerExceptions everywhere). For example, the following used to work fine before the changes, but geometryMessageService is now null:
// Note that this is an unchanged class
#Transactional
#Service("geometryService")
public class DefaultGeometryService implements GeometryService {
...
private GeometryMessageService geometryMessageService;
public void setGeometryBerichtService(final GeometryBerichtService geometryBerichtService) {
this.geometryBerichtService = geometryBerichtService;
}
public void someMethod(){
...
this.geometryBerichtService.doSomething(); // <- NullPointerException
...
}
}
The no-argument constructor is a
requirement (tools like Hibernate use
reflection on this constructor to
instantiate objects).
I got this hand-wavy answer but could somebody explain further? Thanks
Hibernate, and code in general that creates objects via reflection use Class<T>.newInstance() to create a new instance of your classes. This method requires a public no-arg constructor to be able to instantiate the object. For most use cases, providing a no-arg constructor is not a problem.
There are hacks based on serialization that can work around not having a no-arg constructor, since serialization uses jvm magic to create objects without invoking the constructor. But this is not available across all VMs. For example, XStream can create instances of objects that don't have a public no-arg constructor, but only by running in a so-called "enhanced" mode which is available only on certain VMs. (See the link for details.) Hibernate's designers surely chose to maintain compatibility with all VMs and so avoids such tricks, and uses the officially supported reflection method Class<T>.newInstance() requiring a no-arg constructor.
Erm, sorry everyone, but Hibernate does not require that your classes must have a parameterless constructor. The JPA 2.0 specification requires it, and this is very lame on behalf of JPA. Other frameworks like JAXB also require it, which is also very lame on behalf of those frameworks.
(Actually, JAXB supposedly allows entity factories, but it insists on instantiating these factories by itself, requiring them to have a --guess what-- parameterless constructor, which in my book is exactly as good as not allowing factories; how lame is that!)
But Hibernate does not require such a thing.
Hibernate supports an interception mechanism, (see "Interceptor" in the documentation,) which allows you to instantiate your objects with whatever constructor parameters they need.
Basically, what you do is that when you setup hibernate you pass it an object implementing the org.hibernate.Interceptor interface, and hibernate will then be invoking the instantiate() method of that interface whenever it needs a new instance of an object of yours, so your implementation of that method can new your objects in whatever way you like.
I have done it in a project and it works like a charm. In this project I do things via JPA whenever possible, and I only use Hibernate features like the interceptor when I have no other option.
Hibernate seems to be somewhat insecure about it, as during startup it issues an info message for each of my entity classes, telling me INFO: HHH000182: No default (no-argument) constructor for class and class must be instantiated by Interceptor, but then later on I do instantiate them by interceptor, and it is happy with that.
To answer the "why" part of the question for tools other than Hibernate, the answer is "for absolutely no good reason", and this is proven by the existence of the hibernate interceptor. There are many tools out there that could have been supporting some similar mechanism for client object instantiation, but they don't, so they create the objects by themselves, so they have to require parameterless constructors. I am tempted to believe that this is happening because the creators of these tools think of themselves as ninja systems programmers who create frameworks full of magic to be used by ignorant application programmers, who (so they think) would never in their wildest dreams have a need for such advanced constructs as the... Factory Pattern. (Okay, I am tempted to think so. I don't actually think so. I am joking.)
Hibernate instantiates your objects. So it needs to be able to instantiate them. If there isn't a no-arg constructor, Hibernate won't know how to instantiate it, i.e. what argument to pass.
The hibernate documentation says:
4.1.1. Implement a no-argument constructor
All persistent classes must have a default constructor (which can be non-public) so that Hibernate can instantiate them using Constructor.newInstance(). It is recommended that you have a default constructor with at least package visibility for runtime proxy generation in Hibernate.
The hibernate is an ORM framework which supports field or property access strategy. However, it does not support constructor-based mapping - maybe what you would like ? - because of some issues like
1º What happens whether your class contains a lot of constructors
public class Person {
private String name;
private Integer age;
public Person(String name, Integer age) { ... }
public Person(String name) { ... }
public Person(Integer age) { ... }
}
As you can see, you deal with a issue of inconsistency because Hibernate cannot suppose which constructor should be called. For instance, suppose you need to retrieve a stored Person object
Person person = (Person) session.get(Person.class, <IDENTIFIER>);
Which constructor should Hibernate call to retrieve a Person object ? Can you see ?
2º And finally, by using reflection, Hibernate can instantiate a class through its no-arg constructor. So when you call
Person person = (Person) session.get(Person.class, <IDENTIFIER>);
Hibernate will instantiate your Person object as follows
Person.class.newInstance();
Which according to API documentation
The class is instantiated as if by a new expression with an empty argument list
Moral of the story
Person.class.newInstance();
is similar To
new Person();
Nothing else
Hibernate needs to create instances as result of your queries (via reflection), Hibernate relies on the no-arg constructor of entities for that, so you need to provide a no-arg constructor. What is not clear?
Actually, you can instantiate classes which have no 0-args constructor; you can get a list of a class' constructors, pick one and invoke it with bogus parameters.
While this is possible, and I guess it would work and wouldn't be problematic, you'll have to agree that is pretty weird.
Constructing objects the way Hibernate does (I believe it invokes the 0-arg constructor and then it probably modifies the instance's fields directly via Reflection. Perhaps it knows how to call setters) goes a little bit against how is an object supposed to be constructed in Java- invoke the constructor with the appropriate parameters so that the new object is the object you want. I believe that instantiating an object and then mutating it is somewhat "anti-Java" (or I would say, anti pure theoretical Java)- and definitely, if you do this via direct field manipulation, it goes encapsulation and all that fancy encapsulation stuff.
I think that the proper way to do this would be to define in the Hibernate mapping how an object should be instantiated from the info in the database row using the proper constructor... but this would be more complex- meaning both Hibernate would be even more complex, the mapping would be more complex... and all to be more "pure"; and I don't think this would have an advantage over the current approach (other than feeling good about doing things "the proper way").
Having said that, and seeing that the Hibernate approach is not very "clean", the obligation to have a 0-arg constructor is not strictly necessary, but I can understand somewhat the requirement, although I believe they did it on purely "proper way" grounds, when they strayed from the "proper way" (albeit for reasonable reasons) much before that.
It is much easier to create object with a parameterless constructor through reflection, and then fill its properties with data through reflection, than to try and match data to arbitrary parameters of a parameterized constructor, with changing names/naming conflicts, undefined logic inside constructor, parameter sets not matching properties of an object, et cetera.
Many ORMs and serializers require parameterless constructors, because paramterized constructors through reflection are very fragile, and parameterless constructors provide both stability to the application and control over the object behavior to the developer.
Hibernate uses proxies for lazy loading. If you do no define a constructor or make it private a few things may still work - the ones that do not depend on proxy mechanism. For example, loading the object (with no constructor) directly using query API.
But, if you use session.load method() you'll face InstantiationException from proxy generator lib due to non-availability of constructor.
This guy reported a similar situation:
http://kristian-domagala.blogspot.com/2008/10/proxy-instantiation-problem-from.html
Check out this section of the Java language spec that explains the difference between static and non-static inner classes: http://java.sun.com/docs/books/jls/third_edition/html/classes.html#8.1.3
A static inner class is conceptually no different than a regular general class declared in a .java file.
Since Hibernate needs to instantiate ProjectPK independantly of the Project instance, ProjectPK either needs to be a static inner class, or declared in it's own .java file.
reference org.hibernate.InstantiationException: No default constructor
In my case, I had to hide my no-arg constructor, but because Hibernate I couldn't do it. So I solved the problem in another way.
/**
* #deprecated (Hibernate's exclusive constructor)
*/
public ObjectConstructor (){ }
Summarizing of what is below. It matters if you want to be JPA compatible or strictly Hibernate
Just look at official documentation: https://docs.jboss.org/hibernate/orm/5.6/userguide/html_single/Hibernate_User_Guide.html#entity-pojo
Section 2.1 The Entity Class of the JPA 2.1 specification defines its requirements for an entity class. Applications that wish to remain portable across JPA providers should adhere to these requirements:
One point says:
The entity class must have a public or protected no-argument
constructor. It may define additional constructors as well.
However, hibernate is less strict in this:
Hibernate, however, is not as strict in its requirements. The differences from the list above include:
One point says:
The entity class must have a no-argument constructor, which may be
public, protected or package visibility. It may define additional
constructors as well.
More on that is right below:
https://docs.jboss.org/hibernate/orm/5.6/userguide/html_single/Hibernate_User_Guide.html#entity-pojo-constructor
JPA requires that this constructor be defined as public or protected. Hibernate, for the most part, does not care about the constructor visibility, as long as the system SecurityManager allows overriding the visibility setting. That said, the constructor should be defined with at least package visibility if you wish to leverage runtime proxy generation.
Can I make the spring service classes final? Is there any harm doing that? Nobody is going to extend the class. Is there any issue?
public final class MyService {
// Depedencies go here.
}
Don't make them final. If you use any AOP (including transaction support) on concrete classes, spring will use CGLIB to dynamically extend your class in order to make a proxy. And the requirement for CGLIB to work is to have your classes non-final. Otherwise an exception will be thrown.
Spring will create a JDK dynamic proxy rather than a CGLIB proxy if the following are true:
aop:config has proxy-target-classes set to false
Any other namespace configurations (e.g. tx:transaction-management) also have proxy-target-classes set to false
Your class implements an interface
If all three are true, then you can declare the class final. (You can even make the class package-private and the constructor private if you like, Spring will be able to instantiate it).
Otherwise, Spring will create a CGLIB proxy. You can still declare the class final (assuming it is not decorated with #Repository and you do not have a PersistenceExceptionPostBeanProcessor declared) if there are no public methods in the bean. Once you have a single public method, you cannot declare the class final with CBLIB proxying. (Note: you must have at least a package-private, no-argument constructor when proxying via CGLIB).
When is the above useful? Say you have a service interface (all services should generally have interfaces) with an implementation bean that is package private. The service is using JDK dynamic proxying. So, it can be final and non-visible outside the package, leaking fewer implementation details. Say the service needs a data access object. If no other service uses this DAO, why make it or any of its methods public? If all the methods on the DAO are package-private, the service implementation can still wire the DAO in and use its methods. From outside the package, callers only see the interface (a good thing) and any types that are used in the interface signature.
Finally (no pun intended), make a class final whenever you can. Concrete inheritance is both often confusing an abused (see fragile base problem). Final classes also allow some compiler optimizations.