Fellow SO-er's:
I've been puzzling over this one for a couple of days, and, as of yet, don't have a solution ...
I'm building a Spring Boot web app and what I'd like to be able to do is to activate/deactivate encryption of data fields in my datastore (using the facilities provided by jasypt+spring+hibernate) via activating/deactivating configuration profiles. So that - for development work - I can have data fields stored as clear text, while for production, they would be encrypted.
Currently, I'm doing this via a rather inelegant approach. Specifically, I comment/uncomment code in my package-info.java file where I define two #TypeDefs annotation blocks for the type used for the datastore field- one of which is commented and the other of which is active. Thus, my current package-info.java file is written as follows:
//////////////////////////////////////////////////////////////////////////////////////////////////////////////
// Use this #TypeDefs annotation when dataencrypt configuration profile is active
//#TypeDefs({ #TypeDef(name = com.castlehillgaming.gameshare.model.Ticket.ENCRYPTED_STRING_TYPENAME, typeClass = EncryptedStringType.class, parameters = {
// #Parameter(name = "encryptorRegisteredName", value = com..evilcorp.evilproject.config.EncryptionConfig.REGISTERED_NAME) }) })
//////////////////////////////////////////////////////////////////////////////////////////////////////////////
//////////////////////////////////////////////////////////////////////////////////////////////////////////////
// Use this #TypeDefs annotation when dataencrypt configuration profile is not active
#TypeDefs({
#TypeDef(name = com.evilcorp.evilproject.model.Ticket.ENCRYPTED_STRING_TYPENAME, typeClass = String.class, parameters = {
#Parameter(name = "encryptorRegisteredName", value = com..evilcorp.evilproject.config.EncryptionConfig.REGISTERED_NAME) }) })
//////////////////////////////////////////////////////////////////////////////////////////////////////////////
package com..evilcorp.evilproject.model;
import org.hibernate.annotations.Parameter;
import org.hibernate.annotations.TypeDef;
import org.hibernate.annotations.TypeDefs;
import org.jasypt.hibernate4.type.EncryptedStringType;
And my #Entity Ticket class contains the following:
#Entity
#EqualsAndHashCode(of = { "ticketId" })
#NoArgsConstructor(access = AccessLevel.PRIVATE, force = true)
public class Ticket implements Serializable {
...
#Column(unique = true, nullable = false)
#Type(type = ENCRYPTED_STRING_TYPENAME)
private #Getter String ticketId;
...
}
I'm hoping that I can devise something that will allow me to reduce my package-info.java file to the following:
#TypeDefs({
#TypeDef(name = com.evilcorp.evilproject.model.Ticket.ENCRYPTED_STRING_TYPENAME, typeClass = com.evilcorp.evilproject.config.MyTicketDataFieldString.class, parameters = {
#Parameter(name = "encryptorRegisteredName", value = com..evilcorp.evilproject.config.EncryptionConfig.REGISTERED_NAME) }) })
package com..evilcorp.evilproject.model;
import org.hibernate.annotations.Parameter;
import org.hibernate.annotations.TypeDef;
import org.hibernate.annotations.TypeDefs;
And define two distinct versions of MyTicketDataFieldString class based on the state of Spring Boot Configuration Profiles. E.g.,
#Configuration
#Profile("dataencrypt")
public class MyTicketDataFieldString extends EncryptedStringType {}
and
#Configuration
#Profile("!dataencrypt")
public class MyTicketDataFieldString implements CharSequence { ... }
where the CharSequence implementation behaves like a vanilla java.lang.String.
But, this won't work because I'll have to define the same class twice in a specific package.
Any ideas on how this can be done (or something equivalent) would be much appreciated.
Related
I am using MapStruct to have a target bean mapped from attributes coming from 2 source beans. This is something very common that MapStruct easily does by Controlling nested mappings. My problem is that I need to "calculate/validate" a target attribute taking as input one of the attributes in a source bean.
I will explain with a few code snippets.
First, we have the beans, something like (simplified):
import lombok.Builder;
import lombok.Getter;
import java.util.List;
#Getter
#Builder
public class TargetBean {
private final List<String> targetListTransformed;
private final List<String> targetListDirectMapping;
private final List<String> targetListTransformedWithAnotherMethod;
private final String targetString;
private final Integer targetInteger;
}
One of the source beans:
import lombok.Getter;
import lombok.RequiredArgsConstructor;
import java.util.List;
#Getter
#RequiredArgsConstructor
public class SourceBeanOne {
private final List<String> sourceListToTransform;
private final List<String> sourceListDirectMapping;
private final String sourceString;
}
and the other source bean:
import lombok.Getter;
import lombok.RequiredArgsConstructor;
#Getter
#RequiredArgsConstructor
public class SourceBeanTwo {
private final Integer sourceInteger;
}
Then the mapping attempt:
#Mapper(uses = SourceOneExtractor.class)
public abstract class TargetBeanMapper {
#Mapping(target = "targetListTransformed", source = "source1")
#Mapping(target = "targetListDirectMapping", source = "source1.sourceListDirectMapping")
#Mapping(target = "targetListTransformedWithAnotherMethod", qualifiedByName = {"sourceOneTransformer", "transformerTwo"})
#Mapping(target = "targetString", source = "source1.sourceString")
#Mapping(target = "targetInteger", source = "source2.sourceInteger")
abstract TargetBean map (SourceBeanOne source1, SourceBeanTwo source2);
List<String> mapTargetList(SourceBeanOne source1) {
final List<String> result = new ArrayList<>();
// simulate a transformation
result.add("one");
result.add("two");
return result;
}
}
As you can see, I have tried Invoking custom mapping method, where you can see see a dummy implementation what I need to do: use one, or more, attributes in SourceBeanOne to produce the List to be mapped into TargetBean.targetListTransformed.
That perfectly works. My problem came when I realized that I needed to produce another List for another Target attribute, doing different things with other attributes in SourceBeanOne. Another custom mapping method is not possible because MapStruct cannot disambiguate. That's how I ended trying 5.9. Mapping method selection based on qualifiers. And this is the resulting qualifier class based on #Named (to avoid generating a bunch of #Qualified annotations):
import org.mapstruct.Named;
import java.util.ArrayList;
import java.util.List;
#Named("sourceOneTransformer")
public class SourceOneExtractor {
#Named("transformerOne")
public List<String> getterMethodOne(SourceBeanOne s) {
final List<String> result = new ArrayList<>();
// simulate a transformation
result.add("one");
result.add("two");
return result;
}
#Named("transformerTwo")
public List<String> getterMethodTwo(SourceBeanOne s) {
final List<String> result = new ArrayList<>();
// simulate a transformation
result.add("three");
result.add("four");
return result;
}
}
Then I got this error:
No property named "targetListTransformedWithAnotherMethod" exists in source parameter(s). Please define the source explicitly.
At the beginning I had to deal with Lombok-Mapstruct issues reading some other posts in Stack Overflow and MapStruct documentation (Can I use MapStruct together with Project Lombok?), but once solved, everything is working and Lombok annotations are correctly processed before MapStruct's. But, just in case, I also tried with hand-made code for constructors, target builder and getters with the same result, so Lombok is not causing this. I also made sure I am using the right #Name annotation (Why does #Name not work?).
My real use case (code here is just a simplification for the sake of clarity) source does not have any attribute named "targetListTransformedWithAnotherMethod".
The last part of the error finally gave me the hint: " Please define the source explicitly."
Looking at Mapstruct source I saw that it was trying to find an attribute in SourceBeanOne named "targetListTransformedWithAnotherMethod". As there was not one, mapping was not possible and it failed saying "[...] define the source explicitly".
Despite none of the examples in 5.9. Mapping method selection based on qualifiers indicated the source bean by means of the source parameter, I tried it and now it works.
I leave here the final code of the mapper and a test, for reference. The code of the qualifier is the same as in the question above.
The mapper:
#Mapper(uses = SourceOneExtractor.class)
public interface TargetBeanMapper {
#Mapping(target = "targetListTransformed", source= "source1", qualifiedByName = {"sourceOneTransformer", "transformerOne"})
#Mapping(target = "targetListDirectMapping", source = "source1.sourceListDirectMapping")
#Mapping(target = "targetListTransformedWithAnotherMethod", source= "source1", qualifiedByName = {"sourceOneTransformer", "transformerTwo"})
#Mapping(target = "targetString", source = "source1.sourceString")
#Mapping(target = "targetInteger", source = "source2.sourceInteger")
TargetBean map (SourceBeanOne source1, SourceBeanTwo source2);
}
A sample test:
class TargetBeanMapperTest {
private static TargetBeanMapper mapper;
#BeforeAll
static void instantiateMapper() {
// instantiate mapper under test
mapper = Mappers.getMapper(TargetBeanMapper.class);
}
#Test
void map() {
final SourceBeanOne s1 = new SourceBeanOne(List.of("this", "list", "will be transformed"),
List.of("this", "list", "will be directly mapped"),
"sourceOneString");
final SourceBeanTwo s2 = new SourceBeanTwo(66);
final TargetBean t = mapper.map(s1, s2);
assertNotEquals(s1.getSourceListToTransform(),
t.getTargetListTransformed()); // check that source1 list has been transformed
assertEquals(List.of("one", "two"), t.getTargetListTransformed());
assertEquals(List.of("three", "four"),
t.getTargetListTransformedWithAnotherMethod()); // check that source1 list has been transformed
assertEquals(s1.getSourceListDirectMapping(),
t.getTargetListDirectMapping());
assertEquals(s1.getSourceString(), t.getTargetString());
assertEquals(s2.getSourceInteger(), t.getTargetInteger());
}
}
I am trying to create an object model that represents a hierarchy of nested device locations. For example a 'deck' contains a 'slide tray' which contains one or more 'slides'. I want to be able to read in a json file that contains the hierarchy/configuration of the system. I want to use Lombok builders in my classes so I can safely generate the json files in code when I need to. The more common use case is to read in the json file to create the pojo's on application startup. Generating the json files with the builder works great. However, I have not been to de-serialize the file back into pojo's.
Here is the error I am getting:
com.fasterxml.jackson.databind.exc.InvalidDefinitionException: Cannot construct instance of `my.org.Deck$DeckBuilder` (no Creators, like default construct, exist): cannot deserialize from Object value (no delegate- or property-based Creator)
at [Source: (String)"{"type":"Deck","locNumber":1,
The top level super-class is this:
package my.org;
import com.fasterxml.jackson.annotation.JsonProperty;
import com.fasterxml.jackson.annotation.JsonSubTypes;
import com.fasterxml.jackson.annotation.JsonTypeInfo;
import com.fasterxml.jackson.databind.annotation.JsonDeserialize;
import lombok.Getter;
import lombok.Singular;
import lombok.experimental.Accessors;
import lombok.experimental.SuperBuilder;
import java.awt.geom.Point2D;
import java.util.List;
#JsonTypeInfo(use = JsonTypeInfo.Id.NAME, property = "type")
#JsonSubTypes({
#JsonSubTypes.Type(value = Deck.class, name = "Deck"),
#JsonSubTypes.Type(value = SlideTray.class, name = "SlideTray"),
#JsonSubTypes.Type(value = Slide.class, name = "Slide"),
#JsonSubTypes.Type(value = NullLoc.class, name = "null"),
})
#SuperBuilder
#Getter
#Accessors(fluent = true, chain = true)
#JsonDeserialize(builder = BaseLocationType.BaseLocationTypeBuilder.class)
public class BaseLocationType<T extends BaseLocationType> {
#JsonProperty("locNumber")
private int locNumber;
#JsonProperty("posRelativeToParent")
private Point2D.Double positionRelativeToParent;
#Singular
#JsonProperty("childLocs")
private List<T> childLocs;
}
The Deck sub-class:
package my.org;
import com.fasterxml.jackson.databind.annotation.JsonDeserialize;
import lombok.EqualsAndHashCode;
import lombok.Getter;
import lombok.experimental.Accessors;
import lombok.experimental.SuperBuilder;
#SuperBuilder
#Getter
#EqualsAndHashCode(callSuper = true)
#Accessors(fluent = true, chain = true)
#JsonDeserialize(builder = Deck.DeckBuilder.class)
public class Deck extends BaseLocationType<SlideTray> {
private String deckField1;
private String deckField2;
}
The SlideTray sub-class:
package my.org;
import com.fasterxml.jackson.databind.annotation.JsonDeserialize;
import lombok.EqualsAndHashCode;
import lombok.Getter;
import lombok.experimental.Accessors;
import lombok.experimental.SuperBuilder;
#SuperBuilder
#Getter
#EqualsAndHashCode(callSuper = true)
#Accessors(fluent = true, chain = true)
#JsonDeserialize(builder = SlideTray.SlideTrayBuilder.class)
public class SlideTray extends BaseLocationType<Slide> {
private String slideTrayField1;
}
The Slide sub-class:
package my.org;
import com.fasterxml.jackson.databind.annotation.JsonDeserialize;
import lombok.EqualsAndHashCode;
import lombok.Getter;
import lombok.experimental.Accessors;
import lombok.experimental.SuperBuilder;
#SuperBuilder
#Getter
#EqualsAndHashCode(callSuper = true)
#Accessors(fluent = true, chain = true)
#JsonDeserialize(builder = Slide.SlideBuilder.class)
public class Slide extends BaseLocationType<NullLoc> {
private String slideField1;
}
NullLoc:
package my.org;
import lombok.experimental.SuperBuilder;
#SuperBuilder
public class NullLoc extends BaseLocationType<NullLoc> {
// no fields or builder, etc
}
Test Code - fails with the above exception on mapper.readValue():
// create 1 deck with 1 slideTray that has 2 slides
Deck.DeckBuilder<?, ?> deckBuilder = Deck.builder()
.locNumber(1)
.positionRelativeToParent(new Point2D.Double(1.0, 1.0))
.deckField1("deck f1 data")
.deckField2("deck f2 data")
.childLoc(SlideTray.builder()
.locNumber(2)
.positionRelativeToParent(new Point2D.Double(2.0, 2.0))
.slideTrayField1("slide tray f1 data")
.childLoc(Slide.builder()
.locNumber(3)
.positionRelativeToParent(new Point2D.Double(3.0, 3.0))
.slideField1("child1-slide f1 data")
.build())
.childLoc(Slide.builder()
.locNumber(4)
.positionRelativeToParent(new Point2D.Double(4.0, 4.0))
.slideField1("child2-slide f1 data")
.build()).build());
Deck deckPojo = deckBuilder.build();
// serialize the pojo's
String json = new ObjectMapper().writeValueAsString(deckPojo);
// de-serialize the json back into the pojo's
ObjectMapper mapper = new ObjectMapper();
Deck deckPojoDeserialized = mapper.readValue(json, Deck.class);
The json that is generated:
{
"type": "Deck",
"locNumber": 1,
"posRelativeToParent": {
"x": 1.0,
"y": 1.0
},
"childLocs": [
{
"type": "SlideTray",
"locNumber": 2,
"posRelativeToParent": {
"x": 2.0,
"y": 2.0
},
"childLocs": [
{
"type": "Slide",
"locNumber": 3,
"posRelativeToParent": {
"x": 3.0,
"y": 3.0
},
"childLocs": []
},
{
"type": "Slide",
"locNumber": 4,
"posRelativeToParent": {
"x": 4.0,
"y": 4.0
},
"childLocs": []
}
]
}
]
}
note: I'm not seeing a option here in stackoverflow to upload the demo-project zip file... but can figure out a way to share that if needed.
Thanks!
I think the root problem is related to the #JsonDeserialize annotation builder values defined across the three primary sub-classes, because they appear to be abstract class references. Which would also explain the error message you're receiving.
From the Lombok #SuperBuilder documentation ref:
To ensure type-safety, #SuperBuilder generates two inner builder classes for each annotated class, one abstract and one concrete class named FoobarBuilder and FoobarBuilderImpl (where Foobar is the name of the annotated class).
I believe updating the following #JsonDeserialize annotation builder values will help resolve the issue:
In the Deck sub-class:
#JsonDeserialize(builder = Deck.DeckBuilderImpl.class)
In the SlideTray sub-class:
#JsonDeserialize(builder = SlideTray.SlideTrayBuilderImpl.class)
In the Slide sub-class:
#JsonDeserialize(builder = Slide.SlideBuilderImpl.class)
Additional note with respect to BuilderImpl manual updates:
The #SuperBuilder documentationref includes the following supporting information relative to this topic:
Customizing the code generated by #SuperBuilder is limited to adding new methods or annotations to the builder classes, and providing custom implementations of the 'set', builder(), and build() methods. You have to make sure that the builder class declaration headers match those that would have been generated by lombok. Due to the heavy generics usage, we strongly advice to copy the builder class definition header from the uncustomized delomboked code.
I have inherited a MASSIVE Java application and am having quite a few problems trying to find my way around it. I have a specific problem regarding Spring JpaRepository. Please note that I have just started in Spring and am not that sure footed yet.
I have a repository with the #RepositoryRestResource annotations.
#RepositoryRestResource
public interface GoodsReceiptRepository extends JpaRepository<GoodsReceipt, GoodsReceiptId> {
I have the #Entity as well:
#Entity
#Table(name = AvisoPosition.TABLE)
#IdClass(AvisoPositionId.class)
public class AvisoPosition implements Serializable {
and
#Entity
#Table(name = GoodsReceipt.TABLE)
#IdClass(GoodsReceiptId.class)
public class GoodsReceipt implements Serializable {
any fields I have with the #Id annotation are not returned in the JSON response:
#Id #Column(name = "LGNTCC")
private String accountingArea;
How do I get these ID fields?
If I remove the #Id I get what I want but I do not dare do that as I cannot judge what effect that will have on the application.
Cheers
It seems that you're using Spring Data Rest what you're seeing is the default behavior for it.
You can customize this behavior by doing the following:
import org.springframework.context.annotation.Configuration;
import org.springframework.data.rest.core.config.RepositoryRestConfiguration;
import org.springframework.data.rest.webmvc.config.RepositoryRestConfigurerAdapter;
#Configuration
public class RepositoryConfig extends RepositoryRestConfigurerAdapter {
#Override
public void configureRepositoryRestConfiguration(RepositoryRestConfiguration config) {
config.exposeIdsFor(YourClassNameGoesHere.class);
}
}
Try to create a DTO for each entity, then avisoPostionDTO.setId(avisoPosition.getId()); and you just return the DTO of each entity.
I'm working on presentation in which I would like to show difference in number of executed sql queries between deleteByPost() method with and without custom query. I'm expecting method without custom query to execute 10001 delete queries and with it just 2.
I'm aware of Hibernate Statistics object and it's methods. I was expecting one of them, named getQueryExecutionCount(), to return the number of sql queries executed against db, but what I'm getting is always a 0.
If anyone wonders hibernate statistics are enabled for sure because I'm getting correct numbers on other properties like the count of deleted entities.
Below there is a complete example showing what I am trying to accomplish.
Is there a way to get the number of generated and executed queries using Statistics or any other mechanism? Currently I'm looking at logs (hibernate.show_sql) and counting printed queries but it just seems wrong to me.
package example5
import org.hibernate.SessionFactory
import org.junit.jupiter.api.AfterEach
import org.junit.jupiter.api.Assertions.assertEquals
import org.junit.jupiter.api.BeforeEach
import org.junit.jupiter.api.Test
import org.junit.jupiter.api.assertAll
import org.springframework.beans.factory.annotation.Autowired
import org.springframework.boot.autoconfigure.SpringBootApplication
import org.springframework.data.jpa.repository.Modifying
import org.springframework.data.jpa.repository.Query
import org.springframework.data.jpa.repository.config.EnableJpaRepositories
import org.springframework.data.repository.PagingAndSortingRepository
import org.springframework.data.repository.query.Param
import org.springframework.stereotype.Repository
import org.springframework.stereotype.Service
import org.springframework.test.context.junit.jupiter.SpringJUnitJupiterConfig
import org.springframework.transaction.annotation.EnableTransactionManagement
import org.springframework.transaction.annotation.Transactional
import javax.persistence.*
// ENTITIES
#Entity
#Table(name = "posts")
class Post(
#Id
#Column(name = "id")
#GeneratedValue(strategy = GenerationType.SEQUENCE)
var id: Long? = null,
#Version
#Column(name = "version")
var version: Long? = null
)
#Entity
#Table(name = "comments")
class Comment(
#Id
#Column(name = "id")
#GeneratedValue(strategy = GenerationType.SEQUENCE)
var id: Long? = null,
#Version
#Column(name = "version")
var version: Long? = null,
#JoinColumn(name = "post_id")
#ManyToOne(fetch = FetchType.LAZY)
var post: Post? = null
)
// REPOSITORIES
#Repository
interface PostRepository : PagingAndSortingRepository<Post, Long>
#Repository
interface CommentRepository : PagingAndSortingRepository<Comment, Long> {
#Modifying
#Query("delete from Comment c where c.post = :post")
fun deleteByPost(#Param("post") post: Post)
}
// SERVICES
interface PostService {
fun delete(post: Post)
}
#Service
open class PostServiceImpl(
#Autowired
val postRepository: PostRepository,
#Autowired
val commentRepository: CommentRepository
) : PostService {
#Transactional
override fun delete(post: Post) {
commentRepository.deleteByPost(post)
postRepository.delete(post)
}
}
// CONFIGURATION
#EnableJpaRepositories(basePackages = ["example5"])
#EnableTransactionManagement
#SpringBootApplication(scanBasePackages = ["example5"])
open class FrameworkApplication
// TESTS
#SpringJUnitJupiterConfig(classes = [FrameworkApplication::class])
class Example5(
#Autowired
val postService: PostService,
#Autowired
val postRepository: PostRepository,
#Autowired
val commentRepository: CommentRepository,
#Autowired
val emFactory: EntityManagerFactory
) {
#AfterEach
fun cleanUp() {
commentRepository.deleteAll()
postRepository.deleteAll()
}
#Test
fun testDelete() {
//given
var post = Post()
post = postRepository.save(post)
val comments = mutableListOf<Comment>()
for (i in 1..10000) {
val comment = Comment()
comment.post = post
comments.add(comment)
}
commentRepository.save(comments)
val sessionFactory = emFactory.unwrap(SessionFactory::class.java)
val statistics = sessionFactory.statistics
//then
statistics.clear()
postService.delete(post)
val executedQueryCount = statistics.queryExecutionCount
//then
assertAll(
{ assertEquals(0, postRepository.count()) },
{ assertEquals(0, commentRepository.count()) },
{ assertEquals(2, executedQueryCount) }
)
}
}
The library Spring Hibernate Query Utils (https://github.com/yannbriancon/spring-hibernate-query-utils) provides a query counter that you can use to check the number of queries generated.
If you prefer to do it yourself, Hibernate provides a class EmptyInterceptor that contains a hook named onPrepareStatement.
You can extend this class and add logic in the onPrepareStatement hook to count the queries.
Take a look at the library code to see how to configure the iterator.
The method onPrepareStatement is now deprecated and is removed in the new Hibernate 6 version. The new way to inspect SQL is to implement a StatementInspector .
I've written a little library (https://github.com/Lemick/hibernate-query-asserts) that can assert the count of SQL queries by type (SELECT, INSERT, ..) generated by Hibernate in your Spring tests, this way, you can be warned whenever the SQL statements change in your tests, and prevent N+1 selects. You can take a look here at the project if you want to know how this is implemented.
A test example that demonstrates the purpose:
#Test
#Transactional
#AssertHibernateSQLCount(inserts = 3)
void create_two_blog_posts() {
BlogPost post_1 = new BlogPost("Blog post 1");
post_1.addComment(new PostComment("Good article"));
post_1.addComment(new PostComment("Very interesting"));
blogPostRepository.save(post_1);
}
Edit: I found a related question here, but the only 2 answers contradict each other, and there was not enough information to address my use case.
I am trying to use Spring Data Mongo to load records from a collection. One of the fields within those records is an Enum, defined as such:
#AllArgsConstructor
#Getter
#JsonFormat(shape = JsonFormat.Shape.OBJECT)
public enum Action {
APPROVED("Approved"),
SAVED("Saved"),
CORRECTED("Corrected");
private String name;
#JsonCreator
static Action findValue(#JsonProperty("name") String name) {
return Arrays.stream(Action.values()).filter(v -> v.name.equals(name)).findFirst().get();
}
}
This should define enums to be serialized and deserialized according to a JSON representation: {"name": "Saved"} for example.
Jackson seems to be working fine, since I threw an API call at it and told it to expect an Action type, and it read the enum without any issues.
public void save(#RequestBody #Valid Action action) {
System.out.println(action.getName());
} // successfully prints the name of whatever Action I give
However, when I try to read an object with an Action field using Spring Data Mongo, I get the following:
Expected to read Document Document{{name=Corrected}} into type class package.structure.for.some.proprietary.stuff.constants.Action but didn't find a PersistentEntity for the latter!
So I'm thinking Spring Data Mongo just can't make heads or tails of these enums for whatever reason. But I'm not sure how to help it register that as a PersistentEntity. The main class of my Spring Boot app is in package package.structure.for.some.proprietary.stuff and is annotated as such:
#ComponentScan("package.structure")
#EnableTransactionManagement
#EnableAutoConfiguration
#SpringBootApplication
The object in particular I'm trying to read is defined by this POJO:
import java.util.Date;
import org.springframework.data.annotation.Id;
import org.springframework.data.mongodb.core.mapping.Document;
import lombok.Data;
import lombok.NonNull;
import package.structure.for.some.proprietary.stuff.constants.Action;
import com.fasterxml.jackson.annotation.JsonIgnore;
import com.fasterxml.jackson.annotation.JsonInclude;
import com.fasterxml.jackson.annotation.JsonProperty;
import com.fasterxml.jackson.annotation.JsonPropertyOrder;
#JsonInclude(JsonInclude.Include.NON_NULL)
#JsonPropertyOrder({
"timeStamp",
"action",
})
#Data
#Document(collection = "sample_coll")
public class Sample {
#Id
#JsonIgnore
private String id = null;
#JsonProperty("timeStamp")
#NonNull
private Date timeStamp;
#JsonProperty("action")
#NonNull
private Action action;
}
and is queried from the collection with a MongoRepository:
public interface SampleRepository extends MongoRepository<Sample, String> {
}
using SampleRepository.findAll();
So my big question is, how do I get Spring Data Mongo to recognize this enum Action as a PersistentEntity?
Try #Enumerated
#Enumerated
#JsonProperty("action")
#NonNull
private Action action;