Passing Dynamic collection name on Spring MongoRepository not working - java

I'm creating dynamic collections by using mongoTemplate in service layer. Upto this everything went well but when saving into collection dynamically makes issue. explaining here...
Service Layer
public void createCollection(String collectionName) {
mongoTemplate.createCollection(collectionName);
}
public Object updateLessonOrSurveyOrQuery(String courseID, int levelNo, CourseAsset courseAssetToUpdate) {
.....
courseAssetRepo.saveByCourseID(courseID, courseAssetToUpdate);
.....
}
Repo Layer
#Repository
public interface CourseAssetRepo extends MongoRepository<CourseAsset, String> {
ArrayList<CourseAsset> findAllByCourseID(String courseID);
void saveByCourseID( String courseID, CourseAsset courseAsset);
}
findAllByCourseID working but saveByCourseID not woking;
POJO class
#Data
public class CourseAsset {
private int level;
private String title;
private String courseID;
}
ERROR :
Caused by: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'courseAssetRepo' defined in com.dotbw.learn.repo.CourseAssetRepo defined in #EnableMongoRepositories declared on MongoRepositoriesRegistrar.EnableMongoRepositoriesConfiguration: Could not create query for public abstract void com.dotbw.learn.repo.CourseAssetRepo.saveByCourseID(java.lang.String,com.dotbw.learn.model.CourseAsset); Reason: No property 'saveByCourseID' found for type 'CourseAsset'
i can understand repo expects CourseAsset Data inside the pojo class. But while saving how we can provide this value.
i have tried many way as ChatGPT said but nothing worked.

Related

How can I use QueryDSL on multiple mongo repositories?

I am building a profile service with the typical REST endpoints for creating, reading, updating and deleting profiles. For this I am using the Spring Framework together with a MongoDB. On top I would like to use QueryDSL to create some custom queries.
A full minimal working example of the current implementation can be found here: https://github.com/mirrom/profile-modules
I would like to have sub profile models that extend the base profile model, and sub sub models that extend the sub models. By this I have hierarchical profiles that inherit the fields of its parent profile. The idea is to store all profiles in the same collection and distinguish them via the automatically created _class field.
A simple example (with Lombok annotations):
#Data
#Document(collection = "profiles")
#Entity
public class Profile {
#Id
private ObjectId id;
#Indexed
private String title;
#Indexed
private String description;
private LocalDateTime createdAt;
private LocalDateTime modifiedAt;
}
#Data
#Entity
#EqualsAndHashCode(callSuper = true)
public class Sub1Profile extends Profile {
private String sub1String;
private int sub1Integer;
}
While (all) profiles can get accessed via the endpoint /api/v1/profiles, the sub1Profiles can be accessed via /api/v1/profiles/sub-1-profiles. Currently the sub1Profiles endpoint delivers all profiles, but it should just deliver the sub1Profiles and its children. For this I would like to use QueryDSL, but I can't add QuerydslPredicateExecutor<Profile> and QuerydslBinderCustomizer<QProfile> to more than one repository interface. This is how my profile repository looks like:
#Repository
public interface ProfileRepository extends MongoRepository<Profile, ObjectId>, QuerydslPredicateExecutor<Profile>,
QuerydslBinderCustomizer<QProfile> {
#Override
default void customize(QuerydslBindings bindings, QProfile root) {
bindings.bind(String.class)
.first((SingleValueBinding<StringPath, String>) StringExpression::containsIgnoreCase);
}
}
If I now try to do the same with Sub1ProfileRepository:
#Repository
public interface Sub1ProfileRepository
extends MongoRepository<Sub1Profile, ObjectId>, QuerydslPredicateExecutor<Sub1Profile>,
QuerydslBinderCustomizer<QSub1Profile> {
default void customize(QuerydslBindings bindings, QProfile root) {
bindings.bind(String.class)
.first((SingleValueBinding<StringPath, String>) StringExpression::containsIgnoreCase);
}
}
I get this error message:
org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'sub1ProfileRepository' defined in com.example.profile.repository.sub1profile.Sub1ProfileRepository defined in #EnableMongoRepositories declared on MongoRepositoriesRegistrar.EnableMongoRepositoriesConfiguration: Invocation of init method failed; nested exception is org.springframework.data.mapping.PropertyReferenceException: No property customize found for type Sub1Profile!
What am I missing?
In Sub1ProfileRepository's customize method, you have used QProfile as method argument. Can you use QSub1Profile instead and check if it's working?

Is there an equivalent of Jackson + Spring's `#JsonView` using Quarkus + JSONB?

I'm playing with Quarkus and trying to build a CRUD REST application; I'm trying to get 2 endpoints returning 2 different views of the same entities. Here is an example on how I would have done in Spring + Jackson:
#Entity
public class Car{
public String model;
#ManyToOne( fetch = FetchType.LAZY, cascade = {CascadeType.ALL})
public Owner owner;
// [...]
}
#Entity
public class Owner{
public String name;
// [...]
}
Here it is the important part: now if I were using Jackson I would have create a CarView class:
public class CarView {
public static class Public {};
public static class Private extends Public {};
}
And with that I would have annotated Car.model with #JsonView(CarView.Public.class) and Car.owner with #JsonView(CarView.Private.class) and then just annotate with the same annotations my methods in the REST controller in order to tell Jackson which view I want to use:
#RequestMapping("/car/{id}")
#JsonView(CarView.Public.class)
public Car getPublic(#PathVariable int id) { /*...*/ }
#RequestMapping("/car/private/{id}")
#JsonView(CarView.Private.class)
public Car getPrivate(#PathVariable int id) { /*...*/ }
Can I accomplish the same result using Quarkus & JSON-B?
Quarkus supports usage of JsonViews to manage the serialization/deserialization of request/response.
(Just to let you know, sadly it's not supported (yet) by smallry-openapi implementation, so even if the serialization would work, you'll still see the full model in swagger.)
An example of usage, taken from official guide https://quarkus.io/guides/resteasy-reactive#jsonview-support:
JAX-RS methods can be annotated with #JsonView in order to customize the serialization of the returned POJO, on a per method-basis. This is best explained with an example.
A typical use of #JsonView is to hide certain fields on certain methods. In that vein, let’s define two views:
public class Views {
public static class Public {
}
public static class Private extends Public {
}
}
Let’s assume we have the User POJO on which we want to hide some field during serialization. A simple example of this is:
public class User {
#JsonView(Views.Private.class)
public int id;
#JsonView(Views.Public.class)
public String name;
}
Depending on the JAX-RS method that returns this user, we might want to exclude the id field from serialization - for example you might want an insecure method to not expose this field. The way we can achieve that in RESTEasy Reactive is shown in the following example:
#JsonView(Views.Public.class)
#GET
#Path("/public")
public User userPublic() {
return testUser();
}
#JsonView(Views.Private.class)
#GET
#Path("/private")
public User userPrivate() {
return testUser();
}
When the result the userPublic method is serialized, the id field will not be contained in the response as the Public view does not include it. The result of userPrivate however will include the id as expected when serialized.
Have you checked #JsonbVisibility or "Jsonb adapter" part in
https://javaee.github.io/jsonb-spec/users-guide.html annotation from Jsonb? I am afraid maybe there isn't a solution in Jsonb yet like #JsonView in Jackson. Jsonb adapter is configuration at bean level(you choose the Jsonb instance when you (de)serialize), not at view level.

Spring Data JPA mapping nested entities

I'm a little bit confused about using projections in Spring Data JPA.
I wanted to optimize my queries by requesting only needed columns (preferably) in one query, and I thought that using projections is a good idea. But it seems that projection with nested projection becomes open and requests all columns and further nesting is impossible.
I've tried to find a solution with #Query (cannot find how to map nested lists), #EntityGraph (cannot find how to request only specified column) and #SqlResultSetMapping (cannot find how to make mapping nested lists), but it hasn't worked for me.
Is there any solution except receiving List<Object[]> and manually mapping?
I have the next entities classes (simplified for the question):
public class TestAttempt{
private Long id;
private User targetUser;
private Test test;
}
public class Test{
private Long id;
private String name;
private Set<Question> questions;
}
public class Question{
private Long id;
private String name;
private Test test;
}
And I wanted to write something like this (it can be just TestAttempt with null in unused fields):
public interface TestAttemptList {
Long getId();
Test getTest();
interface Test {
String getName();
List<Question> getQuestions();
interface Question {
String getName();
}
}
}
public interface TestAttemptRepository extends JpaRepository<TestAttempt, Long> {
List<TestAttemptList> getAllByTargetUserId(Long targetUserId);
}
And in result get something like this:
{
id: 1,
test: {
name: test1,
questions: [{
name: quest1
}, {
name: quest2
}]
}
}
Ive done something like this... You'll have your repository interfaces which will extend CrudRepository et. al. with the full objects (TestAttempt etc) You define your projections separately. The projection interfaces can contain other projection interfaces (TestAttemptSummary can contain a TestSummary) When the projection interface is used within the given repository the defined methods are applied to the object type the repository is configured for. Something like this.
public interface TestAttemptSummary {
Long getId();
TestSummary getTest();
}
public interface TestSummary {
String getName();
List<QuestionSummary> getQuestions();
}
public interface QuestionSummary {
String getName();
}
public interface TestAttemptRepository extends CrudRepository<TestAttempt, Long> {
TestAttemptSummary getTestAttemptSummary();
}

ModelMapper - Failed to instantiate instance of destination

I'm working with mongodb so I'm decoupling entities from presentation layer creating DTOs (with hibernate-validator annotations).
public abstract class UserDTO {
private String id;
#NotNull
protected String firstName;
#NotNull
protected String lastName;
protected UserType type;
protected ContactInfoDTO contact;
protected List<ResumeDTO> resumes;
public UserDTO(){}
//...
I'm trying to retrive from db this concrete class
public class UserType1DTO extends UserDTO {
private CompanyDTO company;
public UserType1DTO(){
super();
}
public UserType1DTO(String firstName, String lastName, ContactInfoDTO contact, CompanyDTO company) {
super(UserType.type1, firstName, lastName, contact);
this.company = company;
}
/...
Like this:
return mapper.map((UserType1) entity,UserType1DTO.class);
And I get this error about not being able to instanciate ResumeDTO
Failed to instantiate instance of destination *.dto.ResumeDTO. Ensure that *.dto.ResumeDTO has a non-private no-argument constructor.
ResumeDTO is similar to UserDTO, is an abstract class and has concrete classes for each user type. All they have constructors with no arguments.
What is the problem?
You are trying to map a concrete class to an abstract class, this will not work.
You can not use as destination an Abstract Class. Why? It can not be instantiated. So you must use a concrete class.
Definitively it wouldn't work a map with an Abstract Class destination:
mapper.map(entity, AbstractClass.class);
/*Error: java.lang.InstantiationException
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at java.lang.Class.newInstance(Class.java:442)*/
You must use a concrete class which extends the Abstract Class
public class ConcreteClass extends AbstractClass {
//
}
And then map it to this concrete class:
mapper.map(entity, ConcreteClass.class);
More info:
Due to it is not possible to instantiate an abstract class it will not work in destination properties neither.
There is an issue in Github related to this: https://github.com/jhalterman/modelmapper/issues/130
This error occurs when you have primitive data types or primitive return types in setter and getter or a parameterized constructor
so here you need to remove following code
public UserType1DTO(String firstName, String lastName, ContactInfoDTO contact,
CompanyDTO company) {
super(UserType.type1, firstName, lastName, contact);
this.company = company;
}
it will work fine.
What solved my issue is using typeMap and updating the version of ModelMapper. Please refer the below link:-
Mapping Lists with ModelMapper
Using typeMap still gave me the same error. Then I updated my ModelMapper version from 2.0.0 to 2.3.5, and the issue was resolved.
I was also getting the same exception. The way i resolved is made a private constructor of the destination class.
ViewDTO viewDto = (new ModelMapper()).map(object, ViewDTO.class);
The "object" is the domain object and it is to be mapped to the viewDto which is a DTO object to be sent to the client side.
I just added a private constructor to the ViewDTO class.
private ViewDTO() {}
and this solved my issue.And the verified answer also helped a lot.
Hope this is helpful.

Spring Data MongoDB Repository with custom collection name

I am using Spring Data for MongoDB and I need to be able to configure collection at runtime.
My repository is defined as:
#Repository
public interface EventDataRepository extends MongoRepository<EventData, String> {
}
I tried this silly example:
#Document(collection = "${mongo.event.collection}")
public class EventData implements Serializable {
but mongo.event.collection did not resolve to a name as it does with a #Value annotation.
A bit more debugging and searching and I tried the following:
#Document(collection = "#{${mongo.event.collection}}")
This produced an exception:
Caused by: org.springframework.expression.spel.SpelParseException: EL1041E:(pos 1): After parsing a valid expression, there is still more data in the expression: 'lcurly({)'
at org.springframework.expression.spel.standard.InternalSpelExpressionParser.doParseExpression(InternalSpelExpressionParser.java:129)
at org.springframework.expression.spel.standard.SpelExpressionParser.doParseExpression(SpelExpressionParser.java:60)
at org.springframework.expression.spel.standard.SpelExpressionParser.doParseExpression(SpelExpressionParser.java:32)
at org.springframework.expression.common.TemplateAwareExpressionParser.parseExpressions(TemplateAwareExpressionParser.java:154)
at org.springframework.expression.common.TemplateAwareExpressionParser.parseTemplate(TemplateAwareExpressionParser.java:85)
Perhaps I just don't know how to quite use SPel to access values from Spring's Property Configurer.
When stepping through the code, I see that there is a way to specify collection name or even expressions, however, I am not sure which annotation should be used for this purpose or how to do it.
Thanks.
-AP_
You can solve this problem by just using SPeL:
#Document(collection = "#{environment.getProperty('mongo.event.collection')}")
public class EventData implements Serializable {
...
}
Update Spring 5.x:
Since Spring 5.x or so you need an additional # before environment:
#Document(collection = "#{#environment.getProperty('mongo.event.collection')}")
public class EventData implements Serializable {
...
}
Docs:
SpEL: 4.2 Expressions in Bean Definitions
SpEL: 4.3.12 Bean References
PropertyResolver::getProperty
So, at the end, here is a work around that did the trick. I guess I really don't know how to access data from Spring Properties Configurer using the SPeL expressions.
In my #Configuration class:
#Value("${mongo.event.collection}")
private String
mongoEventCollectionName;
#Bean
public String mongoEventCollectionName() {
return
mongoEventCollectionName;
}
On my Document:
#Document(collection = "#{mongoEventCollectionName}")
This, appears to work and properly pick up the name configured in my .properties file, however, I am still not sure why I could not just access the value with $ as I do in the #Value annotation.
define your entity class like
#Document(collection = "${EventDataRepository.getCollectionName()}")
public class EventData implements Serializable {
Define a custom repository interface with getter and setter methods for "collectionName"
public interface EventDataRepositoryCustom {
String getCollectionName();
void setCollectionName(String collectionName);
}
provide implementation class for custom repository with "collectionName" implementation
public class EventDataRepositoryImpl implements EventDataRepositoryCustom{
private static String collectionName = "myCollection";
#Override
public String getCollectionName() {
return collectionName;
}
#Override
public void setCollectionName(String collectionName) {
this.collectionName = collectionName;
}
}
Add EventDataRepositoryImpl to the extends list of your repository interface in this it would look like
#Repository
public interface EventDataRepository extends MongoRepository<EventData, String>, EventDataRepositoryImpl {
}
Now in your Service class where you are using the MongoRepository set the collection name, it would look like
#Autowired
EventDataRepository repository ;
repository.setCollectionName("collectionName");
Entity Class
#Document // remove the parameters from here
public class EscalationCase
{
}
Configuration class
public class MongoDBConfiguration {
private final Logger logger = LoggerFactory.getLogger(MongoDBConfiguration.class);
#Value("${sfdc.mongodb.collection}") //taking collection name from properties file
private String collectionName;
#Bean
public MongoTemplate mongoTemplate(MongoDbFactory mongoDbFactory, MongoMappingContext context) {
MappingMongoConverter converter = new MappingMongoConverter(new DefaultDbRefResolver(mongoDbFactory), context);
converter.setTypeMapper(new DefaultMongoTypeMapper(null));
MongoTemplate mongoTemplate = new MongoTemplate(mongoDbFactory, converter);
if (!mongoTemplate.collectionExists(collectionName)) {
mongoTemplate.createCollection(collectionName); // adding the collection name here
}
return mongoTemplate;
}
}

Categories

Resources