I'm using this awesome library, but I have a problem.
I'm implementing a DTO pattern, so I use another project to convert automaticaly an EJB to a DTO using naming conventions.
Then, I want to query the DTO and getting the real result (EJB query).
I implemented QueryDSL with JPAAnnotationProcessor on my ENTITIES, and the QuerydslAnnotationProcessor on my DTOs.
For example :
An entity User(Long Id, String username, Site site)
A DTO UserDto(Long id, String username, String siteName)
Converting objects is good, "siteName" automatically match "site.name".
And so, I put a QueryDSL Query like: userDto.id.gt(20).and(userDto.username.like("a%")).and(userDto.siteName.like("%b"));
I'm looking for a way to build the corresponding entity query
The only idea I got is to :
Clone the Query
Change the path "userDto" to "user"
Verify each predicate to know if the property exists and if the type is matching
Any way to do that or to reach my goal?
Thanks
Since this is still relevant and undocumented functionality, and since Timo's answer, while helpful, is very cryptic, here's how to do it:
First, extend ReplaceVisitor:
private class CustomReplaceVisior extends ReplaceVisitor<Void> {
#Override
public Expression<?> visit(Path<?> path, #Nullable Void context) {
// The map Timo mentioned to transform paths:
Map<Path<?>, Path<?>> map = Map.of(
QUser.user.id, QUserDto.userDto.id,
QUser.user.name, QUserDto.userDto.name
);
if (map.contains(path)) {
return map.get(path);
} else {
return super.visit(path, context);
}
}
}
Then use it like this:
CustomReplaceVisior replaceVisitor = new CustomReplaceVisior();
Predicate userPredicate = QUser.user.id.eq(2).and(QUser.user.name.eq("Somename"));
Predicate userDtoPredicate = (Predicate) userPredicate.accept(replaceVisitor, null);
You will need to convert expressions in general. With a custom ReplaceVisitor you can for example override visit(Path expr, #Nullable Void context)
A generic way to do the path replacements would be to use a Map map to define the replacements:
if (map.contains(path)) {
return map.get(path);
} else {
return super.visit(path, context);
}
You can use your visitor like this:
Expression transformedExpression = expr.accept(visitor, null);
Related
I'm having trouble converting between java.sql.Timestamp and java.time.Instant using JOOQ converters.
Here's a simplified version of the code I'm working with.
public class User {
private static final Converter<Timestamp, Instant> MY_CONVERTER= Converter.of(
Timestamp.class,
Instant.class,
t -> t == null ? null : t.toInstant(),
i -> i == null ? null : Timestamp.from(i)
)
public static Table<?> table = DSL.table("user");
public static Field<String> name = DSL.field(DSL.name(table.getName(), "name"), String.class);
public static Field<Instant> name = DSL.field(DSL.name(table.getCreated(), "created"), SQLDataType.TIMESTAMP.asConvertedDataType(Converter.of(MY_CONVERTER)));
}
private class UserDto {
private String name;
private Instant created;
// getters, setters, etc.
}
public class UserWriter {
// constructor with injected DefaultDSLContext etc..
public void create(UserDto user) {
dslContext.insertInto(User.table, User.firstName, User.lastName)
.values(user.getName(), user.getCreated())
.execute();
}
}
public class UserReader {
// constructor with injected DefaultDSLContext etc..
public Result<Record> getAll() {
return dslContext.select().from(User.table).fetch();
}
}
public class UserService {
// constructor with injected UserReader etc..
public Collection<UserDto> getAll() {
return userReader
.getAll()
.stream()
.map(Users::from)
.collect(Collectors.toList());
}
}
public class Users {
public static UserDto from(Record record) {
UserDto user = new UserDto();
user.setName(record.get(User.name));
user.setCreated(record.get(User.created);
return user;
}
}
When I create a new User the converter is called and the insertion works fine. However, when I select the Users the converter isn't called and the record.get(User.created) call in the Users::from method returns a Timestamp (and therefore fails as UserDto.setCreated expects an Instant).
Any ideas?
Thanks!
Why the converter isn't applied
From the way you phrased your question (you didn't post the exact SELECT statement that you've tried), I'm assuming you didn't pass all the column expressions explicitly. But then, how would jOOQ be able to find out what columns your table has? You declared some column expressions in some class, but that class isn't following any structure known to jOOQ. The only way to get jOOQ to fetch all known columns is to make them known to jOOQ, using code generation (see below).
You could, of course,let User extend the internal org.jooq.impl.TableImpl class and use internal API to register the Field values. But why do that manually, if you can generate this code?
Code generation
I'll repeat the main point of my previous question, which is: Please use the code generator. I've now written an entire article on why you should do this. Once jOOQ knows all of your meta data via code generation, you can just automatically select all columns like this:
UserRecord user = ctx
.selectFrom(USER)
.where(USER.ID.eq(...))
.fetchOne();
Not just that, you can also configure your data types as INSTANT using a <forcedType>, so you don't need to worry about data type conversion every time.
I cannot stress this enough, and I'm frequently surprised how many projects try to use jOOQ without code generation, which removes so much of jOOQ's power. The main reason to not use code generation is if your schema is dynamic, but since you have that User class, it obviously isn't dynamic.
Consider the following method on a Spring Data JPA interface:
#Query("select distinct :columnName from Item i")
List<Item> findByName(#Param("columnName") String columnName);
I would like to use such a method for performing queries dynamically using different column names on the same entity. How can this be done?
You can't. You'll have to implement such a method by yourself. And you won't be able to use parameters: you'll have to use String concatenation or the criteria API. What you'll pass won't be a column name but a field/property name. And it won't return a List<Item>, since you only select one field.
You can use QueryDSL support built into Spring Data. See this tutorial to get started.
First of all you must implement custom Spring Data repository by adding interface:
public interface ItemCustomRepository {
List<Item> findBy(String columnName, String columnValue);
}
then you must extend your current Spring Data repository interface with newly created i.e.:
public interface ItemRepository extends JpaRepository<Item, Long>, ItemCustomRepository, QueryDslPredicateExecutor {
}
and then you must implement your interface using Query DSL dynamic expression feature (the name ItemRepositoryImpl is crucial - it will let you use original Spring Data repository implementation):
public class ItemRepositoryImpl implements ItemCustomRepository {
#Autowired
private ItemRepository itemRepository;
public List<Item> findBy(final String columnName, final String columnValue) {
Path<Item> item = Expressions.path(Item.class, "item");
Path<String> itemColumnName = Expressions.path(String.class, item, columnName);
Expression<String> itemValueExpression = Expressions.constant(columnValue);
BooleanExpression fieldEqualsExpression = Expressions.predicate(Ops.EQ, itemColumnName, itemValueExpression);
return itemRepository.findAll(fieldEqualsExpression);
}
}
I have the following repository:
#Repository
public interface EntityRepository extends JpaRepository<Entity, Long> {
List<Entity> findAllByFirstId(Long firstId);
List<Entity> findAllBySecondId(Long secondId);
List<Entity> findAllByFirstIdAndSecondId(Long firstId, Long secondId);
}
The constructor implementing an interface generated with io.swagger:swagger-codegen-maven-plugin uses Optional<Long> as optional request parameters (the underlying service uses also the same parameters):
ResponseEntity<List<Entity>> entities(Optional<Long> firstId, Optional<Long> secondId);
I would like to filter the entities based on the parameters firstId and secondId which are never nulls at the database but can be passed through the constructor (the parameter for searching is optional).
The problem comes with the named queries when the null is passed as the parameter is optional, the JpaReposotory uses the null as a criterion for the searching in the database. That's what I don't want - I want to ignore the filtering based on this parameter as long as it is null.
My workaround solution based on Optional is:
public List<Entity> entities(Optional<Long> firstId, Optional<Long> secondId) {
return firstId
.or(() -> secondId)
.map(value -> {
if (firstId.isEmpty()) {
return entityRepository.findAllBySecondId(value);
}
if (secondId.isEmpty()) {
return entityRepository.findAllByFirstId(value);
}
return entityRepository.findAllByFirstIdAndSecondId(
firstId.get(), secondId.get());
})
.orElse(entityRepository.findAll())
.stream()
.map(...) // Mapping between DTO and entity. For sake of brevity
// I used the same onject Entity for both controler and repository
// as long as it not related to the question
.collect(Collectors.toList());
}
This issue has been already asked: Spring Data - ignore parameter if it has a null value and a ticket created DATAJPA-209.
As long as the question is almost 3 years old and the ticket dates back to 2012, I would like to ask if there exists a more comfortable and universal way to avoid the overhead of handling the Optional and duplicating the repository methods. The solution for 2 such parameters looks acceptable, however I'd like to implement the very same filtering for 4-5 parameters.
You need Specification utility class like this
public class EntitySpecifications {
public static Specification<Entity> firstIdEquals(Optional<Long> firstId) {// or Long firstId. It is better to avoid Optional method parameters.
return (root, query, builder) ->
firstId.isPresent() ? // or firstId != null if you use Long method parameter
builder.equal(root.get("firstId"), firstId.get()) :
builder.conjunction(); // to ignore this clause
}
public static Specification<Entity> secondIdEquals(Optional<Long> secondId) {
return (root, query, builder) ->
secondId.isPresent() ?
builder.equal(root.get("secondId"), secondId.get()) :
builder.conjunction(); // to ignore this clause
}
}
Then your EntityRepository have to extend JpaSpecificationExecutor
#Repository
public interface EntityRepository
extends JpaRepository<Entity, Long>, JpaSpecificationExecutor<Entity> {
}
Usage:
#Service
public class EntityService {
#Autowired
EntityRepository repository;
public List<Entity> getEntities(Optional<Long> firstId, Optional<Long> secondId) {
Specification<Entity> spec =
Specifications.where(EntitySpecifications.firstIdEquals(firstId)) //Spring Data JPA 2.0: use Specification.where
.and(EntitySpecifications.secondIdEquals(secondId));
return repository.findAll(spec);
}
}
The io.swagger:swagger-codegen-maven-plugin generates them as
Optional since I request them as not required (required: false by
default). I might generate them as boxed types, such as Long, …
It’s probably partly a matter of taste. If it were me and I could, I’d go for the version without Optional. I don’t think they contribute anything useful here.
public List<Entity> entities(Long firstId, Long secondId) {
List<Dto> dtos;
if (firstId == null) {
if (secondId == null) {
dtos = entityRepository.findAll();
} else {
dtos = entityRepository.findAllBySecondId(secondId);
}
} else {
if (secondId == null) {
dtos = entityRepository.findAllByFirstId(firstId);
} else {
dtos = entityRepository.findAllByFirstIdAndSecondId(firstId, secondId);
}
}
return dtos.stream()
.map(...)
.collect(Collectors.toList());
}
The Optional class was designed to be used for return values that may be absent, not really for anything else, so I have read. I think there are rare situations where I’d use them for something else, but this is not one of them.
I'd suggest you to use specifications instead. See documentation and examples here.
Briefly, the idea is following. For each attribute you define a specification. Then check each attribute in your search criteria and if it is not null add corresponding specification to the "concatenated" specification. Then you search using this "concatenated" specification.
I am using validation-api-2.0.1.Final and hibernate-validator-6.0.13.Final. I would like to do validation for the below case,
I have created a custom validation to validate List<Map<String,Object>>
BookInfo.java
#Target({METHOD, FIELD, ANNOTATION_TYPE, CONSTRUCTOR, PARAMETER, TYPE_USE})
#Retention(RetentionPolicy.RUNTIME)
#Documented
#Constraint(
validatedBy = {BookInfoValidator.class}
)
public #interface BookInfo {
String message() default "Should not be empty";
Class<?>[] groups() default {};
Class<? extends Payload>[] payload() default {};
}
BookInfoValidator.java
public class BookInfoValidator implements ConstraintValidator<ValidateUserInfo, List<Map<String, Object>>> {
private final ContentRepositoryClient contentRepository;
public ValidateUserInfoValidator(ContentRepositoryClient contentRepository) {
this.contentRepository = contentRepository;
}
#Override
public void initialize(ValidateUserInfo constraintAnnotation) {
}
#Override
public boolean isValid(List<Map<String,Object>> map, ConstraintValidatorContext constraintValidatorContext) {
//In the list of Map the key will be "text,email,date etc etc" based on the key i would like to
//validate with the proper validation constraints
//ex) for Email invoke javax.validation.constraints.Email.class from validation-api
//I am not sure how to manually invoke the validation annotations.
return false;
}
}
BookInfoView.java
class BookInfoView {
#BookInfo
private List<Map<String, Object>> bookInfos;
}
In the list of Map the key will be "text, email, date etc". Based on the key I would like to validate with the proper validation constraints
exception for Email invoke javax.validation.constraints.Email.class from validation-api. I am not sure how to manually invoke the validation annotations.
Any hint or help will be much appreciated.
I am not sure how to manually invoke the validation annotations.
I am answering above quoted lines. Yes, it is possible to invoke validation programmatically and in case of validation failures you will receive all failure messages in a set. Below are the steps to do the same:
Build ValidatorFactory
Get hold of a Validator instance from ValidatorFactory
Perform the validation using validate() method
Process the validation result constraintViolations.iterator().next().getMessage()
Show some code, below is the code snippet for all four steps mentioned above:
ValidatorFactory factory = Validation.buildDefaultValidatorFactory();
Validator validator = factory.getValidator();
Set<ConstraintViolation<BookInfoView>> constraintViolations = validator.validate(bookInfoViewObj);
assertEquals( "Should not be empty", constraintViolations.iterator().next().getMessage() );
Hibernate Validator framework provides various other capabilities to validate one or more entities and then process the result. Better you have look at official document.
There's no 'nice' way of accessing validator implementations for constraints (email , not null etc.). While you could create instances of those validators and store them in your BookInfoValidator you would need to do a lot of additional work. As for each validator its ConstraintValidator#initialize() method. While in case of simple constraints like #NotNull there's actually noting to initialize, the same check can easily be performed without this validator. And in case of a more complex ones like #Email you would need to create your own proxy class for the annotation so you could properly initialize the constraint validator.
With that said I would suggest to write a wrapper class for your Map, something like:
public class BookInfoWrapper {
private final Map<String, Object> data;
public BookInfoWrapper(Map<String, Object> data) {
this.data = data;
}
#NotNull
public Map<String, Object> getUser(){
return (Map<String, Object>) data.get( "user" );
}
#Email
public String getEmail(){
return Objects.toString(( getUser() ).get( "email" ));
}
// and any other constraints you need
}
And then convert your list of maps to these wrappers before validation.
I can also see that you have a repository in your validator, hence I think that you might want to derive rules "dynamically". In such case you might want to check out the programmatic API provided by Hibernate Validator. Using it you should be able to build the rules you need based on the data retrieved from the database. But still you would need to wrap the maps first.
To summarize it all, sadly there's no nice and easy solution for your particular case yet. We are working on a validation of free form objects but it'll take us some time to be able to release it. Hence I would suggest that you should either
write the validation checks on your own in your BookInfoValidator without using built-in constraints.
use a wrapper approach described above.
Overview
Given
Spring Data JPA, Spring Data Rest, QueryDsl
a Meetup entity
with a Map<String,String> properties field
persisted in a MEETUP_PROPERTY table as an #ElementCollection
a MeetupRepository
that extends QueryDslPredicateExecutor<Meetup>
I'd expect
A web query of
GET /api/meetup?properties[aKey]=aValue
to return only Meetups with a property entry that has the specified key and value: aKey=aValue.
However, that's not working for me.
What am I missing?
Tried
Simple Fields
Simple fields work, like name and description:
GET /api/meetup?name=whatever
Collection fields work, like participants:
GET /api/meetup?participants.name=whatever
But not this Map field.
Customize QueryDsl bindings
I've tried customizing the binding by having the repository
extend QuerydslBinderCustomizer<QMeetup>
and overriding the
customize(QuerydslBindings bindings, QMeetup meetup)
method, but while the customize() method is being hit, the binding code inside the lambda is not.
EDIT: Learned that's because QuerydslBindings means of evaluating the query parameter do not let it match up against the pathSpecs map it's internally holding - which has your custom bindings in it.
Some Specifics
Meetup.properties field
#ElementCollection(fetch = FetchType.EAGER)
#CollectionTable(name = "MEETUP_PROPERTY", joinColumns = #JoinColumn(name = "MEETUP_ID"))
#MapKeyColumn(name = "KEY")
#Column(name = "VALUE", length = 2048)
private Map<String, String> properties = new HashMap<>();
customized querydsl binding
EDIT: See above; turns out, this was doing nothing for my code.
public interface MeetupRepository extends PagingAndSortingRepository<Meetup, Long>,
QueryDslPredicateExecutor<Meetup>,
QuerydslBinderCustomizer<QMeetup> {
#Override
default void customize(QuerydslBindings bindings, QMeetup meetup) {
bindings.bind(meetup.properties).first((path, value) -> {
BooleanBuilder builder = new BooleanBuilder();
for (String key : value.keySet()) {
builder.and(path.containsKey(key).and(path.get(key).eq(value.get(key))));
}
return builder;
});
}
Additional Findings
QuerydslPredicateBuilder.getPredicate() asks QuerydslBindings.getPropertyPath() to try 2 ways to return a path from so it can make a predicate that QuerydslAwareRootResourceInformationHandlerMethodArgumentResolver.postProcess() can use.
1 is to look in the customized bindings. I don't see any way to express a map query there
2 is to default to Spring's bean paths. Same expression problem there. How do you express a map?
So it looks impossible to get QuerydslPredicateBuilder.getPredicate() to automatically create a predicate.
Fine - I can do it manually, if I can hook into QuerydslAwareRootResourceInformationHandlerMethodArgumentResolver.postProcess()
HOW can I override that class, or replace the bean? It's instantiated and returned as a bean in the RepositoryRestMvcConfiguration.repoRequestArgumentResolver() bean declaration.
I can override that bean by declaring my own repoRequestArgumentResolver bean, but it doesn't get used.
It gets overridden by RepositoryRestMvcConfigurations. I can't force it by setting it #Primary or #Ordered(HIGHEST_PRECEDENCE).
I can force it by explicitly component-scanning RepositoryRestMvcConfiguration.class, but that also messes up Spring Boot's autoconfiguration because it causes
RepositoryRestMvcConfiguration's bean declarations to be processed
before any auto-configuration runs. Among other things, that results in responses that are serialized by Jackson in unwanted ways.
The Question
Well - looks like the support I expected just isn't there.
So the question becomes:
HOW do I correctly override the repoRequestArgumentResolver bean?
BTW - QuerydslAwareRootResourceInformationHandlerMethodArgumentResolver is awkwardly non-public. :/
Replace the Bean
Implement ApplicationContextAware
This is how I replaced the bean in the application context.
It feels a little hacky. I'd love to hear a better way to do this.
#Configuration
public class CustomQuerydslHandlerMethodArgumentResolverConfig implements ApplicationContextAware {
/**
* This class is originally the class that instantiated QuerydslAwareRootResourceInformationHandlerMethodArgumentResolver and placed it into the Spring Application Context
* as a {#link RootResourceInformationHandlerMethodArgumentResolver} by the name of 'repoRequestArgumentResolver'.<br/>
* By injecting this bean, we can let {#link #meetupApiRepoRequestArgumentResolver} delegate as much as possible to the original code in that bean.
*/
private final RepositoryRestMvcConfiguration repositoryRestMvcConfiguration;
#Autowired
public CustomQuerydslHandlerMethodArgumentResolverConfig(RepositoryRestMvcConfiguration repositoryRestMvcConfiguration) {
this.repositoryRestMvcConfiguration = repositoryRestMvcConfiguration;
}
#Override
public void setApplicationContext(ApplicationContext applicationContext) throws BeansException {
DefaultListableBeanFactory beanFactory = (DefaultListableBeanFactory) ((GenericApplicationContext) applicationContext).getBeanFactory();
beanFactory.destroySingleton(REPO_REQUEST_ARGUMENT_RESOLVER_BEAN_NAME);
beanFactory.registerSingleton(REPO_REQUEST_ARGUMENT_RESOLVER_BEAN_NAME,
meetupApiRepoRequestArgumentResolver(applicationContext, repositoryRestMvcConfiguration));
}
/**
* This code is mostly copied from {#link RepositoryRestMvcConfiguration#repoRequestArgumentResolver()}, except the if clause checking if the QueryDsl library is
* present has been removed, since we're counting on it anyway.<br/>
* That means that if that code changes in the future, we're going to need to alter this code... :/
*/
#Bean
public RootResourceInformationHandlerMethodArgumentResolver meetupApiRepoRequestArgumentResolver(ApplicationContext applicationContext,
RepositoryRestMvcConfiguration repositoryRestMvcConfiguration) {
QuerydslBindingsFactory factory = applicationContext.getBean(QuerydslBindingsFactory.class);
QuerydslPredicateBuilder predicateBuilder = new QuerydslPredicateBuilder(repositoryRestMvcConfiguration.defaultConversionService(),
factory.getEntityPathResolver());
return new CustomQuerydslHandlerMethodArgumentResolver(repositoryRestMvcConfiguration.repositories(),
repositoryRestMvcConfiguration.repositoryInvokerFactory(repositoryRestMvcConfiguration.defaultConversionService()),
repositoryRestMvcConfiguration.resourceMetadataHandlerMethodArgumentResolver(),
predicateBuilder, factory);
}
}
Create a Map-searching predicate from http params
Extend RootResourceInformationHandlerMethodArgumentResolver
And these are the snippets of code that create my own Map-searching predicate based on the http query parameters.
Again - would love to know a better way.
The postProcess method calls:
predicate = addCustomMapPredicates(parameterMap, predicate, domainType).getValue();
just before the predicate reference is passed into the QuerydslRepositoryInvokerAdapter constructor and returned.
Here is that addCustomMapPredicates method:
private BooleanBuilder addCustomMapPredicates(MultiValueMap<String, String> parameters, Predicate predicate, Class<?> domainType) {
BooleanBuilder booleanBuilder = new BooleanBuilder();
parameters.keySet()
.stream()
.filter(s -> s.contains("[") && matches(s) && s.endsWith("]"))
.collect(Collectors.toList())
.forEach(paramKey -> {
String property = paramKey.substring(0, paramKey.indexOf("["));
if (ReflectionUtils.findField(domainType, property) == null) {
LOGGER.warn("Skipping predicate matching on [%s]. It is not a known field on domainType %s", property, domainType.getName());
return;
}
String key = paramKey.substring(paramKey.indexOf("[") + 1, paramKey.indexOf("]"));
parameters.get(paramKey).forEach(value -> {
if (!StringUtils.hasLength(value)) {
booleanBuilder.or(matchesProperty(key, null));
} else {
booleanBuilder.or(matchesProperty(key, value));
}
});
});
return booleanBuilder.and(predicate);
}
static boolean matches(String key) {
return PATTERN.matcher(key).matches();
}
And the pattern:
/**
* disallow a . or ] from preceding a [
*/
private static final Pattern PATTERN = Pattern.compile(".*[^.]\\[.*[^\\[]");
I spent a few days looking into how to do this. In the end I just went with manually adding to the predicate. This solution feels simple and elegant.
So you access the map via
GET /api/meetup?properties.aKey=aValue
On the controller I injected the request parameters and the predicate.
public List<Meetup> getMeetupList(#QuerydslPredicate(root = Meetup.class) Predicate predicate,
#RequestParam Map<String, String> allRequestParams,
Pageable page) {
Predicate builder = createPredicateQuery(predicate, allRequestParams);
return meetupRepo.findAll(builder, page);
}
I then just simply parsed the query parameters and added contains
private static final String PREFIX = "properties.";
private BooleanBuilder createPredicateQuery(Predicate predicate, Map<String, String> allRequestParams) {
BooleanBuilder builder = new BooleanBuilder();
builder.and(predicate);
allRequestParams.entrySet().stream()
.filter(e -> e.getKey().startsWith(PREFIX))
.forEach(e -> {
var key = e.getKey().substring(PREFIX.length());
builder.and(QMeetup.meetup.properties.contains(key, e.getValue()));
});
return builder;
}