QueryDsl web query on the key of a Map field - java

Overview
Given
Spring Data JPA, Spring Data Rest, QueryDsl
a Meetup entity
with a Map<String,String> properties field
persisted in a MEETUP_PROPERTY table as an #ElementCollection
a MeetupRepository
that extends QueryDslPredicateExecutor<Meetup>
I'd expect
A web query of
GET /api/meetup?properties[aKey]=aValue
to return only Meetups with a property entry that has the specified key and value: aKey=aValue.
However, that's not working for me.
What am I missing?
Tried
Simple Fields
Simple fields work, like name and description:
GET /api/meetup?name=whatever
Collection fields work, like participants:
GET /api/meetup?participants.name=whatever
But not this Map field.
Customize QueryDsl bindings
I've tried customizing the binding by having the repository
extend QuerydslBinderCustomizer<QMeetup>
and overriding the
customize(QuerydslBindings bindings, QMeetup meetup)
method, but while the customize() method is being hit, the binding code inside the lambda is not.
EDIT: Learned that's because QuerydslBindings means of evaluating the query parameter do not let it match up against the pathSpecs map it's internally holding - which has your custom bindings in it.
Some Specifics
Meetup.properties field
#ElementCollection(fetch = FetchType.EAGER)
#CollectionTable(name = "MEETUP_PROPERTY", joinColumns = #JoinColumn(name = "MEETUP_ID"))
#MapKeyColumn(name = "KEY")
#Column(name = "VALUE", length = 2048)
private Map<String, String> properties = new HashMap<>();
customized querydsl binding
EDIT: See above; turns out, this was doing nothing for my code.
public interface MeetupRepository extends PagingAndSortingRepository<Meetup, Long>,
QueryDslPredicateExecutor<Meetup>,
QuerydslBinderCustomizer<QMeetup> {
#Override
default void customize(QuerydslBindings bindings, QMeetup meetup) {
bindings.bind(meetup.properties).first((path, value) -> {
BooleanBuilder builder = new BooleanBuilder();
for (String key : value.keySet()) {
builder.and(path.containsKey(key).and(path.get(key).eq(value.get(key))));
}
return builder;
});
}
Additional Findings
QuerydslPredicateBuilder.getPredicate() asks QuerydslBindings.getPropertyPath() to try 2 ways to return a path from so it can make a predicate that QuerydslAwareRootResourceInformationHandlerMethodArgumentResolver.postProcess() can use.
1 is to look in the customized bindings. I don't see any way to express a map query there
2 is to default to Spring's bean paths. Same expression problem there. How do you express a map?
So it looks impossible to get QuerydslPredicateBuilder.getPredicate() to automatically create a predicate.
Fine - I can do it manually, if I can hook into QuerydslAwareRootResourceInformationHandlerMethodArgumentResolver.postProcess()
HOW can I override that class, or replace the bean? It's instantiated and returned as a bean in the RepositoryRestMvcConfiguration.repoRequestArgumentResolver() bean declaration.
I can override that bean by declaring my own repoRequestArgumentResolver bean, but it doesn't get used.
It gets overridden by RepositoryRestMvcConfigurations. I can't force it by setting it #Primary or #Ordered(HIGHEST_PRECEDENCE).
I can force it by explicitly component-scanning RepositoryRestMvcConfiguration.class, but that also messes up Spring Boot's autoconfiguration because it causes
RepositoryRestMvcConfiguration's bean declarations to be processed
before any auto-configuration runs. Among other things, that results in responses that are serialized by Jackson in unwanted ways.
The Question
Well - looks like the support I expected just isn't there.
So the question becomes:
HOW do I correctly override the repoRequestArgumentResolver bean?
BTW - QuerydslAwareRootResourceInformationHandlerMethodArgumentResolver is awkwardly non-public. :/

Replace the Bean
Implement ApplicationContextAware
This is how I replaced the bean in the application context.
It feels a little hacky. I'd love to hear a better way to do this.
#Configuration
public class CustomQuerydslHandlerMethodArgumentResolverConfig implements ApplicationContextAware {
/**
* This class is originally the class that instantiated QuerydslAwareRootResourceInformationHandlerMethodArgumentResolver and placed it into the Spring Application Context
* as a {#link RootResourceInformationHandlerMethodArgumentResolver} by the name of 'repoRequestArgumentResolver'.<br/>
* By injecting this bean, we can let {#link #meetupApiRepoRequestArgumentResolver} delegate as much as possible to the original code in that bean.
*/
private final RepositoryRestMvcConfiguration repositoryRestMvcConfiguration;
#Autowired
public CustomQuerydslHandlerMethodArgumentResolverConfig(RepositoryRestMvcConfiguration repositoryRestMvcConfiguration) {
this.repositoryRestMvcConfiguration = repositoryRestMvcConfiguration;
}
#Override
public void setApplicationContext(ApplicationContext applicationContext) throws BeansException {
DefaultListableBeanFactory beanFactory = (DefaultListableBeanFactory) ((GenericApplicationContext) applicationContext).getBeanFactory();
beanFactory.destroySingleton(REPO_REQUEST_ARGUMENT_RESOLVER_BEAN_NAME);
beanFactory.registerSingleton(REPO_REQUEST_ARGUMENT_RESOLVER_BEAN_NAME,
meetupApiRepoRequestArgumentResolver(applicationContext, repositoryRestMvcConfiguration));
}
/**
* This code is mostly copied from {#link RepositoryRestMvcConfiguration#repoRequestArgumentResolver()}, except the if clause checking if the QueryDsl library is
* present has been removed, since we're counting on it anyway.<br/>
* That means that if that code changes in the future, we're going to need to alter this code... :/
*/
#Bean
public RootResourceInformationHandlerMethodArgumentResolver meetupApiRepoRequestArgumentResolver(ApplicationContext applicationContext,
RepositoryRestMvcConfiguration repositoryRestMvcConfiguration) {
QuerydslBindingsFactory factory = applicationContext.getBean(QuerydslBindingsFactory.class);
QuerydslPredicateBuilder predicateBuilder = new QuerydslPredicateBuilder(repositoryRestMvcConfiguration.defaultConversionService(),
factory.getEntityPathResolver());
return new CustomQuerydslHandlerMethodArgumentResolver(repositoryRestMvcConfiguration.repositories(),
repositoryRestMvcConfiguration.repositoryInvokerFactory(repositoryRestMvcConfiguration.defaultConversionService()),
repositoryRestMvcConfiguration.resourceMetadataHandlerMethodArgumentResolver(),
predicateBuilder, factory);
}
}
Create a Map-searching predicate from http params
Extend RootResourceInformationHandlerMethodArgumentResolver
And these are the snippets of code that create my own Map-searching predicate based on the http query parameters.
Again - would love to know a better way.
The postProcess method calls:
predicate = addCustomMapPredicates(parameterMap, predicate, domainType).getValue();
just before the predicate reference is passed into the QuerydslRepositoryInvokerAdapter constructor and returned.
Here is that addCustomMapPredicates method:
private BooleanBuilder addCustomMapPredicates(MultiValueMap<String, String> parameters, Predicate predicate, Class<?> domainType) {
BooleanBuilder booleanBuilder = new BooleanBuilder();
parameters.keySet()
.stream()
.filter(s -> s.contains("[") && matches(s) && s.endsWith("]"))
.collect(Collectors.toList())
.forEach(paramKey -> {
String property = paramKey.substring(0, paramKey.indexOf("["));
if (ReflectionUtils.findField(domainType, property) == null) {
LOGGER.warn("Skipping predicate matching on [%s]. It is not a known field on domainType %s", property, domainType.getName());
return;
}
String key = paramKey.substring(paramKey.indexOf("[") + 1, paramKey.indexOf("]"));
parameters.get(paramKey).forEach(value -> {
if (!StringUtils.hasLength(value)) {
booleanBuilder.or(matchesProperty(key, null));
} else {
booleanBuilder.or(matchesProperty(key, value));
}
});
});
return booleanBuilder.and(predicate);
}
static boolean matches(String key) {
return PATTERN.matcher(key).matches();
}
And the pattern:
/**
* disallow a . or ] from preceding a [
*/
private static final Pattern PATTERN = Pattern.compile(".*[^.]\\[.*[^\\[]");

I spent a few days looking into how to do this. In the end I just went with manually adding to the predicate. This solution feels simple and elegant.
So you access the map via
GET /api/meetup?properties.aKey=aValue
On the controller I injected the request parameters and the predicate.
public List<Meetup> getMeetupList(#QuerydslPredicate(root = Meetup.class) Predicate predicate,
#RequestParam Map<String, String> allRequestParams,
Pageable page) {
Predicate builder = createPredicateQuery(predicate, allRequestParams);
return meetupRepo.findAll(builder, page);
}
I then just simply parsed the query parameters and added contains
private static final String PREFIX = "properties.";
private BooleanBuilder createPredicateQuery(Predicate predicate, Map<String, String> allRequestParams) {
BooleanBuilder builder = new BooleanBuilder();
builder.and(predicate);
allRequestParams.entrySet().stream()
.filter(e -> e.getKey().startsWith(PREFIX))
.forEach(e -> {
var key = e.getKey().substring(PREFIX.length());
builder.and(QMeetup.meetup.properties.contains(key, e.getValue()));
});
return builder;
}

Related

Spring Data JPA - Named query ignoring null parameters

I have the following repository:
#Repository
public interface EntityRepository extends JpaRepository<Entity, Long> {
List<Entity> findAllByFirstId(Long firstId);
List<Entity> findAllBySecondId(Long secondId);
List<Entity> findAllByFirstIdAndSecondId(Long firstId, Long secondId);
}
The constructor implementing an interface generated with io.swagger:swagger-codegen-maven-plugin uses Optional<Long> as optional request parameters (the underlying service uses also the same parameters):
ResponseEntity<List<Entity>> entities(Optional<Long> firstId, Optional<Long> secondId);
I would like to filter the entities based on the parameters firstId and secondId which are never nulls at the database but can be passed through the constructor (the parameter for searching is optional).
The problem comes with the named queries when the null is passed as the parameter is optional, the JpaReposotory uses the null as a criterion for the searching in the database. That's what I don't want - I want to ignore the filtering based on this parameter as long as it is null.
My workaround solution based on Optional is:
public List<Entity> entities(Optional<Long> firstId, Optional<Long> secondId) {
return firstId
.or(() -> secondId)
.map(value -> {
if (firstId.isEmpty()) {
return entityRepository.findAllBySecondId(value);
}
if (secondId.isEmpty()) {
return entityRepository.findAllByFirstId(value);
}
return entityRepository.findAllByFirstIdAndSecondId(
firstId.get(), secondId.get());
})
.orElse(entityRepository.findAll())
.stream()
.map(...) // Mapping between DTO and entity. For sake of brevity
// I used the same onject Entity for both controler and repository
// as long as it not related to the question
.collect(Collectors.toList());
}
This issue has been already asked: Spring Data - ignore parameter if it has a null value and a ticket created DATAJPA-209.
As long as the question is almost 3 years old and the ticket dates back to 2012, I would like to ask if there exists a more comfortable and universal way to avoid the overhead of handling the Optional and duplicating the repository methods. The solution for 2 such parameters looks acceptable, however I'd like to implement the very same filtering for 4-5 parameters.
You need Specification utility class like this
public class EntitySpecifications {
public static Specification<Entity> firstIdEquals(Optional<Long> firstId) {// or Long firstId. It is better to avoid Optional method parameters.
return (root, query, builder) ->
firstId.isPresent() ? // or firstId != null if you use Long method parameter
builder.equal(root.get("firstId"), firstId.get()) :
builder.conjunction(); // to ignore this clause
}
public static Specification<Entity> secondIdEquals(Optional<Long> secondId) {
return (root, query, builder) ->
secondId.isPresent() ?
builder.equal(root.get("secondId"), secondId.get()) :
builder.conjunction(); // to ignore this clause
}
}
Then your EntityRepository have to extend JpaSpecificationExecutor
#Repository
public interface EntityRepository
extends JpaRepository<Entity, Long>, JpaSpecificationExecutor<Entity> {
}
Usage:
#Service
public class EntityService {
#Autowired
EntityRepository repository;
public List<Entity> getEntities(Optional<Long> firstId, Optional<Long> secondId) {
Specification<Entity> spec =
Specifications.where(EntitySpecifications.firstIdEquals(firstId)) //Spring Data JPA 2.0: use Specification.where
.and(EntitySpecifications.secondIdEquals(secondId));
return repository.findAll(spec);
}
}
The io.swagger:swagger-codegen-maven-plugin generates them as
Optional since I request them as not required (required: false by
default). I might generate them as boxed types, such as Long, …
It’s probably partly a matter of taste. If it were me and I could, I’d go for the version without Optional. I don’t think they contribute anything useful here.
public List<Entity> entities(Long firstId, Long secondId) {
List<Dto> dtos;
if (firstId == null) {
if (secondId == null) {
dtos = entityRepository.findAll();
} else {
dtos = entityRepository.findAllBySecondId(secondId);
}
} else {
if (secondId == null) {
dtos = entityRepository.findAllByFirstId(firstId);
} else {
dtos = entityRepository.findAllByFirstIdAndSecondId(firstId, secondId);
}
}
return dtos.stream()
.map(...)
.collect(Collectors.toList());
}
The Optional class was designed to be used for return values that may be absent, not really for anything else, so I have read. I think there are rare situations where I’d use them for something else, but this is not one of them.
I'd suggest you to use specifications instead. See documentation and examples here.
Briefly, the idea is following. For each attribute you define a specification. Then check each attribute in your search criteria and if it is not null add corresponding specification to the "concatenated" specification. Then you search using this "concatenated" specification.

How to trigger javax validations manually via reflection or any other way?

I am using validation-api-2.0.1.Final and hibernate-validator-6.0.13.Final. I would like to do validation for the below case,
I have created a custom validation to validate List<Map<String,Object>>
BookInfo.java
#Target({METHOD, FIELD, ANNOTATION_TYPE, CONSTRUCTOR, PARAMETER, TYPE_USE})
#Retention(RetentionPolicy.RUNTIME)
#Documented
#Constraint(
validatedBy = {BookInfoValidator.class}
)
public #interface BookInfo {
String message() default "Should not be empty";
Class<?>[] groups() default {};
Class<? extends Payload>[] payload() default {};
}
BookInfoValidator.java
public class BookInfoValidator implements ConstraintValidator<ValidateUserInfo, List<Map<String, Object>>> {
private final ContentRepositoryClient contentRepository;
public ValidateUserInfoValidator(ContentRepositoryClient contentRepository) {
this.contentRepository = contentRepository;
}
#Override
public void initialize(ValidateUserInfo constraintAnnotation) {
}
#Override
public boolean isValid(List<Map<String,Object>> map, ConstraintValidatorContext constraintValidatorContext) {
//In the list of Map the key will be "text,email,date etc etc" based on the key i would like to
//validate with the proper validation constraints
//ex) for Email invoke javax.validation.constraints.Email.class from validation-api
//I am not sure how to manually invoke the validation annotations.
return false;
}
}
BookInfoView.java
class BookInfoView {
#BookInfo
private List<Map<String, Object>> bookInfos;
}
In the list of Map the key will be "text, email, date etc". Based on the key I would like to validate with the proper validation constraints
exception for Email invoke javax.validation.constraints.Email.class from validation-api. I am not sure how to manually invoke the validation annotations.
Any hint or help will be much appreciated.
I am not sure how to manually invoke the validation annotations.
I am answering above quoted lines. Yes, it is possible to invoke validation programmatically and in case of validation failures you will receive all failure messages in a set. Below are the steps to do the same:
Build ValidatorFactory
Get hold of a Validator instance from ValidatorFactory
Perform the validation using validate() method
Process the validation result constraintViolations.iterator().next().getMessage()
Show some code, below is the code snippet for all four steps mentioned above:
ValidatorFactory factory = Validation.buildDefaultValidatorFactory();
Validator validator = factory.getValidator();
Set<ConstraintViolation<BookInfoView>> constraintViolations = validator.validate(bookInfoViewObj);
assertEquals( "Should not be empty", constraintViolations.iterator().next().getMessage() );
Hibernate Validator framework provides various other capabilities to validate one or more entities and then process the result. Better you have look at official document.
There's no 'nice' way of accessing validator implementations for constraints (email , not null etc.). While you could create instances of those validators and store them in your BookInfoValidator you would need to do a lot of additional work. As for each validator its ConstraintValidator#initialize() method. While in case of simple constraints like #NotNull there's actually noting to initialize, the same check can easily be performed without this validator. And in case of a more complex ones like #Email you would need to create your own proxy class for the annotation so you could properly initialize the constraint validator.
With that said I would suggest to write a wrapper class for your Map, something like:
public class BookInfoWrapper {
private final Map<String, Object> data;
public BookInfoWrapper(Map<String, Object> data) {
this.data = data;
}
#NotNull
public Map<String, Object> getUser(){
return (Map<String, Object>) data.get( "user" );
}
#Email
public String getEmail(){
return Objects.toString(( getUser() ).get( "email" ));
}
// and any other constraints you need
}
And then convert your list of maps to these wrappers before validation.
I can also see that you have a repository in your validator, hence I think that you might want to derive rules "dynamically". In such case you might want to check out the programmatic API provided by Hibernate Validator. Using it you should be able to build the rules you need based on the data retrieved from the database. But still you would need to wrap the maps first.
To summarize it all, sadly there's no nice and easy solution for your particular case yet. We are working on a validation of free form objects but it'll take us some time to be able to release it. Hence I would suggest that you should either
write the validation checks on your own in your BookInfoValidator without using built-in constraints.
use a wrapper approach described above.

Spring Cache Evict does not work

Hi I have problem with clean cache when method is executed.
Here is my configuration and caches methods:
#Configuration
#EnableCaching
#AutoConfigureAfter(value = {MetricsConfiguration.class, DatabaseConfiguration.class})
#Profile("!" + Constants.SPRING_PROFILE_FAST)
public class CacheConfiguration {
private final Logger log = LoggerFactory.getLogger(CacheConfiguration.class);
public static final String STOCK_DETAILS_BY_TICKER_CACHE = "stockDetailsByTickerCache";
public static final String RSS_NEWS_BY_TYPE_CACHE = "rssNewsByTypeCache";
#Bean
public CacheManager cacheManager() {
SimpleCacheManager cacheManager = new SimpleCacheManager();
List<Cache> caches = new ArrayList<Cache>();
caches.add(new ConcurrentMapCache(STOCK_DETAILS_BY_TICKER_CACHE));
caches.add(new ConcurrentMapCache(RSS_NEWS_BY_TYPE_CACHE));
cacheManager.setCaches(caches);
return cacheManager;
}
}
This method i want to cache:
#Cacheable(cacheNames = CacheConfiguration.RSS_NEWS_BY_TYPE_CACHE, key = "#type")
public ResponseEntity<List<NewsDetailsDTO>> getLatestNewsMessageByType(RssType type) {
Pageable pageable = new PageRequest(0, 5, Sort.Direction.DESC, "createdDate");
List<NewsMessage> latestNewsMessage = newsMessageRepository.findByType(type, pageable).getContent();
return new ResponseEntity<List<NewsDetailsDTO>>(mapToDTO(latestNewsMessage), HttpStatus.OK);
}
On execution of this method i would like to clean cache by type:
#CacheEvict(cacheNames={CacheConfiguration.RSS_NEWS_BY_TYPE_CACHE}, beforeInvocation = true, key = "#news.type")
public void save(NewsMessage news) {
newsMessageRepository.save(news);
}
And NewsMessage object looks like:
#Entity
#Table(name = "NEWS_MESSAGE")
public class NewsMessage extends ChatMessage {
<other fileds>
#NotNull
#Enumerated(EnumType.STRING)
private RssType type;
}
The cache thing works fine, by the first time there is a query to DB by second and next the data is fetch from cache. Problem is when I update data the #CacheEvict does not clean the cache. I was trying to clean all cache using this annotation:
#CacheEvict(cacheNames={CacheConfiguration.RSS_NEWS_BY_TYPE_CACHE}, allEntries = true)
But it also does not work. Could you help me?
From where do you call the save() method?
In your own answer it looks like you have moved the annotations to another class/interface to invoke the proxy object of that class/interface (btw annotations should generally not be used in interfaces, because they often don't get catched with default configuration).
Therefore my question: do you know the spring aop proxy? You have to call annotated methods from methods outside your MessageRepository class to invoke the proxy object.
General documentation for that is here: http://docs.spring.io/spring/docs/current/spring-framework-reference/htmlsingle/#aop-understanding-aop-proxies
or with examples here http://spring.io/blog/2012/05/23/transactions-caching-and-aop-understanding-proxy-usage-in-spring
I found the workaround for my problem. I had to moved the annotation upper to the spring data jpa interace.
public interface NewsMessageRepository extends JpaRepository<NewsMessage, Long> {
#CacheEvict(cacheNames = {CacheConfiguration.RSS_NEWS_BY_TYPE_CACHE}, beforeInvocation = true, key = "#p0.type")
NewsMessage save(NewsMessage news);
}
Now it is working as i expected, but still have no idea why it did not work in my service. Maybe because my services implements two interfaces?
#Service
public class NewsMessageService implements RssObserver, NewsMessageServiceable {
}
You need a public RssType getType() method in your NewsMessage class. The key expression "#news.type" in your #CacheEvict annotation is expecting either a public field named "type" or a public getter method named "getType".

QueryDSL - Predicate conversion : change root path and check structure

I'm using this awesome library, but I have a problem.
I'm implementing a DTO pattern, so I use another project to convert automaticaly an EJB to a DTO using naming conventions.
Then, I want to query the DTO and getting the real result (EJB query).
I implemented QueryDSL with JPAAnnotationProcessor on my ENTITIES, and the QuerydslAnnotationProcessor on my DTOs.
For example :
An entity User(Long Id, String username, Site site)
A DTO UserDto(Long id, String username, String siteName)
Converting objects is good, "siteName" automatically match "site.name".
And so, I put a QueryDSL Query like: userDto.id.gt(20).and(userDto.username.like("a%")).and(userDto.siteName.like("%b"));
I'm looking for a way to build the corresponding entity query
The only idea I got is to :
Clone the Query
Change the path "userDto" to "user"
Verify each predicate to know if the property exists and if the type is matching
Any way to do that or to reach my goal?
Thanks
Since this is still relevant and undocumented functionality, and since Timo's answer, while helpful, is very cryptic, here's how to do it:
First, extend ReplaceVisitor:
private class CustomReplaceVisior extends ReplaceVisitor<Void> {
#Override
public Expression<?> visit(Path<?> path, #Nullable Void context) {
// The map Timo mentioned to transform paths:
Map<Path<?>, Path<?>> map = Map.of(
QUser.user.id, QUserDto.userDto.id,
QUser.user.name, QUserDto.userDto.name
);
if (map.contains(path)) {
return map.get(path);
} else {
return super.visit(path, context);
}
}
}
Then use it like this:
CustomReplaceVisior replaceVisitor = new CustomReplaceVisior();
Predicate userPredicate = QUser.user.id.eq(2).and(QUser.user.name.eq("Somename"));
Predicate userDtoPredicate = (Predicate) userPredicate.accept(replaceVisitor, null);
You will need to convert expressions in general. With a custom ReplaceVisitor you can for example override visit(Path expr, #Nullable Void context)
A generic way to do the path replacements would be to use a Map map to define the replacements:
if (map.contains(path)) {
return map.get(path);
} else {
return super.visit(path, context);
}
You can use your visitor like this:
Expression transformedExpression = expr.accept(visitor, null);

Interface method that has different parameters in Java

Looking for some guidance on designing some code in Java.
Currently I have something like this....
#Service
class SomeService {
#Autowired
private FilterSoldOut filterSoldOut;
#Autowired
private FilterMinPriceThreshold filterMinPriceThreshold;
public List<Product> getProducts() {
List<Product> products = //...code to get some products
// Returns list of in-stock products
products = filterSoldOut.doFilter(products);
// Returns list of products above min price
products = filterMinPriceThreshold.doFilter(minPrice, products);
return products;
}
}
What I would like to be able to do is create a Filter interface with a doFilter method and then in SomeService create a List filters, which is autowired by Spring. Then in the getProducts method I can iterate the filters list and invoke doFilter. This way in the future I can, create new classes that implement the Filter interface and add them to the list via Spring configuration, and have the new filter applied without having to change the code.
But, the problem is that the parameters to the doFilter method can be different. I've read about the Command Pattern, and the Visitor Pattern but they don't quite seem to fit the bill.
Can anyone suggest a good pattern to achieve what I've described?
Thanks.
There are many ways to do this. Some are complicated, some are simpler. The simplest one would be to use varargs or an array of Object elements. The problem here is that you have to cast each objetc to its proper type in order to use them and that can be a little tricky if there are multiple types in an unknown order.
Another option is to use a Map<String,Object> (which you can wrap in a class of your own if required, something lile FilterParams) that stores parameters based on a name, and you can then obtain them and cast them accordingly.
Edit
Considering that the parameters vary on runtime, you'll need someone "well informed" about the current configuration.
Not pattern-wise but I'd rather keep it simple without using too many fancy names. What about introducing a FilterConfigurator that has a simple overloaded method configure that recieves the particular filter and configures it based on its type?. This configurator is the informed entity that knows the current values for those parameters.
The goal is to rid Service from the responsibility of configuring a filter.
In addition, if you create your Filter class, you'll be able to implement a single doFilter that you can invoke without changes.
There's another Idea... and it involves a FilterFactory that creates and initializes filters, thus having a filter 100% configured from scratch. This factory can rely on the very same FilterConfigurer or do it itself.
old:
I'd suggest you setting the filter state at construction time or at
least before you getProducts().
In your example with the two filters one of them is (probably)
checking a database for availability of the product and the other one
is comparing the product's price to some preset value. This value
(minPrice) is known before the filter is applied. It can
also be said that the filter depends on it, or that it's part of the
filter's state. Therefore I'd recommend you putting the
minPrice inside the filter at construction time (or via a
setter) and then only pass the list of products you want to filter.
Use the same pattern for your other filters.
new suggestion (came up with it after the comments):
You can create a single object (AllFiltersState) that holds all the values for all the filters. In your controller set whatever criteria you need in this object (minPrice, color, etc.) and pass it to every filter along the products - doFilter(allFiltersState, products).
As Cris say you can use next function definition:
public List<Product> doFilter(Object...args) {
if (args.length != 2)
throw new IllegalArgumentException();
if (! (args[0] instanceof String))
throw new IllegalArgumentException();
if (! (args[2] instanceof Integer))
throw new IllegalArgumentException();
String stringArgument = (String) args[0];
Integer integerArgument = (Integer) args[1];
// your code here
return ...;
}
or with command pattern:
public interface Command {
}
public class FirstCommand implements Command {
private String string;
// constructor, getters and setters
}
public class SecondCommand implements Command {
private Integer integer;
// constructor, getters and setters
}
// first service function
public List<Product> doFilter(Command command) {
if (command instanceof FirstCommand)
throw new IllegalArgumentException();
FirstCommand firstCommand = (FirstCommand) command;
return ...;
}
// second service function
public List<Product> doFilter(Command command) {
if (command instanceof SecondCommand)
throw new IllegalArgumentException();
SecondCommand secondCommand = (SecondCommand) command;
return ...;
}
EDIT:
Ok, i understand your question. And think you can create various session scoped filters.
#Service
class SomeService {
#Autowired(required = false)
private List<Filter> filters;
public List<Product> getProducts() {
List<Product> products = //...code to get some products
if (filters != null) {
for (Filter filter : filters)
products = filter.doFilter(products);
}
return products;
}
}
And then create filters with settings fields:
public PriceFilter implements Filter {
private Integer minPrice;
private Integer maxPrice;
// getters and setters
public List<Product> doFilter(List<Product> products) {
// implementation here
}
}
public ContentFilter implements Filter {
private String regexp;
// getters and setters
public List<Product> doFilter(List<Product> products) {
// implementation here
}
}
Then user can configure this filters for session and use service function getProducts to get result.
Having a list of filters getting autowired is not a very good approach to solve your problem.
Every filter depends on different types of parameters which would need to be passed to the doFilter method. Needing to do so makes the approach highly unflexible. Yes you could use varargs but it would just create a mess. That's why it's probably easier to implement a builder to build you a chain of filters to be applied to the collection of products. Adding new filters to the builder becomes a trivial task. The Builder Pattern is very useful when a lot of different parameters are at play.
Consider having this interface:
public interface CollectionFilter<T> {
public Collection<T> doFilter(Collection<T> collection);
}
A filter chaining class which applies all filters to the collection:
public class CollectionFilterChain<T> {
private final List<CollectionFilter<T>> filters;
public CollectionFilterChain(List<CollectionFilter<T>> filters) {
this.filters = filters;
}
public Collection<T> doFilter(Collection<T> collection) {
for (CollectionFilter<T> filter : filters) {
collection = filter.doFilter(collection);
}
return collection;
}
}
The two CollectionFilter<T> implementations:
public class InStockFilter<T> implements CollectionFilter<T> {
public Collection<T> doFilter(Collection<T> collection) {
// filter
}
}
public class MinPriceFilter<T> implements CollectionFilter<T> {
private final float minPrice;
public MinPriceFilter(float minPrice) {
this.minPrice = minPrice;
}
public Collection<T> doFilter(Collection<T> collection) {
// filter
}
}
And a builder to let you build the filter chain in a easy way:
public class CollectionFilterChainBuilder<T> {
List<CollectionFilter<T>> filters;
public CollectionFilterChainBuilder() {
filters = new ArrayList<CollectionFilter<T>>();
}
public CollectionFilterChainBuilder<T> inStock() {
filters.add(new InStockFilter<T>());
return this;
}
public CollectionFilterChainBuilder<T> minPrice(float price) {
filters.add(new MinPriceFilter<T>(price));
return this;
}
public CollectionFilterChain<T> build() {
return new CollectionFilterChain<T>(filters);
}
}
With the builder it's easy to create a filter chain as follows:
CollectionFilterChainBuilder<Product> builder =
new CollectionFilterChainBuilder();
CollectionFilterChain<Product> filterChain =
builder.inStock().minPrice(2.0f).build();
Collection<Product> filteredProducts =
filterChain.doFilter(products);
In a more dynamic settings you could use the builder like:
CollectionFilterChainBuilder<Product> builder = new CollectionFilterChainBuilder();
if (filterInStock) {
builder.inStock();
}
if (filterMinPrice) {
builder.minPrice(minPrice);
}
// build some more

Categories

Resources