Currently I am using Query DSL in my Java(with JPA) EE project. I recieve a filterObject from UI as json with all the filters. My FilterObject looks like this
public class FilterObject {
private String name;
private List<Status> status;
private String module;
private List<Source> source;
......
}
And in my service class I have something like this
public List<MyModel> findByFilter(FilterObject filterObject) {
BooleanBuilder builder = new BooleanBuilder();
QMyModel mymodel= QMyModel.myModel;
if(filterObject.getName() != null) {
builder.and(mymodel.name.contains(filterObject.getName()));
}
if(! CollectionUtils.isEmpty(filterObject.getStatus())) {
builder.and(mymodel.status.in(filterObject.getStatus()));
}
...............
...............
}
And finally I have this
JPAQuery<MyModel> query = new JPAQuery<>(getEntityManager());
List<MyModel> myModels = query.from(QMyModel.mymodel).where(builder).fetch();
EDIT:
/**
* QMyModel is a Querydsl query type for MyModel
*/
#Generated("com.querydsl.codegen.EntitySerializer")
public class QMyModel extends EntityPathBase<MyModel> {
private static final long serialVersionUID = 1041638507L;
private static final PathInits INITS = PathInits.DIRECT2;
public static final QMyModel myModel = new QMyModel("myModel");
public final StringPath name = createString("name");
public final EnumPath<Status> status = createEnum("status", Status.class);
public final StringPath module = createString("module");
........
.......
}
All these work. But my FilterObject is growing and has more than 10 fields. So I have like 10 If blocks in my service class method. Is there a better way to do this where I could avoid so many if blocks.
You can use lambda's, or (even better in this case) method reference:
public List<MyModel> findByFilter(FilterObject filterObject) {
BooleanBuilder builder = new BooleanBuilder();
QMyModel mymodel = QMyModel.myModel;
add(builder, filterObject.getName(), mymodel.name::contains);
add(builder, filterObject.getStatus(), mymodel.status::in);
...
}
private <T> void add(BooleanBuilder builder, T filterElement, Function<T, BooleanExpression> booleanExpressionFunction) {
if (valid(filterElement)) {
builder.and(booleanExpressionFunction.apply(filterElement));
}
}
private boolean valid(Object filterElement) {
if (filterElement == null) {
return false;
}
if (filterElement instanceof Collection) {
return !((Collection) filterElement).isEmpty();
}
return true;
}
Related
I am trying to convert my POJO into 2 different CSV representations.
My POJO:
#NoArgsConstructor
#AllArgsConstructor
public static class Example {
#JsonView(View.Public.class)
private String a;
#JsonView(View.Public.class)
private String b;
#JsonView(View.Internal.class)
private String c;
#JsonView(View.Internal.class)
private String d;
public static final class View {
interface Public {}
interface Internal extends Public {}
}
}
Public view exposed fields a and b, and Internal view exposes all fields.
The problem is that if I construct the ObjectWriter with .writerWithSchemaFor(Example.class) all my fields are included but ignored as defined by the view. ObjectWriter will create the schema as defined by the Example.class but if I apply .withView it will only hide the fields, not ignore them.
This means that I must construct the schema manually.
Tests:
#Test
public void testJson() throws JsonProcessingException {
final ObjectMapper mapper = new ObjectMapper();
final Example example = new Example("1", "2", "3", "4");
final String result = mapper.writerWithView(Example.View.Public.class).writeValueAsString(example);
System.out.println(result); // {"a":"1","b":"2"}
}
#Test
public void testCsv() throws JsonProcessingException {
final CsvMapper mapper = new CsvMapper();
final Example example = new Example("1", "2", "3", "4");
final String result = mapper.writerWithSchemaFor(Example.class).withView(Example.View.Public.class).writeValueAsString(example);
System.out.println(result); // 1,2,,
}
#Test
public void testCsvWithCustomSchema() throws JsonProcessingException {
final CsvMapper mapper = new CsvMapper();
CsvSchema schema = CsvSchema.builder()
.addColumn("a")
.addColumn("b")
.build();
final Example example = new Example("1", "2", "3", "4");
final String result = mapper.writer().with(schema).withView(Example.View.Public.class).writeValueAsString(example);
System.out.println(result); // 1,2
}
testCsv test has 4 fields, but 2 are excluded. testCsvWithCustomSchema test has only the fields I want.
Is there a way to get CsvSchema that will match my #JsonView without having to construct it myself?
Here is a solution I did with reflection, I am not really happy with it since it is still "manually" building the schema.
This solution is also bad since it ignores mapper configuration like MapperFeature.DEFAULT_VIEW_INCLUSION.
This seems like doing something that should be already available from the library.
#AllArgsConstructor
public class GenericPojoCsvSchemaBuilder {
public CsvSchema build(final Class<?> type) {
return build(type, null);
}
public CsvSchema build(final Class<?> type, final Class<?> view) {
return build(CsvSchema.builder(), type, view);
}
public CsvSchema build(final CsvSchema.Builder builder, final Class<?> type) {
return build(builder, type, null);
}
public CsvSchema build(final CsvSchema.Builder builder, final Class<?> type, final Class<?> view) {
final JsonPropertyOrder propertyOrder = type.getAnnotation(JsonPropertyOrder.class);
final List<Field> fieldsForView;
// DO NOT use Arrays.asList because it uses an internal fixed length implementation which cannot use .removeAll (throws UnsupportedOperationException)
final List<Field> unorderedFields = Arrays.stream(type.getDeclaredFields()).collect(Collectors.toList());
if (propertyOrder != null && propertyOrder.value().length > 0) {
final List<Field> orderedFields = Arrays.stream(propertyOrder.value()).map(s -> {
try {
return type.getDeclaredField(s);
} catch (final NoSuchFieldException e) {
throw new IllegalArgumentException(e);
}
}).collect(Collectors.toList());
if (propertyOrder.value().length < type.getDeclaredFields().length) {
unorderedFields.removeAll(orderedFields);
orderedFields.addAll(unorderedFields);
}
fieldsForView = getJsonViewFields(orderedFields, view);
} else {
fieldsForView = getJsonViewFields(unorderedFields ,view);
}
final JsonIgnoreFieldFilter ignoreFieldFilter = new JsonIgnoreFieldFilter(type.getDeclaredAnnotation(JsonIgnoreProperties.class));
fieldsForView.forEach(field -> {
if (ignoreFieldFilter.matches(field)) {
builder.addColumn(field.getName());
}
});
return builder.build();
}
private List<Field> getJsonViewFields(final List<Field> fields, final Class<?> view) {
if (view == null) {
return fields;
}
return fields.stream()
.filter(field -> {
final JsonView jsonView = field.getAnnotation(JsonView.class);
return jsonView != null && Arrays.stream(jsonView.value()).anyMatch(candidate -> candidate.isAssignableFrom(view));
})
.collect(Collectors.toList());
}
private class JsonIgnoreFieldFilter implements ReflectionUtils.FieldFilter {
private final List<String> fieldNames;
public JsonIgnoreFieldFilter(final JsonIgnoreProperties jsonIgnoreProperties) {
if (jsonIgnoreProperties != null) {
fieldNames = Arrays.asList(jsonIgnoreProperties.value());
} else {
fieldNames = null;
}
}
#Override
public boolean matches(final Field field) {
if (fieldNames != null && fieldNames.contains(field.getName())) {
return false;
}
final JsonIgnore jsonIgnore = field.getDeclaredAnnotation(JsonIgnore.class);
return jsonIgnore == null || !jsonIgnore.value();
}
}
}
Some how the requirement changed and I have to use aggregation query insted of basic query in setQuery(). Is this even possible?
Please suggest how can i do that? My Aggregation query is ready but not sure how can I use that in spring batch
public ItemReader<ProfileCollection> searchMongoItemReader() throws Exception, UnexpectedInputException, ParseException, NonTransientResourceException {
MongoItemReader<MyCollection> mongoItemReader = new MongoItemReader<>();
mongoItemReader.setTemplate(myMongoTemplate);
mongoItemReader.setCollection(myMongoCollection);
mongoItemReader.setQuery(" Some Simple Query - Basic");
mongoItemReader.setTargetType(MyCollection.class);
Map<String, Sort.Direction> sort = new HashMap<>();
sort.put("field4", Sort.Direction.ASC);
mongoItemReader.setSort(sort);
return mongoItemReader;
}
extend MongoItemReader and provide your own implementation for method doPageRead(). This way you will have full pagination support and this reading of documents will be part of a step.
public class CustomMongoItemReader<T, O> extends MongoItemReader<T> {
private MongoTemplate template;
private Class<? extends T> inputType;
private Class<O> outputType
private MatchOperation match;
private ProjectionOperation projection;
private String collection;
#Override
protected Iterator<T> doPageRead() {
Pageable page = PageRequest.of(page, pageSize) //page and page size are coming from the class that MongoItemReader extends
Aggregation agg = newAggregation(match, projection, skip(page.getPageNumber() * page.getPageSize()), limit(page.getPageSize()));
return (Iterator<T>) template.aggregate(agg, collection, outputType).iterator();
}
}
And other getter and setters and other methods. Just have a look at sourcecode for MongoItemReader here.
I also removed Query support from it. You can have that also in the same method just copy paste it from MongoItemReader. Same with Sort.
And in the class where you have a reader, you would do something like:
public MongoItemReader<T> reader() {
CustomMongoItemReader reader = new CustomMongoItemReader();
reader.setTemplate(mongoTemplate);
reader.setName("abc");
reader.setTargetType(input.class);
reader.setOutputType(output.class);
reader.setCollection(myMongoCollection);
reader.setMatch(Aggregation.match(new Criteria()....)));
reader.setProjection(Aggregation.project("..","..");
return reader;
}
To be able to use aggregation in a job, making use of all features that spring batch has, you have to create a custom ItemReader.
Extending AbstractPaginatedDateItemReader we can use all the elements from pageable operations.
Here's a simple of that custom class:
public class CustomAggreagationPaginatedItemReader<T> extends AbstractPaginatedDataItemReader<T> implements InitializingBean {
private static final Pattern PLACEHOLDER = Pattern.compile("\\?(\\d+)");
private MongoOperations template;
private Class<? extends T> type;
private Sort sort;
private String collection;
public CustomAggreagationPaginatedItemReader() {
super();
setName(ClassUtils.getShortName(CustomAggreagationPaginatedItemReader.class));
}
public void setTemplate(MongoOperations template) {
this.template = template;
}
public void setTargetType(Class<? extends T> type) {
this.type = type;
}
public void setSort(Map<String, Sort.Direction> sorts) {
this.sort = convertToSort(sorts);
}
public void setCollection(String collection) {
this.collection = collection;
}
#Override
#SuppressWarnings("unchecked")
protected Iterator<T> doPageRead() {
Pageable pageRequest = new PageRequest(page, pageSize, sort);
BasicDBObject cursor = new BasicDBObject();
cursor.append("batchSize", 100);
SkipOperation skipOperation = skip(Long.valueOf(pageRequest.getPageNumber()) * Long.valueOf(pageRequest.getPageSize()));
Aggregation aggregation = newAggregation(
//Include here all your aggreationOperations,
skipOperation,
limit(pageRequest.getPageSize())
).withOptions(newAggregationOptions().cursor(cursor).build());
return (Iterator<T>) template.aggregate(aggregation, collection, type).iterator();
}
#Override
public void afterPropertiesSet() throws Exception {
Assert.state(template != null, "An implementation of MongoOperations is required.");
Assert.state(type != null, "A type to convert the input into is required.");
Assert.state(collection != null, "A collection is required.");
}
private String replacePlaceholders(String input, List<Object> values) {
Matcher matcher = PLACEHOLDER.matcher(input);
String result = input;
while (matcher.find()) {
String group = matcher.group();
int index = Integer.parseInt(matcher.group(1));
result = result.replace(group, getParameterWithIndex(values, index));
}
return result;
}
private String getParameterWithIndex(List<Object> values, int index) {
return JSON.serialize(values.get(index));
}
private Sort convertToSort(Map<String, Sort.Direction> sorts) {
List<Sort.Order> sortValues = new ArrayList<Sort.Order>();
for (Map.Entry<String, Sort.Direction> curSort : sorts.entrySet()) {
sortValues.add(new Sort.Order(curSort.getValue(), curSort.getKey()));
}
return new Sort(sortValues);
}
}
If you look with attention you can see it was create using MongoItemReader from Spring framework, you can see that class at org.springframework.batch.item.data.MongoItemReader, there's the way you have to create a whole new class extending AbstractPaginatedDataItemReader, if you have a look at "doPageRead" method you should be able to see that it only uses the find operation of MongoTemplate, making it impossible to use Aggregate operations in it.
Here is how it should like to use it our CustomReader:
#Bean
public ItemReader<YourDataClass> reader(MongoTemplate mongoTemplate) {
CustomAggreagationPaginatedItemReader<YourDataClass> customAggreagationPaginatedItemReader = new CustomAggreagationPaginatedItemReader<>();
Map<String, Direction> sort = new HashMap<String, Direction>();
sort.put("id", Direction.ASC);
customAggreagationPaginatedItemReader.setTemplate(mongoTemplate);
customAggreagationPaginatedItemReader.setCollection("collectionName");
customAggreagationPaginatedItemReader.setTargetType(YourDataClass.class);
customAggreagationPaginatedItemReader.setSort(sort);
return customAggreagationPaginatedItemReader;
}
As you may notice, you also need a instance of MongoTemplate, here's how it should like too:
#Bean
public MongoTemplate mongoTemplate(MongoDbFactory mongoDbFactory) {
return new MongoTemplate(mongoDbFactory);
}
Where MongoDbFactory is a autowired object by spring framework.
Hope that's enough to help you.
How to set super class "LookUp" properties id, name when parsing JSON array
of countries:
[ {"ID":5, "CountryNameEN":"UK" }, {"ID":6, "CountryNameEN":"USA" } ]
For example, When i calling get_lookups_countries() API with Retrofit 2 & parse the response with google Gson Library, I want to set super class instance members id & name with the same values of derived class "Country"
#GET(Constants.LookUps.GET_COUNTRIES) Call<List<Country>> get_lookups_countries();
Gson gson = new GsonBuilder()
.setLenient()
.registerTypeAdapter(LookUp.class,new LookupsDeserializer())
.create();
HttpLoggingInterceptor logging = new HttpLoggingInterceptor();
logging.setLevel(HttpLoggingInterceptor.Level.BODY);
OkHttpClient.Builder okHttpClient = new OkHttpClient.Builder();
Retrofit retrofit = new Retrofit.Builder()
.baseUrl(BASE_URL)
.client(okHttpClient.build())
.addConverterFactory(GsonConverterFactory.create(gson))
.build();
return retrofit.create(APIEndpointVatTax.class);
public class LookUp {
int id;
String name;
}
public class Country extends LookUp {
#SerializedName("ID")
#Expose
private Integer iD;
#SerializedName("CountryNameEN")
#Expose
private String countryNameEN;
}
You seem to have some issues with your JSON mappings: you're trying to bind the super class fields to the sub class fields, however this is where an interface might be a better choice for you, because your intention is just asking the deserialized object for its id and name.
I would do it like this:
interface LookUp {
int getId();
String getName();
}
final class CountryByInterface
implements LookUp {
#SerializedName("ID")
private final Integer id = null;
#SerializedName("CountryNameEN")
private final String name = null;
#Override
public int getId() {
return id;
}
#Override
public String getName() {
return name;
}
}
So it could be used easily (Java 8 for the demo purposes only):
final Gson gson = new Gson();
final Type countryListType = new TypeToken<List<CountryByInterface>>() {
}.getType();
try ( final Reader reader = getPackageResourceReader(Q43247712.class, "countries.json") ) {
gson.<List<CountryByInterface>>fromJson(reader, countryListType)
.stream()
.map(c -> c.getId() + "=>" + c.getName())
.forEach(System.out::println);
}
If for some justified reason you really need the super class to hold such fields, you have to implement a post-processor (inspired with PostConstructAdapterFactory). Say,
abstract class AbstractLookUp {
int id;
String name;
abstract int getId();
abstract String getName();
final void postSetUp() {
id = getId();
name = getName();
}
}
final class CountryByClass
extends AbstractLookUp {
#SerializedName("ID")
private final Integer id = null;
#SerializedName("CountryNameEN")
private final String name = null;
#Override
int getId() {
return id;
}
#Override
String getName() {
return name;
}
}
final Gson gson = new GsonBuilder()
.registerTypeAdapterFactory(new TypeAdapterFactory() {
#Override
public <T> TypeAdapter<T> create(final Gson gson, final TypeToken<T> typeToken) {
// Check if it's a class we can handle: AbstractLookUp
if ( AbstractLookUp.class.isAssignableFrom(typeToken.getRawType()) ) {
// Get the downstream parser for the given type
final TypeAdapter<T> delegateTypeAdapter = gson.getDelegateAdapter(this, typeToken);
return new TypeAdapter<T>() {
#Override
public void write(final JsonWriter out, final T value)
throws IOException {
delegateTypeAdapter.write(out, value);
}
#Override
public T read(final JsonReader in)
throws IOException {
// Deserialize it as an AbstractLookUp instance
final AbstractLookUp abstractLookUp = (AbstractLookUp) delegateTypeAdapter.read(in);
// And set it up
abstractLookUp.postSetUp();
#SuppressWarnings("unchecked")
final T result = (T) abstractLookUp;
return result;
}
};
}
return null;
}
})
.create();
final Type countryListType = new TypeToken<List<CountryByClass>>() {
}.getType();
try ( final Reader reader = getPackageResourceReader(Q43247712.class, "countries.json") ) {
gson.<List<CountryByClass>>fromJson(reader, countryListType)
.stream()
.map(c -> ((AbstractLookUp) c).id + "=>" + ((AbstractLookUp) c).name)
.forEach(System.out::println);
}
Both examples produce
5=>UK
6=>USA
However I find the first approach better designed and much easier to use, whereas the second one demonstrates how Gson can be configured to implement complex (de)serialization strategies.
I read in the documentation of custom converter that for a custom converter on a field mapping I can pass a custom parameter.
This is not good enough for me because this is specified once when building the mapper.
Is there any way to pass this parameter when doing the actual mapping?
mapper.map(sourceObject, Destination.class, "parameter");
My actual problem is that I want to map from one class containing multi lingual properties and destination should only have the "choosen" language properties.
Source class
public class Source
{
// Fields in default language
private String prop1;
private String prop2;
// List containing all translations of properties
private List<SourceName> sourceNames;
}
public class SourceName
{
private int lang_id;
private String prop1;
private String prop2;
}
Destination class
public class Destination
{
// Fields translated in choosen language
private String prop1;
private String prop2;
}
My goal is to be able to do like this:
Destination destination = mapper.map(source, Destination.class, 4); // To lang_id 4
Thanks
I have made this function (FIELDMAP var is "fieldMap"):
public static <T> T mapWithParam(Object source, Class<T> destinationClass, String param) throws MappingException {
T toReturn = null;
DozerBeanMapper dbm = (DozerBeanMapper) MapperFactory.getMapper();
MappingMetadata mmdt = dbm.getMappingMetadata();
ClassMappingMetadata classMapping = mmdt.getClassMapping(source.getClass(), destinationClass);
List<FieldMappingMetadata> fielMappingMetadata = classMapping.getFieldMappings();
List<OriginalFieldMap> originalValues = new ArrayList<OriginalFieldMap>();
for (FieldMappingMetadata fmmd : fielMappingMetadata) {
if (fmmd.getCustomConverter() != null) {
try {
Class<?> cls = Class.forName(fmmd.getCustomConverter());
if (cls.newInstance() instanceof ConfigurableCustomConverter) {
FieldMap modifieldFieldMap = (FieldMap)ReflectionHelper.executeGetMethod(fmmd, FIELDMAP);
originalValues.add(new OriginalFieldMap(modifieldFieldMap, modifieldFieldMap.getCustomConverterParam()));
modifieldFieldMap.setCustomConverterParam(param);
ReflectionHelper.executeSetMethod(fmmd, FIELDMAP, modifieldFieldMap);
}
} catch (ReflectionException | ClassNotFoundException | InstantiationException | IllegalAccessException e) {
e.printStackTrace();
}
}
}
toReturn = dbm.map(source, destinationClass);
for (OriginalFieldMap ofp : originalValues) {
ofp.getFieldMap().setCustomConverterParam(ofp.getOriginalValue());
}
return toReturn;
}
And OriginalFieldMap class:
import org.dozer.fieldmap.FieldMap;
public class OriginalFieldMap{
FieldMap fieldMap;
String originalValue;
public OriginalFieldMap(FieldMap fieldMap, String originalValue) {
super();
this.fieldMap = fieldMap;
this.originalValue = originalValue;
}
public FieldMap getFieldMap() {
return fieldMap;
}
public String getOriginalValue() {
return originalValue;
}
}
I'm currently writing a Spring MVC-based webapp.
Rather than writing one test for every annotated method, I would like to benefit from Parameterized JUnit runner.
Finally, I got it almost working, although I had to change all primitive arguments to their wrapper counterpart in my controller methods (and then manually do the sanity checks on null refs).
If it can help, here is the code (this also depends on Guava):
#RunWith(Parameterized.class)
public class MyControllerMappingTest {
private MockHttpServletRequest request;
private MockHttpServletResponse response;
private MyController mockedController;
private AnnotationMethodHandlerAdapter annotationHandlerAdapter;
private final String httpMethod;
private final String uri;
private final String controllerMethod;
private final Class<?>[] parameterTypes;
private final Object[] parameterValues;
#Before
public void setup() {
request = new MockHttpServletRequest();
response = new MockHttpServletResponse();
mockedController = mock(MyController.class);
annotationHandlerAdapter = new AnnotationMethodHandlerAdapter();
}
#Parameters
public static Collection<Object[]> requestMappings() {
return asList(new Object[][] {
{"GET", "/my/uri/0", "index", arguments(new MethodArgument(Integer.class, 0))}
});
}
private static List<MethodArgument> arguments(MethodArgument... arguments) {
return asList(arguments);
}
public MyControllerMappingTest(String httpMethod, String uri, String controllerMethod, List<MethodArgument> additionalParameters) {
this.httpMethod = httpMethod;
this.uri = uri;
this.controllerMethod = controllerMethod;
this.parameterTypes = new Class<?>[additionalParameters.size()];
initializeParameterTypes(additionalParameters);
this.parameterValues = newArrayList(transform(additionalParameters, valueExtractor())).toArray();
}
private void initializeParameterTypes(List<MethodArgument> additionalParameters) {
Iterable<Class<?>> classes = transform(additionalParameters, typeExtractor());
int i = 0;
for (Class<?> parameterClass : classes) {
parameterTypes[i++] = parameterClass;
}
}
#Test
public void when_matching_mapping_constraints_then_controller_method_automatically_called() throws Exception {
request.setMethod(httpMethod);
request.setRequestURI(uri);
annotationHandlerAdapter.handle(request, response, mockedController);
Method method = MyController.class.getMethod(controllerMethod, parameterTypes);
method.invoke(verify(mockedController), parameterValues);
}
}
with the custom class MethodArgument that follows:
public class MethodArgument {
private final Class<?> type;
private final Object value;
public MethodArgument(final Class<?> type, final Object value) {
this.type = type;
this.value = value;
}
public Object getValue() {
return value;
}
public Class<?> getType() {
return type;
}
public static Function<MethodArgument, Class<?>> typeExtractor() {
return new Function<MethodArgument, Class<?>>() {
#Override
public Class<?> apply(MethodArgument argument) {
return argument.getType();
}
};
}
public static Function<MethodArgument, Object> valueExtractor() {
return new Function<MethodArgument, Object>() {
#Override
public Object apply(MethodArgument argument) {
return argument.getValue();
}
};
}
}
So, I'm almost there, the only test case here works because of Java Integer cache, and the Integer instance is therefore the same throughout the call chain... This however doesn't work with custom objects, I always end up with an InvocationTargetException (cause: "Argument(s) are different!")...
The types are correct but the passed instances are not identical to the ones set in the #Parameters method.
Any idea how to work around this?
Hold your horses!
SpringSource is baking a spring-test-mvc module :
https://github.com/SpringSource/spring-test-mvc
It would be nice if instead of providing the example that works, you could provide the one that doesn't, and provide the stacktrace as well.
I quickly checked Google, it seems that Mockito doesn't handle well reflection on spy objects.
If you really wanna go along that road, there might be another way: providing the expected called method as part of your parameterized data, not by providing reflection data, but by actually calling the mock from there.
I'm writing that without any IDE at hand, so there might be compile errors, but you'll get the idea:
#RunWith(Parameterized.class)
public class MyControllerMappingTest {
public interface VerifyCall<T> {
void on(T controller);
}
#Parameters
public static Collection<Object[]> requestMappings() {
Object[][] testCases = {
{"GET", "/my/uri/0", new VerifyCall<MyController>() {
#Override
public void on(MyController controller) {
controller.index(0);
}
}}
};
return asList(testCases);
}
private MockHttpServletRequest request;
private MockHttpServletResponse response;
private MyController mockedController;
private AnnotationMethodHandlerAdapter annotationHandlerAdapter;
private final String httpMethod;
private final String uri;
private final VerifyCall<MyController> verifyCall;
public MyControllerMappingTest(String httpMethod, String uri, VerifyCall<MyController> verifyCall) {
this.httpMethod = httpMethod;
this.uri = uri;
this.verifyCall = verifyCall;
}
#Before
public void setup() {
request = new MockHttpServletRequest();
response = new MockHttpServletResponse();
mockedController = mock(MyController.class);
annotationHandlerAdapter = new AnnotationMethodHandlerAdapter();
}
#Test
public void when_matching_mapping_constraints_then_controller_method_automatically_called() throws Exception {
request.setMethod(httpMethod);
request.setRequestURI(uri);
annotationHandlerAdapter.handle(request, response, mockedController);
verifyCall.on(verify(mockedController));
}
}
Of course, having Java Lambas would help making this more readable.
You could also use FunkyJFunctional :
#RunWith(Parameterized.class)
public class MyControllerMappingTest {
#Parameters
public static Collection<Object[]> requestMappings() {
class IndexZero extends FF<MyController, Void> {{ in.index(0); }}
Object[][] testCases = { //
{"GET", "/my/uri/0", withF(IndexZero.clas)}
};
return asList(testCases);
}
private MockHttpServletRequest request;
private MockHttpServletResponse response;
private MyController mockedController;
private AnnotationMethodHandlerAdapter annotationHandlerAdapter;
private final String httpMethod;
private final String uri;
private final Function<MyController, Void> verifyCall;
public MyControllerMappingTest(String httpMethod, String uri, Function<MyController, Void> verifyCall) {
this.httpMethod = httpMethod;
this.uri = uri;
this.verifyCall = verifyCall;
}
#Before
public void setup() {
request = new MockHttpServletRequest();
response = new MockHttpServletResponse();
mockedController = mock(MyController.class);
annotationHandlerAdapter = new AnnotationMethodHandlerAdapter();
}
#Test
public void when_matching_mapping_constraints_then_controller_method_automatically_called() throws Exception {
request.setMethod(httpMethod);
request.setRequestURI(uri);
annotationHandlerAdapter.handle(request, response, mockedController);
verifyCall.apply(verify(mockedController));
}
}
A few side notes:
For the sake of readability, it's a good practice to put your static members first in your class. Instance methods (setup()) should also go after the constructor.
Array syntax:
Instead of this syntax:
return asList(new Object[][] {
{},
{}
};
I find this syntax to be more readable:
Object[][] testCases = {
{},
{}
};
return asList(testCases);