Spring Mongodb - Cannot write custom Converter for java.time.Period - java

I'm using Spring Cloud Brixton.SR4 with Spring Data MongoDB.
I have a very simple entity:
#Document
public class Foo{
private Period period;
//getter & setter
}
Because java.time.Period is not supported by jsr310 I'm creating custom converters:
class Converters {
#Component
#WritingConverter
static class PeriodToStringConverter implements Converter<Period, String> {
#Override
public String convert(Period period) {
return period.toString();
}
}
#ReadingConverter
#Component
static class StringToPeriodConverter implements Converter<String, Period> {
#Override
public Period convert(String s) {
return Period.parse(s);
}
}
Now I register them in my configuration class extending AbstractMongoConfiguration:
#Bean
#Override
public MappingMongoConverter mappingMongoConverter() throws Exception {
DbRefResolver dbRefResolver = new DefaultDbRefResolver(mongoDbFactory());
MappingMongoConverter converter = new MappingMongoConverter(dbRefResolver, mongoMappingContext());
final CustomConversions conversions = customConversions();
log.info("hasCustomWriteTarget(Period.class): " + conversions.hasCustomWriteTarget(Period.class));
log.info("hasCustomWriteTarget(Period.class, String.class): " + conversions.hasCustomWriteTarget(Period.class, String.class));
log.info("hasCustomReadTarget(String.class, Period.class): " + conversions.hasCustomReadTarget(String.class, Period.class));
converter.setCustomConversions(conversions);
converter.afterPropertiesSet(); //probably not needed, trying out of despair
return converter;
}
#Bean
#Override
public CustomConversions customConversions() {
List<Converter> converters = new ArrayList<>();
converters.add(new Converters.PeriodToStringConverter());
converters.add(new Converters.StringToPeriodConverter());
return new CustomConversions(converters);
}
When I start my app I see in the logs:
hasCustomWriteTarget(Period.class): true
hasCustomWriteTarget(Period.class, String.class): true
hasCustomReadTarget(String.class, Period.class): true
Now I create a new Foo and save it to my repository:
Foo foo = new Foo();
foo.setPeriod(Period.of(2, 0, 1));
fooRepository.save(foo);
Now the weirdness happens:
In Mongodb I see:
{
"_id": ObjectId("xxxx"),
"period": {
"years" : 0,
"months" : 2,
"days" : 1
}
}
So already that's something wrong. It should be saved as a String
When I try to read the object in Java I get:
org.springframework.data.mapping.model.MappingException: No property null found on entity class java.time.Period to bind constructor parameter to!
I debugged the code in MappingMongoConverter:
if (conversions.hasCustomReadTarget(dbo.getClass(), rawType)) {
return conversionService.convert(dbo, rawType);
}
because my object was not store as a String the dbo variable is actually a BasicDbObject and therefore I don't have converter for this.
Any idea why my write converter is not being used to persist the Period?
I have jackson-datatype-jdk8 on my classpath, could it be the issue? Would jackson be involved at all for persisting in Mongodb?
EDIT
It seems to be a registration issue. When I debug the code, the CustomConversion object used in MappingMongoConverter is different than the one I create. And it doesn't have the custom converters which I create

OK it was extremely stupid...
I was also creating my own MongoTemplate:
#Bean
public MongoTemplate mongoTemplate() throws Exception {
return new MongoTemplate(mongoDbFactory());
}
Which basically ignores my custom converter. To fix it:
#Bean
public MongoTemplate mongoTemplate() throws Exception {
return new MongoTemplate(mongoDbFactory(), mappingMongoConverter());
}

Related

Persisting Path objects with spring-data-mongodb respositories

In a project I use spring-boot-starter-data-mongodb:2.5.3 and therefore spring-data-mongodb:3.2.3 and have an entity class that simplified looks like this:
#Document
public class Task {
#Id
private final String id;
private final Path taskDir;
...
// constructor, getters, setters
}
with a default Spring MongoDB repository that allows to retrieve the task via its id.
The Mongo configuration looks as such:
#Configuration
#EnableMongoRepositories(basePackages = {
"path.to.repository"
}, mongoTemplateRef = MongoConfig.MONGO_TEMPLATE_REF)
#EnableConfigurationProperties(MongoSettings.class)
public class MongoConfig extends MongoConfigurationSupport {
private static final Logger LOG = LoggerFactory.getLogger(MethodHandles.lookup().lookupClass());
public static final String MONGO_TEMPLATE_REF = "mongoAlTemplate";
private final MongoSettings mongoSettings;
#Autowired
public MongoConfig(final MongoSettings mongoSettings) {
this.mongoSettings = mongoSettings;
}
#Bean(name = "ourMongo", destroyMethod = "close")
public MongoClient ourMongoClient() {
MongoCredential credential =
MongoCredential.createCredential(mongoSettings.getUser(),
mongoSettings.getDb(),
mongoSettings.getPassword());
MongoClientSettings clientSettings = MongoClientSettings.builder()
.readPreference(ReadPreference.primary())
// enable optimistic locking for #Version and eTag usage
.writeConcern(WriteConcern.ACKNOWLEDGED)
.credential(credential)
.applyToSocketSettings(
builder -> builder.connectTimeout(15, TimeUnit.SECONDS)
.readTimeout(1, TimeUnit.MINUTES))
.applyToConnectionPoolSettings(
builder -> builder.maxConnectionIdleTime(10, TimeUnit.MINUTES)
.minSize(5).maxSize(20))
// .applyToClusterSettings(
// builder -> builder.requiredClusterType(ClusterType.REPLICA_SET)
// .hosts(Arrays.asList(new ServerAddress("host1", 27017),
// new ServerAddress("host2", 27017)))
// .build())
.build();
return MongoClients.create(clientSettings);
}
#Override
#Nonnull
protected String getDatabaseName() {
return mongoSettings.getDb();
}
#Bean(name = MONGO_TEMPLATE_REF)
public MongoTemplate ourMongoTemplate() throws Exception {
return new MongoTemplate(ourMongoClient(), getDatabaseName());
}
}
On attempting to save a task via taskRepository.save(task) Java ends up in a StackOverflowError
java.lang.StackOverflowError
at java.lang.ThreadLocal.get(ThreadLocal.java:160)
at java.util.concurrent.locks.ReentrantReadWriteLock$Sync.tryReleaseShared(ReentrantReadWriteLock.java:423)
at java.util.concurrent.locks.AbstractQueuedSynchronizer.releaseShared(AbstractQueuedSynchronizer.java:1341)
at java.util.concurrent.locks.ReentrantReadWriteLock$ReadLock.unlock(ReentrantReadWriteLock.java:881)
at org.springframework.data.mapping.context.AbstractMappingContext.getPersistentEntity(AbstractMappingContext.java:239)
at org.springframework.data.mapping.context.AbstractMappingContext.getPersistentEntity(AbstractMappingContext.java:201)
at org.springframework.data.mapping.context.AbstractMappingContext.getPersistentEntity(AbstractMappingContext.java:87)
at org.springframework.data.mapping.context.MappingContext.getRequiredPersistentEntity(MappingContext.java:73)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.writePropertyInternal(MappingMongoConverter.java:740)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.writeProperties(MappingMongoConverter.java:657)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.writeInternal(MappingMongoConverter.java:633)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.writePropertyInternal(MappingMongoConverter.java:746)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.writeProperties(MappingMongoConverter.java:657)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.writeInternal(MappingMongoConverter.java:633)
...
On annotating the path object taskDir in the Task class with #Transient I'm able to persist the task, so the problem seems to be related with Java/Spring/MongoDB being unable to handle Path objects directly.
My next attempt was to configure a custom converter inside the MongoConfig class to convert between Path and String representations:
#Override
protected void configureConverters(
MongoCustomConversions.MongoConverterConfigurationAdapter converterConfigurationAdapter) {
LOG.info("configuring converters");
converterConfigurationAdapter.registerConverter(new Converter<Path, String>() {
#Override
public String convert(#Nonnull Path path) {
return path.normalize().toAbsolutePath().toString();
}
});
converterConfigurationAdapter.registerConverter(new Converter<String, Path>() {
#Override
public Path convert(#Nonnull String path) {
return Paths.get(path);
}
});
}
though the error remained. I then defined a direct conversion between the Task object and a DBObject as showcased in this guide
#Override
protected void configureConverters(
MongoCustomConversions.MongoConverterConfigurationAdapter converterConfigurationAdapter) {
LOG.info("configuring converters");
converterConfigurationAdapter.registerConverter(new Converter<Task, DBObject>() {
#Override
public DBObject convert(#Nonnull Task source) {
DBObject dbObject = new BasicDBObject();
if (source.getTaskDirectory() != null) {
dbObject.put("taskDir", source.getTaskDirectory().normalize().toAbsolutePath().toString());
}
...
return dbObject;
}
});
}
and I still get a StackOverflowError in return. Through the log statement I added I see that Spring called into the configureConverters method and therefore should have registered the custom converters.
Why do I still get the StackOverflowError though? How do I need to tell Spring to treat Path objects as Strings while persisting and on read-time convert the String value to a Path object back again?
Update:
I've now followed the example given in the official documentation and refactored the converter to its own class
import org.bson.Document;
import org.springframework.core.convert.converter.Converter;
import org.springframework.data.convert.WritingConverter;
import javax.annotation.Nonnull;
#WritingConverter
public class TaskWriteConverter implements Converter<Task, Document> {
#Override
public Document convert(#Nonnull Task source) {
Document document = new Document();
document.put("_id", source.getId());
if (source.getTaskDir() != null) {
document.put("taskDir", source.getTaskDir().normalize().toAbsolutePath().toString());
}
return document;
}
}
The configuration in the MongoConfig class now looks like this:
#Override
protected void configureConverters(
MongoCustomConversions.MongoConverterConfigurationAdapter adapter) {
LOG.info("configuring converters");
adapter.registerConverter(new TaskWriteConverter());
adapter.registerConverter(new TaskReadConverter());
adapter.registerConverter(new Converter<Path, String>() {
#Override
public String convert(#Nonnull Path path) {
return path.normalize().toAbsolutePath().toString();
}
});
adapter.registerConverter(new Converter<String, Path>() {
#Override
public Path convert(#Nonnull String path) {
return Paths.get(path);
}
});
}
After changing the logging level for org.springframework.data to debug I see in the logs that these converters also got picked up:
2021-09-23 14:09:20.469 [INFO ] [ main] MongoConfig configuring converters
2021-09-23 14:09:20.480 [DEBUG] [ main] CustomConversions Adding user defined converter from class com.acme.Task to class org.bson.Document as writing converter.
2021-09-23 14:09:20.480 [DEBUG] [ main] CustomConversions Adding user defined converter from class org.bson.Document to class com.acme.Task as reading converter.
2021-09-23 14:09:20.481 [DEBUG] [ main] CustomConversions Adding user defined converter from interface java.nio.file.Path to class java.lang.String as writing converter.
2021-09-23 14:09:20.481 [DEBUG] [ main] CustomConversions Adding user defined converter from class java.lang.String to interface java.nio.file.Path as reading converter.
However, I see the that most of the converters are added multiple times, i.e. I find a log for Adding converter from class java.lang.Character to class java.lang.String as writing converter. actually 4 times before the application hits the save method on the repository. As my custom converters are only added the 3rd time all of these converters appear in the logs, I have a feeling that they are somehow overwritten as the logs in the last "iteration" don't include my custom converters.
The test case that reproduces that issue is as follows:
#ĹšpringBootTest
#AutoConfigureMockMvc
#PropertySource("classpath:application-test.properties")
public class SomeIT {
#Autowired
private TaskRepository taskRepository;
...
#Test
public void testTaskPersistence() throws Exception {
Task task = new Task("1234", Paths.get("/home/roman"));
taskRepository.save(task);
}
...
}
The test-method is only used to investigate into the current persistence issue and under normal conditions shouldn't be there at all as the integration test tests the upload of a large file, its preprocessing and so on. This integration tests however fails due to Spring not being able, at least it seems so, to store entities that contain Path objects.
Note that for simple entities I do not have issues in persisting them with the outlined setup and I also seem them in the dockerized MongoDB.
I haven't had time yet to dig deeper into why Spring has such problems with Path objects or why my custom converters suddenly disappear in the last iteration of the CustomConversions log output.
It turns out that the way the mongoTemplate is configured does "overwrite" any specified custom converters and thus Spring is not able to make use of these and convert Path to String and vice versa.
After changing the MongoConfig to the one below, I'm finally able to use my custom converters and thus persist entities as expected:
#Configuration
#EnableMongoRepositories(basePackages = {
"path.to.repository"
}, mongoTemplateRef = MongoConfig.MONGO_TEMPLATE_REF)
#EnableConfigurationProperties(MongoSettings.class)
public class MongoConfig extends MongoConfigurationSupport {
private static final Logger LOG = LoggerFactory.getLogger(MethodHandles.lookup().lookupClass());
public static final String MONGO_TEMPLATE_REF = "mongoAlTemplate";
private final MongoSettings mongoSettings;
#Autowired
public MongoConfig(final MongoSettings mongoSettings) {
this.mongoSettings = mongoSettings;
}
#Bean(name = "ourMongo", destroyMethod = "close")
public MongoClient ourMongoClient() {
MongoCredential credential =
MongoCredential.createCredential(mongoSettings.getUser(),
mongoSettings.getDb(),
mongoSettings.getPassword());
MongoClientSettings clientSettings = MongoClientSettings.builder()
.readPreference(ReadPreference.primary())
// enable optimistic locking for #Version and eTag usage
.writeConcern(WriteConcern.ACKNOWLEDGED)
.credential(credential)
.applyToSocketSettings(
builder -> builder.connectTimeout(15, TimeUnit.SECONDS)
.readTimeout(1, TimeUnit.MINUTES))
.applyToConnectionPoolSettings(
builder -> builder.maxConnectionIdleTime(10, TimeUnit.MINUTES)
.minSize(5).maxSize(20))
// .applyToClusterSettings(
// builder -> builder.requiredClusterType(ClusterType.REPLICA_SET)
// .hosts(Arrays.asList(new ServerAddress("host1", 27017),
// new ServerAddress("host2", 27017)))
// .build())
.build();
LOG.info("Mongo client initialized. Connecting with user {} to DB {}",
mongoSettings.getUser(), mongoSettings.getDb());
return MongoClients.create(clientSettings);
}
#Override
#Nonnull
protected String getDatabaseName() {
return mongoSettings.getDb();
}
#Bean
public MongoDatabaseFactory ourMongoDBFactory() {
return new SimpleMongoClientDatabaseFactory(ourMongoClient(), getDatabaseName());
}
#Bean(name = MONGO_TEMPLATE_REF)
public MongoTemplate ourMongoTemplate() throws Exception {
return new MongoTemplate(ourMongoDBFactory(), mappingMongoConverter());
}
#Bean
public MappingMongoConverter mappingMongoConverter() throws Exception {
DbRefResolver dbRefResolver = new DefaultDbRefResolver(ourMongoDBFactory());
MongoCustomConversions customConversions = customConversions();
MongoMappingContext context = mongoMappingContext(customConversions);
MappingMongoConverter converter = new MappingMongoConverter(dbRefResolver, context);
// this one is actually needed otherwise the StackOverflowError re-appears!
converter.setCustomConversions(customConversions);
return converter;
}
#Bean
#Override
#Nonnull
public MongoCustomConversions customConversions() {
return new MongoCustomConversions(
Arrays.asList(new PathWriteConverter(), new PathReadConverter())
);
}
}
So, instead of passing the MongoClient and the database name to the mongoTemplate directly, a MongoDatabaseFactory object holding the above mentioned values and a MappingMongoConverter object are passed as input to the template.
Unfortunately, it is necessary to pass the customConversion object twice within the mappingMongoConverter() method. If not done so, the StackOverflowError reappears.
With the given configuration, conversions from Path to String and String to Path are now possible and thus no custom conversions from Task to Document and vice versa are currently needed.

Spring Data Couchbase use different object mapper for template and CrudRepository

I have a weird behavior using SpringDataCouchbase maybe you can help me to understand.
Context
I'm using Spring-data-couchbase (v.3.1.9.RELEASE) with SpringBoot 2.
My application has an entity with a LocalDate field like
#Getter
#Setter
class MyEntity {
private LocalDate fieldName;
... Other basic/primitive fields
}
I have configured the basic converters to deal with LocalDates in the CouchbaseConfig bean.
#Configuration
#EnableCouchbaseRepositories(basePackages = {"com.example.repository.model"})
public class CouchbaseConfig extends AbstractCouchbaseConfiguration {
#Override
public CustomConversions customConversions() {
List<?> converters = Arrays.asList(
LocalDateTimeToStringConverter.INSTANCE,
StringToLocalDateTimeConverter.INSTANCE,
LocalDateToStringConverter.INSTANCE,
StringToLocalDateConverter.INSTANCE);
return new CouchbaseCustomConversions(converters);
}
#WritingConverter
public enum LocalDateToStringConverter implements Converter<LocalDate, String> {
INSTANCE;
#Override
public String convert(LocalDate source) {
return source.format(DateUtils.SHORT_DATE_FORMATTER);
}
}
#ReadingConverter
public enum StringToLocalDateConverter implements Converter<String, LocalDate> {
INSTANCE;
#Override
public LocalDate convert(String source) {
return LocalDate.parse(source, DateUtils.SHORT_DATE_FORMATTER);
}
}
The save and find operations work without problems using the CrudRepository but now I have to do a dynamic query using the CouchbaseTemplate in a CustomRepository.
The problem
I need to use template.findByN1QLProjection(queryWithParameters, MyProjection.class) to retrieve only those fields required from MyEntity but findByN1QLProjection method uses translationService.decodeFragment(json.toString(), entityClass) which is implemented by JacksonTranslationService like
public <T> T decodeFragment(String source, Class<T> target) {
try {
return objectMapper.readValue(source, target);
}
catch (IOException e) {
throw new RuntimeException("Cannot decode ad-hoc JSON", e);
}
}
The problem is: the ObjectMapper used by decodeFragment method is a new one created inside the class that doesn't use the converters from CouchbaseConfig class and as no JavaTimeModule is configured the exception com.fasterxml.jackson.databind.exc.InvalidDefinitionException: Cannot construct instance of java.time.LocalDate is thrown.
After some time I figured out the problem and I configured a different translationService in the CouchBaseConfig with a custom object mapper which solved the error.
Question
Why entities in SpringDataCouchbase are handled in a different way when you use template findOperations than when you use CrudRepository?
Am I doing something wrong?
Thanks for your time.

Spring Boot using Json as request parameters instead of an entity/model

Our company is planning to switch our microservice technology to Spring Boot. As an initiative I did some advanced reading and noting down its potential impact and syntax equivalents. I also started porting the smallest service we had as a side project.
One issue that blocked my progress was trying to convert our Json request/response exchange to Spring Boot.
Here's an example of the code: (This is Nutz framework for those who don't recognize this)
#POST
#At // These two lines are equivalent to #PostMapping("/create")
#AdaptBy(type=JsonAdapter.class)
public Object create(#Param("param_1") String param1, #Param("param_2) int param2) {
MyModel1 myModel1 = new MyModel1(param1);
MyModel2 myModel2 = new MyModel2(param2);
myRepository1.create(myMode12);
myRepository2.create(myModel2);
return new MyJsonResponse();
}
On PostMan or any other REST client I simply pass POST:
{
"param_1" : "test",
"param_2" : 1
}
I got as far as doing this in Spring Boot:
#PostMapping("/create")
public Object create(#RequestParam("param_1") String param1, #RequestParam("param_2) int param2) {
MyModel1 myModel1 = new MyModel1(param1);
MyModel2 myModel2 = new MyModel2(param2);
myRepository1.create(myMode12);
myRepository2.create(myModel2);
return new MyJsonResponse();
}
I am not sure how to do something similar as JsonAdapter here. Spring doesn't recognize the data I passed.
I tried this but based on the examples it expects the Json paramters to be of an Entity's form.
#RequestMapping(path="/wallet", consumes="application/json", produces="application/json")
But I only got it to work if I do something like this:
public Object (#RequestBody MyModel1 model1) {}
My issue with this is that MyModel1 may not necessarily contain the fields/parameters that my json data has.
The very useful thing about Nutz is that if I removed JsonAdapter it behaves like a regular form request endpoint in spring.
I couldn't find an answer here in Stack or if possible I'm calling it differently than what existing spring devs call it.
Our bosses expect us (unrealistically) to implement these changes without forcing front-end developers to adjust to these changes. (Autonomy and all that jazz). If this is unavoidable what would be the sensible explanation for this?
In that case you can use Map class to read input json, like
#PostMapping("/create")
public Object create(#RequestBody Map<String, ?> input) {
sout(input.get("param1")) // cast to String, int, ..
}
I actually figured out a more straightforward solution.
Apparently this works:
#PostMapping("/endpoint")
public Object endpoint(#RequestBody MyWebRequestObject request) {
String value1 = request.getValue_1();
String value2 = request.getValue_2();
}
The json payload is this:
{
"value_1" : "hello",
"value_2" : "world"
}
This works if MyRequestObject is mapped like the json request object like so. Example:
public class MyWebRequestObject {
String value_1;
String value_2
}
Unmapped values are ignored. Spring is smart like that.
I know this is right back where I started but since we introduced a service layer for the rest control to interact with, it made sense to create our own request model object (DTOs) that is separate from the persistence model.
You can use #RequestBody Map as a parameter for #PostMapping, #PutMapping and #PatchMapping. For #GetMapping and #DeleteMapping, you can write a class which implements Converter to convert from json-formed request parameters to Map. And you would register that class as a bean with #Component annotation. Then you can bind your parameters to #RequestParameter Map.
Here is an example of Converter below.
#Component
public class StringToMapConverter implements Converter<String, Map<String, Object>> {
private final ObjectMapper objectMapper;
#Autowired
public StringToMapConverter(ObjectMapper objectMapper) {
this.objectMapper = objectMapper;
}
#Override
public Map<String, Object> convert(String source) {
try {
return objectMapper.readValue(source, new TypeReference<Map<String, Object>>(){});
} catch (IOException e) {
return new HashMap<>();
}
}
}
If you want to exclude specific field of your MyModel1 class, use #JsonIgnore annotation onto the field like below.
class MyModel1 {
private field1;
#JsonIgnore field2;
}
Then, I guess you can just use what you have done.(I'm not sure.)
public Object (#RequestBody MyModel1 model1) {}
i think that you can use a strategy that involve dto
https://auth0.com/blog/automatically-mapping-dto-to-entity-on-spring-boot-apis/
you send a json to your rest api that is map like a dto object, after you can map like an entity or use it for your needs
try this:
Add new annotation JsonParam and implement HandlerMethodArgumentResolver of this, Parse json to map and get data in HandlerMethodArgumentResolver
{
"aaabbcc": "aaa"
}
#Target(ElementType.PARAMETER)
#Retention(RetentionPolicy.RUNTIME)
public #interface JsonParam {
String value();
}
#Component
public class JsonParamMethodResolver implements HandlerMethodArgumentResolver {
#Override
public boolean supportsParameter(MethodParameter parameter) {
return parameter.hasParameterAnnotation(JsonParam.class);
}
#Override
public Object resolveArgument(MethodParameter parameter, ModelAndViewContainer mavContainer, NativeWebRequest webRequest, WebDataBinderFactory binderFactory) throws Exception {
RepeatedlyRequestWrapper nativeRequest = webRequest.getNativeRequest(RepeatedlyRequestWrapper.class);
if (nativeRequest == null) {
return null;
}
Gson gson = new Gson();
Map<String, Object> response = gson.fromJson(nativeRequest.getReader(), new TypeToken<Map<String, Object>>() {
}.getType());
if (response == null) {
return null;
}
JsonParam parameterAnnotation = parameter.getParameterAnnotation(JsonParam.class);
String value = parameterAnnotation.value();
Class<?> parameterType = parameter.getParameterType();
return response.get(value);
}
}
#Configuration
public class JsonParamConfig extends WebMvcConfigurerAdapter {
#Autowired
JsonParamMethodResolver jsonParamMethodResolver;
#Override
public void addArgumentResolvers(List<HandlerMethodArgumentResolver> argumentResolvers) {
argumentResolvers.add(jsonParamMethodResolver);
}
}
#PostMapping("/methodName")
public void methodName(#JsonParam("aaabbcc") String ddeeff) {
System.out.println(username);
}

(Aggregation) Can't find a codec for class org.springframework.data.mongodb.core.geo.GeoJsonPoint

can you, please, help me with the following?
I have the error above in the repository given below
#Repository("polygonQueryRepository")
#RequiredArgsConstructor
public class PolygonQueryRepositoryImpl extends PolygonRepository {
#Autowired
private MongoOperations operations;
public List<DBObject> findPolygonsMatchingGivenPointAndInputAggregate(Double lat, Double lng, String band) {
GeoJsonPoint point = new GeoJsonPoint(lat, lng);
MatchOperation operation = match(Criteria.where("location").intersects(point).and("attributes.band").regex(band));
Aggregation aggregation = newAggregation(Polygon.class,
unwind("attributes.transponders"),
operation);
AggregationResults<DBObject> result =
operations.aggregate(aggregation,"Polygon",DBObject.class);
return result.getMappedResults();
}
}
I've previously read that I need to register codecs when defining MongoTemplate bean. So I did - and still no luck.
#Configuration
public class SpringMongoConfig {
#Value("${mongo.db.name}")
private String dbName;
#Value("${mongo.db.host}")
private String dbHost;
#Bean
public MongoDbFactory mongoDbFactory() throws Exception {
MongoClientOptions options = MongoClientOptions.builder().codecRegistry(MongoClient.getDefaultCodecRegistry()).build();
return new SimpleMongoDbFactory(new MongoClient(dbHost,options), dbName);
}
#Bean
public MongoTemplate mongoTemplate() throws Exception {
MongoTemplate mongoTemplate = new MongoTemplate(mongoDbFactory());
return mongoTemplate;
}
}
Do you have any suggestions?
You're close with this line:
MongoClientOptions options = MongoClientOptions
.builder()
.codecRegistry(MongoClient.getDefaultCodecRegistry())
.build();
But to use GeoJsonPoint class you need to implement Codec interface and then register it in the following way:
CodecRegistry registry = CodecRegistries.fromRegistries(
CodecRegistries.fromCodecs(new GeoJsonPointCodec()), // the codec you need to implement
MongoClient.getDefaultCodecRegistry()
);
You can use this registry in the options builder.
cbartosiak, thank you so much for answer! It helped me a lot. But the workaround was super-super unobvious!:)
In fact the registry was required but I couldn't manage to register UNTIL i changed the aggregation interface to
AggregationResults<AggregatedPolygon> result = template.aggregate((TypedAggregation<?>) aggregation, AggregatedPolygon.class);
So I just needed to add cast to TypedAggregation the aggregation object and remove Collection name from interface.
That's really weird but it worked for me this way.

Can you configure Spring controller specific Jackson deserialization?

I need to add a custom Jackson deserializer for java.lang.String to my Spring 4.1.x MVC application. However all answers (such as this) refer to configuring the ObjectMapper for the complete web application and the changes will apply to all Strings across all #RequestBody in all controllers.
I only want to apply the custom deserialization to #RequestBody arguments used within particular controllers. Note that I don't have the option of using #JsonDeserialize annotations for the specific String fields.
Can you configure custom deserialization for specific controllers only?
To have different deserialization configurations you must have different ObjectMapper instances but out of the box Spring uses MappingJackson2HttpMessageConverter which is designed to use only one instance.
I see at least two options here:
Move away from MessageConverter to an ArgumentResolver
Create a #CustomRequestBody annotation, and an argument resolver:
public class CustomRequestBodyArgumentResolver implements HandlerMethodArgumentResolver {
private final ObjectMapperResolver objectMapperResolver;
public CustomRequestBodyArgumentResolver(ObjectMapperResolver objectMapperResolver) {
this.objectMapperResolver = objectMapperResolver;
}
#Override
public boolean supportsParameter(MethodParameter methodParameter) {
return methodParameter.getParameterAnnotation(CustomRequestBody.class) != null;
}
#Override
public Object resolveArgument(MethodParameter methodParameter, ModelAndViewContainer mavContainer, NativeWebRequest webRequest, WebDataBinderFactory binderFactory) throws Exception {
if (this.supportsParameter(methodParameter)) {
ObjectMapper objectMapper = objectMapperResolver.getObjectMapper();
HttpServletRequest request = (HttpServletRequest) webRequest.getNativeRequest();
return objectMapper.readValue(request.getInputStream(), methodParameter.getParameterType());
} else {
return WebArgumentResolver.UNRESOLVED;
}
}
}
#CustomRequestBody annotation:
#Target(ElementType.PARAMETER)
#Retention(RetentionPolicy.RUNTIME)
#Documented
public #interface CustomRequestBody {
boolean required() default true;
}
ObjectMapperResolver is an interface we will be using to resolve actual ObjectMapper instance to use, I will discuss it below. Of course if you have only one use case where you need custom mapping you can simply initialize your mapper here.
You can add custom argument resolver with this configuration:
#Configuration
public class WebConfiguration extends WebMvcConfigurerAdapter {
#Bean
public CustomRequestBodyArgumentResolver customBodyArgumentResolver(ObjectMapperResolver objectMapperResolver) {
return new CustomRequestBodyArgumentResolver(objectMapperResolver)
}
#Override
public void addArgumentResolvers(List<HandlerMethodArgumentResolver> argumentResolvers) {
argumentResolvers.add(customBodyArgumentResolver(objectMapperResolver()));
}
}
Note: Do not combine #CustomRequestBody with #RequestBody, it will be ignored.
Wrap ObjectMapper in a proxy that hides multiple instances
MappingJackson2HttpMessageConverter is designed to work with only one instance of ObjectMapper. We can make that instance a proxy delegate. This will make working with multiple mappers transparent.
First of all we need an interceptor that will translate all method invocations to an underlying object.
public abstract class ObjectMapperInterceptor implements MethodInterceptor {
#Override
public Object invoke(MethodInvocation invocation) throws Throwable {
return ReflectionUtils.invokeMethod(invocation.getMethod(), getObject(), invocation.getArguments());
}
protected abstract ObjectMapper getObject();
}
Now our ObjectMapper proxy bean will look like this:
#Bean
public ObjectMapper objectMapper(ObjectMapperResolver objectMapperResolver) {
ProxyFactory factory = new ProxyFactory();
factory.setTargetClass(ObjectMapper.class);
factory.addAdvice(new ObjectMapperInterceptor() {
#Override
protected ObjectMapper getObject() {
return objectMapperResolver.getObjectMapper();
}
});
return (ObjectMapper) factory.getProxy();
}
Note: I had class loading issues with this proxy on Wildfly, due to its modular class loading, so I had to extend ObjectMapper (without changing anything) just so I can use class from my module.
It all tied up together using this configuration:
#Configuration
public class WebConfiguration extends WebMvcConfigurerAdapter {
#Bean
public MappingJackson2HttpMessageConverter jackson2HttpMessageConverter() {
return new MappingJackson2HttpMessageConverter(objectMapper(objectMapperResolver()));
}
#Override
public void configureMessageConverters(List<HttpMessageConverter<?>> converters) {
converters.add(jackson2HttpMessageConverter());
}
}
ObjectMapperResolver implementations
Final piece is the logic that determines which mapper should be used, it will be contained in ObjectMapperResolver interface. It contains only one look up method:
public interface ObjectMapperResolver {
ObjectMapper getObjectMapper();
}
If you do not have a lot of use cases with custom mappers you can simply make a map of preconfigured instances with ReqeustMatchers as keys. Something like this:
public class RequestMatcherObjectMapperResolver implements ObjectMapperResolver {
private final ObjectMapper defaultMapper;
private final Map<RequestMatcher, ObjectMapper> mapping = new HashMap<>();
public RequestMatcherObjectMapperResolver(ObjectMapper defaultMapper, Map<RequestMatcher, ObjectMapper> mapping) {
this.defaultMapper = defaultMapper;
this.mapping.putAll(mapping);
}
public RequestMatcherObjectMapperResolver(ObjectMapper defaultMapper) {
this.defaultMapper = defaultMapper;
}
#Override
public ObjectMapper getObjectMapper() {
ServletRequestAttributes sra = (ServletRequestAttributes) RequestContextHolder.getRequestAttributes();
HttpServletRequest request = sra.getRequest();
for (Map.Entry<RequestMatcher, ObjectMapper> entry : mapping.entrySet()) {
if (entry.getKey().matches(request)) {
return entry.getValue();
}
}
return defaultMapper;
}
}
You can also use a request scoped ObjectMapper and then configure it on a per-request basis. Use this configuration:
#Bean
public ObjectMapperResolver objectMapperResolver() {
return new ObjectMapperResolver() {
#Override
public ObjectMapper getObjectMapper() {
return requestScopedObjectMapper();
}
};
}
#Bean
#Scope(value = WebApplicationContext.SCOPE_REQUEST, proxyMode = ScopedProxyMode.TARGET_CLASS)
public ObjectMapper requestScopedObjectMapper() {
return new ObjectMapper();
}
This is best suited for custom response serialization, since you can configure it right in the controller method. For custom deserialization you must also use Filter/HandlerInterceptor/ControllerAdvice to configure active mapper for current request before the controller method is triggered.
You can create interface, similar to ObjectMapperResolver:
public interface ObjectMapperConfigurer {
void configureObjectMapper(ObjectMapper objectMapper);
}
Then make a map of this instances with RequstMatchers as keys and put it in a Filter/HandlerInterceptor/ControllerAdvice similar to RequestMatcherObjectMapperResolver.
P.S. If you want to explore dynamic ObjectMapper configuration a bit further I can suggest my old answer here. It describes how you can make dynamic #JsonFilters at run time. It also contains my older approach with extended MappingJackson2HttpMessageConverter that I suggested in comments.
Probably this would help, but it ain't pretty. It would require AOP. Also I did not validate it.
Create a #CustomAnnotation.
Update your controller:
void someEndpoint(#RequestBody #CustomAnnotation SomeEntity someEntity);
Then implemment the AOP part:
#Around("execution(* *(#CustomAnnotation (*)))")
public void advice(ProceedingJoinPoint proceedingJoinPoint) {
// Here you would add custom ObjectMapper, I don't know another way around it
HttpServletRequest request = ((ServletRequestAttributes) RequestContextHolder.currentRequestAttributes()).getRequest();
String body = request .getReader().lines().collect(Collectors.joining(System.lineSeparator()));
SomeEntity someEntity = /* deserialize */;
// This could be cleaner, cause the method can accept multiple parameters
proceedingJoinPoint.proceed(new Object[] {someEntity});
}
You can create custom deserializer for your String data.
Custom Deserializer
public class CustomStringDeserializer extends JsonDeserializer<String> {
#Override
public String deserialize(JsonParser p, DeserializationContext ctxt) throws IOException {
String str = p.getText();
//return processed String
}
}
Now suppose the String is present inside a POJO use #JsonDeserialize annotation above the variable:
public class SamplePOJO{
#JsonDeserialize(using=CustomStringDeserializer.class)
private String str;
//getter and setter
}
Now when you return it as a response it will be Deserialized in the way you have done it in CustomDeserializer.
Hope it helps.
You could try Message Converters.
They have a context about http input request (for example, docs see here, JSON). How to customize you could see here.
Idea that you could check HttpInputMessage with special URIs, which used in your controllers and convert string as you want.
You could create special annotation for this, scan packages and do it automatically.
Note
Likely, you don't need implementation of ObjectMappers. You can use simple default ObjectMapper to parse String and then convert string as you wish.
In that case you would create RequestBody once.
You can define a POJO for each different type of request parameter that you would like to deserialize. Then, the following code will pull in the values from the JSON into the object that you define, assuming that the names of the fields in your POJO match with the names of the field in the JSON request.
ObjectMapper mapper = new ObjectMapper();
YourPojo requestParams = null;
try {
requestParams = mapper.readValue(JsonBody, YourPOJO.class);
} catch (IOException e) {
throw new IOException(e);
}

Categories

Resources