Persisting Path objects with spring-data-mongodb respositories - java

In a project I use spring-boot-starter-data-mongodb:2.5.3 and therefore spring-data-mongodb:3.2.3 and have an entity class that simplified looks like this:
#Document
public class Task {
#Id
private final String id;
private final Path taskDir;
...
// constructor, getters, setters
}
with a default Spring MongoDB repository that allows to retrieve the task via its id.
The Mongo configuration looks as such:
#Configuration
#EnableMongoRepositories(basePackages = {
"path.to.repository"
}, mongoTemplateRef = MongoConfig.MONGO_TEMPLATE_REF)
#EnableConfigurationProperties(MongoSettings.class)
public class MongoConfig extends MongoConfigurationSupport {
private static final Logger LOG = LoggerFactory.getLogger(MethodHandles.lookup().lookupClass());
public static final String MONGO_TEMPLATE_REF = "mongoAlTemplate";
private final MongoSettings mongoSettings;
#Autowired
public MongoConfig(final MongoSettings mongoSettings) {
this.mongoSettings = mongoSettings;
}
#Bean(name = "ourMongo", destroyMethod = "close")
public MongoClient ourMongoClient() {
MongoCredential credential =
MongoCredential.createCredential(mongoSettings.getUser(),
mongoSettings.getDb(),
mongoSettings.getPassword());
MongoClientSettings clientSettings = MongoClientSettings.builder()
.readPreference(ReadPreference.primary())
// enable optimistic locking for #Version and eTag usage
.writeConcern(WriteConcern.ACKNOWLEDGED)
.credential(credential)
.applyToSocketSettings(
builder -> builder.connectTimeout(15, TimeUnit.SECONDS)
.readTimeout(1, TimeUnit.MINUTES))
.applyToConnectionPoolSettings(
builder -> builder.maxConnectionIdleTime(10, TimeUnit.MINUTES)
.minSize(5).maxSize(20))
// .applyToClusterSettings(
// builder -> builder.requiredClusterType(ClusterType.REPLICA_SET)
// .hosts(Arrays.asList(new ServerAddress("host1", 27017),
// new ServerAddress("host2", 27017)))
// .build())
.build();
return MongoClients.create(clientSettings);
}
#Override
#Nonnull
protected String getDatabaseName() {
return mongoSettings.getDb();
}
#Bean(name = MONGO_TEMPLATE_REF)
public MongoTemplate ourMongoTemplate() throws Exception {
return new MongoTemplate(ourMongoClient(), getDatabaseName());
}
}
On attempting to save a task via taskRepository.save(task) Java ends up in a StackOverflowError
java.lang.StackOverflowError
at java.lang.ThreadLocal.get(ThreadLocal.java:160)
at java.util.concurrent.locks.ReentrantReadWriteLock$Sync.tryReleaseShared(ReentrantReadWriteLock.java:423)
at java.util.concurrent.locks.AbstractQueuedSynchronizer.releaseShared(AbstractQueuedSynchronizer.java:1341)
at java.util.concurrent.locks.ReentrantReadWriteLock$ReadLock.unlock(ReentrantReadWriteLock.java:881)
at org.springframework.data.mapping.context.AbstractMappingContext.getPersistentEntity(AbstractMappingContext.java:239)
at org.springframework.data.mapping.context.AbstractMappingContext.getPersistentEntity(AbstractMappingContext.java:201)
at org.springframework.data.mapping.context.AbstractMappingContext.getPersistentEntity(AbstractMappingContext.java:87)
at org.springframework.data.mapping.context.MappingContext.getRequiredPersistentEntity(MappingContext.java:73)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.writePropertyInternal(MappingMongoConverter.java:740)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.writeProperties(MappingMongoConverter.java:657)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.writeInternal(MappingMongoConverter.java:633)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.writePropertyInternal(MappingMongoConverter.java:746)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.writeProperties(MappingMongoConverter.java:657)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.writeInternal(MappingMongoConverter.java:633)
...
On annotating the path object taskDir in the Task class with #Transient I'm able to persist the task, so the problem seems to be related with Java/Spring/MongoDB being unable to handle Path objects directly.
My next attempt was to configure a custom converter inside the MongoConfig class to convert between Path and String representations:
#Override
protected void configureConverters(
MongoCustomConversions.MongoConverterConfigurationAdapter converterConfigurationAdapter) {
LOG.info("configuring converters");
converterConfigurationAdapter.registerConverter(new Converter<Path, String>() {
#Override
public String convert(#Nonnull Path path) {
return path.normalize().toAbsolutePath().toString();
}
});
converterConfigurationAdapter.registerConverter(new Converter<String, Path>() {
#Override
public Path convert(#Nonnull String path) {
return Paths.get(path);
}
});
}
though the error remained. I then defined a direct conversion between the Task object and a DBObject as showcased in this guide
#Override
protected void configureConverters(
MongoCustomConversions.MongoConverterConfigurationAdapter converterConfigurationAdapter) {
LOG.info("configuring converters");
converterConfigurationAdapter.registerConverter(new Converter<Task, DBObject>() {
#Override
public DBObject convert(#Nonnull Task source) {
DBObject dbObject = new BasicDBObject();
if (source.getTaskDirectory() != null) {
dbObject.put("taskDir", source.getTaskDirectory().normalize().toAbsolutePath().toString());
}
...
return dbObject;
}
});
}
and I still get a StackOverflowError in return. Through the log statement I added I see that Spring called into the configureConverters method and therefore should have registered the custom converters.
Why do I still get the StackOverflowError though? How do I need to tell Spring to treat Path objects as Strings while persisting and on read-time convert the String value to a Path object back again?
Update:
I've now followed the example given in the official documentation and refactored the converter to its own class
import org.bson.Document;
import org.springframework.core.convert.converter.Converter;
import org.springframework.data.convert.WritingConverter;
import javax.annotation.Nonnull;
#WritingConverter
public class TaskWriteConverter implements Converter<Task, Document> {
#Override
public Document convert(#Nonnull Task source) {
Document document = new Document();
document.put("_id", source.getId());
if (source.getTaskDir() != null) {
document.put("taskDir", source.getTaskDir().normalize().toAbsolutePath().toString());
}
return document;
}
}
The configuration in the MongoConfig class now looks like this:
#Override
protected void configureConverters(
MongoCustomConversions.MongoConverterConfigurationAdapter adapter) {
LOG.info("configuring converters");
adapter.registerConverter(new TaskWriteConverter());
adapter.registerConverter(new TaskReadConverter());
adapter.registerConverter(new Converter<Path, String>() {
#Override
public String convert(#Nonnull Path path) {
return path.normalize().toAbsolutePath().toString();
}
});
adapter.registerConverter(new Converter<String, Path>() {
#Override
public Path convert(#Nonnull String path) {
return Paths.get(path);
}
});
}
After changing the logging level for org.springframework.data to debug I see in the logs that these converters also got picked up:
2021-09-23 14:09:20.469 [INFO ] [ main] MongoConfig configuring converters
2021-09-23 14:09:20.480 [DEBUG] [ main] CustomConversions Adding user defined converter from class com.acme.Task to class org.bson.Document as writing converter.
2021-09-23 14:09:20.480 [DEBUG] [ main] CustomConversions Adding user defined converter from class org.bson.Document to class com.acme.Task as reading converter.
2021-09-23 14:09:20.481 [DEBUG] [ main] CustomConversions Adding user defined converter from interface java.nio.file.Path to class java.lang.String as writing converter.
2021-09-23 14:09:20.481 [DEBUG] [ main] CustomConversions Adding user defined converter from class java.lang.String to interface java.nio.file.Path as reading converter.
However, I see the that most of the converters are added multiple times, i.e. I find a log for Adding converter from class java.lang.Character to class java.lang.String as writing converter. actually 4 times before the application hits the save method on the repository. As my custom converters are only added the 3rd time all of these converters appear in the logs, I have a feeling that they are somehow overwritten as the logs in the last "iteration" don't include my custom converters.
The test case that reproduces that issue is as follows:
#ŚpringBootTest
#AutoConfigureMockMvc
#PropertySource("classpath:application-test.properties")
public class SomeIT {
#Autowired
private TaskRepository taskRepository;
...
#Test
public void testTaskPersistence() throws Exception {
Task task = new Task("1234", Paths.get("/home/roman"));
taskRepository.save(task);
}
...
}
The test-method is only used to investigate into the current persistence issue and under normal conditions shouldn't be there at all as the integration test tests the upload of a large file, its preprocessing and so on. This integration tests however fails due to Spring not being able, at least it seems so, to store entities that contain Path objects.
Note that for simple entities I do not have issues in persisting them with the outlined setup and I also seem them in the dockerized MongoDB.
I haven't had time yet to dig deeper into why Spring has such problems with Path objects or why my custom converters suddenly disappear in the last iteration of the CustomConversions log output.

It turns out that the way the mongoTemplate is configured does "overwrite" any specified custom converters and thus Spring is not able to make use of these and convert Path to String and vice versa.
After changing the MongoConfig to the one below, I'm finally able to use my custom converters and thus persist entities as expected:
#Configuration
#EnableMongoRepositories(basePackages = {
"path.to.repository"
}, mongoTemplateRef = MongoConfig.MONGO_TEMPLATE_REF)
#EnableConfigurationProperties(MongoSettings.class)
public class MongoConfig extends MongoConfigurationSupport {
private static final Logger LOG = LoggerFactory.getLogger(MethodHandles.lookup().lookupClass());
public static final String MONGO_TEMPLATE_REF = "mongoAlTemplate";
private final MongoSettings mongoSettings;
#Autowired
public MongoConfig(final MongoSettings mongoSettings) {
this.mongoSettings = mongoSettings;
}
#Bean(name = "ourMongo", destroyMethod = "close")
public MongoClient ourMongoClient() {
MongoCredential credential =
MongoCredential.createCredential(mongoSettings.getUser(),
mongoSettings.getDb(),
mongoSettings.getPassword());
MongoClientSettings clientSettings = MongoClientSettings.builder()
.readPreference(ReadPreference.primary())
// enable optimistic locking for #Version and eTag usage
.writeConcern(WriteConcern.ACKNOWLEDGED)
.credential(credential)
.applyToSocketSettings(
builder -> builder.connectTimeout(15, TimeUnit.SECONDS)
.readTimeout(1, TimeUnit.MINUTES))
.applyToConnectionPoolSettings(
builder -> builder.maxConnectionIdleTime(10, TimeUnit.MINUTES)
.minSize(5).maxSize(20))
// .applyToClusterSettings(
// builder -> builder.requiredClusterType(ClusterType.REPLICA_SET)
// .hosts(Arrays.asList(new ServerAddress("host1", 27017),
// new ServerAddress("host2", 27017)))
// .build())
.build();
LOG.info("Mongo client initialized. Connecting with user {} to DB {}",
mongoSettings.getUser(), mongoSettings.getDb());
return MongoClients.create(clientSettings);
}
#Override
#Nonnull
protected String getDatabaseName() {
return mongoSettings.getDb();
}
#Bean
public MongoDatabaseFactory ourMongoDBFactory() {
return new SimpleMongoClientDatabaseFactory(ourMongoClient(), getDatabaseName());
}
#Bean(name = MONGO_TEMPLATE_REF)
public MongoTemplate ourMongoTemplate() throws Exception {
return new MongoTemplate(ourMongoDBFactory(), mappingMongoConverter());
}
#Bean
public MappingMongoConverter mappingMongoConverter() throws Exception {
DbRefResolver dbRefResolver = new DefaultDbRefResolver(ourMongoDBFactory());
MongoCustomConversions customConversions = customConversions();
MongoMappingContext context = mongoMappingContext(customConversions);
MappingMongoConverter converter = new MappingMongoConverter(dbRefResolver, context);
// this one is actually needed otherwise the StackOverflowError re-appears!
converter.setCustomConversions(customConversions);
return converter;
}
#Bean
#Override
#Nonnull
public MongoCustomConversions customConversions() {
return new MongoCustomConversions(
Arrays.asList(new PathWriteConverter(), new PathReadConverter())
);
}
}
So, instead of passing the MongoClient and the database name to the mongoTemplate directly, a MongoDatabaseFactory object holding the above mentioned values and a MappingMongoConverter object are passed as input to the template.
Unfortunately, it is necessary to pass the customConversion object twice within the mappingMongoConverter() method. If not done so, the StackOverflowError reappears.
With the given configuration, conversions from Path to String and String to Path are now possible and thus no custom conversions from Task to Document and vice versa are currently needed.

Related

Error integrating with Redis: DefaultSerializer requires a Serializable payload but received an object of type [reactor.core.publisher.FluxIterable]

My colleague shared a working java seed project with me (I'm a newcomer to java) that is caching the result of a Spring Boot service into a Redis in-memory store. The service, which returned only one entity result, was based on a class of type Mono; but for my use case I need it to return a list, so I've been using a Flux type instead.
The service aspect of the code (including the use of the Flux type) was working until I attempted to cache the result to Redis running on my localhost. Now I'm getting the following error and can't seem to understand how to fix it after a full day of Googling and troubleshooting.
There was an unexpected error (type=Internal Server Error, status=500).
Cannot serialize; nested exception is org.springframework.core.serializer.support.SerializationFailedException: Failed to serialize object using DefaultSerializer; nested exception is java.lang.IllegalArgumentException: DefaultSerializer requires a Serializable payload but received an object of type [reactor.core.publisher.FluxIterable]
org.springframework.data.redis.serializer.SerializationException: Cannot serialize; nested exception is org.springframework.core.serializer.support.SerializationFailedException: Failed to serialize object using DefaultSerializer; nested exception is java.lang.IllegalArgumentException: DefaultSerializer requires a Serializable payload but received an object of type [reactor.core.publisher.FluxIterable]
at org.springframework.data.redis.serializer.JdkSerializationRedisSerializer.serialize(JdkSerializationRedisSerializer.java:96)
Suppressed: The stacktrace has been enhanced by Reactor, refer to additional information below:
Error has been observed at the following site(s):
*__checkpoint ⇢ HTTP GET "/v1/walletcharges" [ExceptionHandlingWebHandler]
Original Stack Trace:
at org.springframework.data.redis.serializer.JdkSerializationRedisSerializer.serialize(JdkSerializationRedisSerializer.java:96)
...
Caused by: org.springframework.core.serializer.support.SerializationFailedException: Failed to serialize object using DefaultSerializer; nested exception is java.lang.IllegalArgumentException: DefaultSerializer requires a Serializable payload but received an object of type [reactor.core.publisher.FluxIterable]
at org.springframework.core.serializer.support.SerializingConverter.convert(SerializingConverter.java:64)
...
Caused by: java.lang.IllegalArgumentException: DefaultSerializer requires a Serializable payload but received an object of type [reactor.core.publisher.FluxIterable]
at org.springframework.core.serializer.DefaultSerializer.serialize(DefaultSerializer.java:43)
...
I understand the error well enough to guess (though I'm not certain) that the issue is coming from the two places where I've made a Flux from the iterable walletCharges Flux.fromIterable(walletCharges). But I don't know what to replace it with because converting it to a Flux was the only way I found to get rid of compilation errors caused by the mismatch of expected and returned types.
Following is my code. My apologies for the length! I've omitted the import statements for brevity, but because I don't know for sure where the issue is coming from, I felt nervous omitting anything else that might be relevant.
Please help me if you can!
Main Application CitReconServiceApplication.java
#EnableCaching
#SpringBootApplication
public class CitReconServiceApplication {
public static void main(String[] args) {
SpringApplication.run(CitReconServiceApplication.class, args);
}
}
Controller WalletChargesController.java
#Cacheable(value = "allWalletChargesCache", key = "#root.methodName")
#CrossOrigin(origins = {"https://localhost:3000", "http://localhost:3000"} )
#RestController
#RequestMapping("v1/walletcharges")
public class WalletChargesController {
private WalletChargesService _walletChargesService;
#Autowired
public WalletChargesController(WalletChargesService _walletChargesService) {
this._walletChargesService = _walletChargesService;
}
#GetMapping
public Flux<WalletCharges> getWalletCharges() {
return _walletChargesService.getWalletCharges();
};
}
Entity Model WalletCharges.java
#Data
#Entity
public class WalletCharges implements Serializable {
#Id
#JsonProperty(TRANSACTION_ID)
private Integer transactionId;
#JsonProperty(TYPE)
private String type;
// Other columns removed for brevity
}
Redis Repository RedisWalletChargesRepository.java
#Repository
public class RedisWalletChargesRepository implements Serializable {
private final ReactiveRedisOperations<String, String> operations;
public RedisWalletChargesRepository(ReactiveRedisOperations<String, String> operations) {
this.operations = operations;
}
public Flux<WalletCharges> save(String allWalletCharges, List<WalletCharges> walletCharges) {
try {
String walletChargesString = ObjectMapperUtils.convertToString(walletCharges);
return operations.opsForValue()
.set(allWalletCharges, walletChargesString)
.flatMapMany(__ -> Flux.fromIterable(walletCharges));
} catch (JsonProcessingException e) {
e.printStackTrace();
return null;
}
}
public Flux<WalletCharges> findByKey(String key) {
return operations.opsForValue()
.get(key)
.flatMapMany(result -> {
try {
List<WalletCharges> walletCharges = ObjectMapperUtils.convertToObject(result);
return Flux.fromIterable(walletCharges);
} catch (JsonProcessingException e) {
e.printStackTrace();
return null;
}
});
}
}
Data Access Object WalletChargesRepository.java
#Repository
public interface WalletChargesRepository extends JpaRepository<WalletCharges, Serializable> {
#Query(value = GET_WALLET_CHARGES_SQL_LONGFORM, nativeQuery = true)
List<WalletCharges> findAll();
}
Service Implementation WalletChargesServiceImpl.javaa
#Service
public class WalletChargesServiceImpl implements WalletChargesService {
private WalletChargesRepository _repository;
private RedisWalletChargesRepository _redisWalletChargesRepository;
#Autowired
public WalletChargesServiceImpl(WalletChargesRepository _repository, RedisWalletChargesRepository _redisWalletChargesRepository) {
this._repository = _repository;
this._redisWalletChargesRepository = _redisWalletChargesRepository;
}
#Override
public Flux<WalletCharges> getWalletCharges() {
return _redisWalletChargesRepository
.findByKey("allWalletCharges")
.switchIfEmpty(Flux.defer(() -> getWalletChargesFromDBAndCache()));
}
#Override
public Flux<WalletCharges> getWalletChargesFromDBAndCache(){
System.out.println("The getWalletChargesFromDBAndCache method was invoked");
List<WalletCharges> walletCharges = _repository.findAll();
_redisWalletChargesRepository.save("allWalletCharges", walletCharges);
return Flux.fromIterable(walletCharges);
}
}
Object Mapper/Serializer ObjectMapperUtils.java
#UtilityClass
public class ObjectMapperUtils {
private final ObjectMapper mapper = createObjectMapper();
private ObjectMapper createObjectMapper() {
ObjectMapper objectMapper = new ObjectMapper();
objectMapper.setSerializationInclusion(JsonInclude.Include.NON_EMPTY);
return objectMapper;
}
public <T> T convertValue(Object obj, Class<T> contentClass){
return mapper.convertValue(obj, contentClass);
}
public String convertToString(Object obj) throws JsonProcessingException {
return mapper.writeValueAsString(obj);
}
public List<WalletCharges> convertToObject(String str) throws JsonProcessingException {
return mapper.readValue(str, mapper.getTypeFactory().constructCollectionType(List.class, WalletCharges.class));
}
}

Spring Data Couchbase use different object mapper for template and CrudRepository

I have a weird behavior using SpringDataCouchbase maybe you can help me to understand.
Context
I'm using Spring-data-couchbase (v.3.1.9.RELEASE) with SpringBoot 2.
My application has an entity with a LocalDate field like
#Getter
#Setter
class MyEntity {
private LocalDate fieldName;
... Other basic/primitive fields
}
I have configured the basic converters to deal with LocalDates in the CouchbaseConfig bean.
#Configuration
#EnableCouchbaseRepositories(basePackages = {"com.example.repository.model"})
public class CouchbaseConfig extends AbstractCouchbaseConfiguration {
#Override
public CustomConversions customConversions() {
List<?> converters = Arrays.asList(
LocalDateTimeToStringConverter.INSTANCE,
StringToLocalDateTimeConverter.INSTANCE,
LocalDateToStringConverter.INSTANCE,
StringToLocalDateConverter.INSTANCE);
return new CouchbaseCustomConversions(converters);
}
#WritingConverter
public enum LocalDateToStringConverter implements Converter<LocalDate, String> {
INSTANCE;
#Override
public String convert(LocalDate source) {
return source.format(DateUtils.SHORT_DATE_FORMATTER);
}
}
#ReadingConverter
public enum StringToLocalDateConverter implements Converter<String, LocalDate> {
INSTANCE;
#Override
public LocalDate convert(String source) {
return LocalDate.parse(source, DateUtils.SHORT_DATE_FORMATTER);
}
}
The save and find operations work without problems using the CrudRepository but now I have to do a dynamic query using the CouchbaseTemplate in a CustomRepository.
The problem
I need to use template.findByN1QLProjection(queryWithParameters, MyProjection.class) to retrieve only those fields required from MyEntity but findByN1QLProjection method uses translationService.decodeFragment(json.toString(), entityClass) which is implemented by JacksonTranslationService like
public <T> T decodeFragment(String source, Class<T> target) {
try {
return objectMapper.readValue(source, target);
}
catch (IOException e) {
throw new RuntimeException("Cannot decode ad-hoc JSON", e);
}
}
The problem is: the ObjectMapper used by decodeFragment method is a new one created inside the class that doesn't use the converters from CouchbaseConfig class and as no JavaTimeModule is configured the exception com.fasterxml.jackson.databind.exc.InvalidDefinitionException: Cannot construct instance of java.time.LocalDate is thrown.
After some time I figured out the problem and I configured a different translationService in the CouchBaseConfig with a custom object mapper which solved the error.
Question
Why entities in SpringDataCouchbase are handled in a different way when you use template findOperations than when you use CrudRepository?
Am I doing something wrong?
Thanks for your time.

Using Jackson to dynamically serialize an entity as either its ID or its full representation at runtime

We're developing a RESTful API using Java EE 7 (RESTEasy / Hibernate / Jackson).
We want the API to serialize all child entities using their IDs, by default. We're doing this mostly to maintain consistency with our deserialization strategy, where we insist on receiving an ID.
However, we also want our users to be able to choose to get an expanded view of any of our child entities, either through a custom endpoint or a query parameter (undecided). For example:
// http://localhost:8080/rest/operator/1
// =====================================
{
"operatorId": 1,
"organization": 34,
"endUser": 23
}
// http://localhost:8080/rest/operator/1?expand=organization
// =====================================
{
"operatorId": 1,
"organization": {
"organizationId": 34,
"organizationName": "name"
},
"endUser": 23
}
// http://localhost:8080/rest/operator/1?expand=enduser
// =====================================
{
"operatorId": 1,
"organization": 34,
"endUser": {
"endUserId": 23,
"endUserName": "other name"
}
}
// http://localhost:8080/rest/operator/1?expand=organization,enduser
// =====================================
{
"operatorId": 1,
"organization": {
"organizationId": 34,
"organizationName": "name"
},
"endUser": {
"endUserId": 23,
"endUserName": "other name"
}
}
Is there a way to dynamically change the behavior of Jackson to determine whether a specified AbstractEntity field is serialized in full form or as its ID? How might it be done?
Additional Info
We know of a few ways to serialize our child entities using their IDs, including:
public class Operator extends AbstractEntity {
...
#JsonIdentityInfo(generator=ObjectIdGenerators.PropertyGenerator.class, property="organizationId")
#JsonIdentityReference(alwaysAsId=true)
public getOrganization() { ... }
...
}
and
public class Operator extends AbstractEntity {
...
#JsonSerialize(using=AbstractEntityIdSerializer.class)
public getOrganization() { ... }
...
}
where AbstractEntityIdSerializer serializes the entity using its ID.
The problem is that we don't know of a way for the user to override that default behavior and revert to standard Jackson object serialization. Ideally they'd also be able to choose which child properties to serialize in full form.
It would be awesome to dynamically toggle the alwaysAsId argument of #JsonIdentityReference for any property at runtime, if that's possible, or make the equivalent change to ObjectMapper/ObjectWriter.
Update: Working(?) Solution
We haven't had a chance to fully test this yet, but I've been working on a solution that leverages overriding Jackson's AnnotationIntrospector class. It seems to be working as intended.
public class CustomAnnotationIntrospector extends JacksonAnnotationIntrospector {
private final Set<String> expandFieldNames_;
public CustomAnnotationIntrospector(Set<String> expandFieldNames) {
expandFieldNames_ = expandFieldNames;
}
#Override
public ObjectIdInfo findObjectReferenceInfo(Annotated ann, ObjectIdInfo objectIdInfo) {
JsonIdentityReference ref = _findAnnotation(ann, JsonIdentityReference.class);
if (ref != null) {
for (String expandFieldName : expandFieldNames_) {
String expandFieldGetterName = "get" + expandFieldName;
String propertyName = ann.getName();
boolean fieldNameMatches = expandFieldName.equalsIgnoreCase(propertyName);
boolean fieldGetterNameMatches = expandFieldGetterName.equalsIgnoreCase(propertyName);
if (fieldNameMatches || fieldGetterNameMatches) {
return objectIdInfo.withAlwaysAsId(false);
}
}
objectIdInfo = objectIdInfo.withAlwaysAsId(ref.alwaysAsId());
}
return objectIdInfo;
}
}
At serialization time, we copy our ObjectMapper (so the AnnotationIntrospector runs again) and apply CustomAnnotationIntrospector as follows:
#Context
private HttpRequest httpRequest_;
#Override
writeTo(...) {
// Get our application's ObjectMapper.
ContextResolver<ObjectMapper> objectMapperResolver = provider_.getContextResolver(ObjectMapper.class,
MediaType.WILDCARD_TYPE);
ObjectMapper objectMapper = objectMapperResolver.getContext(Object.class);
// Get Set of fields to be expanded (pre-parsed).
Set<String> fieldNames = (Set<String>)httpRequest_.getAttribute("ExpandFields");
if (!fieldNames.isEmpty()) {
// Pass expand fields to AnnotationIntrospector.
AnnotationIntrospector expansionAnnotationIntrospector = new CustomAnnotationIntrospector(fieldNames);
// Replace ObjectMapper with copy of ObjectMapper and apply custom AnnotationIntrospector.
objectMapper = objectMapper.copy();
objectMapper.setAnnotationIntrospector(expansionAnnotationIntrospector);
}
ObjectWriter objectWriter = objectMapper.writer();
objectWriter.writeValue(...);
}
Any glaring flaws in this approach? It seems relatively straightforward and is fully dynamic.
The answer is Jackson's mixin feature:
You create a simple Java class that has the exact same method signature as the anotated method of the entity. You annotate that method with the modified value. the body of the method is insignificant (it would not be called):
public class OperatorExpanded {
...
#JsonIdentityInfo(generator=ObjectIdGenerators.PropertyGenerator.class, property="organizationId")
#JsonIdentityReference(alwaysAsId=false)
public Organization getOrganization() { return null; }
...
}
you tie the mixin to the entity-to-be-serialized using Jackson's module system: this can be decided at run time
ObjectMapper mapper = new ObjectMapper();
if ("organization".equals(request.getParameter("exapnd")) {
SimpleModule simpleModule = new SimpleModule();
simpleModule.setMixInAnnotation(Operator.class, OperatorExpanded.class);
mapper.registerModule(simpleModule);
}
now, the mapper will take the annotations from the mixin, but invoke the method of the entity.
If you are looking for a generalized solution that needs to be extended to all of your resources you may try following approach. I tried below solution using Jersey and Jackson. It should also work with RestEasy.
Basically, you need to write a custom jackson provider which set a special serializer for an expand field. Also, you need to pass the expand fields to the serializer so that you can decide how to do the serialization for expand fields.
#Singleton
public class ExpandFieldJacksonProvider extends JacksonJaxbJsonProvider {
#Inject
private Provider<ContainerRequestContext> provider;
#Override
protected JsonEndpointConfig _configForWriting(final ObjectMapper mapper, final Annotation[] annotations, final Class<?> defaultView) {
final AnnotationIntrospector customIntrospector = mapper.getSerializationConfig().getAnnotationIntrospector();
// Set the custom (user) introspector to be the primary one.
final ObjectMapper filteringMapper = mapper.setAnnotationIntrospector(AnnotationIntrospector.pair(customIntrospector, new JacksonAnnotationIntrospector() {
#Override
public Object findSerializer(Annotated a) {
// All expand fields should be annotated with '#ExpandField'.
ExpandField expField = a.getAnnotation(ExpandField.class);
if (expField != null) {
// Use a custom serializer for expand field
return new ExpandSerializer(expField.fieldName(), expField.idProperty());
}
return super.findSerializer(a);
}
}));
return super._configForWriting(filteringMapper, annotations, defaultView);
}
#Override
public void writeTo(final Object value, final Class<?> type, final Type genericType, final Annotation[] annotations, final MediaType mediaType, final MultivaluedMap<String, Object> httpHeaders,
final OutputStream entityStream) throws IOException {
// Set the expand fields to java's ThreadLocal so that it can be accessed in 'ExpandSerializer' class.
ExpandFieldThreadLocal.set(provider.get().getUriInfo().getQueryParameters().get("expand"));
super.writeTo(value, type, genericType, annotations, mediaType, httpHeaders, entityStream);
// Once the serialization is done, clear ThreadLocal
ExpandFieldThreadLocal.remove();
}
ExpandField.java
#Retention(RUNTIME)
public #interface ExpandField {
// name of expand field
String fieldName();
// name of Id property in expand field. For eg: oraganisationId
String idProperty();
}
ExpandFieldThreadLocal.java
public class ExpandFieldThreadLocal {
private static final ThreadLocal<List<String>> _threadLocal = new ThreadLocal<>();
public static List<String> get() {
return _threadLocal.get();
}
public static void set(List<String> expandFields) {
_threadLocal.set(expandFields);
}
public static void remove() {
_threadLocal.remove();
}
}
ExpandFieldSerializer.java
public static class ExpandSerializer extends JsonSerializer<Object> {
private String fieldName;
private String idProperty;
public ExpandSerializer(String fieldName,String idProperty) {
this.fieldName = fieldName;
this.idProperty = idProperty;
}
#Override
public void serialize(Object value, JsonGenerator gen, SerializerProvider serializers) throws IOException, JsonProcessingException {
// Get expand fields in current request which is set in custom jackson provider.
List<String> expandFields = ExpandFieldThreadLocal.get();
if (expandFields == null || !expandFields.contains(fieldName)) {
try {
// If 'expand' is not present in query param OR if the 'expand' field does not contain this field, write only id.
serializers.defaultSerializeValue(value.getClass().getMethod("get"+StringUtils.capitalize(idProperty)).invoke(value),gen);
} catch (Exception e) {
//Handle Exception here
}
} else {
serializers.defaultSerializeValue(value, gen);
}
}
}
Operator.java
public class Operator extends AbstractEntity {
...
#ExpandField(fieldName = "organization",idProperty="organizationId")
private organization;
...
}
The final step is to register the new ExpandFieldJacksonProvider. In Jersey, we register it through an instance of javax.ws.rs.core.Application as shown below. I hope there is something similar in RestEasy. By default, most of the JAX-RS libraries tend to load default JacksonJaxbJsonProvider through auto-discovery. You have to make sure auto-discovery is disabled for Jackson and new ExpandFieldJacksonProvider is registered.
public class JaxRsApplication extends Application{
#Override
public Set<Class<?>> getClasses() {
Set<Class<?>> clazzes=new HashSet<>();
clazzes.add(ExpandFieldJacksonProvider.class);
return clazzes;
}
}

Spring Mongodb - Cannot write custom Converter for java.time.Period

I'm using Spring Cloud Brixton.SR4 with Spring Data MongoDB.
I have a very simple entity:
#Document
public class Foo{
private Period period;
//getter & setter
}
Because java.time.Period is not supported by jsr310 I'm creating custom converters:
class Converters {
#Component
#WritingConverter
static class PeriodToStringConverter implements Converter<Period, String> {
#Override
public String convert(Period period) {
return period.toString();
}
}
#ReadingConverter
#Component
static class StringToPeriodConverter implements Converter<String, Period> {
#Override
public Period convert(String s) {
return Period.parse(s);
}
}
Now I register them in my configuration class extending AbstractMongoConfiguration:
#Bean
#Override
public MappingMongoConverter mappingMongoConverter() throws Exception {
DbRefResolver dbRefResolver = new DefaultDbRefResolver(mongoDbFactory());
MappingMongoConverter converter = new MappingMongoConverter(dbRefResolver, mongoMappingContext());
final CustomConversions conversions = customConversions();
log.info("hasCustomWriteTarget(Period.class): " + conversions.hasCustomWriteTarget(Period.class));
log.info("hasCustomWriteTarget(Period.class, String.class): " + conversions.hasCustomWriteTarget(Period.class, String.class));
log.info("hasCustomReadTarget(String.class, Period.class): " + conversions.hasCustomReadTarget(String.class, Period.class));
converter.setCustomConversions(conversions);
converter.afterPropertiesSet(); //probably not needed, trying out of despair
return converter;
}
#Bean
#Override
public CustomConversions customConversions() {
List<Converter> converters = new ArrayList<>();
converters.add(new Converters.PeriodToStringConverter());
converters.add(new Converters.StringToPeriodConverter());
return new CustomConversions(converters);
}
When I start my app I see in the logs:
hasCustomWriteTarget(Period.class): true
hasCustomWriteTarget(Period.class, String.class): true
hasCustomReadTarget(String.class, Period.class): true
Now I create a new Foo and save it to my repository:
Foo foo = new Foo();
foo.setPeriod(Period.of(2, 0, 1));
fooRepository.save(foo);
Now the weirdness happens:
In Mongodb I see:
{
"_id": ObjectId("xxxx"),
"period": {
"years" : 0,
"months" : 2,
"days" : 1
}
}
So already that's something wrong. It should be saved as a String
When I try to read the object in Java I get:
org.springframework.data.mapping.model.MappingException: No property null found on entity class java.time.Period to bind constructor parameter to!
I debugged the code in MappingMongoConverter:
if (conversions.hasCustomReadTarget(dbo.getClass(), rawType)) {
return conversionService.convert(dbo, rawType);
}
because my object was not store as a String the dbo variable is actually a BasicDbObject and therefore I don't have converter for this.
Any idea why my write converter is not being used to persist the Period?
I have jackson-datatype-jdk8 on my classpath, could it be the issue? Would jackson be involved at all for persisting in Mongodb?
EDIT
It seems to be a registration issue. When I debug the code, the CustomConversion object used in MappingMongoConverter is different than the one I create. And it doesn't have the custom converters which I create
OK it was extremely stupid...
I was also creating my own MongoTemplate:
#Bean
public MongoTemplate mongoTemplate() throws Exception {
return new MongoTemplate(mongoDbFactory());
}
Which basically ignores my custom converter. To fix it:
#Bean
public MongoTemplate mongoTemplate() throws Exception {
return new MongoTemplate(mongoDbFactory(), mappingMongoConverter());
}

Can you configure Spring controller specific Jackson deserialization?

I need to add a custom Jackson deserializer for java.lang.String to my Spring 4.1.x MVC application. However all answers (such as this) refer to configuring the ObjectMapper for the complete web application and the changes will apply to all Strings across all #RequestBody in all controllers.
I only want to apply the custom deserialization to #RequestBody arguments used within particular controllers. Note that I don't have the option of using #JsonDeserialize annotations for the specific String fields.
Can you configure custom deserialization for specific controllers only?
To have different deserialization configurations you must have different ObjectMapper instances but out of the box Spring uses MappingJackson2HttpMessageConverter which is designed to use only one instance.
I see at least two options here:
Move away from MessageConverter to an ArgumentResolver
Create a #CustomRequestBody annotation, and an argument resolver:
public class CustomRequestBodyArgumentResolver implements HandlerMethodArgumentResolver {
private final ObjectMapperResolver objectMapperResolver;
public CustomRequestBodyArgumentResolver(ObjectMapperResolver objectMapperResolver) {
this.objectMapperResolver = objectMapperResolver;
}
#Override
public boolean supportsParameter(MethodParameter methodParameter) {
return methodParameter.getParameterAnnotation(CustomRequestBody.class) != null;
}
#Override
public Object resolveArgument(MethodParameter methodParameter, ModelAndViewContainer mavContainer, NativeWebRequest webRequest, WebDataBinderFactory binderFactory) throws Exception {
if (this.supportsParameter(methodParameter)) {
ObjectMapper objectMapper = objectMapperResolver.getObjectMapper();
HttpServletRequest request = (HttpServletRequest) webRequest.getNativeRequest();
return objectMapper.readValue(request.getInputStream(), methodParameter.getParameterType());
} else {
return WebArgumentResolver.UNRESOLVED;
}
}
}
#CustomRequestBody annotation:
#Target(ElementType.PARAMETER)
#Retention(RetentionPolicy.RUNTIME)
#Documented
public #interface CustomRequestBody {
boolean required() default true;
}
ObjectMapperResolver is an interface we will be using to resolve actual ObjectMapper instance to use, I will discuss it below. Of course if you have only one use case where you need custom mapping you can simply initialize your mapper here.
You can add custom argument resolver with this configuration:
#Configuration
public class WebConfiguration extends WebMvcConfigurerAdapter {
#Bean
public CustomRequestBodyArgumentResolver customBodyArgumentResolver(ObjectMapperResolver objectMapperResolver) {
return new CustomRequestBodyArgumentResolver(objectMapperResolver)
}
#Override
public void addArgumentResolvers(List<HandlerMethodArgumentResolver> argumentResolvers) {
argumentResolvers.add(customBodyArgumentResolver(objectMapperResolver()));
}
}
Note: Do not combine #CustomRequestBody with #RequestBody, it will be ignored.
Wrap ObjectMapper in a proxy that hides multiple instances
MappingJackson2HttpMessageConverter is designed to work with only one instance of ObjectMapper. We can make that instance a proxy delegate. This will make working with multiple mappers transparent.
First of all we need an interceptor that will translate all method invocations to an underlying object.
public abstract class ObjectMapperInterceptor implements MethodInterceptor {
#Override
public Object invoke(MethodInvocation invocation) throws Throwable {
return ReflectionUtils.invokeMethod(invocation.getMethod(), getObject(), invocation.getArguments());
}
protected abstract ObjectMapper getObject();
}
Now our ObjectMapper proxy bean will look like this:
#Bean
public ObjectMapper objectMapper(ObjectMapperResolver objectMapperResolver) {
ProxyFactory factory = new ProxyFactory();
factory.setTargetClass(ObjectMapper.class);
factory.addAdvice(new ObjectMapperInterceptor() {
#Override
protected ObjectMapper getObject() {
return objectMapperResolver.getObjectMapper();
}
});
return (ObjectMapper) factory.getProxy();
}
Note: I had class loading issues with this proxy on Wildfly, due to its modular class loading, so I had to extend ObjectMapper (without changing anything) just so I can use class from my module.
It all tied up together using this configuration:
#Configuration
public class WebConfiguration extends WebMvcConfigurerAdapter {
#Bean
public MappingJackson2HttpMessageConverter jackson2HttpMessageConverter() {
return new MappingJackson2HttpMessageConverter(objectMapper(objectMapperResolver()));
}
#Override
public void configureMessageConverters(List<HttpMessageConverter<?>> converters) {
converters.add(jackson2HttpMessageConverter());
}
}
ObjectMapperResolver implementations
Final piece is the logic that determines which mapper should be used, it will be contained in ObjectMapperResolver interface. It contains only one look up method:
public interface ObjectMapperResolver {
ObjectMapper getObjectMapper();
}
If you do not have a lot of use cases with custom mappers you can simply make a map of preconfigured instances with ReqeustMatchers as keys. Something like this:
public class RequestMatcherObjectMapperResolver implements ObjectMapperResolver {
private final ObjectMapper defaultMapper;
private final Map<RequestMatcher, ObjectMapper> mapping = new HashMap<>();
public RequestMatcherObjectMapperResolver(ObjectMapper defaultMapper, Map<RequestMatcher, ObjectMapper> mapping) {
this.defaultMapper = defaultMapper;
this.mapping.putAll(mapping);
}
public RequestMatcherObjectMapperResolver(ObjectMapper defaultMapper) {
this.defaultMapper = defaultMapper;
}
#Override
public ObjectMapper getObjectMapper() {
ServletRequestAttributes sra = (ServletRequestAttributes) RequestContextHolder.getRequestAttributes();
HttpServletRequest request = sra.getRequest();
for (Map.Entry<RequestMatcher, ObjectMapper> entry : mapping.entrySet()) {
if (entry.getKey().matches(request)) {
return entry.getValue();
}
}
return defaultMapper;
}
}
You can also use a request scoped ObjectMapper and then configure it on a per-request basis. Use this configuration:
#Bean
public ObjectMapperResolver objectMapperResolver() {
return new ObjectMapperResolver() {
#Override
public ObjectMapper getObjectMapper() {
return requestScopedObjectMapper();
}
};
}
#Bean
#Scope(value = WebApplicationContext.SCOPE_REQUEST, proxyMode = ScopedProxyMode.TARGET_CLASS)
public ObjectMapper requestScopedObjectMapper() {
return new ObjectMapper();
}
This is best suited for custom response serialization, since you can configure it right in the controller method. For custom deserialization you must also use Filter/HandlerInterceptor/ControllerAdvice to configure active mapper for current request before the controller method is triggered.
You can create interface, similar to ObjectMapperResolver:
public interface ObjectMapperConfigurer {
void configureObjectMapper(ObjectMapper objectMapper);
}
Then make a map of this instances with RequstMatchers as keys and put it in a Filter/HandlerInterceptor/ControllerAdvice similar to RequestMatcherObjectMapperResolver.
P.S. If you want to explore dynamic ObjectMapper configuration a bit further I can suggest my old answer here. It describes how you can make dynamic #JsonFilters at run time. It also contains my older approach with extended MappingJackson2HttpMessageConverter that I suggested in comments.
Probably this would help, but it ain't pretty. It would require AOP. Also I did not validate it.
Create a #CustomAnnotation.
Update your controller:
void someEndpoint(#RequestBody #CustomAnnotation SomeEntity someEntity);
Then implemment the AOP part:
#Around("execution(* *(#CustomAnnotation (*)))")
public void advice(ProceedingJoinPoint proceedingJoinPoint) {
// Here you would add custom ObjectMapper, I don't know another way around it
HttpServletRequest request = ((ServletRequestAttributes) RequestContextHolder.currentRequestAttributes()).getRequest();
String body = request .getReader().lines().collect(Collectors.joining(System.lineSeparator()));
SomeEntity someEntity = /* deserialize */;
// This could be cleaner, cause the method can accept multiple parameters
proceedingJoinPoint.proceed(new Object[] {someEntity});
}
You can create custom deserializer for your String data.
Custom Deserializer
public class CustomStringDeserializer extends JsonDeserializer<String> {
#Override
public String deserialize(JsonParser p, DeserializationContext ctxt) throws IOException {
String str = p.getText();
//return processed String
}
}
Now suppose the String is present inside a POJO use #JsonDeserialize annotation above the variable:
public class SamplePOJO{
#JsonDeserialize(using=CustomStringDeserializer.class)
private String str;
//getter and setter
}
Now when you return it as a response it will be Deserialized in the way you have done it in CustomDeserializer.
Hope it helps.
You could try Message Converters.
They have a context about http input request (for example, docs see here, JSON). How to customize you could see here.
Idea that you could check HttpInputMessage with special URIs, which used in your controllers and convert string as you want.
You could create special annotation for this, scan packages and do it automatically.
Note
Likely, you don't need implementation of ObjectMappers. You can use simple default ObjectMapper to parse String and then convert string as you wish.
In that case you would create RequestBody once.
You can define a POJO for each different type of request parameter that you would like to deserialize. Then, the following code will pull in the values from the JSON into the object that you define, assuming that the names of the fields in your POJO match with the names of the field in the JSON request.
ObjectMapper mapper = new ObjectMapper();
YourPojo requestParams = null;
try {
requestParams = mapper.readValue(JsonBody, YourPOJO.class);
} catch (IOException e) {
throw new IOException(e);
}

Categories

Resources