I have a entity like this:
public class MyEntity {
public Map map1;
public Map map2;
}
Using XStreams, I want to use the default map converter for map1 and my own map converter for map2. How can I do this?
Add an annotation in the field #XStreamConverter(MyOwnConverter.class) as shown here.
Related
I would like MapStruct to map every property of my Object, except for one particular one for which I would like to provide a custom mapping.
So far, I implemented the whole mapper myself, but every time I add a new property to my Entity, I forget to update the mapper.
#Mapper(componentModel = "cdi")
public interface MyMapper {
MyMapper INSTANCE = Mappers.getMapper(MyMapper.class);
default MyDto toDTO(MyEntity myEntity){
MyDto dto = new MyDto();
dto.field1 = myEntity.field1;
// [...]
dto.fieldN = myEntity.fieldN;
// Custom mapping here resulting in a Map<> map
dto.fieldRequiringCustomMapping = map;
}
}
Is there a way to outsource the mapping for my field fieldRequiringCustomMapping and tell MapStruct to map all the other ones like usual? 🤔
MapStruct has a way to use custom mapping between some fields. So in your case you can do someting like:
#Mapper(componentModel = "cdi")
public interface MyMapper {
#Mapping(target = "fieldRequiringCustomMapping", qualifiedByName = "customFieldMapping")
MyDto toDTO(MyEntity myEntity);
// The #(org.mapstruct.)Named is only needed if the Return and Target type are not unique
#Named("customFieldMapping")
default FieldRequiringCustomMapping customMapping(SourceForFieldRequiringCustomMapping source) {
// Custom mapping here resulting in a Map<> map
return map
}
}
I do not know from what your fieldRequiringCustomMapping needs to be mapped, the example assumes that you have such a field in MyEntity as well, if that is not the case you'll need to add source to the #Mapping.
Side Note: When using a non default componentModel in your case cdi it is not recommended to use the Mappers factory. It will not perform the injection of other mappers in case you use them in a mapper.
I'm using a Converter class to store my entity field (type - Map<String, Object>) in MySql DB as a JSON text:
#Entity
#Table(name = "some_table")
public class SomeEntity {
...
#Column(name = "meta_data", nullable = false)
#Convert(converter = MetaDataConverter.class)
Map<String, Object> metaData = null;
...
}
Here is the MetaDataConverter class:
#Converter
public class MetaDataConverter implements AttributeConverter<Map<String, Object>, String> {
private final ObjectMapper objectMapper = new ObjectMapper();
#Override
public String convertToDatabaseColumn(Map<String, Object> metadata) {
try {
return objectMapper.writeValueAsString(metadata);
} catch (Exception e) {
...
}
}
#Override
public Map<String, Object> convertToEntityAttribute(String dbData) {
try {
return objectMapper.readValue(dbData, Map.class);
} catch (Exception e) {
...
}
}
}
Here is the my service class:
#Service
#RequiredArgsConstructor
#Transactional
public class MetaDataService {
private final JpaRepository<MetaDataEntity, String> metaDataRepository;
public MetaDataEntity updateOnlyMetadata(String someParameter, Map<String, Object> newMetaData) {
MetaDataEntity metaDataEntity = metaDataRepository.findBySomeParameter(someParameter);
metaDataEntity.setMetaData(newMetaData);
return metaDataRepository.save(metaDataEntity);
}
}
For the creation it works fine, but it doesn't work with the updating of the converted field. If i try to update only the
metaData field the appropriate column in database is not updated. However, the metaData is updated in case of updating with the other entity fields.
I've already seen the same questions (JPA not updating column with Converter class and Data lost because of JPA AttributeConverter?), but i have not found the answer. Is there something like a standard or best practice for such a case?
FYI: For the CRUD operations i'm using the Spring Data JpaRepository class and its methhods.
Implement proper equals and hashcode methods on your map value objects. JPA can then use those methods to identify that your Map is dirty.
I had a similar problem. i figure out that hibernate used the equals method to determine if one of attribute is dirty and then make an update.
you have two choices : implement correctly the equals method for the entity including the converted attribute
or
instead of using "Map" try to use "HashMap" as attribute type.
the HashMap has an already implemented equals method and hibernate will use it
I have a class which it's only field is an HashMap.
Is it possible to define the #Id of the class to be one of the keys of the HashMap? (which always exists there)
Thanks!
If you have a Class that contains only an HashMap don't define a Class because it not make any sense, instead convert your query result directly in an Map likes Map<String, Object> dbCursor = mongoTemplate.getCollection("articles").find(query.getQueryObject(), Map.class).first();
I suggest you to use the Class in order to define your object, maybe you can use something like
public class Foo {
#Id
private String id;
private Map<String, Object> data;
private Map<String, Object> metadata;
}
in order to maintain flexibility
Can someone provide some idea to inject all dynamic keys and values from property file and pass it as Map to DBConstants class using Setter Injection with Collection.
Keys are not known in advance and can vary.
// Example Property File that stores all db related details
// db.properties
db.username.admin=root
db.password.admin=password12
db.username.user=admin
db.password.user=password13
DBConstants contains map dbConstants for which all keys and values need to be injected.
Please provide bean definition to inject all keys and values to Map dbConstants.
public class DBConstants {
private Map<String,String> dbConstants;
public Map<String, String> getDbConstants() {
return dbConstants;
}
public void setDbConstants(Map<String, String> dbConstants) {
this.dbConstants = dbConstants;
}
}
You can create PropertiesFactoryBean with your properties file and then inject it with #Resource annotation where you want to use it as a map.
#Bean(name = "myProperties")
public static PropertiesFactoryBean mapper() {
PropertiesFactoryBean bean = new PropertiesFactoryBean();
bean.setLocation(new ClassPathResource("prop_file_name.properties"));
return bean;
}
Usage:
#Resource(name = "myProperties")
private Map<String, String> myProperties;
you can use #Value.
Properties file:
dbConstants={key1:'value1',key2:'value2'}
Java code:
#Value("#{${dbConstants}}")
private Map<String,String> dbConstants;
you have to give spaces its like
hash.key = {indoor: 'reading', outdoor: 'fishing'}
Read map like below as i mentioned.
#Value("#{${hash.key}}")
private Map<String, String> hobbies;
I'm using Dozer to Map from one object to another. I know Dozer can do a recursive mapping, but maybe i'm putting too much pressure on Dozer :p I want to map from class A to B
class A {
private Map<String, List<ObjectA>> myMap;
// getters and setter for myMap
}
class B{
private Map<String, List<ObjectB>> myMap;
// getters and setter for myMap
}
When I map from A to B I get an instance of B, but inside the Map I got a List of ObjectA. To be clear, I get an instance of this (imaginary) object:
class B{
Map<String, List<ObjectA>> myMap;
}
How can I make dozer to perform this mapping correctly?
Note: ObjectA and ObjectB have the same properties (and int and a String).
You could specified hint:
<field>
<a>A</a>
<b>B</b>
<a-hint>ObjectA</a-hint>
<b-hint>ObjectB</b-hint>
</field>