I was provided with access to a Postgres table yielding a JSONB column amongst many "standard" others and I'm creating an entity class for it without resorting to any sort of ORM frameworks such as spring-data or hibernate. To represent that column I created the necessary POJOs, e.g.
#Entity
public class MyClass {
#Id
#Column(nullable = false)
private Long id;
#Convert(converter = SomeConverter.class)
#Column(columnDefinition = "jsonb")
private Data data; //My custom POJO
///... many other fields
Then I created a simple unit test in which I use Apache DBUtils to convert the query result set to a MyClass instance:
PGSimpleDataSource ds = //...
ResultSetHandler<List<MyClass>> h = new BeanListHandler<>(
MyClass.class, new BasicRowProcessor(
new GenerousBeanProcessor()));
QueryRunner run = new QueryRunner(ds);
run.query("select * from mytable", h);
which results in the following error: Cannot set data: incompatible types, cannot convert org.postgresql.util.PGobject, suggesting that it is not able to handle the conversion between the JSONB column and MyClass. Is there any good way to tackle this problem? I was able to get around this problem in a "not so good way" by implementing my own BeanHandler:
public class MyClassHandler extends BeanHandler<MyClass> {
public AssetHandler(Class<? extends MyClass> type, RowProcessor convert) {
super(type, convert);
}
#Override
public MyClass handle(ResultSet rs) throws SQLException {
MyClass myclass = new MyClass();
myclass.setId(rs.getLong("id"));
//...
myclass.setData(new ObjectMapper().readValue(rs.getObject("data").toString());
return myclass;
}
which works but DBUtils becomes useless as I end up doing all the work myself - this wouldn't be the case if for example I could invoke super.handle(rs) and tackle only the data field as I did there, but I found no way to do it.
Related
I have two set object Set<Foo> fooSet and Set<Bar> barSet. Foo is the entity object and Bar is the DTO which i want to use for sending data to the UI. I want to copy all the properties from fooSet into barSet.
public class Foo {
#Id
#Column(name = "Category_Id")
#GeneratedValue(strategy = GenerationType.AUTO)
private Long categoryId;
#OneToMany(mappedBy="productCategory")
private Set<ProductEntity> products;
}
public class BarDto {
private Long categoryId;
private Set<ProductDto> products;
}
I am trying to convert Entity into DTO object like:
public BarDto mapDomainToDto(Foo domain) {
BarDto barDto= new BarDto();
barDto.setCategoryId(domain.getCategoryId());
//trying to do something like:
barDto.setProducts(domain.getProducts());
}
Is there any way to achieve this in Java 8?
Java 8 itself doesn't provide such mapping feature and with pure Java all the left is manual calls of getters and setters as you already do:
BarDto barDto = new BarDto();
barDto.setCategoryId(domain.getCategoryId());
barDto.setProducts(domain.getProducts());
...
This is not bad itself as long as the number of such object mappings and parameters is low. For more complex object hierarchies I recommend MapStruct. ModelMapper is its alternative (IMHO, MapStruct is the way easier to configure and use, though it does the job the same).
#Mapper
public interface FooBarMapper {
FooBarMapper INSTANCE = Mappers.getMapper(FooBarMapper.class);
BarDto fooToBarDto(Foo domain);
}
BarDto barDto = FooBarMapper.INSTANCE.fooToBarDto(domain);
Back to Java 8, I bet you refer to either Stream API, Optional or anything related to the functional paradigm. Again, there is not such feature in this or any newer verstion of Java. However, it might help you with mapping of collections of objects using the library above:
FooBarMapper mapper = FooBarMapper.INSTANCE;
List<Foo> fooList = ...
List<BarDto> barDtoList = fooList.stream()
.map(mapper::fooToBarDto)
.collect(Collectors.toList());
Usualy we write model mapper. Or u can use some libraries like
http://modelmapper.org/getting-started/
I am pretty new in Spring Data JPA and ORM in general. I have the following architectural doubt.
Lets consider this entity class:
#Entity // This tells Hibernate to make a table out of this class
public class Order {
#Id
#GeneratedValue(strategy=GenerationType.AUTO)
private int id;
#Column(name = "name")
private String fullName;
private String address;
private String product;
#Column(name = "order_date_time")
private String orderDate;
private Double quantity;
// getters, setters
}
This class is mapped on my order database table.
In my application data came from an Excel document that I parse via Apace POI and than I have to persist on the database.
My doubt is: can I directly use this entity class to map an Excel row using Apache POI in order to persist the Excel rows as order table records? Or is better to use another DTO class to read the rows from Excel and than use this DTO to set the field values of my entity class?
An entity class can contain a constructor using fields?
Can I directly use this entity class to map an Excel row using Apache
POI in order to persist the Excel rows as order table records?
Yes you can.
Or is better to use another DTO class to read the rows from Excel and
than use this DTO to set the field values of my entity class?
It's certainly common to have a DTO layer between the two, but it's not required so it's up to you.
An entity class can contain a constructor using fields?
Yes, but at least Hibernate wants a non-private default constructor as well, so remember to create a protected Order() {} (or any visibility modifier besides private) in addition to your parameterized constructor.
I'm not a heavy user of Apache POI, but I do know it's used to manipulate MS files.
So, here are my two cents, in your use case, you can just read it and map directly to the Entity class as it doesn't expose an API to the external world.
However, if you were building a REST/SOAP API, I recommend you put a DTO in between so you don't mistakenly expose things that shouldn't be exposed.
From architectural point of view better to have a DTO class and encapsulate some logic there.
class CsvOrder {
private String fullName;
private String address;
public CsvRecord(String[] record) {
fullName = get(record, FULLNAME_INDEX);
address = get(record, ADDRESS_INDEX);
}
public Order toOrder() {
Order result = new Order();
result.setFullName(fullName);
return result;
}
}
public static <T> T get(T[] arr, int index) {
final T notFound = null;
return index < size(arr) ? arr[index] : notFound;
}
public static <T> int size(T[] array) {
return array == null ? 0 : array.length;
}
You can put a static method toOrder() to OrderServiceMapper, if you want to totally decouple layers
class OrderServiceMapper {
public static Order toOrder(CsvOrder order) {
Order result = new Order();
result.setFullName(order.getFullName());
return result;
}
}
Also, use Integer in place of int for id. Better to use Long everywhere
// This tells Spring to add this class to Hibernate configuration during auto scan
#Entity
public class Order {
#Id
#GeneratedValue(strategy = GenerationType.AUTO)
private Integer id;
}
In my db entities i have byte[] fields:
import javax.persistence.*;
/**
* Account
*/
#Entity
#Table(name = TABLE)
public class Account {
public static final String TABLE = "Account";
...
public final static String COLUMN_PASSWORD_HASH = "passwordHash";
#Column(name = COLUMN_PASSWORD_HASH, nullable = false)
public byte[] passwordHash;
...
I want to keep my db entities clear of any vendor dependency so i use only JPA annotations and try to avoid any ORMLite or Hibernate annotations.
However when trying to save such entity with ORMLite i get the following error:
java.sql.SQLException: ORMLite does not know how to store class [B for field 'passwordHash'. byte[] fields must specify dataType=DataType.BYTE_ARRAY or SERIALIZABLE
As far as i understand for some reason ORMLite does not prefer BYTE_ARRAY for byte[] and requires to mark the fields with com.j256.ormlite.field.Datatype ORMLite annotation with introduces explicit dependency on ormlite-core module and this is what i want to avoid (i have Hibernate DAO impl and ORMLite DAO impl and i don't want to mix everything).
My original intention was to configure ORMLite to prefer BYTE_ARRAY for byte[] fields. How can i do it? Should i introduce custom persister? Any other suggestions?
I solved it by adding the following custom data persister (without adding dependency to ormlite-core as i wanted):
package name.antonsmirnov.zzz.dao.types;
import com.j256.ormlite.field.SqlType;
import com.j256.ormlite.field.types.ByteArrayType;
/**
* ByteArray Type that prefers storing byte[] as BYTE_ARRAY
*/
public class PreferByteArrayType extends ByteArrayType {
public PreferByteArrayType() {
super(SqlType.BYTE_ARRAY, new Class[] { byte[].class });
}
private static final PreferByteArrayType singleTon = new PreferByteArrayType();
public static PreferByteArrayType getSingleton() {
return singleTon;
}
}
Register it just like any other custom persister:
DataPersisterManager.registerDataPersisters(PreferByteArrayType.getSingleton());
Note you can't use default ByteArrayDataType because it has empty classes array so it causes it to become persister for autogenerated fields and it throws exception that byte array fields can't be id fields.
I've checked it to use BLOB fields type for MySQL:
com.mysql.jdbc.Field#39a2bb97[catalog=test_db,tableName=account,originalTableName=account,columnName=passwordHash,originalColumnName=passwordHash,mysqlType=252(FIELD_TYPE_BLOB),flags= BINARY BLOB, charsetIndex=63, charsetName=ISO-8859-1]
I'm on Spring boot 1.4.x branch and Spring Data MongoDB.
I want to extend a Pojo from HashMap to give it the possibility to save new properties dynamically.
I know I can create a Map<String, Object> properties in the Entry class to save inside it my dynamics values but I don't want to have an inner structure. My goal is to have all fields at the root's entry class to serialize it like that:
{
"id":"12334234234",
"dynamicField1": "dynamicValue1",
"dynamicField2": "dynamicValue2"
}
So I created this Entry class:
#Document
public class Entry extends HashMap<String, Object> {
#Id
private String id;
public String getId() {
return id;
}
public void setId(String id) {
this.id = id;
}
}
And the repository like this:
public interface EntryRepository extends MongoRepository<Entry, String> {
}
When I launch my app I have this error:
Error creating bean with name 'entryRepository': Invocation of init method failed; nested exception is org.springframework.data.mapping.model.MappingException: Could not lookup mapping metadata for domain class java.util.HashMap!
Any idea?
TL; DR;
Do not use Java collection/map types as a base class for your entities.
Repositories are not the right tool for your requirement.
Use DBObject with MongoTemplate if you need dynamic top-level properties.
Explanation
Spring Data Repositories are repositories in the DDD sense acting as persistence gateway for your well-defined aggregates. They inspect domain classes to derive the appropriate queries. Spring Data excludes collection and map types from entity analysis, and that's why extending your entity from a Map fails.
Repository query methods for dynamic properties are possible, but it's not the primary use case. You would have to use SpEL queries to express your query:
public interface EntryRepository extends MongoRepository<Entry, String> {
#Query("{ ?0 : ?1 }")
Entry findByDynamicField(String field, Object value);
}
This method does not give you any type safety regarding the predicate value and only an ugly alias for a proper, individual query.
Rather use DBObject with MongoTemplate and its query methods directly:
List<DBObject> result = template.find(new Query(Criteria.where("your_dynamic_field")
.is(theQueryValue)), DBObject.class);
DBObject is a Map that gives you full access to properties without enforcing a pre-defined structure. You can create, read, update and delete DBObjects objects via the Template API.
A last thing
You can declare dynamic properties on a nested level using a Map, if your aggregate root declares some static properties:
#Document
public class Data {
#Id
private String id;
private Map<String, Object> details;
}
Here we can achieve using JSONObject
The entity will be like this
#Document
public class Data {
#Id
private String id;
private JSONObject details;
//getters and setters
}
The POJO will be like this
public class DataDTO {
private String id;
private JSONObject details;
//getters and setters
}
In service
Data formData = new Data();
JSONObject details = dataDTO.getDetails();
details.put("dynamicField1", "dynamicValue1");
details.put("dynamicField2", "dynamicValue2");
formData.setDetails(details);
mongoTemplate.save(formData );
i have done as per my business,refer this code and do it yours. Is this helpful?
I am a bit confused as to when I need to use an object mapper. I thought it should be used for mapping a result set from a DB query into objects so I created an object mapper like this:
public class PersonMapper implements ResultSetMapper<Person>
{
public Person map(int index, ResultSet resultSet, StatementContext ctx) throws SQLException
{
Person person = new Person();
person.setPersonId(resultSet.getShort("PersonId"));
person.setPersonType((PersonType) resultSet.getObject("PersonType"));
person.setPersonName(resultSet.getString("PersonName"));
person.setPersonMobile(resultSet.getString("PersonMobile"));
return person;
}
}
Then I registered it with the specific DAO like this: #RegisterMapper(PersonMapper.class)
However, it seems that everything also works without the mapper even if I make a query like this: List<Person> list = list(namedQuery("Person.findAll")); which returns a proper list.
So when exactly should I use a mapper?
if you are talking about ObjectMapper (com.fasterxml.jackson.databind.ObjectMapper) class then we use it to save an entire POJO object as JSON string into the database, in which you need to keep the data type of the column as json while table creation...
so we use it like this
public Response saveStateRules(#Context HttpServletRequest request,StatePojo statePojo)
{
ObjectMapper mapper = new ObjectMapper();
String json = mapper.writeValueAsString(statePojo);
State state = new State();
state.setRulejson(json);
}
where,
public class StatePojo implements Serializable{
private Integer stateid;
private ArrayList<StateRuleCondition> fieldconditions;
private ArrayList<Integer> stateids;
private Boolean isallow;
// contains all getters and setters
}