How can I deserialize row to DTO? - java

I have created method where I can pass SQL string, arguments and DTO class.
public <T> List selectObjectList(String sql, Object[] args, Class<T> dto) {
return jdbcTemplate.query(
sql,
args,
new BeanPropertyRowMapper(dto)
);
}
So my DTO is
#Data
#NoArgsConstructor
public class SimpleDto{
private Integer id;
private Date createDate;
...
private ArrayList<String> recipents;
private ArrayList<Integer> objects;
...
}
Then I pass SQL "select * from simple_dto n where n.create_date >= now()", no argsuments and
ArrayList<Notification> notifications =
(ArrayList<Notification>) comDao.selectObjectList(sql, args, SimpleDto.class);
And I get exception:
Caused by: org.springframework.beans.ConversionNotSupportedException:
Failed to convert property value of type 'org.postgresql.jdbc.PgArray'
to required type 'java.util.ArrayList' for property 'recipents';
nested exception is java.lang.IllegalStateException: Cannot convert
value of type 'org.postgresql.jdbc.PgArray' to required type
'java.util.ArrayList' for property 'recipents': no matching editors or
conversion strategy found
SQL script to create simple_dto table:
create table notification
(
id SERIAL PRIMARY KEY,
create_date timestamp not null,
recipents varchar ARRAY,
objects integer ARRAY
);
I know that problem in array. But how can I solve it ?
UPD:
Main purpose to make DRY code. BeanPropertyRowMapper really cant deserialize arrays to list.
But here method where you can put BeanPropertyRowMapper(DTO.class) and custom RowMapper both.
#Override
public <T> List<T> selectObjectList(String sql, Object[] args, RowMapper<T> mapper) {
return jdbcTemplate.query(
sql,
args,
mapper
);
}

BeanPropertyRowMapper cannot map java.sql.Array to ArrayList
Create a RowMapper (implement also mapping of 'objects' like i did for "recipents" And also 'create_date')
public class NotificationMapper implements RowMapper<Notification> {
#Override
public Notification mapRow(ResultSet rs, int rowNum) throws SQLException {
Notification employee = new Notification();
employee.setId(rs.getInt("id"));
String[] arr = (String[])rs.getArray("recipents").getArray();
Collections.addAll(employee.getRecipents(), arr);
return employee;
}
}
And use it to map result set. (You can remove 'dto' param.)
jdbcTemplate.query(
sql,
args,
new NotificationMapper()
);
Note that your lists from the pojo need to be initialized
public class SimpleDto{
private Integer id;
private Date createDate;
...
private ArrayList<String> recipents = new ArrayList<>();
private ArrayList<Integer> objects = new ArrayList<>();
...
}

Related

How to duplicate logic of both #Type and #Convert in Hibernate when mapping PostgreSQL enum?

I have an existing PostgreSQL database with an entity called Bike, with a field bike_gear_value whose type is an enum bike_gear with values GEAR-ONE, GEAR-TWO. I cannot change the database table.
The problem I'm having is mapping a Java enum to that existing enum in the db. I am using Hibernate and querying/saving the Bike entity using Spring Boot's repository interface.
Here is my enum in Java (keeping in mind you can't have enum values with dashes),
public enum BikeGear {
GEARONE("GEAR-ONE"),
GEARTWO("GEAR-TWO");
//constructors, getters, setters
public static BikeGear fromCode(String code){
for (BikeGear bike : BikeGear.values()) {
if(bike.getValue().equals(code)){
return bike;
}
}
throw new UnsupportedOperationException(
"The code " + code + " is not supported!"
);
}
}
I have an AttributeConverter,
#Converter
public class BikeGearConverter implements AttributeConverter<BikeGear, String> {
#Override
public String convertToDatabaseColumn(BikeGear attribute) {
if(attribute == null){
return null;
}
return attribute.getValue();
}
#Override
public BikeGear convertToEntityAttribute(String dbData) {
if(dbData == null){
return null;
}
return BikeGear.fromCode(dbData);
}
}
I have a PostgreSQLEnumType class,
public class PostgreSQLEnumType extends org.hibernate.type.EnumType {
public void nullSafeSet(
PreparedStatement st,
Object value,
int index,
SharedSessionContractImplementor session)
throws HibernateException, SQLException {
st.setObject(
index,
value != null ?
((Enum) value).name() :
null,
Types.OTHER
);
}
}
Lastly, here is my Bike class (simplified),
#Entity
#Table(name = "bike")
#TypeDef(
name = "pgsql_enum",
typeClass = PostgreSQLEnumType.class
)
public class Bike {
#Column(name = "bike_gear_value")
// #Type(type = "pgsql_enum")
// #Enumerated(EnumType.STRING)
// #Convert(converter = BikeGearConverter.class)
BikeGear rating;
//constructors, getters, setters
}
Here are the errors I get using the following combinations of #Type (w/ #TypeDef), #Enumerated and #Convert,
#Type and #Convert
AttributeConverter and explicit Type cannot be applied to same attribute
#Convert and #Enumerated
// Saving a Bike
org.postgresql.util.PSQLException: ERROR: column "bike_gear_value" is of type bike_gear but expression is of type character varying
Hint: You will need to rewrite or cast the expression.
// Querying a Bike
java.lang.IllegalArgumentException: No enum constant com.example.bikeapp.model.BikeGear.GEAR-ONE
#Convert
// Saving a Bike
org.postgresql.util.PSQLException: ERROR: column "bike_gear_value" is of type bike_gear but expression is of type character varying
Hint: You will need to rewrite or cast the expression.
// Querying a Bike
Successful
#Type
// Saving a Bike
Successful
// Querying a Bike
java.lang.IllegalArgumentException: No enum constant com.example.bikeapp.model.BikeGear.GEAR-ONE
Ideally I would use #Type and #Converter so that both saving and querying is successful, however that's not allowed. Does anyone have recommendations here for a solution?

Writing MyBatis3 TypeHandler for User-Defined Oracle Table Type

I am having difficulty writing a TypeHandler to convert an Oracle user-defined table type to a list of Java objects.
The database types and function spec are defined like this:
CREATE OR REPLACE TYPE MySchema.MY_TYPE IS OBJECT (
first_name VARCHAR2(50),
last_name VARCHAR2(50)
);
CREATE OR REPLACE TYPE MySchema.MY_TYPE_TBL IS TABLE OF MY_TYPE;
FUNCTION GET_MY_STUFF(my_user_name IN VARCHAR2) RETURN MySchema.MY_TYPE_TABLE;
I have a MyBatis mapper that has the following call in it:
<resultMap id="myResultMap" type="GetMyStuffResult" />
<select id="getMyStuff" statementType="CALLABLE" parameterType="map">
{#{return_value, mode=OUT,
typeHandler=MyStuffToList
jdbcTypeName=MySchema.MY_TYPE_TABLE,
jdbcType=ARRAY} =
call MySchema.MyPackage.GET_MY_STUFF (
my_user_name => #{userName, mode=IN, jdbcType=VARCHAR}
)}
</select>
Finally, I am attempting to write a TypeHandler, but am failing miserably in the getResult override:
public class MyStuffToList implements TypeHandler<List<GetMyStuffResult>> {
// Other Overrides Here
#Override
public List<GetMyStuffResult> getResult(CallableStatement cs, int columnIndex)
throws SQLException {
List<GetMyStuffResult> results = new ArrayList<GetMyStuffResult>();
Array array = cs.getArray(columnIndex);
// HOW DO I CONVERT THE Array TO List<GetMyStuffResult> ???
return results;
}
}
I cannot seem to get from the CallableStatement passed into the TypeHandler to the list that I want.
Here is what I did to get this to work.
The TypeHandler needs to create a typemap that references the single Oracle type (not the table type):
public class MyStuffToList implements TypeHandler<List<GetMyStuffResult>> {
// Other Overrides Here
#Override
public List<GetMyStuffResult> getResult(CallableStatement cs, int columnIndex)
throws SQLException {
List<GetMyStuffResult> results = new ArrayList<GetMyStuffResult>();
Array array = cs.getArray(columnIndex);
// Add a TypeMap to map the Oracle object to a Java class
Map<String, Class<?>> typeMap = new HashMap<String, Class<?>>();
typeMap.put("MySchema.MY_TYPE", GetMyStuffResult.class);
// Get an array of Java objects using that type map
Object[] javaObjects = (Object[]) array.getArray(typeMap);
// add each of these converted objects to the results list
for (Object javaObject : javaObjects) {
results.add((GetMyStuffResult) javaObject);
}
return results;
}
}
Then, the DTO class itself needs to implement ORAData and ORADataFactory to glue everything together:
public class GetMyStuffResult implements ORAData, ORADataFactory {
string first_name;
string last_name;
// Assume appropriate getters and setters for all properties here
// public String getFirstName() {
// |
// |
// Implement ORAData and ORADataFactory
public ORAData create(Datum datum, int sqlType throws SQLException {
GetMyStuffResult result = new GetMyStuffResult();
Struct javaStruct = (Struct) datum;
Object[] attributes = javaStruct.getAttributes();
// ORDER MATTERS HERE - must be in order defined in Oracle
result.setFirstName(attributes[0].toString());
result.setLastName(attributes[1].toString());
}
public Datum toDatum(Connection conn) throws SQLException {
return null;
}
}
Of course, you'll want to do proper null and data checks, etc.

Type mismatch: cannot convert from List<Map<String,Object>> to List<Object>

We have
String sql = "SELECT * FROM user WHERE id = '" +userId +"'";
List<Object> userList = template.queryForList(sql);
The query shall return the different users with the given id.
The template is an object of JdbcTemplate class.
Here the query returns a List of Map<String,Object> and the left hand side is List<Object>. Is there any way to do the conversion. This is part of the spring transaction management example.
You can always use only the values which you are interested in the first place like:
List<Object> userList = new ArrayList<Object>(template.queryForList(sql).values());
In case it returns List<Map<String, Object>> instead of Map<String,Object>:
List < Object > userList = new ArrayList < Object > ();
for (Map < String, Object > obj: template.queryForList(sql)) {
userList.addAll(obj.values());
}
This should give you a list of Object as you need. If you need to explore why Map.values() cannot be directly casted to list and we need to create a new ArrayList for that purpose can be found here: why HashMap Values are not cast in List
I think you should take another approach. Spring can not build objects from the data returned from the database, but you can use a RowMapper to construct user objects. Additionally you should not build the query on your own but use PreparedStatements. Otherwise you might be vulnerable to SQL injection.
Here is a example on how to do it:
template.query("SELECT * FROM user WHERE id = ?", new Object[] { userId }, new RowMapper<User>() {
public User mapRow(ResultSet rs, int rowNum) {
// Build a user from the current row and return it
}
});
I think you have confused it a lot. The best way to do it is indeed what is suggested by Steffen Kreutz. Consider the following example:
User Table in DB has following fields:
1. UserID -> Type: Int
2. User_Name -> Type: Varchar
3. User_Contact -> Type: Varchar
...
Now, you can simple write a RowMapper to map all these fields to your custom POJO Object like follows:
POJO Class:
public class User{
private int userId;
private String userName;
private String userContact;
public int getUserId() {
return this.userId;
}
public void setUserId(int userId) {
this.userId = userId;
}
/* Other Getter-Setter(s) */
}
Query String:
private static final String SELECT_QUERY = "SELECT * FROM user WHERE id = ?";
JdbcTemplate Call:
Here you are passing userId for ?. JdbcTemplate will automatically take care of it.
List<User> = (List<User>) jdbcTemplate.query(SELECT_QUERY, new Object[] { userId }, new UserRowMapper());
Finally the UserRowMapper Class:
class UserRowMapper implements RowMapper {
#Override
public Object mapRow(ResultSet resultSet, int row) throws SQLException {
User user = new User();
user.setUserId(resultSet.getInt("UserID"));
user.setUserName(resultSet.getString("User_Name"));
user.setUserContact(resultSet.getString("User_Contact"));
return user;
}
}
This is indeed the best and recommended way of using JdbcTemplate.

DynamoDB and Global Secondary Index and ObjectMapper

I want to use DynamoDBMapper to query a table, this table has a global secondary index. And, I wish to query the global secondary index. So, I have a class corresponding to each item in the table. And, the field which is the Hash key in the global secondary index is annotated as following
#DynamoDBIndexHashKey(globalSecondaryIndexName="Index-Name", attributeName = "EmailSent")
public String getEmailSent() {
return emailSent;
}
And, I am querying using the mapper as shown below
public <T extends Object> List<T> queryGlobalIndex(final String tableName, final String indexName, final T inputObj) {
final Class<T> clazz = (Class<T>) inputObj.getClass();
DynamoDBQueryExpression<T> queryExpression = new DynamoDBQueryExpression<T>().withIndexName(indexName).withConsistentRead(false).withHashKeyValues(inputObj);
return mapper.query(clazz, queryExpression, new DynamoDBMapperConfig(
new TableNameOverride(tableName)));
}
This is working, my quest is that I want to remove the field globalSecondaryIndexName from my annotation #DynamoDBIndexHashKey on the field. Any inputs on how to go about it?

Best technique to map JSON data into ContentValues

I'm developing an Android app which will initiate a number of API calls that return JSON data structures and then store the results in a content provider. Different API calls return different JSON data structures and map to a corresponding table schema in the content provider. I'm looking for a simple and Java-esque method to map from the properties in a JSONObject to a flat ContentValues object. I started to use a simple HashMap and iterating over it's entrySet mapping key strings in the JSONObject to value strings for the ContentValues object, but I'd like to account for the fact that some JSON properties are integers or booleans. Also, in some cases I'd like a more complex mapping such as a JSONArray into a comma separated string. In C, I'd probably just do this with a struct array name, value, type, and an optional callback to handle more complex mappings.
UPDATE: Due to the hierarchal nature of the JSON Data Structure and due to the fact that it can actually have sub-tables at certain depths I've taken the following approach.
private static interface MapJSON {
public void mapData(JSONObject object, ContentValues values)
throws JSONException;
}
private static abstract class AbstractMapJSON implements MapJSON {
protected final String mJSONName;
protected final String mContentName;
public AbstractMapJSON(String jsonName, String contentName) {
mJSONName = jsonName;
mContentName = contentName;
}
public abstract void mapData(JSONObject object, ContentValues values)
throws JSONException;
}
/* This is the basic template for each of the basic types */
private static class BooleanMapJSON extends AbstractMapJSON {
public BooleanMapJSON(String jsonName, String contentName) {
super(jsonName, contentName);
}
public void mapData(JSONObject object, ContentValues values)
throws JSONException {
values.put(mContentName, object.getBoolean(mJSONName));
}
}
/* This class takes a nested JSON Object and flattens it into the same table */
private static class ObjectMapJSON implements MapJSON {
protected final String mJSONName;
protected final MapJSON[] mMap;
public ObjectMapJSON(String jsonName, MapJSON[] map) {
mJSONName = jsonName;
mMap = map;
}
public void mapData(JSONObject object, ContentValues values)
throws JSONException {
JSONObject subObject = object.getJSONObject(mJSONName);
for(MapJSON mapItem: mMap) {
mapItem.mapData(subObject, values);
}
}
}
With that defined, I've can create mappings like this:
private static final MapJSON[] mainSiteMap = new MapJSON[] {
new StringMapJSON("name", StackPad.Sites.NAME),
new LongMapJSON("creation_date", StackPad.Sites.CREATION_DATE),
new StringMapJSON("description", StackPad.Sites.DESCRIPTION),
};
private static final MapJSON sitesMap = new ObjectMapJSON("main_site", mainSiteMap);
But it still seems like it needs a little work to mesh well.
Maybe you can build a class and use it in the hashmap , I dont know whats your types but for example
class Foo{
String name;
String value;
String type;
int opt;
}
.....
HashMap hm = new HashMap();
Foo foo = new Foo("123","123","type",1);
hm.put("100", foo);
.....
you can try using google's gson, create a structure for your object then map them to the object.. you can specify what datatype and support primitive types as well..

Categories

Resources