How to map native SQL query to dto object in hibernate? - java

I have the following native SQL query:
Select E.id AS ID, E.desc AS DESCRIPTION FROM TEMP E
And the dto class:
private int id;
private String description;
/* getter and setter */
How to get a list of dto class??

What I would probably do to maximize reuse would be to write my own ResultTransformer that you could instantiate prior to running your query and as a part of this implementation, you're required to provide it with the appropriate mapping information.
// construct the transformer and register mappings
MappedResultTransformertransformer = new MappedResultTransformer(DtoClass.class);
transformer.map( "ID", "id" );
transformer.map( "DESCRIPTION", "description" );
// apply the transformer
session.createQuery( ... ).setResultTransformer( transformer ).list();
Here's an example of how the transformer might look.
public class MappedResultTransformer extends BasicTransformerAdapter {
final Map<String, String> fieldMappings = new HashMap<>();
final Class<?> clazz;
public MappedResultTransformer(Class<?> clazz) {
this.clazz = clazz;
}
public void map(String alias, String property) {
fieldMappings.put( alias, property );
}
#Override
public Object transformTuple(Object[] tuple, String[] aliases) {
Object result = clazz.newInstance();
for ( int i = 0; i < aliases.length; ++i ) {
Object tupleValue = tuple[ i ];
String alias = aliases[ i ];
String propertyName = fieldMappings.get( alias );
if ( propertyName != null ) {
// use reflection to set the value of 'propertyName' on 'result'
}
}
return result;
}
}
The beauty here is that this class is completely reusable, it isn't tied to any specific class or query. You then could extend upon it and add support for nested properties, etc perhaps.

try this
Query query = getSession.createSQLQuery("...")
.addScalar("ID")
.addScalar("DESCRIPTION")
.setResultTransformer(Transformers.aliasToBean(dto.class));
List<dto> list = query.list();
dto class
#Entity
#Table(name="your database table")
public class DTO {
#Id
private int id
#Column(name="description_name_on_table)
private String description
..getter and setter
}

Related

How to get the name of an Attribute from an Entity

I have the following entity class:
public class Conversation {
private String id;
private String ownerId;
private Long creationDate;
public Conversation(String id, String ownerId, Long creationDate){
this.id = id;
this.ownerId = ownerId;
this.creationDate = creationDate;
}
}
On other submodule through an external service, on each insertion, I recive a map of the following entities:
public class AttributeValue {
private Sring s; //string attribute
private String n; //number attribute
public String getS() {
return this.s;
}
public String getN() {
return this.n;
}
public AttributeValue(String s, String n){
this.s = s;
this.n = n;
}
}
//Example if I insert this conversation: new Conversation("1", "2", 1623221757971)
// I recive this map:
Map<String, AttributeValue> insertStream = Map.ofEntries(
entry("id", new AttributeValue("1", null)),
entry("ownerId", new AttributeValue("2", null)),
entry("creationDate", new AttributeValue(null, "1623221757971"))
);
To read the ownerId field from the map, I have to do this:
String ownerId = insertStream.get("ownerId").getS();
My question is, instead of have to write: insertStream.get("ownerId"), exists any way through Reflection to read the name of the field from the entity (Conversation.ownerId)?
This is because we want to mantain the submodule and If we make a change on the entitity, for example change ownerId for ownerIdentifier, the submodule shows a compilation error or is changed automatically.
Is this what you want? Field#getName()
Example code:
Field[] conversationFields = Conversation.class.getDeclaredFields();
String field0Name = conversationFields[0].getName();
Depending on the JVM used, field0Name can be "id". You can also use Class#getFields(), this method includes all Fields that are accessible in this class (super class's fields).
Another option (not using reflection) would be to refactor your code.
import java.util.Map;
import java.util.HashMap;
public class Conversation {
public static String[] names = {
"id", "ownerId", "creationDate"
};
private Map<String, Object> data = new HashMap<String,Object>();
public Conversation(Object... data) {
if(data.length!=names.length)
throw new IllegalArgumentException("You need to pass "+names.length+" arguments!");
for(int i=0; i<names.length; i++)
data.put(names[i],data[i]);
}
public Map<String,Object> getData() { return data; }
// You can pass "id"/"ownerId" or names[0]/names[1]
public String getString(String key) {
return (String)data.get(key);
}
// You can pass "creationDate" or names[2]
public long getLong(String key) {
return (long)data.get(key);
}
}
You could then create Conversation Objects like before:
Conversation c = new Conversation("myId","myOwnerId",123456789L);
You could also add public static String fields like ID="id", but changing the value of a field will never change the field's name.

JpaSpecification using Generics. Works with string, but issues with Date and joined fields

I'm trying to expand on this Baeldung tutorial https://www.baeldung.com/rest-api-search-language-spring-data-specifications
But I want the Specification to be Generic and I wanted to allow the client to search by values of embedded objects. Everything works for String and some numbers, but not for ids and other more complicated objects like Date.
My Model: (assume a person can only have 1 pet)
#Entity
public Person {
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
private ID id;
private String name;
private Date dateOfBirth
private Integer age;
private Pet pet;
// Getter & Setters etc
}
#Entity
public Pet {
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
private ID id;
private String type;
private String name;
private Integer numOfLegs;
// Getter & Setters etc
}
Person repository:
#Repository
public interface PersonRepository extends JpaRepository<Person, Integer>, JpaSpecificationExecutor<Person>{}
Search Criteria that will hold the key, operator and value that we can search by.
public class EntitySearchCriteria {
private String key;
private String operation;
private Object value;
public EntitySearchCriteria(final String key, final String operation, final Object value) {
this.key = key;
this.operation = operation;
this.value = value;
}
// Getters and Setters etc
My Generic Specification class (this is really where the action is to build the predicates that are to be used). This also allows the client to set a SearchCriteria on a value of a joined table. e.g. "Pet.name=Muffins"
public abstract class AbstractEntitySpecification<T, ID extends Serializable> implements Specification<T> {
protected EntitySearchCriteria criteria;
#Override
public Predicate toPredicate(Root<T> root, CriteriaQuery<?> query, CriteriaBuilder criteriaBuilder) {
if (criteria.getOperation().equalsIgnoreCase(">")) {
return criteriaBuilder.greaterThanOrEqualTo(root.<String>get(criteria.getKey()), criteria.getValue().toString());
} else if (criteria.getOperation().equalsIgnoreCase("<")) {
return criteriaBuilder.lessThanOrEqualTo(root.<String>get(criteria.getKey()), criteria.getValue().toString());
} else if (criteria.getOperation().equalsIgnoreCase(":")) {
if (criteria.getKey().contains(".")) {
String[] joinCriteriaArray = criteria.getKey().split("\\.");
Class<?> joinedClass = root.get(joinCriteriaArray[0]).getClass();
Join<T, ?> joinedRelationship = root.join(joinCriteriaArray[0]);
return criteriaBuilder.equal(joinedRelationship.get(joinCriteriaArray[1]), criteria.getValue());
}
if (root.get(criteria.getKey()).getJavaType() == String.class) {
return criteriaBuilder.like(root.<String>get(criteria.getKey()), "%" + criteria.getValue() + "%");
} else {
return criteriaBuilder.equal(root.get(criteria.getKey()), criteria.getValue());
}
}
return null;
}
}
Any Entity that I want to allow this type of Querying then just needs to have a concrete implementation of the AbstractEntitySpecification
public class PersonSpecification extends AbstractEntitySpecification<Person, Integer> {
public PersonSpecification (final EntitySearchCriteria entitySearchCriteria) {
this.criteria = entitySearchCriteria;
}
}
These are the tests that I have run. Any search on a attribute of Person that is a String or Int (i.e. Person.name, Person.age) will work, but a search on dateOfBirth will not.
Any search on an attribute of the pet that is a string will work using the join, but searching on the id(Integer) will not, no matter if I pass the id as an Int, or a String. I have put the behaviour in a comment for each test.
public class PersonSpecificationMediumTest extends AbstractMediumTest {
#Autowired
private PersonRepository personRepository;
#Autowired
private PetRepository petRepository;
Person person1;
Person person2;
#Before
public void setUp() {
Pet muffins = new Pet(1, "cat", "muffins", 4);
Pet rex= new Pet(2, "dog", "rex", 4);
petRepository.saveAll(Arrays.asList(muffins , rex));
person1 = new Person();
person1.setName("David");
person1.setDateOfBirth(Date.parse("1979-03-01");
person1.setPet(muffins);
person1 = personRepository.saveAndFlush(person1);
person2 = new Person();
person2.setName("Mary");
person2.setDateOfBirth(Date.parse("1982-03-01");
person2.setPet(rex);
person2 = personRepository.saveAndFlush(person2);
}
#Test //Works
public void toPredicate_findByNameEquals_assertCorrectResult() {
PersonSpecification spec
= new PersonSpecification(new EntitySearchCriteria("name", ":", "David"));
List<Person> results = personRepository.findAll(spec);
Assert.assertEquals(person1, results.get(0));
}
#Test // Works
public void toPredicate_findByPetNameEquals_assertCorrectResult() {
PersonSpecification spec
= new PersonSpecification(new EntitySearchCriteria("client.name", ":", "Rex"));
List<Person> results = personRepository.findAll(spec);
Assert.assertEquals(person2, results.get(0));
}
#Test // Return empty list. Cannot find the pet by Id.
public void toPredicate_findByPetIdEquals_assertCorrectResult() {
PersonSpecification spec
= new PersonSpecification(new EntitySearchCriteria("pet.id", ":", 2));
List<Person> results = personRepository.findAll(spec);
Assert.assertEquals(person2, results.get(0));
}
#Test // org.springframework.dao.InvalidDataAccessApiUsageException: Parameter value [2] did not match expected type [java.lang.Integer (n/a)];
public void toPredicate_findByPetIdAsStringEquals_assertCorrectResult() {
PersonSpecification spec
= new PersonSpecification(new EntitySearchCriteria("pet.id", ":", "2"));
List<Person> results = personRepository.findAll(spec);
Assert.assertEquals(person2, results.get(0));
}
#Test // Fails on org.springframework.dao.InvalidDataAccessApiUsageException: Parameter value [2020-01-01] did not match expected type [java.util.Date (n/a)]
public void toPredicate_findByDateOfBirthBetween_assertCorrectResult() {
PersonSpecification spec1
= new PersonSpecification(new EntitySearchCriteria("dateOfBirth", "<", "1990-01-01"));
PersonSpecification spec2
= new PersonSpecification(new EntitySearchCriteria("dateOfBirth", ">", "1970-01-01"));
List<Person> results = personRepository.findAll(spec1.and(spec2));
Assert.assertTrue(results.size() == 2);
}
}
Any idea why Date is so problematic? I wanted use the date in the greaterThanOrEqualTo and lessThanOrEqualTo, but passing in criteria.getValue(Object) gives a compile error so it forces me to use a string representation of the object. But the error shown is org.springframework.dao.InvalidDataAccessApiUsageException: Parameter value [2020-01-01] did not match expected type [java.util.Date (n/a)] which indicates to me that it cannot compare a String to a Date, which makes sense, but why stop me from passing the Date object?
Also, why is Id such an issue on the joined table? Why can it not find id = 2, I would have thought it would be straight forward, especially since I can search by the number of legs of the Pets successfully. It must have something to do with id being Serializable.
Check out JavaDoc for Date.parse. The essential part is already with the declaration:
#Deprecated
public static long parse(String s)
As it is clearly stated it returns a long value. To get a Date object you could use SimpleDateFormat that inherits DateFormat.parse(String s), like:
SimpleDateFormat sdf = new SimpleDateFormat("yyyy-MM-dd");
Date d1 = sdf.parse("1979-03-01");

How do I tell Hibernate to not create a table for this Entity?

I'm using SqlResultSetMapping and the Entity annotations (SqlResultSetMapping requires an Entity with an Id) to tell Hibernate how to populate instances of Foo with native query results data.
Non-persisted entity:
#SqlResultSetMapping(name = "fooMapping", entities = #EntityResult(entityClass = Foo.class))
#Entity
public class Foo {
#Id
public Long row_id;
public String name;
}
Native query:
String sql = "SELECT id AS row_id, friendlyName AS name FROM SomeTable";
Query q = JPA.em().createNativeQuery(sql, "fooMapping");
List<Foo> fooList = q.getResultList();
The problem is, a table called "Foo" gets created automatically for me (using Play! Framework in dev mode), but Foo is not a model and should not be persisted.
How do I instruct hibernate not to create this table?
Using #ConstructorResult will work great once it's available for your persistence layer. Until then, there is a Hibernate-specific approach using an org.hibernate.SQLQuery and an org.hibernate.transform.ResultTransformer that does not depend on #SqlResultSetMapping. Because a POJO is populated, Hibernate finds no #Entity to automatically turn into a table.
Non-persisted POJO:
public class Foo {
public Long row_id;
public String name;
}
ResultTransformer:
public static class FooResultTransformer implements ResultTransformer {
#Override
public List transformList(List list) { return list; }
#Override
public Object transformTuple(Object[] tuple, String[] aliases) {
List<String> aliasList = Arrays.asList(aliases);
Foo foo = new Foo();
foo.row_id = ((Number) getValue(tuple, aliasList, "row_id", 0L))
.longValue();
foo.name = (String) getValue(tuple, aliasList, "name", null);
return foo;
}
private static Object getValue(Object[] tuple, List<String> aliases,
String field, Object defaultValue)
{
// unchecked for berevity
if (tuple[aliases.indexOf(field)] == null) {
return defaultValue;
}
return tuple[aliases.indexOf(field)];
}
}
Native SQLQuery:
String sql = "SELECT id AS row_id, friendlyName AS name FROM SomeTable";
Session session = JPA.em().unwrap(Session.class);
SQLQuery q = session.createSQLQuery(sql);
q.setResultTransformer( new FooResultTransformer() );
List<Foo> fooList = q.list();
Unfortunately this isn't easy...
If you are using JPA 2.1 support for #ConstructorResult (seems there's only support in hibernate 4.3.0.Beta2 so you might not be using), you can use #ConstructorResult as follows:
#SqlResultSetMapping(name="fooMapping",
classes={
#ConstructorResult(targetClass=Foo.class, columns={
#ColumnResult(name="row_id", type=Integer.class),
#ColumnResult(name="name", type=String.class)
})
}
)
public class Foo {
public Long row_id;
public String name;
public Foo(Long rowId, String name) {
...
}
}

JDBCTemplate set nested POJO with BeanPropertyRowMapper

Given the following example POJO's: (Assume Getters and Setters for all properties)
class User {
String user_name;
String display_name;
}
class Message {
String title;
String question;
User user;
}
One can easily query a database (postgres in my case) and populate a list of Message classes using a BeanPropertyRowMapper where the db field matched the property in the POJO: (Assume the DB tables have corresponding fields to the POJO properties).
NamedParameterDatbase.query("SELECT * FROM message", new BeanPropertyRowMapper(Message.class));
I'm wondering - is there a convenient way to construct a single query and / or create a row mapper in such a way to also populate the properties of the inner 'user' POJO within the message.
That is, Some syntatical magic where each result row in the query:
SELECT * FROM message, user WHERE user_id = message_id
Produce a list of Message with the associated User populated
Use Case:
Ultimately, the classes are passed back as a serialised object from a Spring Controller, the classes are nested so that the resulting JSON / XML has a decent structure.
At the moment, this situation is resolved by executing two queries and manually setting the user property of each message in a loop. Useable, but I imagine a more elegant way should be possible.
Update : Solution Used -
Kudos to #Will Keeling for inspiration for the answer with use of the custom row mapper - My solution adds the addition of bean property maps in order to automate the field assignments.
The caveat is structuring the query so that the relevant table names are prefixed (however there is no standard convention to do this so the query is built programatically):
SELECT title AS "message.title", question AS "message.question", user_name AS "user.user_name", display_name AS "user.display_name" FROM message, user WHERE user_id = message_id
The custom row mapper then creates several bean maps and sets their properties based on the prefix of the column: (using meta data to get the column name).
public Object mapRow(ResultSet rs, int i) throws SQLException {
HashMap<String, BeanMap> beans_by_name = new HashMap();
beans_by_name.put("message", BeanMap.create(new Message()));
beans_by_name.put("user", BeanMap.create(new User()));
ResultSetMetaData resultSetMetaData = rs.getMetaData();
for (int colnum = 1; colnum <= resultSetMetaData.getColumnCount(); colnum++) {
String table = resultSetMetaData.getColumnName(colnum).split("\\.")[0];
String field = resultSetMetaData.getColumnName(colnum).split("\\.")[1];
BeanMap beanMap = beans_by_name.get(table);
if (rs.getObject(colnum) != null) {
beanMap.put(field, rs.getObject(colnum));
}
}
Message m = (Task)beans_by_name.get("message").getBean();
m.setUser((User)beans_by_name.get("user").getBean());
return m;
}
Again, this might seem like overkill for a two class join but the IRL use case involves multiple tables with tens of fields.
Perhaps you could pass in a custom RowMapper that could map each row of an aggregate join query (between message and user) to a Message and nested User. Something like this:
List<Message> messages = jdbcTemplate.query("SELECT * FROM message m, user u WHERE u.message_id = m.message_id", new RowMapper<Message>() {
#Override
public Message mapRow(ResultSet rs, int rowNum) throws SQLException {
Message message = new Message();
message.setTitle(rs.getString(1));
message.setQuestion(rs.getString(2));
User user = new User();
user.setUserName(rs.getString(3));
user.setDisplayName(rs.getString(4));
message.setUser(user);
return message;
}
});
A bit late to the party however I found this when I was googling the same question and I found a different solution that may be favorable for others in the future.
Unfortunately there is not a native way to achieve the nested scenario without making a customer RowMapper. However I will share an easier way to make said custom RowMapper than some of the other solutions here.
Given your scenario you can do the following:
class User {
String user_name;
String display_name;
}
class Message {
String title;
String question;
User user;
}
public class MessageRowMapper implements RowMapper<Message> {
#Override
public Message mapRow(ResultSet rs, int rowNum) throws SQLException {
User user = (new BeanPropertyRowMapper<>(User.class)).mapRow(rs,rowNum);
Message message = (new BeanPropertyRowMapper<>(Message.class)).mapRow(rs,rowNum);
message.setUser(user);
return message;
}
}
The key thing to remember with BeanPropertyRowMapper is that you have to follow the naming of your columns and the properties of your class members to the letter with the following exceptions (see Spring Documentation):
column names are aliased exactly
column names with underscores will be converted into "camel" case (ie. MY_COLUMN_WITH_UNDERSCORES == myColumnWithUnderscores)
Spring introduced a new AutoGrowNestedPaths property into the BeanMapper interface.
As long as the SQL query formats the column names with a . separator (as before) then the Row mapper will automatically target inner objects.
With this, I created a new generic row mapper as follows:
QUERY:
SELECT title AS "message.title", question AS "message.question", user_name AS "user.user_name", display_name AS "user.display_name" FROM message, user WHERE user_id = message_id
ROW MAPPER:
package nested_row_mapper;
import org.springframework.beans.*;
import org.springframework.jdbc.core.RowMapper;
import org.springframework.jdbc.support.JdbcUtils;
import java.sql.ResultSet;
import java.sql.ResultSetMetaData;
import java.sql.SQLException;
public class NestedRowMapper<T> implements RowMapper<T> {
private Class<T> mappedClass;
public NestedRowMapper(Class<T> mappedClass) {
this.mappedClass = mappedClass;
}
#Override
public T mapRow(ResultSet rs, int rowNum) throws SQLException {
T mappedObject = BeanUtils.instantiate(this.mappedClass);
BeanWrapper bw = PropertyAccessorFactory.forBeanPropertyAccess(mappedObject);
bw.setAutoGrowNestedPaths(true);
ResultSetMetaData meta_data = rs.getMetaData();
int columnCount = meta_data.getColumnCount();
for (int index = 1; index <= columnCount; index++) {
try {
String column = JdbcUtils.lookupColumnName(meta_data, index);
Object value = JdbcUtils.getResultSetValue(rs, index, Class.forName(meta_data.getColumnClassName(index)));
bw.setPropertyValue(column, value);
} catch (TypeMismatchException | NotWritablePropertyException | ClassNotFoundException e) {
// Ignore
}
}
return mappedObject;
}
}
Update: 10/4/2015. I typically don't do any of this rowmapping anymore. You can accomplish selective JSON representation much more elegantly via annotations. See this gist.
I spent the better part of a full day trying to figure this out for my case of 3-layer nested objects and just finally nailed it. Here's my situation:
Accounts (i.e. users) --1tomany--> Roles --1tomany--> views (user is allowed to see)
(These POJO classes are pasted at the very bottom.)
And I wanted the controller to return an object like this:
[ {
"id" : 3,
"email" : "catchall#sdcl.org",
"password" : "sdclpass",
"org" : "Super-duper Candy Lab",
"role" : {
"id" : 2,
"name" : "ADMIN",
"views" : [ "viewPublicReports", "viewAllOrders", "viewProducts", "orderProducts", "viewOfferings", "viewMyData", "viewAllData", "home", "viewMyOrders", "manageUsers" ]
}
}, {
"id" : 5,
"email" : "catchall#stereolab.com",
"password" : "stereopass",
"org" : "Stereolab",
"role" : {
"id" : 1,
"name" : "USER",
"views" : [ "viewPublicReports", "viewProducts", "orderProducts", "viewOfferings", "viewMyData", "home", "viewMyOrders" ]
}
}, {
"id" : 6,
"email" : "catchall#ukmedschool.com",
"password" : "ukmedpass",
"org" : "University of Kentucky College of Medicine",
"role" : {
"id" : 2,
"name" : "ADMIN",
"views" : [ "viewPublicReports", "viewAllOrders", "viewProducts", "orderProducts", "viewOfferings", "viewMyData", "viewAllData", "home", "viewMyOrders", "manageUsers" ]
}
} ]
A key point is to realize that Spring doesn't just do all this automatically for you. If you just ask it to return an Account item without doing the work of nested objects, you'll merely get:
{
"id" : 6,
"email" : "catchall#ukmedschool.com",
"password" : "ukmedpass",
"org" : "University of Kentucky College of Medicine",
"role" : null
}
So, first, create your 3-table SQL JOIN query and make sure you're getting all the data you need. Here's mine, as it appears in my Controller:
#PreAuthorize("hasAuthority('ROLE_ADMIN')")
#RequestMapping("/accounts")
public List<Account> getAllAccounts3()
{
List<Account> accounts = jdbcTemplate.query("SELECT Account.id, Account.password, Account.org, Account.email, Account.role_for_this_account, Role.id AS roleid, Role.name AS rolename, role_views.role_id, role_views.views FROM Account JOIN Role on Account.role_for_this_account=Role.id JOIN role_views on Role.id=role_views.role_id", new AccountExtractor() {});
return accounts;
}
Note that I'm JOINing 3 tables. Now create a RowSetExtractor class to put the nested objects together. The above examples show 2-layer nesting... this one goes a step further and does 3 levels. Note that I'm having to maintain the second-layer object in a map as well.
public class AccountExtractor implements ResultSetExtractor<List<Account>>{
#Override
public List<Account> extractData(ResultSet rs) throws SQLException, DataAccessException {
Map<Long, Account> accountmap = new HashMap<Long, Account>();
Map<Long, Role> rolemap = new HashMap<Long, Role>();
// loop through the JOINed resultset. If the account ID hasn't been seen before, create a new Account object.
// In either case, add the role to the account. Also maintain a map of Roles and add view (strings) to them when encountered.
Set<String> views = null;
while (rs.next())
{
Long id = rs.getLong("id");
Account account = accountmap.get(id);
if(account == null)
{
account = new Account();
account.setId(id);
account.setPassword(rs.getString("password"));
account.setEmail(rs.getString("email"));
account.setOrg(rs.getString("org"));
accountmap.put(id, account);
}
Long roleid = rs.getLong("roleid");
Role role = rolemap.get(roleid);
if(role == null)
{
role = new Role();
role.setId(rs.getLong("roleid"));
role.setName(rs.getString("rolename"));
views = new HashSet<String>();
rolemap.put(roleid, role);
}
else
{
views = role.getViews();
views.add(rs.getString("views"));
}
views.add(rs.getString("views"));
role.setViews(views);
account.setRole(role);
}
return new ArrayList<Account>(accountmap.values());
}
}
And this gives the desired output. POJOs below for reference. Note the #ElementCollection Set views in the Role class. This is what automatically generates the role_views table as referenced in the SQL query. Knowing that table exists, its name and its field names is crucial to getting the SQL query right. It feels wrong to have to know that... it seems like this should be more automagic -- isn't that what Spring is for?... but I couldn't figure out a better way. You've got to do the work manually in this case, as far as I can tell.
#Entity
public class Account implements Serializable {
private static final long serialVersionUID = 1L;
#Id
#GeneratedValue(strategy=GenerationType.AUTO)
private long id;
#Column(unique=true, nullable=false)
private String email;
#Column(nullable = false)
private String password;
#Column(nullable = false)
private String org;
private String phone;
#ManyToOne(fetch = FetchType.EAGER, optional = false)
#JoinColumn(name = "roleForThisAccount") // #JoinColumn means this side is the *owner* of the relationship. In general, the "many" side should be the owner, or so I read.
private Role role;
public Account() {}
public Account(String email, String password, Role role, String org)
{
this.email = email;
this.password = password;
this.org = org;
this.role = role;
}
// getters and setters omitted
}
#Entity
public class Role implements Serializable {
private static final long serialVersionUID = 1L;
#Id
#GeneratedValue(strategy=GenerationType.AUTO)
private long id; // required
#Column(nullable = false)
#Pattern(regexp="(ADMIN|USER)")
private String name; // required
#Column
#ElementCollection(targetClass=String.class)
private Set<String> views;
#OneToMany(mappedBy="role")
private List<Account> accountsWithThisRole;
public Role() {}
// constructor with required fields
public Role(String name)
{
this.name = name;
views = new HashSet<String>();
// both USER and ADMIN
views.add("home");
views.add("viewOfferings");
views.add("viewPublicReports");
views.add("viewProducts");
views.add("orderProducts");
views.add("viewMyOrders");
views.add("viewMyData");
// ADMIN ONLY
if(name.equals("ADMIN"))
{
views.add("viewAllOrders");
views.add("viewAllData");
views.add("manageUsers");
}
}
public long getId() { return this.id;}
public void setId(long id) { this.id = id; };
public String getName() { return this.name; }
public void setName(String name) { this.name = name; }
public Set<String> getViews() { return this.views; }
public void setViews(Set<String> views) { this.views = views; };
}
I worked a lot on stuff like this and do not see an elegant way to achieve this without an OR mapper.
Any simple solution based on reflection would heavily rely on the 1:1 (or maybe N:1) relation. Further your columns returned are not qualified by their type, so you cannot say which columns matches which class.
You may get away with spring-data and QueryDSL. I did not dig into them, but I think you need some meta-data for the query that is later used to map back the columns from your database into a proper data structure.
You may also try the new PostgreSQL json support that looks promising.
NestedRowMapper worked for me, the important part is getting the SQL correct. The Message properties shouldn't have the class name in them so the query should look like this:
QUERY:
SELECT title AS "title", question AS "question", user_name AS "user.user_name", display_name AS "user.display_name" FROM message, user WHERE user_id = message_id

Java JPA "Error compiling the query" when it uses an enum

The following JPA query doesn't compile:
#NamedQuery(name = "PSA.findBySourceSystem",
query = "SELECT p FROM PSA p WHERE p.sourceSystem.id = :sourceSystemId")
p.sourceSystem is the following enum:
public enum SourceSystem {
FIRST(3, "ABC"), SECOND(9, "DEF"), THIRD(17, "GHI");
private int id;
private String code;
...
}
and is mapped in PSA's base class:
public class PsaBase implements Serializable {
#Column(name = "sourceSystemId")
#Enumerated(EnumType.ORDINAL)
protected SourceSystem sourceSystem;
...
}
The query compiles and runs fine if I replace p.sourceSystem.id in the query with something more benign.
Thank you in advance for any help.
It shouldn't compile.
You have to resolve the required enum value manually before passing it as a query parameter:
#NamedQuery(name = "PSA.findBySourceSystem",
query = "SELECT p FROM PSA p WHERE p.sourceSystem = :sourceSystem")
.
public enum SourceSystem {
...
private static Map<Integer, SourceSystem> valuesById = new HashMap<Integer, SourceSystem>();
static {
for (SourceSystem s: values())
valuesById.put(s.id, s);
}
public static SourceSystem findById(int id) {
return valuesById.get(id);
}
}
.
em.createNamedQuery("PSA.findBySourceSystem")
.setParameter("sourceSystem", SourceSystem.findById(sourceSystemId));
EDIT:
Since sourceSystem is annotated as #Enumerated(EnumType.ORDINAL), it's stored in the database as the ordinal numbers of the corresponding enum values, therefore FIRST is stored as 0. JPA doesn't directly support using arbitrary field of the enum value to identify it in the database. If your database schema assumes so, you can do the following trick to decouple state of your object from the database schema:
public class PsaBase implements Serializable {
protected SourceSystem sourceSystem;
#Column(name = "sourceSystemId")
public Integer getSourceSystemId() {
return sourceSystem.getId();
}
public void setSourceSystemId(Integer id) {
this.sourceSystem = SourceSystem.findById(id);
}
... getter and setter of sourceSystem with #Transient ...
}

Categories

Resources