I am stuck with a issue of identify which constraint triggers DataIntegrityViolationException. I have two unique constraints: username and email but I have no luck trying to figure it out.
I have tried to get the root cause exception but I got this message
Unique index or primary key violation: "UK_6DOTKOTT2KJSP8VW4D0M25FB7_INDEX_4 ON PUBLIC.USERS(EMAIL) VALUES ('copeland#yahoo.com', 21)"; SQL statement:
insert into users (id, created_at, updated_at, country, email, last_name, name, password, phone, sex, username) values (null, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) [23505-193]
Reading the error I know email constraint triggers the validation but I want to return to the user something like:
{type: ERROR, message: "The email already exist"}
I have read in other post and people handle it looking for a constraint name into the exception(eg, users_unique_username_idx) and display a proper message to the user. But I couldn't get that type of constraint name
Maybe I am missing a configuration. I am using:
Spring Boot 1.5.1.RELEASE, JPA, Hibernate and H2
My application.properties
spring.jpa.generate-ddl=true
User.class:
#Entity(name = "users")
public class User extends BaseEntity {
private static final Logger LOGGER = LoggerFactory.getLogger(User.class);
public enum Sex { MALE, FEMALE }
#Id
#GeneratedValue
private Long id;
#Column(name = "name", length = 100)
#NotNull(message = "error.name.notnull")
private String name;
#Column(name = "lastName", length = 100)
#NotNull(message = "error.lastName.notnull")
private String lastName;
#Column(name = "email", unique = true, length = 100)
#NotNull(message = "error.email.notnull")
private String email;
#Column(name = "username", unique = true, length = 100)
#NotNull(message = "error.username.notnull")
private String username;
#Column(name = "password", length = 100)
#NotNull(message = "error.password.notnull")
#JsonProperty(access = JsonProperty.Access.WRITE_ONLY)
private String password;
#Enumerated(EnumType.STRING)
private Sex sex;
#Column(name = "phone", length = 50)
private String phone;
#Column(name = "country", length = 100)
#NotNull(message = "error.country.notnull")
private String country;
public User() {}
// Getters and setters
}
ControllerValidationHandler.class
#ControllerAdvice
public class ControllerValidationHandler {
private final Logger LOGGER = LoggerFactory.getLogger(ControllerValidationHandler.class);
#Autowired
private MessageSource msgSource;
private static Map<String, String> constraintCodeMap = new HashMap<String, String>() {
{
put("users_unique_username_idx", "exception.users.duplicate_username");
put("users_unique_email_idx", "exception.users.duplicate_email");
}
};
// This solution I see in another stackoverflow answer but not work
// for me. This is the closest solution to solve my problem that I found
#ResponseStatus(value = HttpStatus.CONFLICT) // 409
#ExceptionHandler(DataIntegrityViolationException.class)
#ResponseBody
public ErrorInfo conflict(HttpServletRequest req, DataIntegrityViolationException e) {
String rootMsg = ValidationUtil.getRootCause(e).getMessage();
LOGGER.info("rootMessage" + rootMsg);
if (rootMsg != null) {
Optional<Map.Entry<String, String>> entry = constraintCodeMap.entrySet().stream()
.filter((it) -> rootMsg.contains(it.getKey()))
.findAny();
LOGGER.info("Has entries: " + entry.isPresent()); // false
if (entry.isPresent()) {
LOGGER.info("Value: " + entry.get().getValue());
e=new DataIntegrityViolationException(
msgSource.getMessage(entry.get().getValue(), null, LocaleContextHolder.getLocale()));
}
}
return new ErrorInfo(req, e);
}
The response at this moment is:
{"timestamp":1488063801557,"status":500,"error":"Internal Server Error","exception":"org.springframework.dao.DataIntegrityViolationException","message":"could not execute statement; SQL [n/a]; constraint [\"UK_6DOTKOTT2KJSP8VW4D0M25FB7_INDEX_4 ON PUBLIC.USERS(EMAIL) VALUES ('copeland#yahoo.com', 21)\"; SQL statement:\ninsert into users (id, created_at, updated_at, country, email, last_name, name, password, phone, sex, username) values (null, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) [23505-193]]; nested exception is org.hibernate.exception.ConstraintViolationException: could not execute statement","path":"/users"}
UPDATE
This is my service layer that handle my persistence operations
MysqlService.class
#Service
#Qualifier("mysql")
class MysqlUserService implements UserService {
private UserRepository userRepository;
#Autowired
public MysqlUserService(UserRepository userRepository) {
this.userRepository = userRepository;
}
#Override
public List<User> findAll() {
return userRepository.findAll();
}
#Override
public Page<User> findAll(Pageable pageable) {
return userRepository.findAll(pageable);
}
#Override
public User findOne(Long id) {
return userRepository.findOne(id);
}
#Override
public User store(User user) {
return userRepository.save(user);
}
#Override
public User update(User usr) {
User user = this.validateUser(usr);
return userRepository.save(user);
}
#Override
public void destroy(Long id) {
this.validateUser(id);
userRepository.delete(id);
}
private User validateUser(User usr) {
return validateUser(usr.getId());
}
/**
* Validate that an user exists
*
* #param id of the user
* #return an existing User
*/
private User validateUser(Long id) {
User user = userRepository.findOne(id);
if (user == null) {
throw new UserNotFoundException();
}
return user;
}
}
Update #2
Repo to reproduce the issue https://github.com/LTroya/boot-users. I commented my handler on ValidationExceptionHandler.class in order to see the exception.
Send twice json at Json to test on Readme.md to POST /users/
What you want to do is rather than specify the unique column requirement on the #Column annotation, you can actual define those with names on the #Table annotation that JPA provides to have further control of those constraints.
#Entity
#Table(uniqueConstraints = {
#UniqueConstraint(name = "UC_email", columnNames = { "email" } ),
#UniqueConstraint(name = "UC_username", columnNames = " { "userName" } )
})
There are now two ways for handling the exception:
In the controller
You could elect to place the parsing logic in your controller and simply catch the DataIntegrityException that spring throws and parse it there. Something like the following pseudo code:
public ResponseBody myFancyControllerMethod(...) {
try {
final User user = userService.myFactoryServiceMethod(...);
}
catch ( DataIntegrityException e ) {
// handle exception parsing & setting the appropriate error here
}
}
The ultimate crux with this approach for me is we've moved code to handle persistence problems up two layers rather than the layer immediately above the persistence tier. This means should we have multiple controllers which need to handle this scenario we either find ourselves doing one of the following
Introduce some abstract base controller to place the logic.
Introduce some helper class with static methods we call for reuse.
Cut-n-paste the code - Yes this happens more than we think.
Placing the code in the presentation tier also introduces concerns when you need to share that service with other consumer types that may not be actually returning some type of html view.
This is why I recommend pushing the logic down 1 more level.
In the service
This is a cleaner approach because we push the validation of the constraint handling to the layer above the persistence layer, which is meant to ultimately be where we handle persistence failures. Not only that, our code actually documents the failure conditions and we can elect either to ignore or handle them based on context.
The caveat here is that I'd recommend you create specific exception classes that you throw from your service tier code in order to identify the unique constraint failures and throw those after you have parsed the ConstraintViolationException from Hibernate.
In your web controller, rest controller, or whatever other consumer that is calling into your service, you simply need to catch the appropriate exception class if necessary and branch accordingly. Here's some service pseudo code:
public User myFancyServiceMethod(...) {
try {
// do your stuff here
return userRepository.save( user );
}
catch( ConstraintViolationException e ) {
if ( isExceptionUniqueConstraintFor( "UC_email" ) ) {
throw new EmailAddressAlreadyExistsException();
}
else if ( isExceptionUniqueConstraintFor( "UC_username" ) ) {
throw new UserNameAlreadyExistsException();
}
}
}
You can specify unique constraints separately but you'd need to that on the entity level like
#Entity(name = "users")
#Table(name = "users", uniqueConstraints = {
#UniqueConstraint(name = "users_unique_username_idx", columnNames = "username"),
#UniqueConstraint(name = "users_unique_email_idx", columnNames = "email")
})
public class User extends BaseEntity { ... }
Related
I am new to hibernate and Data JPA. I try to do an insert into my table but the hibernate query has some columns in it that not exist in my table so it will throw an error. Actually at first when I run my code the hibernate add these extra columns to my table and then I change spring.jpa.hibernate.ddl-auto value to none in application.properties, but now when I delete those extra columns from my table and try to insert a new record I see those columns are in insert method.
My Entity classes
#Entity
public class Content {
#Id
#NotNull
#GeneratedValue
Integer id;
//this can be null if it is a question
#Column(name = "content_id")
Integer content_id;
#NotBlank #NotNull
#Column(name = "body")
String body;
#Column(name = "creationDate")
Timestamp creationDate;
#NotNull
#Column(name = "user_id")
Integer user_id;
public Integer getId() {
return id;
}
public void setId(Integer id) {
this.id = id;
}
public Integer getContent_id() {
return content_id;
}
public void setContent_id(Integer content_id) {
this.content_id = content_id;
}
public String getBody() {
return body;
}
public void setBody(String body) {
this.body = body;
}
public Timestamp getCreationDate() {
return creationDate;
}
public void setCreationDate(Timestamp creationDate) {
this.creationDate = creationDate;
}
public int getUser_id() {
return user_id;
}
public void setUser_id(Integer user_id) {
this.user_id = user_id;
}
}
my question class extends the content
#Entity
public class Question extends Content {
#NotNull #NotBlank
#Column(name = "subject")
String subject;
#NotNull #NotBlank
#Column(name = "tags")
String tags;
#NotNull
#Column(name = "contentType")
final Integer contentType_id = 1;
#Column(name = "commentCount")
Integer commentCount;
public Question(#Valid #JsonProperty("subject") String subject,
#Valid #JsonProperty("tags") String tags,
#Valid #JsonProperty("body") String body) {
this.subject = subject;
this.tags = tags;
this.body = body;
}
public Integer getContentType_id() {
return contentType_id;
}
public String getSubject() {
return subject;
}
public void setSubject(String subject) {
this.subject = subject;
}
public String getTags() {
return tags;
}
public void setTags(String tags) {
this.tags = tags;
}
public Integer getCommentCount() {
return commentCount;
}
public void setCommentCount(Integer commentCount) {
this.commentCount = commentCount;
}
}
Service class
#Service
public class QuestionService {
#Autowired
QuestionRepository questionRepository;
public QuestionService(QuestionRepository questionRepository) {
this.questionRepository = questionRepository;
}
public Question postQuestion(Question question){
return questionRepository.save(question);
}
}
Controller
#RequestMapping("easy4lazy/questions")
#RestController
public class QuestionController {
private final QuestionService questionService;
private final int contetnType = 1;
#Autowired
public QuestionController(QuestionService questionService) {
this.questionService = questionService;
}
#PostMapping(path = "/postQuestion" )
public Question postQuestion(#RequestBody Question q){
q.setContent_id(contetnType);
return questionService.postQuestion(q);
}
}
Repository
import com.easy4lazy.proj.model.Question;
import org.springframework.data.repository.CrudRepository;
public interface QuestionRepository extends CrudRepository<Question, Integer> {
}
Error code
Hibernate: insert into content (body, content_id, creation_date, user_id, comment_count, content_type, subject, tags, dtype, id) values (?, ?, ?, ?, ?, ?, ?, ?,'Question', ?)
2019-10-10 18:11:36.513 WARN 11960 --- [nio-8080-exec-3] o.h.engine.jdbc.spi.SqlExceptionHelper : SQL Error: 1054, SQLState: 42S22
2019-10-10 18:11:36.515 ERROR 11960 --- [nio-8080-exec-3] o.h.engine.jdbc.spi.SqlExceptionHelper : Unknown column 'creation_date' in 'field list'
2019-10-10 18:11:36.520 ERROR 11960 --- [nio-8080-exec-3] o.h.i.ExceptionMapperStandardImpl : HHH000346: Error during managed flush [org.hibernate.exception.SQLGrammarException: could not execute statement]
2019-10-10 18:11:36.547 ERROR 11960 --- [nio-8080-exec-3] o.a.c.c.C.[.[.[/].[dispatcherServlet] : Servlet.service() for servlet [dispatcherServlet] in context with path [] threw exception [Request processing failed; nested exception is org.springframework.dao.InvalidDataAccessResourceUsageException: could not execute statement; SQL [n/a]; nested exception is org.hibernate.exception.SQLGrammarException: could not execute statement] with root cause
I don't have content_id, creation_date, comment_count and dtype fields in my table and i don't know why hibernate add them into the query.
is there any way to change the query that hibernate created or fix this problem in any other way, how can I control or manage queries create by hibernate???
I also should mention that I use the postman to send data and check my code.
After lots of searching and working, I found that hibernate naming convention for table columns is in a way that it separate words by an underscore and that was the reason that I saw those columns inside the query generated by hibernate. So if you have a variable inside your class like creationDate hibernate try to converted to creation_date so when I change all my column's name in this method problem solved. Also, the dtype column is a special kind of column that will create by hibernate when many classes use the same table to insert data, it is because to distinguish which class insert the record inside the table and hibernate provide its value with the name of that class.
But you do have content_id and creation_date in your Content entity which the Question entity extends from
After first starting the api, the first 3 user creation Post requests fail (screenshot at bottom of post) with the below error (unique constraint vilocation).
Subsequent requests work, with the first created user having an id of 4, then 5, etc...
How can I make the user creation work on the first (3) tries?
I suspect this relates to the pre-seeding of my users, which I'm doing with the below script. Possibly the auto ID generation first tries 1,2,3 -- which are already in use?
INSERT INTO user
VALUES (1, 'user1', 'pass1', 'ADMIN');
INSERT INTO user
VALUES (2, 'user2', 'pass2', 'USER');
INSERT INTO user
VALUES (3, 'user3', 'pass3', 'ADMIN')
could not execute statement; SQL [n/a]; constraint [\"PRIMARY KEY ON
PUBLIC.USER(ID)\"; SQL statement:\ninsert into user (name, password,
role, id) values (?, ?, ?, ?) [23505-196]]; nested exception is
org.hibernate.exception.ConstraintViolationException: could not
execute statement",
#RestController
public class UserResource {
#Autowired
private UserRepository userRepository;
#GetMapping("/users")
public List<User> retrievaAllUsers() {
return userRepository.findAll();
}
#DeleteMapping("/users/{id}")
public void deleteUser(#PathVariable Long id) {
userRepository.deleteById(id);
}
#PostMapping("/users")
public ResponseEntity<Object> createUser(#RequestBody User user) {
User savedUser = userRepository.save(user);
URI location = ServletUriComponentsBuilder.fromCurrentRequest()
.path("/{id}")
.buildAndExpand(savedUser.getId())
.toUri();
return ResponseEntity.created(location).build();
}
}
-
#Entity
#Table(name = "user")
public class User {
#Id
#GeneratedValue
private Long id;
private String name;
private String password;
#Enumerated(EnumType.STRING)
private Role role;
public User() {
super();
}
public User(Long id, String name, String password, Role role) {
this.id = id;
this.name = name;
this.password = password;
this.role = role;
}
public Long getId() {
return id;
}
public void setId(Long id) {
this.id = id;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
public String getPassword() {
return password;
}
public void setPassword(String password) {
this.password = password;
}
public Role getRole() {
return role;
}
public void setRole(Role role) {
this.role = role;
}
}
edit - added role class
public enum Role {
USER, ADMIN
}
If you are using AUTO_INCREMENT in the column definition, then try changing strategy from GenerationType.AUTO to GenerationType.IDENTITY.
I noticed a similar behavior when I upgraded a project from Spring Boot 1.5 to 2.0.
Just an assumption. Firstly you are insterting datas with sql , but in your code you are creating new user and saving to db. So this new creation gives id as 1. But your db has a user record which primary key is as 1. Please remove all values from db and create your records from rest controller.
In my opinion, use a sequnce like this ,dont forget to create sequence in db;
#GeneratedValue(strategy = GenerationType.SEQUENCE, generator = "user_generator")
#SequenceGenerator(name="user_generator", sequenceName = "user_seq", allocationSize=50)
Or read this and choose for your problem to solve.
AUTO: Hibernate selects the generation strategy based on the used
dialect,
IDENTITY: Hibernate relies on an auto-incremented database column to
generate the primary key,
SEQUENCE: Hibernate requests the primary key value from a database
sequence,
TABLE: Hibernate uses a database table to simulate a sequence.
PS: Identity should be more relevant but try others.
you should provide the column names also to make sure it's ordered.
INSERT INTO user (id, name, password, role)
VALUES (1, 'user1', 'pass1', 'ADMIN');
INSERT INTO user (id, name, password, role)
VALUES (2, 'user2', 'pass2', 'USER');
INSERT INTO user (id, name, password, role)
VALUES (3, 'user3', 'pass3', 'ADMIN')
I have these entities that I want to relate bi-directionaly.
Credential:
#Entity
#Access(AccessType.PROPERTY)
#Table(name = "credential")
public class Credential extends MetaInfo implements Serializable {
...
private Email email;
...
#OneToOne(cascade = CascadeType.ALL, optional = false, orphanRemoval = true)
#JoinColumn(name="email", referencedColumnName="email_address")
public Email getEmail() {
return email;
}
public void setEmail(Email email) {
this.email = email;
}
...
}
Email:
#Entity
#Access(AccessType.PROPERTY)
#Table(name = "email")
public class Email extends MetaInfo implements Serializable{
...
private Credential credential;
public Email() {
}
public Email(String emailAddress) {
this.emailAddress = emailAddress;
}
#Id
#Column(name="email_address")
public String getEmailAddress() {
return emailAddress;
}
public void setEmailAddress(String emailAddress) {
this.emailAddress = emailAddress;
}
#OneToOne(mappedBy = "email", optional=false)
public Credential getCredential() {
return credential;
}
public void setCredential(Credential credential) {
this.credential = credential;
}
}
In a CredentialRepository class I am testing whether the passed-in email
is not assigned to any user except for the user with the username passed-in as the second (optional) parameter:
#Override
public boolean emailIsAssigned(String... args) {
assert(args.length > 0);
if(InputValidators.isValidEmail.test(args[0])){
EntityManager em = entityManagerFactory.createEntityManager();
try {
TypedQuery<Long> count = em.createQuery("SELECT COUNT(e) "
+ "FROM Email e WHERE e.emailAddress "
+ "= :email AND e "
+ "IN (SELECT c.email FROM Credential c WHERE c.username "
+ "!= :username)", Long.TYPE).setParameter("email", args[0])
.setParameter("username", null);
if(InputValidators.stringNotNullNorEmpty.apply(args[1])){
//only if the username has been provided
count.setParameter("username", args[1]);
}
return count.getSingleResult() > 0;
} catch (Exception e) {
System.out.println(e.getMessage());
return false;
} finally {
em.close();
}
}else{
throw new NotAValidEmailException(args[0] + " is not a"
+ " valid email address.");
}
}
Thus above args[0] is the email under test and args[1] is the username under test.
And this is the test that is causing me problems (note that before I already successfully tested inserts, updates and even the emailIsAssigned method but without the c.email part which seems to cause the issue:
#Test
public void emailAlreadyExistsTest(){
assertTrue(credentialRepo.emailIsAssigned("existing_email#yahoo.ca"));
}
And this is the error message that I have:
[EL Warning]: 2017-04-17 17:55:33.606--ServerSession(234430897)--Exception [EclipseLink-4002] (Eclipse Persistence Services - 2.5.2.v20140319-9ad6abd): org.eclipse.persistence.exceptions.DatabaseException
Internal Exception: com.mysql.jdbc.exceptions.jdbc4.MySQLIntegrityConstraintViolationException: Can't write; duplicate key in table '#sql-3e4_9a'
Error Code: 1022
Call: ALTER TABLE credential ADD CONSTRAINT FK_credential_email FOREIGN KEY (email) REFERENCES email (email_address)
Query: DataModifyQuery(sql="ALTER TABLE credential ADD CONSTRAINT FK_credential_email FOREIGN KEY (email) REFERENCES email (email_address)")
I would appreciate if someone could give me a piece of advice. I could always just change the email into a String and mark it as "unique" in #Column, but I feel that there is no reason for the chosen approach not to work.
I am using MySQL as the DB vendor, and Eclipse-Link JPA implementation. I did try to "hard-change" the name of the FK constraint but to no avail. The DB and all tables have the same collation (utf8_unicode_ci).
Try to delete the primary key for class Email because "extends MetaInfo"
I want to save User instance to H2 DB.
And I have got following exception for saving new user to DB:
Caused by: org.h2.jdbc.JdbcSQLException: Column count does not match; SQL statement:
INSERT INTO Users (user_id, user_name, user_birthday, user_email, user_role, user_tickets)
VALUES (?, ?, ?, ?, ?, ) [21002-191]
Here is DAO snippet:
#Override
public Integer create(User entity) {
String sql = "INSERT INTO Users (user_id, user_name, user_birthday, user_email, user_role, user_tickets) " +
"VALUES (:id, :name, :birthday, :email, :role, :tickets)";
SqlParameterSource parameterSource =
new MapSqlParameterSource("id", entity.getId())
.addValue("name", entity.getName())
.addValue("birthday", entity.getBirthday())
.addValue("email", entity.getEmail())
.addValue("role", entity.getRole())
.addValue("tickets", entity.getBookedTickets());
Logger.info("Create user: " + entity);
return getNamedParameterJdbcTemplate().update(sql, parameterSource); <== It fails here
}
SQL script for creating DB looks as follows:
----------------------
-- Create Users table
----------------------
CREATE TABLE Users (
user_id INTEGER PRIMARY KEY NOT NULL,
user_name VARCHAR(30) NULL,
user_birthday DATETIME NULL,
user_email VARCHAR(30) NULL,
user_role VARCHAR(20) NULL,
user_tickets VARCHAR(100) NULL,
);
-----------------------
-- Create Tickets table
-----------------------
CREATE TABLE Tickets (
tick_id INTEGER PRIMARY KEY NOT NULL,
event_id VARCHAR(30),
tick_price DECIMAL(8,2),
user_id INTEGER,
);
Here is User POJO:
public class User {
private Integer id;
private String name;
private Calendar birthday;
private String email;
private String role;
private Set<Ticket> bookedTickets = new HashSet<>();
// getters / setters
I suppose that it can't write to Set<Ticket>, but I don't know how to resolve this issue.
UPDATE:
For performing DB access, I am using - Spring JDBC.
Exactly NamedParameterJdbcTemplate:
<bean id="dataSource" class="org.springframework.jdbc.datasource.DriverManagerDataSource">
<property name="driverClassName" value="${jdbc.driverClassName}"/>
<property name="url" value="${jdbc.url}"/>
<property name="username" value="${jdbc.username}"/>
<property name="password" value="${jdbc.password}"/>
</bean>
<bean class="net.lelyak.edu.dao.NamedParameterJdbcDaoImpl">
<property name="dataSource" ref="dataSource"/>
</bean>
public class NamedParameterJdbcDaoImpl extends NamedParameterJdbcDaoSupport {
#Autowired
private DataSource dataSource;
#PostConstruct
private void initialize() {
setDataSource(dataSource);
}
}
DAO implementation:
#Repository
public class UserDaoImpl extends NamedParameterJdbcDaoImpl implements IGenericDao<User, Integer> {
#Override
public Integer create(User entity) {
// todo move SQL queries to utility class
String sql = "INSERT INTO Users (user_id, user_name, user_birthday, user_email, user_role, user_tickets) " +
"VALUES (:id, :name, :birthday, :email, :role, :tickets)";
// see create() at above text
Any suggestion?
Column user_tickets is VARCHAR(100), but the value you assign to :tickets is a Set<Ticket>, so how is that supposed to work?
Spring doesn't know what you are doing, but it assumes that you're building an IN clause when using a multi-valued argument, e.g. x IN (:tickets), so it replaces the :tickets with the appropriate number of parameter markers. E.g. if your set had 3 values, it would become x IN (?,?,?).
Your Set is empty, so no markers are generated. Technically, I think it should have thrown an exception, because that wouldn't be valid even for an IN clause, but it doesn't.
So, what do you expect the value of column user_tickets to be if your Set<Ticket> had values? The string version of the Set, e.g. [Ticket1, Ticket2]? If so, then call toString().
.addValue("tickets", entity.getBookedTickets().toString());
Then cross your fingers and hope that won't exceed 100 characters.
Sorry, I have forgotten.
My assumptions are user can have many tickets and one tickets belong to only one user. You cant save on database cell all collection, so solution is change relation and save user id on ticket. Bellow is all. I created service class, wich check if you have user on database, if not - save it.
#Entity
public class User {
#Id
#GeneratedValue(strategy=GenerationType.AUTO)
private Integer id;
private String name;
private Calendar birthday;
private String email;
private String role;
#Transient
private Set<Ticket> bookedTickets = new HashSet<>(); //I cant save collection into database
//getters and setters
}
#Entity
public class Ticket {
#Id
#GeneratedValue(strategy=GenerationType.AUTO)
private Integer id;
private String desc;
private int number;
/** id of customer as owner this ticket */
#ManyToOne
private User user;
//getters and setters
}
in userDAO method to save:
public void save(User user){
String sql = "INSERT INTO User ( name, birthday, email, role) VALUES (:name, :birthday, :email, :role)";
SqlParameterSource parameterSource =
new MapSqlParameterSource("name", user.getName())
.addValue("birthday", user.getBirthday())
.addValue("email", user.getEmail())
.addValue("role", user.getRole());
namedParameterJdbcTemplate.update(sql, parameterSource);
sql="SELECT id FROM User WHERE name = :name AND birthday=:birthday AND email=:email AND role=:role";
Integer id = namedParameterJdbcTemplate.query(sql, parameterSource, new ResultSetExtractor<Integer>() {
#Override
public Integer extractData(ResultSet result) throws SQLException,DataAccessException {
return result.getInt("id");
}
});
user.setId(id);
}
in ticketDAO method for save:
public void save(Ticket ticket){
String sql = "INSERT INTO Ticket (desc , number, user_id) VALUES (:desc, :number, :userId)";
SqlParameterSource parameterSource =
new MapSqlParameterSource("desc", ticket.getDesc())
.addValue("number", ticket.getNumber())
.addValue("userId", ticket.getUser().getId());
namedParameterJdbcTemplate.update(sql, parameterSource);
}
and service for saveTickets:
public class UserService {
private TicketDAO ticketDAO;
private UserDAO userDAO;
public void saveTicketsForUser(User user){
if(user.getId()==null){
//if user is not saved in database
userDAO.save(user);
}else{
//if you have this client in database, you don't need to save client
}
for(Ticket ticket: user.getBookedTickets()){
ticket.setUser(user);
ticketDAO.save(ticket);
}
}
}
you can inject dao classes into service using xml.
The solution was with redesign the logic a little bit. Moving main part saving logic to parent abstract class:
public abstract class BaseDAO<ENTITY extends BaseEntity> extends NamedParameterJdbcDaoSupport implements IGenericDao<ENTITY> {
private final String tableName;
private final Class<ENTITY> entityClass;
private final List<String> fields;
private final String insertSQL;
private final String updateSQL;
public BaseDAO(Class<ENTITY> entityClass, String tableName, List<String> fields) {
this.entityClass = entityClass;
this.tableName = tableName;
this.fields = fields;
// init SQLs
StringBuilder sbInsertSQL = new StringBuilder();
StringBuilder sbUpdateSQL = new StringBuilder();
sbInsertSQL.append("INSERT INTO ").append(tableName).append(" (");
sbUpdateSQL.append("UPDATE ").append(tableName).append(" SET ");
for (int i = 0; i < fields.size(); i++) {
if (i > 0) {
sbInsertSQL.append(", ");
sbUpdateSQL.append(", ");
}
sbInsertSQL.append(fields.get(i));
sbUpdateSQL.append(fields.get(i)).append("=:").append(fields.get(i));
}
sbInsertSQL.append(") ").append("VALUES (");
for (int i = 0; i < fields.size(); i++) {
if (i > 0) {
sbInsertSQL.append(",");
}
sbInsertSQL.append(":").append(fields.get(i));
}
sbInsertSQL.append(")\n");
sbUpdateSQL.append(" WHERE id=:id\n");
this.insertSQL = sbInsertSQL.toString();
this.updateSQL = sbUpdateSQL.toString();
Logger.debug("BaseDAO(), insertSQL: [" + insertSQL + "]");
Logger.debug("BaseDAO(), updateSQL: [" + updateSQL + "]");
}
#Override
public Long save(ENTITY entity) {
long res;
if (entity.getId() == null) {
res = insert(entity);
} else {
update(entity);
res = entity.getId();
}
return res;
}
Child DAO will have a look like following:
public class UserDAO extends BaseDAO<User> {
private static final String USER_TABLE_NAME = "t_user";
private static final String userFields[] = {"name", "birthday", "email", "password", "role", "enabled"};
public UserDAO() {
super(User.class, USER_TABLE_NAME, Arrays.asList(userFields));
}
Also, make user table with auto-incrementing id:
----------------------
-- create t_user table
----------------------
CREATE TABLE t_user (
id INT GENERATED ALWAYS AS IDENTITY CONSTRAINT pk_user PRIMARY KEY,
name VARCHAR(60) NOT NULL,
birthday DATE,
email VARCHAR(60),
password VARCHAR(100),
role VARCHAR(300),
enabled SMALLINT(6)
);
Also, the model should be updated with parent logic:
public abstract class BaseEntity {
protected Long id = null;
protected String name;
public class User extends BaseEntity {
private Date birthday;
private String email;
private String password;
private String role;
private boolean enabled;
Given the following example POJO's: (Assume Getters and Setters for all properties)
class User {
String user_name;
String display_name;
}
class Message {
String title;
String question;
User user;
}
One can easily query a database (postgres in my case) and populate a list of Message classes using a BeanPropertyRowMapper where the db field matched the property in the POJO: (Assume the DB tables have corresponding fields to the POJO properties).
NamedParameterDatbase.query("SELECT * FROM message", new BeanPropertyRowMapper(Message.class));
I'm wondering - is there a convenient way to construct a single query and / or create a row mapper in such a way to also populate the properties of the inner 'user' POJO within the message.
That is, Some syntatical magic where each result row in the query:
SELECT * FROM message, user WHERE user_id = message_id
Produce a list of Message with the associated User populated
Use Case:
Ultimately, the classes are passed back as a serialised object from a Spring Controller, the classes are nested so that the resulting JSON / XML has a decent structure.
At the moment, this situation is resolved by executing two queries and manually setting the user property of each message in a loop. Useable, but I imagine a more elegant way should be possible.
Update : Solution Used -
Kudos to #Will Keeling for inspiration for the answer with use of the custom row mapper - My solution adds the addition of bean property maps in order to automate the field assignments.
The caveat is structuring the query so that the relevant table names are prefixed (however there is no standard convention to do this so the query is built programatically):
SELECT title AS "message.title", question AS "message.question", user_name AS "user.user_name", display_name AS "user.display_name" FROM message, user WHERE user_id = message_id
The custom row mapper then creates several bean maps and sets their properties based on the prefix of the column: (using meta data to get the column name).
public Object mapRow(ResultSet rs, int i) throws SQLException {
HashMap<String, BeanMap> beans_by_name = new HashMap();
beans_by_name.put("message", BeanMap.create(new Message()));
beans_by_name.put("user", BeanMap.create(new User()));
ResultSetMetaData resultSetMetaData = rs.getMetaData();
for (int colnum = 1; colnum <= resultSetMetaData.getColumnCount(); colnum++) {
String table = resultSetMetaData.getColumnName(colnum).split("\\.")[0];
String field = resultSetMetaData.getColumnName(colnum).split("\\.")[1];
BeanMap beanMap = beans_by_name.get(table);
if (rs.getObject(colnum) != null) {
beanMap.put(field, rs.getObject(colnum));
}
}
Message m = (Task)beans_by_name.get("message").getBean();
m.setUser((User)beans_by_name.get("user").getBean());
return m;
}
Again, this might seem like overkill for a two class join but the IRL use case involves multiple tables with tens of fields.
Perhaps you could pass in a custom RowMapper that could map each row of an aggregate join query (between message and user) to a Message and nested User. Something like this:
List<Message> messages = jdbcTemplate.query("SELECT * FROM message m, user u WHERE u.message_id = m.message_id", new RowMapper<Message>() {
#Override
public Message mapRow(ResultSet rs, int rowNum) throws SQLException {
Message message = new Message();
message.setTitle(rs.getString(1));
message.setQuestion(rs.getString(2));
User user = new User();
user.setUserName(rs.getString(3));
user.setDisplayName(rs.getString(4));
message.setUser(user);
return message;
}
});
A bit late to the party however I found this when I was googling the same question and I found a different solution that may be favorable for others in the future.
Unfortunately there is not a native way to achieve the nested scenario without making a customer RowMapper. However I will share an easier way to make said custom RowMapper than some of the other solutions here.
Given your scenario you can do the following:
class User {
String user_name;
String display_name;
}
class Message {
String title;
String question;
User user;
}
public class MessageRowMapper implements RowMapper<Message> {
#Override
public Message mapRow(ResultSet rs, int rowNum) throws SQLException {
User user = (new BeanPropertyRowMapper<>(User.class)).mapRow(rs,rowNum);
Message message = (new BeanPropertyRowMapper<>(Message.class)).mapRow(rs,rowNum);
message.setUser(user);
return message;
}
}
The key thing to remember with BeanPropertyRowMapper is that you have to follow the naming of your columns and the properties of your class members to the letter with the following exceptions (see Spring Documentation):
column names are aliased exactly
column names with underscores will be converted into "camel" case (ie. MY_COLUMN_WITH_UNDERSCORES == myColumnWithUnderscores)
Spring introduced a new AutoGrowNestedPaths property into the BeanMapper interface.
As long as the SQL query formats the column names with a . separator (as before) then the Row mapper will automatically target inner objects.
With this, I created a new generic row mapper as follows:
QUERY:
SELECT title AS "message.title", question AS "message.question", user_name AS "user.user_name", display_name AS "user.display_name" FROM message, user WHERE user_id = message_id
ROW MAPPER:
package nested_row_mapper;
import org.springframework.beans.*;
import org.springframework.jdbc.core.RowMapper;
import org.springframework.jdbc.support.JdbcUtils;
import java.sql.ResultSet;
import java.sql.ResultSetMetaData;
import java.sql.SQLException;
public class NestedRowMapper<T> implements RowMapper<T> {
private Class<T> mappedClass;
public NestedRowMapper(Class<T> mappedClass) {
this.mappedClass = mappedClass;
}
#Override
public T mapRow(ResultSet rs, int rowNum) throws SQLException {
T mappedObject = BeanUtils.instantiate(this.mappedClass);
BeanWrapper bw = PropertyAccessorFactory.forBeanPropertyAccess(mappedObject);
bw.setAutoGrowNestedPaths(true);
ResultSetMetaData meta_data = rs.getMetaData();
int columnCount = meta_data.getColumnCount();
for (int index = 1; index <= columnCount; index++) {
try {
String column = JdbcUtils.lookupColumnName(meta_data, index);
Object value = JdbcUtils.getResultSetValue(rs, index, Class.forName(meta_data.getColumnClassName(index)));
bw.setPropertyValue(column, value);
} catch (TypeMismatchException | NotWritablePropertyException | ClassNotFoundException e) {
// Ignore
}
}
return mappedObject;
}
}
Update: 10/4/2015. I typically don't do any of this rowmapping anymore. You can accomplish selective JSON representation much more elegantly via annotations. See this gist.
I spent the better part of a full day trying to figure this out for my case of 3-layer nested objects and just finally nailed it. Here's my situation:
Accounts (i.e. users) --1tomany--> Roles --1tomany--> views (user is allowed to see)
(These POJO classes are pasted at the very bottom.)
And I wanted the controller to return an object like this:
[ {
"id" : 3,
"email" : "catchall#sdcl.org",
"password" : "sdclpass",
"org" : "Super-duper Candy Lab",
"role" : {
"id" : 2,
"name" : "ADMIN",
"views" : [ "viewPublicReports", "viewAllOrders", "viewProducts", "orderProducts", "viewOfferings", "viewMyData", "viewAllData", "home", "viewMyOrders", "manageUsers" ]
}
}, {
"id" : 5,
"email" : "catchall#stereolab.com",
"password" : "stereopass",
"org" : "Stereolab",
"role" : {
"id" : 1,
"name" : "USER",
"views" : [ "viewPublicReports", "viewProducts", "orderProducts", "viewOfferings", "viewMyData", "home", "viewMyOrders" ]
}
}, {
"id" : 6,
"email" : "catchall#ukmedschool.com",
"password" : "ukmedpass",
"org" : "University of Kentucky College of Medicine",
"role" : {
"id" : 2,
"name" : "ADMIN",
"views" : [ "viewPublicReports", "viewAllOrders", "viewProducts", "orderProducts", "viewOfferings", "viewMyData", "viewAllData", "home", "viewMyOrders", "manageUsers" ]
}
} ]
A key point is to realize that Spring doesn't just do all this automatically for you. If you just ask it to return an Account item without doing the work of nested objects, you'll merely get:
{
"id" : 6,
"email" : "catchall#ukmedschool.com",
"password" : "ukmedpass",
"org" : "University of Kentucky College of Medicine",
"role" : null
}
So, first, create your 3-table SQL JOIN query and make sure you're getting all the data you need. Here's mine, as it appears in my Controller:
#PreAuthorize("hasAuthority('ROLE_ADMIN')")
#RequestMapping("/accounts")
public List<Account> getAllAccounts3()
{
List<Account> accounts = jdbcTemplate.query("SELECT Account.id, Account.password, Account.org, Account.email, Account.role_for_this_account, Role.id AS roleid, Role.name AS rolename, role_views.role_id, role_views.views FROM Account JOIN Role on Account.role_for_this_account=Role.id JOIN role_views on Role.id=role_views.role_id", new AccountExtractor() {});
return accounts;
}
Note that I'm JOINing 3 tables. Now create a RowSetExtractor class to put the nested objects together. The above examples show 2-layer nesting... this one goes a step further and does 3 levels. Note that I'm having to maintain the second-layer object in a map as well.
public class AccountExtractor implements ResultSetExtractor<List<Account>>{
#Override
public List<Account> extractData(ResultSet rs) throws SQLException, DataAccessException {
Map<Long, Account> accountmap = new HashMap<Long, Account>();
Map<Long, Role> rolemap = new HashMap<Long, Role>();
// loop through the JOINed resultset. If the account ID hasn't been seen before, create a new Account object.
// In either case, add the role to the account. Also maintain a map of Roles and add view (strings) to them when encountered.
Set<String> views = null;
while (rs.next())
{
Long id = rs.getLong("id");
Account account = accountmap.get(id);
if(account == null)
{
account = new Account();
account.setId(id);
account.setPassword(rs.getString("password"));
account.setEmail(rs.getString("email"));
account.setOrg(rs.getString("org"));
accountmap.put(id, account);
}
Long roleid = rs.getLong("roleid");
Role role = rolemap.get(roleid);
if(role == null)
{
role = new Role();
role.setId(rs.getLong("roleid"));
role.setName(rs.getString("rolename"));
views = new HashSet<String>();
rolemap.put(roleid, role);
}
else
{
views = role.getViews();
views.add(rs.getString("views"));
}
views.add(rs.getString("views"));
role.setViews(views);
account.setRole(role);
}
return new ArrayList<Account>(accountmap.values());
}
}
And this gives the desired output. POJOs below for reference. Note the #ElementCollection Set views in the Role class. This is what automatically generates the role_views table as referenced in the SQL query. Knowing that table exists, its name and its field names is crucial to getting the SQL query right. It feels wrong to have to know that... it seems like this should be more automagic -- isn't that what Spring is for?... but I couldn't figure out a better way. You've got to do the work manually in this case, as far as I can tell.
#Entity
public class Account implements Serializable {
private static final long serialVersionUID = 1L;
#Id
#GeneratedValue(strategy=GenerationType.AUTO)
private long id;
#Column(unique=true, nullable=false)
private String email;
#Column(nullable = false)
private String password;
#Column(nullable = false)
private String org;
private String phone;
#ManyToOne(fetch = FetchType.EAGER, optional = false)
#JoinColumn(name = "roleForThisAccount") // #JoinColumn means this side is the *owner* of the relationship. In general, the "many" side should be the owner, or so I read.
private Role role;
public Account() {}
public Account(String email, String password, Role role, String org)
{
this.email = email;
this.password = password;
this.org = org;
this.role = role;
}
// getters and setters omitted
}
#Entity
public class Role implements Serializable {
private static final long serialVersionUID = 1L;
#Id
#GeneratedValue(strategy=GenerationType.AUTO)
private long id; // required
#Column(nullable = false)
#Pattern(regexp="(ADMIN|USER)")
private String name; // required
#Column
#ElementCollection(targetClass=String.class)
private Set<String> views;
#OneToMany(mappedBy="role")
private List<Account> accountsWithThisRole;
public Role() {}
// constructor with required fields
public Role(String name)
{
this.name = name;
views = new HashSet<String>();
// both USER and ADMIN
views.add("home");
views.add("viewOfferings");
views.add("viewPublicReports");
views.add("viewProducts");
views.add("orderProducts");
views.add("viewMyOrders");
views.add("viewMyData");
// ADMIN ONLY
if(name.equals("ADMIN"))
{
views.add("viewAllOrders");
views.add("viewAllData");
views.add("manageUsers");
}
}
public long getId() { return this.id;}
public void setId(long id) { this.id = id; };
public String getName() { return this.name; }
public void setName(String name) { this.name = name; }
public Set<String> getViews() { return this.views; }
public void setViews(Set<String> views) { this.views = views; };
}
I worked a lot on stuff like this and do not see an elegant way to achieve this without an OR mapper.
Any simple solution based on reflection would heavily rely on the 1:1 (or maybe N:1) relation. Further your columns returned are not qualified by their type, so you cannot say which columns matches which class.
You may get away with spring-data and QueryDSL. I did not dig into them, but I think you need some meta-data for the query that is later used to map back the columns from your database into a proper data structure.
You may also try the new PostgreSQL json support that looks promising.
NestedRowMapper worked for me, the important part is getting the SQL correct. The Message properties shouldn't have the class name in them so the query should look like this:
QUERY:
SELECT title AS "title", question AS "question", user_name AS "user.user_name", display_name AS "user.display_name" FROM message, user WHERE user_id = message_id