PreparedStatement of JDBCTemplate throws exception "Before start of result set" [duplicate] - java

This question already has answers here:
ResultSet exception - before start of result set
(6 answers)
Closed 5 years ago.
i am using Spring JdbcTemplate. And i have query to get data by ID.
I have this table schema :
+---------------+--------------+------+-----+---------+-------+
| Field | Type | Null | Key | Default | Extra |
+---------------+--------------+------+-----+---------+-------+
| id | varchar(150) | NO | PRI | NULL | |
| position_name | varchar(150) | NO | | NULL | |
| description | text | YES | | NULL | |
+---------------+--------------+------+-----+---------+-------+
And i run using this template :
public Position fetchById(final String id) throws Exception {
// TODO Auto-generated method stub
String sql = "SELECT * FROM position WHERE id = ?";
return jdbcTemplate.query(sql, new PreparedStatementSetter() {
public void setValues(PreparedStatement ps) throws SQLException {
// TODO Auto-generated method stub
ps.setString(1, id);
}
}, new ResultSetExtractor<Position>() {
public Position extractData(ResultSet rs) throws SQLException,
DataAccessException {
// TODO Auto-generated method stub
Position p = new Position();
p.setId(rs.getString("id"));
p.setPositionName(rs.getString("position_name"));
p.setDescription(rs.getString("description"));
return p;
}
});
}
But when i run unit test like this :
#Test
public void getPositionByIdTest() throws Exception {
String id = "35910510-ef2f-11e5-9ce9-5e5517507c66";
Position p = positionService.getPositionById(id);
Assert.assertNotNull(p);
Assert.assertEquals("Project Manager", p.getPositionName());
}
I get this following error :
org.springframework.dao.TransientDataAccessResourceException: PreparedStatementCallback; SQL [SELECT * FROM position WHERE id = ?]; Before start of result set; nested exception is java.sql.SQLException: Before start of result set
at org.springframework.jdbc.support.SQLStateSQLExceptionTranslator.doTranslate(SQLStateSQLExceptionTranslator.java:108)
at org.springframework.jdbc.support.AbstractFallbackSQLExceptionTranslator.translate(AbstractFallbackSQLExceptionTranslator.java:73)
...
Caused by: java.sql.SQLException: Before start of result set
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:957)
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:896)
...
How to use PreparedStatement in Select query JDBC Template?
Thank you.

You need to call ResultSet#next() to "move the cursor forward one row from its current position." As you are expecting a single row to be returned from your query, you can call this in an if statement as shown below:
public Position extractData(ResultSet rs) throws SQLException,
DataAccessException {
Position p = new Position();
if(rs.next()) {
p.setId(rs.getString("id"));
p.setPositionName(rs.getString("position_name"));
p.setDescription(rs.getString("description"));
}
return p;
}
If you were expecting to process multiple results and return a collection of some sort, you would do while(rs.next()) and process a row on each iteration of the loop.
Also, as you are using JdbcTemplate you could consider using a RowMapper instead which may simplify your implementation slightly.

You have a simple use case and use one of the more complex query methods, why? Next you are using a ResultSetExtractor whereas you probably want a RowMapper instead. If you use a ResultSetExtractor you will have to iterate over the result set yourself. Replace your code with the following
return getJdbcTemplate.query(sql, new RowMapper<Position>() {
public Position mapRow(ResultSet rs, int row) throws SQLException,
DataAccessException {
Position p = new Position();
p.setId(rs.getString("id"));
p.setPositionName(rs.getString("position_name"));
p.setDescription(rs.getString("description"));
return p;
}, id);
}
So instead of using one of the complexer methods, use one that suits what you need. The JdbcTemplate uses a PreparedStatement anyway.

If you use a ResultSetExtractor you must iterate through the result for using next() calls. This explains the error since the ResultSet is still positioned before the first row, when you read its values.
For your use case - to select a record for a given id - there is a simpler solution using JdbcTemplate.queryForObject and a RowMapper lambda:
String sql = "SELECT * FROM position WHERE id = ?";
Position position = (Position) jdbcTemplate.queryForObject(
sql, new Object[] { id }, (ResultSet rs, int rowNum) -> {
Position p = new Position();
p.setId(rs.getString("id"));
p.setPositionName(rs.getString("position_name"));
p.setDescription(rs.getString("description"));
r eturn p;
});

Related

Insert a row in a table with referencing to a value from other table

I have 2 tables, subject and student respectively
id | name
----+-----------------------------
1 | Math
id | student_name | score | subject_id
----+--------------+-------+------------
11 | Mark | 78.5 | 2
I have the Java code
public boolean insertStudent(String student_name,float score,String name) {
boolean res=false;
try(PreparedStatement statement=this.c.prepareStatement(INSERT_STU);){
statement.setString(1,student_name);
statement.setFloat(2,score);
statement.setString(3,name);
res=statement.execute();
}catch (SQLException e){
e.printStackTrace();
throw new RuntimeException(e);
}
return res;
}
I have the query string for INSERT_STU
private static final String INSERT_STU="INSERT INTO student(student_name,score,subject_id) VALUES(?,?,(SELECT id from subject where name=?))";
This works in postgres. But is there any other way to do it?
*It has to be passed with subject name (String name) and not subject id. Since I won't be knowing the subject_id, I pass the name.
you can change your query to this which probably is more readable :
INSERT INTO student (student_name,score,subject_id)
SELECT ?,?,id from subject where name = ?
but be careful here , if we can't find any match in subject table for given name , nothing will be inserted.

Insert 1000s of records with relationship and ignore duplicates using JDBC & MySQL

I am refactoring some code that was horribly inefficient but am still seeing huge load on both my MySQL and Java servers. We have an endpoint that allows a user to upload a CSV file containing contacts with a first name, last name, phone number, and email address. The phone number and email address need to be unique for a location. The phone number however is stored in separate table from a contact as they can have more than one. The CSV only allows one but they can update a contact manually to add more. Our users will likely upload files as big as 50,000 records.
This is my pertinent SQL structure:
Contact Table
+----+-----------+----------+------------------+------------+
| id | firstName | lastName | email | locationId |
+----+-----------+----------+------------------+------------+
| 1 | John | Doe | jdoe#noemail.com | 1 |
+----+-----------+----------+------------------+------------+
Contact Phone Table
+----+-----------+--------------+---------+
| id | contactId | number | primary |
+----+-----------+--------------+---------+
| 1 | 1 | +15555555555 | 1 |
+----+-----------+--------------+---------+
| 2 | 1 | +11231231234 | 0 |
+----+-----------+--------------+---------+
There are unique composite constraints on email & locationId in the contact table and contactId & number in the contact phone table.
The original programmer just created a loop in Java to loop through the CSV, query for the phone number and email (two separate queries) and insert if there wasn't a match one at a time. It was horrible and would just kill our server.
This is my latest attempt:
Stored Procedure:
DELIMITER $$
CREATE PROCEDURE save_bulk_contact(IN last_name VARCHAR(128), IN first_name VARCHAR(128), IN email VARCHAR(320), IN location_id BIGINT, IN organization_id BIGINT, IN phone_number VARCHAR(15))
BEGIN
DECLARE insert_id BIGINT;
INSERT INTO contact
(`lastName`, `firstName`, `primaryEmail`, `locationId`, `firstActiveDate`)
VALUE (last_name, first_name, email, location_id, organization_id, UNIX_TIMESTAMP() * 1000);
SET insert_id = LAST_INSERT_ID();
INSERT INTO contact_phone
(`contactId`, `number`, `type`, `primary`)
VALUE (insert_id, phone_number, 'CELL', 1);
END$$
DELIMITER ;
Then in Java I query for all of the contacts with phone numbers for the location, loop through them, remove the duplicates, and then use a batch update to insert them all.
Service Layer:
private ContactUploadJSON uploadContacts(ContactUploadJSON contactUploadJSON) throws HandledDataAccessException {
List<ContactUploadData> returnList = new ArrayList<>();
if (contactUploadJSON.getContacts() != null) {
List<Contact> existingContacts = contactRepository.getContactsByLocationId(contactUploadJSON.getLocationId());
List<ContactUploadData> uploadedContacts = contactUploadJSON.getContacts();
Iterator<ContactUploadData> uploadedContactsIterator = uploadedContacts.iterator();
while (uploadedContactsIterator.hasNext()) {
ContactUploadData current = uploadedContactsIterator.next();
boolean anyMatch = existingContacts.stream().anyMatch(existingContact -> {
try {
boolean contactFound = contactEqualsContactUploadData(existingContact, current);
if(contactFound) {
contactUploadJSON.incrementExisted();
current.setError("Duplicate Contact: " + StringUtils.joinWith(" ", existingContact.getFirstName(), existingContact.getLastName()));
returnList.add(current);
}
return contactFound;
} catch (PhoneParsingException | PhoneNotValidException e) {
contactUploadJSON.incrementFailed();
current.setError("Failed with error: " + e.getMessage());
returnList.add(current);
return true;
}
});
if(anyMatch) {
uploadedContactsIterator.remove();
}
}
contactUploadJSON.setCreated(uploadedContacts.size());
if(!uploadedContacts.isEmpty()){
contactRepository.insertBulkContacts(uploadedContacts, contactUploadJSON.getLocationId());
}
}
contactUploadJSON.setContacts(returnList);
return contactUploadJSON;
}
private static boolean contactEqualsContactUploadData(Contact contact, ContactUploadData contactUploadData) throws PhoneParsingException, PhoneNotValidException {
if(contact == null || contactUploadData == null) {
return false;
}
String normalizedPhone = PhoneUtils.validatePhoneNumber(contactUploadData.getMobilePhone());
List<ContactPhone> contactPhones = contact.getPhoneNumbers();
if(contactPhones != null && contactPhones.stream().anyMatch(contactPhone -> StringUtils.equals(contactPhone.getNumber(), normalizedPhone))) {
return true;
}
return (StringUtils.isNotBlank(contactUploadData.getEmail()) &&
StringUtils.equals(contact.getPrimaryEmail(), contactUploadData.getEmail())) ||
(contact.getPrimaryPhoneNumber() != null &&
StringUtils.equals(contact.getPrimaryPhoneNumber().getNumber(), normalizedPhone));
}
Repository Code:
public void insertBulkContacts(List<ContactUploadData> contacts, long locationId) throws HandledDataAccessException {
String sql = "CALL save_bulk_contact(:last_name, :first_name, :email, :location_id, :phone_number)";
try {
List<Map<String, Object>> contactsList = new ArrayList<>();
contacts.forEach(contact -> {
Map<String, Object> contactMap = new HashMap<>();
contactMap.put("last_name", contact.getLastName());
contactMap.put("first_name", contact.getFirstName());
contactMap.put("email", contact.getEmail());
contactMap.put("location_id", locationId);
contactMap.put("phone_number", contact.getMobilePhone());
contactsList.add(contactMap);
});
Map<String, Object>[] paramList = contactsList.toArray(new Map[0]);
namedJdbcTemplate.batchUpdate(sql, paramList);
} catch (DataAccessException e) {
log.severe("Failed to insert contacts:\n" + ExceptionUtils.getStackTrace(e));
throw new HandledDataAccessException("Failed to insert contacts");
}
}
The return ContactUploadJSON contains the contact list, the locationId, and metrics for add, already existing, and failed.
This solution works but I am wondering if there are better approaches? In the future we are going to want a mechanism for updating contacts, not just inserting new ones, so I have to plan accordingly. Is it possible to do this all in MySQL? Would it be more efficient? I think the one-to-many relationship with compound unique constraint makes it more difficult.

While inserting String value (java) in Enum field (DB) Getting " Data truncated for column"

In my mysql table there is one enum field 'spe_gender'.
mysql> desc tbl_sswltdata_persons;
+-----------------+-----------------------+------+-----+---------+----------------+
| Field | Type | Null | Key | Default | Extra |
+-----------------+-----------------------+------+-----+---------+----------------+
| spe_id | bigint(20) unsigned | NO | PRI | NULL | auto_increment |
| spe_sen_id | bigint(20) unsigned | NO | MUL | NULL | |
| spe_gender | enum('male','female') | YES | | NULL | |
| spe_is_deceased | tinyint(1) | NO | | 0 | |
| spe_birth_place | varchar(255) | YES | | NULL | |
| spe_create_date | datetime | YES | | NULL | |
| spe_update_date | datetime | YES | | NULL | |
+-----------------+-----------------------+------+-----+---------+----------------+
7 rows in set (0.00 sec)
So I created one POJO class as:
public class SswltdataPersons implements Serializable {
private static final long serialVersionUID = 1L;
private long spe_id;
private long spe_sen_id;
private String spe_gender;
private String spe_is_deceased;
private String spe_birth_place;
private String spe_create_date;
private String spe_update_date;
// .........
public String getSpe_gender() {
return spe_gender;
}
public void setSpe_gender(String spe_gender) {
this.spe_gender = spe_gender;
}
// ......
}
When I trying to write data into this table I am getting an exception
org.springframework.dao.DataIntegrityViolationException: PreparedStatementCallback; SQL
[INSERT INTO iwpro_imp.tbl_sswltdata_persons VALUES(?,?,?,?,?,?,?)];
Data truncated for column 'spe_gender' at row 1; nested exception is java.sql.BatchUpdateException: Data truncated for column 'spe_gender' at row 1
I think the issue is While inserting String value (through java) in Enum field (in DB). Here is my methods where I am getting Exception.
#Transactional(value="transactionManager_iwpro_imp", rollbackFor = Exception.class)
public void saveAllPersons(final List<SswltdataPersons> list) {
String sql = "INSERT INTO iwpro_imp.tbl_sswltdata_persons VALUES(?,?,?,?,?,?,?)";
try{
jdbcTemplate.update("SET foreign_key_checks = 0");
List<List<SswltdataPersons>> batchLists = Lists.partition(list, batchSize);
for(final List<SswltdataPersons> batch : batchLists) {
BatchPreparedStatementSetter bpss = new BatchPreparedStatementSetter() {
#Override
public void setValues(PreparedStatement ps, int index) throws SQLException {
SswltdataPersons dataObject = batch.get(index);
ps.setLong(1, dataObject.getSpe_id());
ps.setLong(2, dataObject.getSpe_sen_id());
ps.setString(3, dataObject.getSpe_gender());
ps.setString(4, dataObject.getSpe_is_deceased());
ps.setString(5, dataObject.getSpe_birth_place());
ps.setString(6, dataObject.getSpe_create_date());
ps.setString(7, dataObject.getSpe_update_date());
}
#Override
public int getBatchSize() {
return batch.size();
}
};
jdbcTemplate.batchUpdate(sql, bpss);
}
jdbcTemplate.update("SET foreign_key_checks = 1");
}catch(Exception e){
TransactionAspectSupport.currentTransactionStatus().setRollbackOnly();
logger.error("\n\nUnexpected Exception:\n", e);
e.printStackTrace();
}
}
Can't I insert this enum value in DB?
In your java code, declare spe_gender as an Enum type
private Gender spe_gender
where Gender is an Enum class
public enum Gender {
MALE,
FEMALE
}
In order to answer that, you would have to give an example of the data actually being inserted.
Anyway, regarding the exception that you get, you're most likely not inserting "male" or "female", since it says "Data truncated for column 'spe_gender' at row 1" it means that the data you're inserting differs, and is in fact bigger (as in more characters) than the allowed one.
Also, check if there's an actual method to insert enums rather than "setString". -> EDIT: it does not
Thanks Mick Mnemonic for the suggestion. It worked.
replacing
ps.setString(3, dataObject.getSpe_gender().isEmpty());
with
ps.setString(3, dataObject.getSpe_gender().isEmpty() ? null : dataObject.getSpe_gender());
worked for me. Thanks all.

setAllowMultiQueries in MySQL Connector/J 6.0.4 [duplicate]

Hi I was wondering if it is possible to execute something like this using JDBC as it currently provides an exception even though it is possible in the MySQL query browser.
"SELECT FROM * TABLE;INSERT INTO TABLE;"
While I do realize that it is possible with having the SQL query string being split and the statement executed twice but I was wondering if there is a one time approach for this.
String url = "jdbc:mysql://localhost:3306/";
String dbName = "databaseinjection";
String driver = "com.mysql.jdbc.Driver";
String sqlUsername = "root";
String sqlPassword = "abc";
Class.forName(driver).newInstance();
connection = DriverManager.getConnection(url+dbName, sqlUsername, sqlPassword);
I was wondering if it is possible to execute something like this using JDBC.
"SELECT FROM * TABLE;INSERT INTO TABLE;"
Yes it is possible. There are two ways, as far as I know. They are
By setting database connection property to allow multiple queries,
separated by a semi-colon by default.
By calling a stored procedure that returns cursors implicit.
Following examples demonstrate the above two possibilities.
Example 1: ( To allow multiple queries ):
While sending a connection request, you need to append a connection property allowMultiQueries=true to the database url. This is additional connection property to those if already exists some, like autoReConnect=true, etc.. Acceptable values for allowMultiQueries property are true, false, yes, and no. Any other value is rejected at runtime with an SQLException.
String dbUrl = "jdbc:mysql:///test?allowMultiQueries=true";
Unless such instruction is passed, an SQLException is thrown.
You have to use execute( String sql ) or its other variants to fetch results of the query execution.
boolean hasMoreResultSets = stmt.execute( multiQuerySqlString );
To iterate through and process results you require following steps:
READING_QUERY_RESULTS: // label
while ( hasMoreResultSets || stmt.getUpdateCount() != -1 ) {
if ( hasMoreResultSets ) {
Resultset rs = stmt.getResultSet();
// handle your rs here
} // if has rs
else { // if ddl/dml/...
int queryResult = stmt.getUpdateCount();
if ( queryResult == -1 ) { // no more queries processed
break READING_QUERY_RESULTS;
} // no more queries processed
// handle success, failure, generated keys, etc here
} // if ddl/dml/...
// check to continue in the loop
hasMoreResultSets = stmt.getMoreResults();
} // while results
Example 2: Steps to follow:
Create a procedure with one or more select, and DML queries.
Call it from java using CallableStatement.
You can capture multiple ResultSets executed in procedure.
DML results can't be captured but can issue another select
to find how the rows are affected in the table.
Sample table and procedure:
mysql> create table tbl_mq( i int not null auto_increment, name varchar(10), primary key (i) );
Query OK, 0 rows affected (0.16 sec)
mysql> delimiter //
mysql> create procedure multi_query()
-> begin
-> select count(*) as name_count from tbl_mq;
-> insert into tbl_mq( names ) values ( 'ravi' );
-> select last_insert_id();
-> select * from tbl_mq;
-> end;
-> //
Query OK, 0 rows affected (0.02 sec)
mysql> delimiter ;
mysql> call multi_query();
+------------+
| name_count |
+------------+
| 0 |
+------------+
1 row in set (0.00 sec)
+------------------+
| last_insert_id() |
+------------------+
| 3 |
+------------------+
1 row in set (0.00 sec)
+---+------+
| i | name |
+---+------+
| 1 | ravi |
+---+------+
1 row in set (0.00 sec)
Query OK, 0 rows affected (0.00 sec)
Call Procedure from Java:
CallableStatement cstmt = con.prepareCall( "call multi_query()" );
boolean hasMoreResultSets = cstmt.execute();
READING_QUERY_RESULTS:
while ( hasMoreResultSets ) {
Resultset rs = stmt.getResultSet();
// handle your rs here
} // while has more rs
You can use Batch update but queries must be action(i.e. insert,update and delete) queries
Statement s = c.createStatement();
String s1 = "update emp set name='abc' where salary=984";
String s2 = "insert into emp values ('Osama',1420)";
s.addBatch(s1);
s.addBatch(s2);
s.executeBatch();
Hint: If you have more than one connection property then separate them with:
&
To give you somthing like:
url="jdbc:mysql://localhost/glyndwr?autoReconnect=true&allowMultiQueries=true"
I hope this helps some one.
Regards,
Glyn
Based on my testing, the correct flag is "allowMultiQueries=true"
Why dont you try and write a Stored Procedure for this?
You can get the Result Set out and in the same Stored Procedure you can Insert what you want.
The only thing is you might not get the newly inserted rows in the Result Set if you Insert after the Select.
I think this is the easiest way for multy selection/update/insert/delete. You can run as many update/insert/delete as u want after select (you have to make a select first(a dummy if needed)) with executeUpdate(str) (just use new int(count1,count2,...)) and if u need a new selection close 'statement' and 'connection' and make new for next select. Like example:
String str1 = "select * from users";
String str9 = "INSERT INTO `port`(device_id, potition, port_type, di_p_pt) VALUE ('"+value1+"', '"+value2+"', '"+value3+"', '"+value4+"')";
String str2 = "Select port_id from port where device_id = '"+value1+"' and potition = '"+value2+"' and port_type = '"+value3+"' ";
try{
Class.forName("com.mysql.jdbc.Driver").newInstance();
theConnection=(Connection) DriverManager.getConnection(dbURL,dbuser,dbpassword);
theStatement = theConnection.prepareStatement(str1);
ResultSet theResult = theStatement.executeQuery();
int count8 = theStatement.executeUpdate(str9);
theStatement.close();
theConnection.close();
theConnection=DriverManager.getConnection(dbURL,dbuser,dbpassword);
theStatement = theConnection.prepareStatement(str2);
theResult = theStatement.executeQuery();
ArrayList<Port> portList = new ArrayList<Port>();
while (theResult.next()) {
Port port = new Port();
port.setPort_id(theResult.getInt("port_id"));
portList.add(port);
}
I hope it helps

Populate complete domain model with Spring JDBC

TL;DR: How do you use Spring JDBC to populate a complex domain model in the best way?
I've previously only used JPA to retrieve stuff from the database, but our db admins complained how many queries the framework sent to the database and how inefficient they were, so on our new project we decided to try out Spring JDBC instead. I started to implement retrieval of our fairly complex domain model using a one query per entity approach, but the logic to put results where they belong in the model became difficult to follow very quickly.
For example: Items can have many Actions affect them and an Action can affect many Items. When I fetch an Item, I want to see its Actions, and I also want to see their affected Items, excluding the Item that I fetched in the first place. So this data:
Item: | id | name | Action: | id | actNo | itemId |
| 1 | 'One' | | 1 | '001-1' | 1 |
| 2 | 'Two' | | 1 | '001-1' | 2 |
| 2 | '002-2' | 2 |
Would produce this result when fetching "Two":
Item {id: 2, name: 'Two',
actionList: {
Action {id: 1, actNo: '001-1',
itemList: {
Item {id: 1, name: 'One'}
}
},
Action {id: 2, actNo: '002-2'}
}
}
This is the code I've got so far:
#Transactional
public List<Item> getItems(List<Integer> idList) {
initializeTempTable(idList);
return runQueries();
}
private void initializeTempTable(List<Integer> idList) {
String createSql = "create temporary table if not exists temp_table (id int) on commit delete rows";
jdbcTemplate.update(createSql, (SqlParameterSource) null);
String insertSql = "insert into temp_table (id) values (:value)";
List<MapSqlParameterSource> parameters = new ArrayList<MapSqlParameterSource>(idList.size());
for(Integer id : idList) {
parameters.add(new MapSqlParameterSource("value", id));
}
jdbcTemplate.batchUpdate(insertSql, parameters.toArray(new SqlParameterSource[parameters.size()]));
}
private List<Item> runQueries() {
List<Item> itemList = getItems();
addActions(itemList);
// Add the rest...
return itemList;
}
private List<Item> getItems() {
String sql = "select i.* from item i join temp_table t on i.id = t.id";
return jdbcTemplate.query(sql, (SqlParameterSource) null, new RowMapper<Item>() {
public Item mapRow(ResultSet rs, int rowNum) throws SQLException {
Item item = new Item();
item.setId(rs.getInt("id"));
item.setName(rs.getString("name"));
return item;
}
});
}
private void addActions(List<Item> itemList) {
String sql = "select a.* from action a " +
"join item i on a.itemId = i.id " +
"join temp_table t on i.id = t.id;
final Map<Integer, List<Item>> resultMap = new HashMap<Integer, List<Item>>();
jdbcTemplate.query(sql, (SqlParameterMap) null, new RowCallbackHandler() {
public void processRow(ResultSet rs) throws SQLException {
Action action = new Action();
action.setId(rs.getInt("id"));
action.setActNo(rs.getString("actNo"));
int itemId = rs.getInt("itemId");
if(resultMap.containsKey(itemId)) {
List<Action> actionList = resultMap.get(itemId);
actionList.add(action);
} else {
List<Action> actionList = new ArrayList<Action>(Arrays.asList(action));
resultMap.put(itemId, actionList);
}
}
});
for(Item item : itemList) {
List<Action> actionList = resultMap.get(item.getId());
item.setActionList(actionList);
}
addItemsToActions(resultMap);
}
private void addItemsToActions(final Map<Integer, List<Action>> resultMap) {
String sql = "select i2.*, a2.id as actionId, i.id as orgId from item i2 " +
"join action a2 on i2.id = a2.itemId " +
"join action a on a2.id = a.id " +
"join item i on a.itemId = i.id " +
"join temp_table t on i.id = t.id " +
"where i2.id != i.id";
jdbcTemplate,query(sql, (SqlParameterSource) null, new RowCallbackHandler() {
public void processRow(ResultSet rs) throws SQLException {
Item item = new Item();
item.setId(rs.getInt("id"));
item.setName(rs.getString("name"));
int orgItemId = rs.getInt("orgId");
if(resultMap.containsKey(orgItemId)) {
List<Action> actionList = resultMap.get(orgItemId);
int actionId = rs.getInt("actionId");
for(Action action : actionList) {
if(action.getId() == actionId) {
if(action.getItemList() == null) {
action.setItemList(new ArrayList<Item>());
}
action.getItemList().add(item);
break;
}
}
}
}
});
}
As you can see, for such a simple relation I get some non-obvious sql and a lot of hard-to-follow mapping code. And the only way I can see how to combat this is to do exactly what the JPA framework did: traverse the model depth-first and run a lot of small queries to populate each instance as you come by them. Which will make the db admins unhappy again.
Is there a better way?
No, there is no better way, and if you want such queries ORM is definitely not the way to go (although some ORM fanboys will tell you that)
You are much better off by returning the result set as a dynamic flat structure, like a Map, and forget about trying to map this to domain objects with all the parent-child nightmare that comes with it.
Spring has a queryForMap last time i've checked, although I'm not sure if that returns a typed Map.

Categories

Resources