How to get ResultSet from executeBatch? - java

I need to get the result set from an executed batch :
String [] queries = {"create volatile table testTable as (select * from orders) with data;",
"select top 10 * from testTable;" ,
"drop table testTable" };
for (String query : queries) {
statement.addBatch(query);
}
statement.executeBatch();
Ones i execute batch how can i get the result set from the select query ?

In short, you should not. Plain multiple execute() should be used.
As as according to the javadoc of executeBatch(), it should not support getResultSet()/getMoreResults() API.
Also, in JDBC™ 4.0 Specification #14.1.2
Only DDL and DML commands that return a simple update count may be
executed as part of a batch. The method executeBatch throws a
BatchUpdateException if any of the commands in the batch fail to
execute properly or if a command attempts to return a result set.
But some JDBC drivers might do support, try at your own risk.

I can think of 2 options from the top of my head.
1) As the other guy said...
"Plain multiple execute() should be used"
this way you can get the result set.
2) you can query after you execute your batch and get its info from the db.

According to the Java 7 API, the 'executeBatch" function doesn't return an object of ResultSet, but an array of integers. You can process the values based on the API to see which commands in the batch were successful.

Related

Update query in createNativeQuery

I want to run an Update query using createNativeQuery in entityManager. I am not able to run it.
My class structure :
class ABC_DAO
{
List<a> = entityManager.createNativeQuery(select.......); //sql1 : it is working fine
Sysout("some value"); // it is working
entityManager.createNativeQuery(update.......);// ***sql2 :it is not working***
Sysout("some value"); // it is working
}
Hibernate is not executing sql2 but executing sql2. We are using Postgres db. This query has to be in Sql. We are using Hibernate with JPA.
Let my try to help you on behalf of your erroneous code example and problem description.
1) You will only get a List as result of a query if you call getResultList() on it, otherwise sql1 would not work (Please post the complete code, if you want to get help):
List<a> = entityManager.createNativeQuery("sql1", a.class).getResultList();
2) For update statements you have to call the method executeUpdate() and not getResultList() (or getSingleResult())to send the native SQL statement to the database:
int countUpdated = entityManager.createNativeQuery("sql2").executeUpdate();

Does the mysql jdbc driver class know to execute multiple inserts in bulk?

I'm trying to move a large number of records from one MySQL instance two another inside RDS. They are on different VPCs and different AWS accounts, so I can't create a data pipeline that would do the copy for me.
I've written a quick java program that connects to both the import database and the export database and does the following:
query the import database for the highest in table.primary_key with SELECT MAX(primary_key) FROM table
get a result set from the export table with SELECT * FROM table WHERE(primary_key > max_from_import) LIMIT 1000000
create a PreparedStatement object from the import connection and set the queryString to INSERT INTO table (col1....coln) VALUES (?....n?)
iterate over the result set and set the prepared statement columns to the ones from the result cursor (with some minor manipulations to the data), call execute on the PreparedStatement object, clear its' parameters, then move to the next result.
With this method I'm able to see around 100000 records being imported an hour, but I know that from this question that a way to optimize inserts is not to create a new query each time, but to append more data with each insert. i.e.
INSERT INTO table (col1...coln) VALUES (val1...valn), (val1...valn)....(val1...valn);
Does the jdbc driver know to do this, or is there some sort of optimization I can make on my end to improve insert run time?
UPDATE:
Both answers recommended using the add and execute batch, as well as removing auto commit. Removing auto commit saw a slight improvement (10%), doing the batch yielded a run time of less than 50% of the individual inserts.
You need to use batch insert. Internally, Connector/J (MySQL JDBC driver) can rewrite batch inserts into multi values insert statements.
(Note that this is the default Connector/J behavior. You can add
the option useServerPrepStmts=true to the JDBC url to enable server side prepared statements)
The code looks like the following:
try(PreparedStatement stmt = connection.prepareStatement(sql)) {
for(value : valueList) {
stmt.clearParameters();
stmt.setParameter(1, value);
stmt.addBatch();
}
stmt.executeBatch();
}
The code above will generate a multi value insert:
INSERT tablename(field) VALUES(value1), (value2), (value3) ...
First create a JDBC connection to Destination database and make its auto commit property to false.
After that in a loop do the following
Read N(for example 1000) number of rows from Source database and write that to destination database.
After some inserts commit destination database connection.
Sample code to get more idea is given below
Connection sourceCon = getSourceDbConnction();
Connection destCon = getDestinationDbConnction();
destCon.setAutoCommit(false);
int i=0;
String query;
while((query=getInsertQuery()!=null)
{
statement.executeUpdate(query);
i++;
if(i%10 == 0)
{
destCon.commit();
i=0;
}
}
destCon.commit();
The getInsertQuery function should give string in INSERT INTO table (col1...coln) VALUES (val1...valn), (val1...valn)....(val1...valn); format.
Also it should return null, if all tables are processed.
If you are using Prepared Statements, you can use addBatch and executeBatch functions. Inside loop add values using addBatch function. After some inserts call executeBatch.

Teradata JDBC executeBatch errorhandling

I am inserting data into a teradata table using executeBatch method. Currently if one insert in the batch fails all the other inserts in the batch also fails and no records end up being inserted. How can I change this behaviour to let the other inserts in the batch succeed if any inserts fails and the some ability to track the rejected records.
PS: I have ensured that TMODE is set to TERA and autocommit enabled.
UPDATE:
target table definition.
CREATE SET TABLE mydb.mytable ,NO FALLBACK ,
NO BEFORE JOURNAL,
NO AFTER JOURNAL,
CHECKSUM = DEFAULT,
DEFAULT MERGEBLOCKRATIO
(
col1 INTEGER,
col2 VARCHAR(10) CHARACTER SET LATIN NOT CASESPECIFIC NOT NULL)
PRIMARY INDEX ( col1 );
Below is the sample scala code. As you can see, this batch contains 5 insert statements. The First insert is set to fail because it is trying to insert null into an not null field (col2). The other 4 inserts dont have any issues and should succeed. But as you can see from below all 5 inserts in the batch failed. Is there any way we can make other inserts succeed?. As stated above tmode is tera and autocommit is enabled. if there is no way other than re-submitting all failed queries individually then we would have to reduce the batch size and settle for lower throughput.
Class.forName("com.teradata.jdbc.TeraDriver");
val conn = DriverManager.getConnection("jdbc:teradata://teradata-server/mydb,tmode=TERA","username","password")
val insertSQL = "INSERT INTO mydb.mytable VALUES (?,?)"
val stmt = conn.prepareStatement(insertSQL)
stmt.setInt(1,1)
stmt.setNull(2,Types.VARCHAR) // Inserting Null here. This insert will fail
stmt.addBatch()
stmt.setInt(1,2)
stmt.setString(2,"XXX")
stmt.addBatch()
stmt.setInt(1,3)
stmt.setString(2,"YYY")
stmt.addBatch()
stmt.setInt(1,4)
stmt.setString(2,"ZZZ")
stmt.addBatch()
stmt.setInt(1,5)
stmt.setString(2,"ABC")
stmt.addBatch()
try {
val res = stmt.executeBatch()
println(res.mkString(","))
}
catch {
case th: BatchUpdateException => {
println(th.getUpdateCounts().mkString(","))
}
}
Result
-3,-3,-3,-3,-3
This is from Teradata's JDBC manual:
Beginning with Teradata Database 13.10 and Teradata JDBC Driver
13.00.00.16, PreparedStatement batch execution can return individual success and error conditions for each parameter set.
An application using the PreparedStatement executeBatch method must
have a catch-block for BatchUpdateException and the application must
examine the error code returned by the BatchUpdateException
getErrorCode method.
PreparedStatement BatchUpdateException Handling
Execute a multi-statement request using a PreparedStatement batch request and demonstrates the handling of the PreparedStatement BatchUpdateException

How to add SELECT statements in addBatch()?

Hi I am trying to execute couple of queries in a batch by using statement.addBatch(sql) Now I have found that if query is SELECT BLAH BLAH then it throws Exception saying BatchUpdateExecption So how do I add SELECT statement inside batch. For e.g. the following does not work because on batch contains SELECT statement
st.addBatch("UPDATE")
st.addBatch("CREATE")
st.addBatch("SELECT")
st.executeBatch()
One work around is I execute SELECT statement in st.execute("SELECT") instead of st.addBatch("SELECT"). Are there any recommended ways or best practice for this usecase? Please guide thanks in advance.
Batch statements are basically for INSERT, UPDATE and DELETE statements, they're not intended for a SELECT statement nor a DDL statement. If you want to execute a SELECT statement, do it in another Statement or PreparedStatement that won't execute the batch statements.
Even the javadoc says : typically addBatch is a SQL INSERT or UPDATE statement. It is not designed to be used for SELECT statements.Refer addBatch(String sql)
Moreover addBatch(java.lang.String) method cannot be called on a PreparedStatement or CallableStatement but addBatch() supports PreparedStatement.
If you want to execute them in one connection, one of the options is to add logic like this:
if (query.startsWith("SELECT")) {
// process SELECT query
} else {
// process UPDATE/INSERT/DELETE queries
}
but in this case, all of your SELECT queries will be executed at first queue

MyBatis Batch Insert/Update For Oracle

I've recently started learning to use myBatis.I am now facing such a scenario, I need to constantly fetch a new list of Objects through WebService, then for this list, I need to insert/update each object into the oracle DB table through myBatis.
The tricky part is, I cannot simply do a batch insert every time, because some of the objects might already exist in DB, for these records, I need to update the fields of them instead of a new insertion.
My current solution might be very stupid, using Java, build the list of Object from webservice, loop through each of them, do a myBatis select, if it is not a null(already exists in the db), then do a myBatis update; otherwise, do a myBatis insert for this new object.
The function is achieved. But my technical lead says it is very low-efficient, since doing a for loop using Java and insert/update one by one will consume a lot of system resource. He advised me to do batch insert using myBatis by passing a list of objects in.
Batch insertion in myBatis is straightforward, however, since I am not purely inserting(for existing records I need to do update), I don't think batch insert is appropriate here. I've googled a while for this, and realized maybe I will need to use "merge" instead of "insert" (for Oracle).
The examples I googled out for merge in myBatis is only for one object, not in a batch. Thus I want to find out whether experts could offer me some examples on how to do a batch-merge in MyBatis( The correct way to write a Mapper)?
The accepted answer is not the recommended way of handling batch operations. It does not show true batch statements since the batch executor mode should be used when opening a session. See this post in which a code contributor recommended that the proper way to batch update (or insert) is to open a session in batch mode and repeatedly call update (or insert) for a single record.
Here's what works for me:
public void updateRecords(final List<GisObject> objectsToUpdate) {
final SqlSession sqlSession = MyBatisUtils.getSqlSessionFactory().openSession(ExecutorType.BATCH);
try {
final GisObjectMapper mapper = sqlSession.getMapper(GisObjectMapper.class);
for (final GisObject gisObject : objectsToUpdate) {
mapper.updateRecord(gisObject);
}
sqlSession.commit();
} finally {
sqlSession.close();
}
}
Do not use foreach in your update/insert and ensure that it only updates/inserts a single record. I was running into unsolvable oracle errors by doing it according to the accepted answer (invalid character, statement not ended, etc.). As the linked post indicates, the update (or insert) shown in the accepted answer is actually just a giant sql statement.
In my case also there is same scenario. I used for loop to check whether this record exists in databse or not and then according to that I added this object in to two arraylist for insert or update.
And then used batch for insert and update after for loop for that to list.
here is ex. for update according to different where condition
1] this is for update
<foreach collection="attendingUsrList" item="model" separator=";">
UPDATE parties SET attending_user_count = #{model.attending_count}
WHERE fb_party_id = #{model.eid}
</foreach>
2] this is for insert
<insert id="insertAccountabilityUsers" parameterType="AccountabilityUsersModel" useGeneratedKeys="false">
INSERT INTO accountability_users
(
accountability_user_id, accountability_id, to_username,
record_status, created_by, created_at, updated_by, updated_at
)
VALUES
<foreach collection="usersList" item="model" separator=",">
(
#{model.accountabilityUserId}, #{model.accountabilityId}, #{model.toUsername},
'A', #{model.createdBy}, #{model.createdAt}, #{model.updatedBy}, #{model.updatedAt}
)
</foreach>
</insert>
In dao method declare as
void insertAccountabilityUsers(#Param("usersList") List<AccountabilityUsersModel> usersList);
Update
Here is my batch session code
public static synchronized SqlSession getSqlBatchSession() {
ConnectionBuilderAction connection = new ConnectionBuilderAction();
sf = connection.getConnection();
SqlSession session = sf.openSession(ExecutorType.BATCH);
return session;
}
SqlSession session = ConnectionBuilderAction.getSqlSession();
Actually I already given full example here for this question
In oracle if you want to execute multiple statements at one time you have to enclose your statements in "begin" and "end" block. So try to add attributes to foreach as below. This will definitely work.
<foreach collection="customerList" item="object" open="begin" close=";end;" separator=";">
UPDATE customer SET isActive = #{object.isactive}
WHERE customerId= #{object.customerId}
</foreach>

Categories

Resources