Java Bulk Insertion Loops Take Time Code Attached? - java

hi i am new to java and i am inserting in in to database using loop from array it takes time how would i insert data in DB as bulk insertion my code here,
if(con != null)
{
rs = dboperation.DBselectstatement(con,"select host_object_id from nagios_hosts where address='"+ip+"'");
if (rs != null)
{
rs.next();
String id = rs.getString(1);
for(int i= 0;i<serviceArray.length;i++)
{
status.append(serviceArray[i]+"\n");
dboperation.DbupdateStatement(DbAcess.getNagios_connection(),"insert into nagios_servicelist(service_name,host_object_id) values('"+serviceArray[i]+"','"+id+"')");
}
}
}
do not go in detail about this code i tell you that i am getting id from the first query in "rs" resultset and "servicearray" have services that i want to insert in Db but it takes time in loop how will i do this array bulk insertion in Database?
hopes to listen from you soon
Thanks in Advance

You shuld use JDBC bulk insert for your purpose -
//Create a new statement
Statement st = con.createStatement();
//Add SQL statements to be executed
st.addBatch("insert into nagios_servicelist(service_name,host_object_id) values('"+serviceArray[0]+"','"+id+"')");
st.addBatch("insert into nagios_servicelist(service_name,host_object_id) values('"+serviceArray[1]+"','"+id+"')");
st.addBatch("insert into nagios_servicelist(service_name,host_object_id) values('"+serviceArray[2]+"','"+id+"')");
// Execute the statements in batch
st.executeBatch();
You can insert your own logic here. But this is the overview of how this is to be done.

The following code avoids out of memory error as well as SQL injection
String sql = "insert into employee (name, city, phone) values (?, ?, ?)";
Connection connection = new getConnection();
PreparedStatement ps = connection.prepareStatement(sql);
final int batchSize = 1000;
int count = 0;
for (Employee employee: employees) {
ps.setString(1, employee.getName());
ps.setString(2, employee.getCity());
ps.setString(3, employee.getPhone());
ps.addBatch();
if(++count % batchSize == 0) {
ps.executeBatch();
}
}
ps.executeBatch(); // insert remaining records
ps.close();
connection.close();

Related

Return data when inserting a list with executeBatch()

I have to access the automatically generated data (id, created, last_modified ...) when inserting data lists. Because the lists may be large, I use statement.executeBatch() to add everything in a package. However, this way I lose the opportunity to take advantage of the returning statement.
I am currently doing the following to get the data:
public boolean store(Connection connection, List<WorkPlace> list) throws SQLException {
String query =
"insert into work_places (merchant_id, name, description) values (?, ?, ?)";
try(PreparedStatement statement = connection.prepareStatement(query, Statement.RETURN_GENERATED_KEYS)) {
for(WorkPlace workPlace: list) {
statement.setLong(1, workPlace.getMerchantId());
statement.setString(2, workPlace.getName());
statement.setString(3, workPlace.getDescription());
statement.addBatch();
}
statement.executeBatch();
try(ResultSet rs = statement.getGeneratedKeys()) {
List<Long> ids = new ArrayList<>();
while (rs.next()) {
ids.add(rs.getLong(1));
}
query =
"select * from work_places where id = any (?)";
try(PreparedStatement statement1 = connection.prepareStatement(query)) {
statement1.setArray(1, connection.createArrayOf("integer", ids.toArray()));
try(ResultSet rs1 = statement1.executeQuery()) {
list.clear();
while (rs1.next()) {
list.add(getWorkPlace(rs1));
}
}
}
}
}
return true;
}
Are you interested, is there a better way to achieve what I need?
The generated keys implementation in the PostgreSQL JDBC driver uses RETURNING *, which will return all columns from the table. So, if you can retrieve the generated id this way after executing a batch, then you should also be able to retrieve the other columns from the same getGeneratedKeys result set.

Return id after sql insert java [duplicate]

Is there a way to retrieve the auto generated key from a DB query when using a java query with prepared statements.
For example, I know AutoGeneratedKeys can work as follows.
stmt = conn.createStatement();
stmt.executeUpdate(sql, Statement.RETURN_GENERATED_KEYS);
if(returnLastInsertId) {
ResultSet rs = stmt.getGeneratedKeys();
rs.next();
auto_id = rs.getInt(1);
}
However. What if I want to do an insert with a prepared Statement.
String sql = "INSERT INTO table (column1, column2) values(?, ?)";
stmt = conn.prepareStatement(sql);
//this is an error
stmt.executeUpdate(Statement.RETURN_GENERATED_KEYS);
if(returnLastInsertId) {
//this is an error since the above is an error
ResultSet rs = stmt.getGeneratedKeys();
rs.next();
auto_id = rs.getInt(1);
}
Is there a way to do this that I don't know about. It seems from the javadoc that PreparedStatements can't return the Auto Generated ID.
Yes. See here. Section 7.1.9. Change your code to:
String sql = "INSERT INTO table (column1, column2) values(?, ?)";
stmt = conn.prepareStatement(sql, Statement.RETURN_GENERATED_KEYS);
stmt.executeUpdate();
if(returnLastInsertId) {
ResultSet rs = stmt.getGeneratedKeys();
rs.next();
auto_id = rs.getInt(1);
}
There's a couple of ways, and it seems different jdbc drivers handles things a bit different, or not at all in some cases(some will only give you autogenerated primary keys, not other columns) but the basic forms are
stmt = conn.prepareStatement(sql, Statement.RETURN_GENERATED_KEYS);
Or use this form:
String autogenColumns[] = {"column1","column2"};
stmt = conn.prepareStatement(sql, autogenColumns)
Yes, There is a way. I just found this hiding in the java doc.
They way is to pass the AutoGeneratedKeys id as follows
String sql = "INSERT INTO table (column1, column2) values(?, ?)";
stmt = conn.prepareStatement(sql, Statement.RETURN_GENERATED_KEYS);
I'm one of those that surfed through a few threads looking for solution of this issue ... and finally get it to work. FOR THOSE USING jdbc:oracle:thin: with ojdbc6.jar PLEASE TAKE NOTE:
You can use either methods:
(Method 1)
Try{
String yourSQL="insert into Table1(Id,Col2,Col3) values(SEQ.nextval,?,?)";
myPrepStatement = <Connection>.prepareStatement(yourSQL, Statement.RETURN_GENERATED_KEYS);
myPrepStatement.setInt(1, 123);
myPrepStatement.setInt(2, 123);
myPrepStatement.executeUpdate();
ResultSet rs = getGeneratedKeys;
if(rs.next()) {
java.sql.RowId rid=rs.getRowId(1);
//what you get is only a RowId ref, try make use of it anyway U could think of
System.out.println(rid);
}
} catch (SQLException e) {
//
}
(Method 2)
Try{
String yourSQL="insert into Table1(Id,Col2,Col3) values(SEQ.nextval,?,?)";
//IMPORTANT: here's where other threads don tell U, you need to list ALL cols
//mentioned in your query in the array
myPrepStatement = <Connection>.prepareStatement(yourSQL, new String[]{"Id","Col2","Col3"});
myPrepStatement.setInt(1, 123);
myPrepStatement.setInt(2, 123);
myPrepStatement.executeUpdate();
ResultSet rs = getGeneratedKeys;
if(rs.next()) {
//In this exp, the autoKey val is in 1st col
int id=rs.getLong(1);
//now this's a real value of col Id
System.out.println(id);
}
} catch (SQLException e) {
//
}
Basically, try not used Method1 if you just want the value of SEQ.Nextval, b'cse it just return the RowID ref that you may cracked your head finding way to make use of it, which also don fit all data type you tried casting it to! This may works fine (return actual val) in MySQL, DB2 but not in Oracle.
AND, turn off your SQL Developer, Toad or any client which use the same login session to do INSERT when you're debugging. It MAY not affect you every time (debugging call) ... until you find your apps freeze without exception for some time. Yes ... halt without exception!
Connection connection=null;
int generatedkey=0;
PreparedStatement pstmt=connection.prepareStatement("Your insert query");
ResultSet rs=pstmt.getGeneratedKeys();
if (rs.next()) {
generatedkey=rs.getInt(1);
System.out.println("Auto Generated Primary Key " + generatedkey);
}

Insert performance tuning

Currently we are selecting data from one database and inserting it into a backup database(SQL SERVER).
This data always contains more than 15K records in one select.
We are using Enumeration to iterate over the data selected.
We are using JDBC PreparedStatement to insert data as:
Enumeration values = ht.elements(); -- ht is HashTable containing selected data.
while(values.hasMoreElements())
{
pstmt = conn.prepareStatement("insert query");
pstmt.executeUpdate();
}
I am not sure if this is the correct or efficient way to do the faster insert.
For inserting 10k rows it takes near about 30 min or more.
Is there any efficient way to make it fast?
Note: Not using any indexes on the table.
Use a batch insert, but commit after a few entris, don't try to send all 10K at once. Try investigating to get the best size, it' a trade off to memory vs network trips.
Connection connection = new getConnection();
Statement statement = connection.createStatement();
int i = 0;
for (String query : queries) {
statement.addBatch("insert query");
if ((i++ % 500) == 0) {
// Do an execute now and again, don't send too many at once
statement.executeBatch();
}
}
statement.executeBatch();
statement.close();
connection.close();
Also, from your code I'm not sure what you are doing, but use paramaterised queries rather than sending 10K insert statements as text. Something like:
String q= "INSERT INTO data_table (id) values (?)";
Connection connection = new getConnection();
PreparedStatement ps = connection.prepareStatement(q);
for (Data d: data) {
ps.setString(1, d.getId());
ps.addBatch();
}
ps.executeBatch();
ps.close();
connection.close();
You can insert all the values in one sql command:
INSERT INTO Table1 ( Column1, Column2 ) VALUES
( V1, V2 ), ( V3, V4 ), .......
You may also insert the values by bulks of 500 records, for example, if the query would become very big. It is not efficient at all to insert on row per statement remotely (using a connection). Another solution is to do the inserts using a stored procedure. You just pass the values to it as parameters.
Here is how you can do it using the INSERT command above:
Enumeration values = ht.elements(); -- ht is HashTable containing selected data.
int i=0;
String sql="";
while(values.hasMoreElements())
{
sql+="(" + values + ")"; //better use StringBuffer here
i++;
if(i % 500 == 0) {
pstmt = conn.prepareStatement("insert query "+sql);
pstmt.executeUpdate();
sql="";
}
else
sql += " , ";
}

When I try o insert data dataexecuteQuery() isn't working?

Can't seem to fix this. Been trying to get it to work for the past hour. any help would be appreciated.
INFO: Server startup in 868 ms
java.sql.SQLException: Can not issue data manipulation statements with executeQuery().Event{id=0, name='dads', venue='dasd', startDate='11/11/11', endDate='12/11/11'}
Seemed to be getting an error when I try to do an insert.
public void addEvent(Event event) throws DaoException{
Connection con = null;
PreparedStatement ps = null;
ResultSet rs = null;
try {
con = this.getConnection();
String query = "INSERT INTO TABLE EVENT VALUES(null, ?, ?, ?, ?)";
ps = con.prepareStatement(query);
ps.setString(1, event.getName());
ps.setString(2, event.getVenue());
ps.setString(3, event.getStartDate());
ps.setString(4, event.getEndDate());
rs = ps.executeQuery();
}catch(SQLException e) {
System.out.println(event.toString());
e.printStackTrace();
}finally {
try {
if (rs != null) {
rs.close();
}
if (ps != null) {
ps.close();
}
if (con != null) {
freeConnection(con);
}
} catch (SQLException e) {
throw new DaoException("Couldn't " + e.getMessage());
}
}
}
for inserting or updating or deleting you should use executeUpdate()
executeUpdate() returns int value
so replace this line rs = ps.executeQuery(); with
int result = ps.executeUpdate();
Note you will get another error after modifying as per above because you sql query is also wrong
Use the following query
INSERT INTO EVENT VALUES(null, ?, ?, ?, ?)
it looks like incorrect syntax of INSERT Query, update it likewise
INSERT INTO EVENT VALUES(null, ?, ?, ?, ?) // remove TABLE word
//INSERT INTO TABLE_NAME VALUES(...) , this is correct syntax.
also correct here too
executeUpdate() instead of executeQuery()
// for insert/update/delete query always use executeUpdate(),
// for select query use executeQuery()

class java.lang.OutOfMemoryError while saving data in Oracle database

I have an excel sheet with about 25,000 rows. Each row in the excel sheet will be a row in my table as well. I tried to do the following and it just keeps me giving Memory out of bound exception. I tried to change the batchSize from 25 to 50, 100, 500. None of them works. Can anyone tell me what am I doing wrong? changing the heap size of the JVM is not an option for me.
public void saveForecast(List list) throws FinderException{
final Session session = getCurrentSession();
final int batchSize = 25;
Connection con = null;
PreparedStatement pstmt = null;
Iterator iterator = list.iterator();
int rowCount = list.size();
String sqlStatement = "INSERT INTO DMD_VOL_UPLOAD (ORIGIN, DESTINATION, DAY_OF_WEEK, EFFECTIVE_DATE, DISCONTINUE_DATE, VOLUME)";
sqlStatement += " VALUES(?, ?, ?, ?, ?, ?)";
System.out.println(sqlStatement);
System.out.println("Number of rows to be inserted: "+ rowCount);
System.out.println("Starting time: "+new Date().toString());
try{
con = session.connection();
for(int i=0; i<rowCount; i++){
ForecastBatch forecastBatch = (ForecastBatch) iterator.next();
pstmt = con.prepareStatement(sqlStatement);
pstmt.setString(1, forecastBatch.getOrigin());
pstmt.setString(2, forecastBatch.getDestination());
pstmt.setInt(3, forecastBatch.getDayOfWeek());
java.util.Date effJavaDate = forecastBatch.getEffectiveDate();
java.sql.Date effSqlDate = new java.sql.Date(effJavaDate.getTime());
pstmt.setDate(4, effSqlDate);
java.util.Date disJavaDate=forecastBatch.getDiscontinueDate();
java.sql.Date disSqlDate = new java.sql.Date(disJavaDate.getTime());
pstmt.setDate(5, disSqlDate);
pstmt.setInt(6, forecastBatch.getVolumeSum());
pstmt.addBatch();
if(i % batchSize == 0){
pstmt.executeBatch();
session.flush();
session.clear();
}
}
pstmt.executeBatch();
pstmt.close();
System.out.println("Ending Time: "+ new Date().toString());
}catch(SQLException e){
e.printStackTrace();
throw new FinderException(e);
}
finally{
HibernateUtil.closeSession();
}
}
}
You are creating a new statement inside your loop but only closing the last statement after the loop ends. That means you're actually creating 25000 statements and closing only a single one leaving 24999 statements open, which I'm not surprised is causing you to run out of resources.
Furthermore, you're not using the batch statements correctly (you'd have to create the statement once, then set the parameters, call addBatch, set more parameters, call addBatch again, and so on, then call executeBatch when you want to submit all values in the batch.
EDIT:
You'll probably fix this by moving the prepareStatement call just before the for loop and I don't think calling session flush/clear is necessary either.
Your main problem seems to be that you're re-preparing the statement for every single row. You should be preparing the statement once. This would likely lead to consuming a huge amount of memory.

Categories

Resources