So i try to input all the keys from a HashMap to a database. My first approach was to insert all the keys one by one to my database. Note that the HashMap size is some million keys long so this process took a lot of time.
I did some research and stumbled upon the preparedStatement interface. So i came up with this piece of code to create a Batch of 10000 elements and then input them all together to the database.
final int batchSize = 10000;
int count = 0;
Connection dbConnection = null;
try {
dbConnection = getDBConnection();
String SQL = "INSERT INTO masterdict (window) " +
"VALUES(?)";
PreparedStatement ps = (PreparedStatement) dbConnection.prepareStatement(SQL);
for (String k : masterDict.keySet()) {
ps.setString(1,k);
ps.addBatch();
if(++count % batchSize == 0) {
System.out.println(count);
ps.executeBatch();
}
}
ps.executeBatch();
ps.close();
dbConnection.close();
for some reason though this approach takes exactly the same time to complete as the first one. Can anyone explain to me why is this the case?
After reading through the comments i ended up with this new version of the code that works just fine.
final int batchSize = 10000;
int count = 0;
Connection dbConnection = null;
try {
dbConnection = getDBConnection();
String SQL = "INSERT INTO masterdict (window) " +
"VALUES(?)";
PreparedStatement ps = (PreparedStatement) dbConnection.prepareStatement(SQL);
dbConnection.setAutoCommit(false);
for (String k : masterDict.keySet()) {
ps.setString(1,k);
ps.addBatch();
if(++count % batchSize == 0) {
System.out.println(count);
ps.executeBatch();
dbConnection.commit();
}
}
ps.executeBatch();
dbConnection.commit();
ps.close();
} catch (SQLException e) {
if (dbConnection != null) {
dbConnection.rollback();
}
System.out.println(e.getMessage());
} finally {
dbConnection.close();
}
Related
try{
Connection conn = getConnection();
String strUpdateQuery = "update payment_table set CREDIT_CARD_NO = ? where PAYMENT_KEY= ?";
PreparedStatement ps =conn.prepareStatement(strUpdateQuery);
for(int i=0;i<nodes.getLength();i++){
ps.setString(1,"524364OQNBQQ4291");
ps.setString(2,"20130215123757533280168");
ps.executeUpdate();
conn.commit();
}
}catch(SQLException e){
e.printStackTrace();
}
Not updating even a single row even after I checked the primary key is correct.
Try with batch update:
void batchUpdate() {
String strUpdateQuery = "UPDATE payment_table " +
"SET CREDIT_CARD_NO = ? " +
"WHERE PAYMENT_KEY= ?";
try (Connection conn = getConnection();
PreparedStatement ps = conn.prepareStatement(strUpdateQuery)) {
for (int i = 0; i < nodes.getLength(); i++) {
ps.setString(1, "524364OQNBQQ4291");
ps.setString(2, "20130215123757533280168");
ps.addBatch();
}
int[] updated = ps.executeBatch();
// can log updated rows from "updated"
// conn.commit(); in case autocommit set to false or used conn.setAutoCommit(false) somewhere
}
catch (SQLException e) {
e.printStackTrace();
}
}
ps.setXxx(): You can check the order. The sequence must be the same as the column names in the result level returned by the resultSet!!
I must add 1.000.000 rows in mySQL with Java, but the process is quite long․ I have this code:
for(int number=45000000;number<46000000;number++){
query="insert into free(number) values ("+"'"+number+"');";
try {
stmt.executeUpdate(query);
System.out.println(number +" added");
} catch (SQLException e) {
e.printStackTrace();
}
How to accelerate the process? (40 min = 100.000 rows) Any ideas?
You could use a PreparedStatement and batching. Something like,
String query = "insert into free(number) values (?)";
PreparedStatement ps = null;
try {
ps = conn.prepareStatement(query);
for (int number = 45000000; number < 46000000; number++) {
ps.setInt(1, number);
ps.addBatch();
if ((number + 1) % 100 == 0) { // <-- this will add 100 rows at a time.
ps.executeBatch();
}
}
ps.executeBatch();
} catch (SQLException e) {
e.printStackTrace();
} finally {
if (ps != null) {
try {
ps.close();
} catch (SQLException e) {
}
}
}
I want to insert multiple rows into a MySQL table at once using Java. The number of rows is dynamic. In the past I was doing...
for (String element : array) {
myStatement.setString(1, element[0]);
myStatement.setString(2, element[1]);
myStatement.executeUpdate();
}
I'd like to optimize this to use the MySQL-supported syntax:
INSERT INTO table (col1, col2) VALUES ('val1', 'val2'), ('val1', 'val2')[, ...]
but with a PreparedStatement I don't know of any way to do this since I don't know beforehand how many elements array will contain. If it's not possible with a PreparedStatement, how else can I do it (and still escape the values in the array)?
You can create a batch by PreparedStatement#addBatch() and execute it by PreparedStatement#executeBatch().
Here's a kickoff example:
public void save(List<Entity> entities) throws SQLException {
try (
Connection connection = database.getConnection();
PreparedStatement statement = connection.prepareStatement(SQL_INSERT);
) {
int i = 0;
for (Entity entity : entities) {
statement.setString(1, entity.getSomeProperty());
// ...
statement.addBatch();
i++;
if (i % 1000 == 0 || i == entities.size()) {
statement.executeBatch(); // Execute every 1000 items.
}
}
}
}
It's executed every 1000 items because some JDBC drivers and/or DBs may have a limitation on batch length.
See also:
JDBC tutorial - Using PreparedStatement
JDBC tutorial - Using Statement Objects for Batch Updates
When MySQL driver is used you have to set connection param rewriteBatchedStatements to true ( jdbc:mysql://localhost:3306/TestDB?**rewriteBatchedStatements=true**).
With this param the statement is rewritten to bulk insert when table is locked only once and indexes are updated only once. So it is much faster.
Without this param only advantage is cleaner source code.
If you can create your sql statement dynamically you can do following workaround:
String myArray[][] = { { "1-1", "1-2" }, { "2-1", "2-2" }, { "3-1", "3-2" } };
StringBuffer mySql = new StringBuffer("insert into MyTable (col1, col2) values (?, ?)");
for (int i = 0; i < myArray.length - 1; i++) {
mySql.append(", (?, ?)");
}
myStatement = myConnection.prepareStatement(mySql.toString());
for (int i = 0; i < myArray.length; i++) {
myStatement.setString(i, myArray[i][1]);
myStatement.setString(i, myArray[i][2]);
}
myStatement.executeUpdate();
In case you have auto increment in the table and need to access it.. you can use the following approach... Do test before using because getGeneratedKeys() in Statement because it depends on driver used. The below code is tested on Maria DB 10.0.12 and Maria JDBC driver 1.2
Remember that increasing batch size improves performance only to a certain extent... for my setup increasing batch size above 500 was actually degrading the performance.
public Connection getConnection(boolean autoCommit) throws SQLException {
Connection conn = dataSource.getConnection();
conn.setAutoCommit(autoCommit);
return conn;
}
private void testBatchInsert(int count, int maxBatchSize) {
String querySql = "insert into batch_test(keyword) values(?)";
try {
Connection connection = getConnection(false);
PreparedStatement pstmt = null;
ResultSet rs = null;
boolean success = true;
int[] executeResult = null;
try {
pstmt = connection.prepareStatement(querySql, Statement.RETURN_GENERATED_KEYS);
for (int i = 0; i < count; i++) {
pstmt.setString(1, UUID.randomUUID().toString());
pstmt.addBatch();
if ((i + 1) % maxBatchSize == 0 || (i + 1) == count) {
executeResult = pstmt.executeBatch();
}
}
ResultSet ids = pstmt.getGeneratedKeys();
for (int i = 0; i < executeResult.length; i++) {
ids.next();
if (executeResult[i] == 1) {
System.out.println("Execute Result: " + i + ", Update Count: " + executeResult[i] + ", id: "
+ ids.getLong(1));
}
}
} catch (Exception e) {
e.printStackTrace();
success = false;
} finally {
if (rs != null) {
rs.close();
}
if (pstmt != null) {
pstmt.close();
}
if (connection != null) {
if (success) {
connection.commit();
} else {
connection.rollback();
}
connection.close();
}
}
} catch (SQLException e) {
e.printStackTrace();
}
}
#Ali Shakiba your code needs some modification. Error part:
for (int i = 0; i < myArray.length; i++) {
myStatement.setString(i, myArray[i][1]);
myStatement.setString(i, myArray[i][2]);
}
Updated code:
String myArray[][] = {
{"1-1", "1-2"},
{"2-1", "2-2"},
{"3-1", "3-2"}
};
StringBuffer mySql = new StringBuffer("insert into MyTable (col1, col2) values (?, ?)");
for (int i = 0; i < myArray.length - 1; i++) {
mySql.append(", (?, ?)");
}
mysql.append(";"); //also add the terminator at the end of sql statement
myStatement = myConnection.prepareStatement(mySql.toString());
for (int i = 0; i < myArray.length; i++) {
myStatement.setString((2 * i) + 1, myArray[i][1]);
myStatement.setString((2 * i) + 2, myArray[i][2]);
}
myStatement.executeUpdate();
This might be helpful in your case of passing array to PreparedStatement.
Store the required values to an array and pass it to a function to insert the same.
String sql= "INSERT INTO table (col1,col2) VALUES (?,?)";
String array[][] = new String [10][2];
for(int i=0;i<array.size();i++){
//Assigning the values in individual rows.
array[i][0] = "sampleData1";
array[i][1] = "sampleData2";
}
try{
DBConnectionPrepared dbcp = new DBConnectionPrepared();
if(dbcp.putBatchData(sqlSaveAlias,array)==1){
System.out.println("Success");
}else{
System.out.println("Failed");
}
}catch(Exception e){
e.printStackTrace();
}
putBatchData(sql,2D_Array)
public int[] putBatchData(String sql,String args[][]){
int status[];
try {
PreparedStatement stmt=con.prepareStatement(sql);
for(int i=0;i<args.length;i++){
for(int j=0;j<args[i].length;j++){
stmt.setString(j+1, args[i][j]);
}
stmt.addBatch();
stmt.executeBatch();
stmt.clearParameters();
}
status= stmt.executeBatch();
} catch (Exception e) {
e.printStackTrace();
}
return status;
}
It is possible to submit multiple updates in JDBC.
We can use Statement, PreparedStatement, and CallableStatement objects for batch update with disabled auto-commit.
addBatch() and executeBatch() functions are available with all statement objects to have BatchUpdate.
Here addBatch() method adds a set of statements or parameters to the current batch.
So I want to be able to display 500 records at a time, commit and print that it has been displayed records 1 to 500 records have been committed. And than do the next 500 records and commit again until reached the maximum records which is over 20k records. I managed to get the first 500 records but I am stuck how can I commit them and in commit them and continue to get the next 500 records and so on.
public static void selectRecordsIcore() throws SQLException {
Connection dbConnection = null;
PreparedStatement preparedStatement = null;
Statement statement = null;
String selectTableSQL = "SELECT profile_id, ingress_flag, egress_flag, ce_ingress_flag, ce_egress_flag from COS_PROFILE"
+ " WHERE profile_id >= ? AND profile_id <= ?;";
try {
dbConnection = getInformixConnection(); //connects to ICORE database
System.out.println("I am in the try");
//Gets the max profile_id record
statement = dbConnection.createStatement();
ResultSet r = statement.executeQuery("SELECT max(profile_id) AS rowcount FROM COS_PROFILE");
r.next();
int maxCount = r.getInt("rowcount");
System.out.println("COS_PROFILE table before update has " + maxCount + " row(s).");
preparedStatement = dbConnection.prepareStatement(selectTableSQL);
preparedStatement.setInt(1, 1);
preparedStatement.setInt(2, maxCount);
// execute select SQL statement
rs = preparedStatement.executeQuery();
updateRecordIntoBids();
} catch (SQLException e) {
System.out.println(e.getMessage());
} finally {
if (rs != null) {
rs.close();
}
if (statement != null) {
statement.close();
}
if (preparedStatement != null) {
preparedStatement.close();
}
if (dbConnection != null) {
dbConnection.close();
System.out.println("Database ICORE Connection is closed");
}
}
}
private static void updateRecordIntoBids() throws SQLException {
System.out.println("I am inside update method");
Connection dbConnection = null;
PreparedStatement preparedStatement = null;
dbConnection = getOracleConnection(); //connects to BIDS database
String updateTableSQL =
"UPDATE traffic_profile_temp SET pe_ingress_flag = ?, "
+ " pe_egress_flag = ?,"
+ " ce_ingress_flag = ?,"
+ " ce_egress_flag = ? "
+ " WHERE traffic_profile_id = ? ";
preparedStatement = dbConnection.prepareStatement(updateTableSQL);
try {
int rowCount = 0;
while (rs.next() && rowCount < 500) {
// System.out.println("inside the while loop");
String ingressflag = rs.getString("ingress_flag"); //BIDS column is pe_ingress_flag
String egressflag = rs.getString("egress_flag"); //BIDS column is pe_egress_flag
String ceingressflag = rs.getString("ce_ingress_flag"); //BIDS column is ce_ingress_flag
String ceegressflag = rs.getString("ce_egress_flag"); //BIDS column is ce_egress_flag
int profileid = rs.getInt("profile_id"); //BIDS column is traffic_profile_id
preparedStatement.setString(1, ingressflag);
preparedStatement.setString(2, egressflag);
preparedStatement.setString(3, ceingressflag);
preparedStatement.setString(4, ceegressflag);
preparedStatement.setInt(5, profileid);
// System.out.println(updateTableSQL);
System.out.println("Record " +profileid +" is updated to traffic_profile_temp table!");
// execute update SQL stetement
preparedStatement.addBatch();
rowCount++;
System.out.println(rowCount);
}
preparedStatement.executeBatch();
} catch (SQLException e) {
System.out.println(e.getMessage());
} finally {
if (preparedStatement != null) {
preparedStatement.close();
}
if (dbConnection != null) {
dbConnection.close();
System.out.println("Database BIDS Connection is closed");
}
}
}
update this part
while (rs.next() && rowCount < 500) {
with
while (rs.next()) {
and
// execute update SQL stetement
preparedStatement.addBatch();
rowCount++;
System.out.println(rowCount);
with
// execute update SQL stetement
preparedStatement.addBatch();
rowCount++;
if(rowCount % 500 == 0){
preparedStatement.executeBatch();
}
System.out.println(rowCount);
This check if the rowCount can be divided by 500, execute the batch. Don't forget to execute the batch after all statements finish to execute the remaining batches which couldn't divided by 500 . for more details regarding batches
I want to insert multiple rows into a MySQL table at once using Java. The number of rows is dynamic. In the past I was doing...
for (String element : array) {
myStatement.setString(1, element[0]);
myStatement.setString(2, element[1]);
myStatement.executeUpdate();
}
I'd like to optimize this to use the MySQL-supported syntax:
INSERT INTO table (col1, col2) VALUES ('val1', 'val2'), ('val1', 'val2')[, ...]
but with a PreparedStatement I don't know of any way to do this since I don't know beforehand how many elements array will contain. If it's not possible with a PreparedStatement, how else can I do it (and still escape the values in the array)?
You can create a batch by PreparedStatement#addBatch() and execute it by PreparedStatement#executeBatch().
Here's a kickoff example:
public void save(List<Entity> entities) throws SQLException {
try (
Connection connection = database.getConnection();
PreparedStatement statement = connection.prepareStatement(SQL_INSERT);
) {
int i = 0;
for (Entity entity : entities) {
statement.setString(1, entity.getSomeProperty());
// ...
statement.addBatch();
i++;
if (i % 1000 == 0 || i == entities.size()) {
statement.executeBatch(); // Execute every 1000 items.
}
}
}
}
It's executed every 1000 items because some JDBC drivers and/or DBs may have a limitation on batch length.
See also:
JDBC tutorial - Using PreparedStatement
JDBC tutorial - Using Statement Objects for Batch Updates
When MySQL driver is used you have to set connection param rewriteBatchedStatements to true ( jdbc:mysql://localhost:3306/TestDB?**rewriteBatchedStatements=true**).
With this param the statement is rewritten to bulk insert when table is locked only once and indexes are updated only once. So it is much faster.
Without this param only advantage is cleaner source code.
If you can create your sql statement dynamically you can do following workaround:
String myArray[][] = { { "1-1", "1-2" }, { "2-1", "2-2" }, { "3-1", "3-2" } };
StringBuffer mySql = new StringBuffer("insert into MyTable (col1, col2) values (?, ?)");
for (int i = 0; i < myArray.length - 1; i++) {
mySql.append(", (?, ?)");
}
myStatement = myConnection.prepareStatement(mySql.toString());
for (int i = 0; i < myArray.length; i++) {
myStatement.setString(i, myArray[i][1]);
myStatement.setString(i, myArray[i][2]);
}
myStatement.executeUpdate();
In case you have auto increment in the table and need to access it.. you can use the following approach... Do test before using because getGeneratedKeys() in Statement because it depends on driver used. The below code is tested on Maria DB 10.0.12 and Maria JDBC driver 1.2
Remember that increasing batch size improves performance only to a certain extent... for my setup increasing batch size above 500 was actually degrading the performance.
public Connection getConnection(boolean autoCommit) throws SQLException {
Connection conn = dataSource.getConnection();
conn.setAutoCommit(autoCommit);
return conn;
}
private void testBatchInsert(int count, int maxBatchSize) {
String querySql = "insert into batch_test(keyword) values(?)";
try {
Connection connection = getConnection(false);
PreparedStatement pstmt = null;
ResultSet rs = null;
boolean success = true;
int[] executeResult = null;
try {
pstmt = connection.prepareStatement(querySql, Statement.RETURN_GENERATED_KEYS);
for (int i = 0; i < count; i++) {
pstmt.setString(1, UUID.randomUUID().toString());
pstmt.addBatch();
if ((i + 1) % maxBatchSize == 0 || (i + 1) == count) {
executeResult = pstmt.executeBatch();
}
}
ResultSet ids = pstmt.getGeneratedKeys();
for (int i = 0; i < executeResult.length; i++) {
ids.next();
if (executeResult[i] == 1) {
System.out.println("Execute Result: " + i + ", Update Count: " + executeResult[i] + ", id: "
+ ids.getLong(1));
}
}
} catch (Exception e) {
e.printStackTrace();
success = false;
} finally {
if (rs != null) {
rs.close();
}
if (pstmt != null) {
pstmt.close();
}
if (connection != null) {
if (success) {
connection.commit();
} else {
connection.rollback();
}
connection.close();
}
}
} catch (SQLException e) {
e.printStackTrace();
}
}
#Ali Shakiba your code needs some modification. Error part:
for (int i = 0; i < myArray.length; i++) {
myStatement.setString(i, myArray[i][1]);
myStatement.setString(i, myArray[i][2]);
}
Updated code:
String myArray[][] = {
{"1-1", "1-2"},
{"2-1", "2-2"},
{"3-1", "3-2"}
};
StringBuffer mySql = new StringBuffer("insert into MyTable (col1, col2) values (?, ?)");
for (int i = 0; i < myArray.length - 1; i++) {
mySql.append(", (?, ?)");
}
mysql.append(";"); //also add the terminator at the end of sql statement
myStatement = myConnection.prepareStatement(mySql.toString());
for (int i = 0; i < myArray.length; i++) {
myStatement.setString((2 * i) + 1, myArray[i][1]);
myStatement.setString((2 * i) + 2, myArray[i][2]);
}
myStatement.executeUpdate();
This might be helpful in your case of passing array to PreparedStatement.
Store the required values to an array and pass it to a function to insert the same.
String sql= "INSERT INTO table (col1,col2) VALUES (?,?)";
String array[][] = new String [10][2];
for(int i=0;i<array.size();i++){
//Assigning the values in individual rows.
array[i][0] = "sampleData1";
array[i][1] = "sampleData2";
}
try{
DBConnectionPrepared dbcp = new DBConnectionPrepared();
if(dbcp.putBatchData(sqlSaveAlias,array)==1){
System.out.println("Success");
}else{
System.out.println("Failed");
}
}catch(Exception e){
e.printStackTrace();
}
putBatchData(sql,2D_Array)
public int[] putBatchData(String sql,String args[][]){
int status[];
try {
PreparedStatement stmt=con.prepareStatement(sql);
for(int i=0;i<args.length;i++){
for(int j=0;j<args[i].length;j++){
stmt.setString(j+1, args[i][j]);
}
stmt.addBatch();
stmt.executeBatch();
stmt.clearParameters();
}
status= stmt.executeBatch();
} catch (Exception e) {
e.printStackTrace();
}
return status;
}
It is possible to submit multiple updates in JDBC.
We can use Statement, PreparedStatement, and CallableStatement objects for batch update with disabled auto-commit.
addBatch() and executeBatch() functions are available with all statement objects to have BatchUpdate.
Here addBatch() method adds a set of statements or parameters to the current batch.