I want to insert multiple rows into a MySQL table at once using Java. The number of rows is dynamic. In the past I was doing...
for (String element : array) {
myStatement.setString(1, element[0]);
myStatement.setString(2, element[1]);
myStatement.executeUpdate();
}
I'd like to optimize this to use the MySQL-supported syntax:
INSERT INTO table (col1, col2) VALUES ('val1', 'val2'), ('val1', 'val2')[, ...]
but with a PreparedStatement I don't know of any way to do this since I don't know beforehand how many elements array will contain. If it's not possible with a PreparedStatement, how else can I do it (and still escape the values in the array)?
You can create a batch by PreparedStatement#addBatch() and execute it by PreparedStatement#executeBatch().
Here's a kickoff example:
public void save(List<Entity> entities) throws SQLException {
try (
Connection connection = database.getConnection();
PreparedStatement statement = connection.prepareStatement(SQL_INSERT);
) {
int i = 0;
for (Entity entity : entities) {
statement.setString(1, entity.getSomeProperty());
// ...
statement.addBatch();
i++;
if (i % 1000 == 0 || i == entities.size()) {
statement.executeBatch(); // Execute every 1000 items.
}
}
}
}
It's executed every 1000 items because some JDBC drivers and/or DBs may have a limitation on batch length.
See also:
JDBC tutorial - Using PreparedStatement
JDBC tutorial - Using Statement Objects for Batch Updates
When MySQL driver is used you have to set connection param rewriteBatchedStatements to true ( jdbc:mysql://localhost:3306/TestDB?**rewriteBatchedStatements=true**).
With this param the statement is rewritten to bulk insert when table is locked only once and indexes are updated only once. So it is much faster.
Without this param only advantage is cleaner source code.
If you can create your sql statement dynamically you can do following workaround:
String myArray[][] = { { "1-1", "1-2" }, { "2-1", "2-2" }, { "3-1", "3-2" } };
StringBuffer mySql = new StringBuffer("insert into MyTable (col1, col2) values (?, ?)");
for (int i = 0; i < myArray.length - 1; i++) {
mySql.append(", (?, ?)");
}
myStatement = myConnection.prepareStatement(mySql.toString());
for (int i = 0; i < myArray.length; i++) {
myStatement.setString(i, myArray[i][1]);
myStatement.setString(i, myArray[i][2]);
}
myStatement.executeUpdate();
In case you have auto increment in the table and need to access it.. you can use the following approach... Do test before using because getGeneratedKeys() in Statement because it depends on driver used. The below code is tested on Maria DB 10.0.12 and Maria JDBC driver 1.2
Remember that increasing batch size improves performance only to a certain extent... for my setup increasing batch size above 500 was actually degrading the performance.
public Connection getConnection(boolean autoCommit) throws SQLException {
Connection conn = dataSource.getConnection();
conn.setAutoCommit(autoCommit);
return conn;
}
private void testBatchInsert(int count, int maxBatchSize) {
String querySql = "insert into batch_test(keyword) values(?)";
try {
Connection connection = getConnection(false);
PreparedStatement pstmt = null;
ResultSet rs = null;
boolean success = true;
int[] executeResult = null;
try {
pstmt = connection.prepareStatement(querySql, Statement.RETURN_GENERATED_KEYS);
for (int i = 0; i < count; i++) {
pstmt.setString(1, UUID.randomUUID().toString());
pstmt.addBatch();
if ((i + 1) % maxBatchSize == 0 || (i + 1) == count) {
executeResult = pstmt.executeBatch();
}
}
ResultSet ids = pstmt.getGeneratedKeys();
for (int i = 0; i < executeResult.length; i++) {
ids.next();
if (executeResult[i] == 1) {
System.out.println("Execute Result: " + i + ", Update Count: " + executeResult[i] + ", id: "
+ ids.getLong(1));
}
}
} catch (Exception e) {
e.printStackTrace();
success = false;
} finally {
if (rs != null) {
rs.close();
}
if (pstmt != null) {
pstmt.close();
}
if (connection != null) {
if (success) {
connection.commit();
} else {
connection.rollback();
}
connection.close();
}
}
} catch (SQLException e) {
e.printStackTrace();
}
}
#Ali Shakiba your code needs some modification. Error part:
for (int i = 0; i < myArray.length; i++) {
myStatement.setString(i, myArray[i][1]);
myStatement.setString(i, myArray[i][2]);
}
Updated code:
String myArray[][] = {
{"1-1", "1-2"},
{"2-1", "2-2"},
{"3-1", "3-2"}
};
StringBuffer mySql = new StringBuffer("insert into MyTable (col1, col2) values (?, ?)");
for (int i = 0; i < myArray.length - 1; i++) {
mySql.append(", (?, ?)");
}
mysql.append(";"); //also add the terminator at the end of sql statement
myStatement = myConnection.prepareStatement(mySql.toString());
for (int i = 0; i < myArray.length; i++) {
myStatement.setString((2 * i) + 1, myArray[i][1]);
myStatement.setString((2 * i) + 2, myArray[i][2]);
}
myStatement.executeUpdate();
This might be helpful in your case of passing array to PreparedStatement.
Store the required values to an array and pass it to a function to insert the same.
String sql= "INSERT INTO table (col1,col2) VALUES (?,?)";
String array[][] = new String [10][2];
for(int i=0;i<array.size();i++){
//Assigning the values in individual rows.
array[i][0] = "sampleData1";
array[i][1] = "sampleData2";
}
try{
DBConnectionPrepared dbcp = new DBConnectionPrepared();
if(dbcp.putBatchData(sqlSaveAlias,array)==1){
System.out.println("Success");
}else{
System.out.println("Failed");
}
}catch(Exception e){
e.printStackTrace();
}
putBatchData(sql,2D_Array)
public int[] putBatchData(String sql,String args[][]){
int status[];
try {
PreparedStatement stmt=con.prepareStatement(sql);
for(int i=0;i<args.length;i++){
for(int j=0;j<args[i].length;j++){
stmt.setString(j+1, args[i][j]);
}
stmt.addBatch();
stmt.executeBatch();
stmt.clearParameters();
}
status= stmt.executeBatch();
} catch (Exception e) {
e.printStackTrace();
}
return status;
}
It is possible to submit multiple updates in JDBC.
We can use Statement, PreparedStatement, and CallableStatement objects for batch update with disabled auto-commit.
addBatch() and executeBatch() functions are available with all statement objects to have BatchUpdate.
Here addBatch() method adds a set of statements or parameters to the current batch.
Related
How does this source code create a new database record? I am reverse engineering a java application, and I don't have much java experience myself. I would expect to see something like "INSERT INTO ShipBargePipe (columns) VALUES (values) etc. But All i see is a dbTransfers object and nothing else. Does anyone have any idea of how this works?
Thanks in advance.
`
int insertSBP(String direction, int selectedTransferTypeNumber, int selectedSBPNumber, Timestamp shouldStart, int barrels, String notes)
throws NumberFormatException, SQLException
{
int newTransferNumber = -1;
Connection conn = null;
try
{
int userNumber = GV.user.getUserNumber();
conn = GV.getConnection(false);
ResultSetCA selectedProducts = (ResultSetCA)this.jListProducts.getSelectedVectors();
ResultSetCA selectedSources = null;
ResultSetCA selectedDestinations = null;
if (direction.equals("Outbound")) {
selectedSources = (ResultSetCA)this.jListSourceTanks.getSelectedVectors();
} else {
selectedDestinations = (ResultSetCA)this.jListDestinationTanks
.getSelectedVectors();
}
this.dbTransfers.resetChanged();
this.dbTransfers.setInt(selectedTransferTypeNumber, "transferTypeNumber");
this.dbTransfers.setInt(userNumber, "userNumber");
this.dbTransfers.setTimeStamp(shouldStart, "ShouldStartStamp");
this.dbTransfers.setInt(selectedSBPNumber, "SBPNumber");
this.dbTransfers.setString(notes, "Notes");
this.dbTransfers.setInt(barrels, "BarrelsRequested");
this.dbTransfers.insert(conn);
newTransferNumber = DbObject.lastId(conn);
if (direction.equals("Outbound")) {
for (int counter = 0; counter < selectedSources.size(); counter++)
{
int tankNumber = selectedSources.getInt(counter, 0);
int customerNumber = selectedSources.getInt(counter, 2);
this.dbTransferTank.resetChanged();
this.dbTransferTank.setInt(newTransferNumber, "transferNumber");
this.dbTransferTank.setInt(tankNumber, "tankNumber");
this.dbTransferTank.setString("s", "sourceDest");
this.dbTransferTank.setInt(customerNumber, "customerNumber");
this.dbTransferTank.setNull("startBarrels");
this.dbTransferTank.insert(conn);
}
} else {
for (int counter = 0; counter < selectedDestinations.size(); counter++)
{
int tankNumber = selectedDestinations.getInt(counter, 0);
int customerNumber = selectedDestinations.getInt(counter, 2);
this.dbTransferTank.resetChanged();
this.dbTransferTank.setInt(newTransferNumber, "transferNumber");
this.dbTransferTank.setInt(tankNumber, "tankNumber");
this.dbTransferTank.setString("d", "sourceDest");
this.dbTransferTank.setInt(customerNumber, "customerNumber");
this.dbTransferTank.setNull("startBarrels");
this.dbTransferTank.insert(conn);
}
}
for (int counter = 0; counter < selectedProducts.size(); counter++)
{
int productNumber = selectedProducts.getInt(counter, 0);
this.dbProductTransfer.resetChanged();
this.dbProductTransfer.setInt(newTransferNumber, "transferNumber");
this.dbProductTransfer.setInt(productNumber, "productNumber");
this.dbProductTransfer.insert(conn);
}
conn.commit();
}
catch (SQLException e)
{
if (conn != null) {
conn.rollback();
}
throw e;
}
return newTransferNumber;
}
`
Edit * Here is the dbObject insert method and it's depended createInsert method.
It looks like it dynamically loops through the columns to build the SQL query string. Maybe there is a way to enable logging on the mysql database so i can see exactly what the query strings are ran against it?
`
String createInsert()
throws SQLException
{
String fieldNames = "";
String values = "";
for (int i = 0; i < this.fields.length; i++) {
if (this.fields[i].isChanged()) {
if (fieldNames.equals(""))
{
fieldNames = this.fields[i].getName();
values = "?";
}
else
{
fieldNames = fieldNames + ", " + this.fields[i].getName();
values = values + ",?";
}
}
}
if (this.fields.equals("")) {
throw new SQLException("The table " + this.tableName +
" does not have anything to insert");
}
return createInsert(this.schema + "." + this.tableName, fieldNames, values);
}
`
`
public void insert(Connection conn)
throws SQLException
{
Statement stmt = null;
PreparedStatement ps = null;
int i = 0;
int j = 1;
try
{
String dml = createInsert();
printDebug(dml);
ps = conn.prepareStatement(dml);
for (i = 0; i < this.fields.length; i++) {
if (this.fields[i].isChanged())
{
this.fields[i].setInPreparedStatement(ps, j);
if (this.fields[i].getObject() == null) {
printDebug(j + ": " + this.fields[i].getObject());
} else {
printDebug(j + ": '" + this.fields[i].getObject() + "'");
}
j++;
}
}
int v = ps.executeUpdate();
if (v != 1) {
throw new SQLException("I can not insert the table " + this.tableName);
}
}
finally
{
resetChanged();
if (ps != null) {
ps.close();
}
if (stmt != null) {
stmt.close();
}
}
}
`
Edit* After further digging, I enabled logging on mysql and it shows me exaclty what is going on behind the scenes when new records are added / deleted etc.
`
170426 15:39:16 4 Query SET autocommit=0
4 Prepare [5] SELECT transfertypenumber, transfertypename, abbreviation FROM rc.transfertypes WHERE transfertypename= ?
4 Execute [5] SELECT transfertypenumber, transfertypename, abbreviation FROM rc.transfertypes WHERE transfertypename= 'Tank'
4 Prepare [6] INSERT INTO rc.transfers(transfertypenumber, usernumber, shouldstartstamp, notes, barrelsrequested) VALUES (?,?,?,?,?)
4 Execute [6] INSERT INTO rc.transfers(transfertypenumber, usernumber, shouldstartstamp, notes, barrelsrequested) VALUES (5,49,'2017-04-26 15:39:05','ZACKSCRIVEN',999)
4 Prepare [7] SELECT last_insert_id()
4 Execute [7] SELECT last_insert_id()
4 Prepare [8] INSERT INTO rc.transfertank(transfernumber, tanknumber, sourcedest, customernumber) VALUES (?,?,?,?)
4 Execute [8] INSERT INTO rc.transfertank(transfernumber, tanknumber, sourcedest, customernumber) VALUES (76265,1,'s',18)
4 Prepare [9] INSERT INTO rc.transfertank(transfernumber, tanknumber, sourcedest, customernumber) VALUES (?,?,?,?)
4 Execute [9] INSERT INTO rc.transfertank(transfernumber, tanknumber, sourcedest, customernumber) VALUES (76265,9,'d',18)
4 Prepare [10] INSERT INTO rc.producttransfer(transfernumber, productnumber) VALUES (?,?)
4 Execute [10] INSERT INTO rc.producttransfer(transfernumber, productnumber) VALUES (76265,21)
4 Query commit
`
dbTransfers and dbTransferTank seem to represent some sort of ORM (object-relational mapping) object. The insert statement isn't called directly by this code, but is embedded in the lines that call these objects' respective .insert method.
So i try to input all the keys from a HashMap to a database. My first approach was to insert all the keys one by one to my database. Note that the HashMap size is some million keys long so this process took a lot of time.
I did some research and stumbled upon the preparedStatement interface. So i came up with this piece of code to create a Batch of 10000 elements and then input them all together to the database.
final int batchSize = 10000;
int count = 0;
Connection dbConnection = null;
try {
dbConnection = getDBConnection();
String SQL = "INSERT INTO masterdict (window) " +
"VALUES(?)";
PreparedStatement ps = (PreparedStatement) dbConnection.prepareStatement(SQL);
for (String k : masterDict.keySet()) {
ps.setString(1,k);
ps.addBatch();
if(++count % batchSize == 0) {
System.out.println(count);
ps.executeBatch();
}
}
ps.executeBatch();
ps.close();
dbConnection.close();
for some reason though this approach takes exactly the same time to complete as the first one. Can anyone explain to me why is this the case?
After reading through the comments i ended up with this new version of the code that works just fine.
final int batchSize = 10000;
int count = 0;
Connection dbConnection = null;
try {
dbConnection = getDBConnection();
String SQL = "INSERT INTO masterdict (window) " +
"VALUES(?)";
PreparedStatement ps = (PreparedStatement) dbConnection.prepareStatement(SQL);
dbConnection.setAutoCommit(false);
for (String k : masterDict.keySet()) {
ps.setString(1,k);
ps.addBatch();
if(++count % batchSize == 0) {
System.out.println(count);
ps.executeBatch();
dbConnection.commit();
}
}
ps.executeBatch();
dbConnection.commit();
ps.close();
} catch (SQLException e) {
if (dbConnection != null) {
dbConnection.rollback();
}
System.out.println(e.getMessage());
} finally {
dbConnection.close();
}
I want to insert multiple rows into a MySQL table at once using Java. The number of rows is dynamic. In the past I was doing...
for (String element : array) {
myStatement.setString(1, element[0]);
myStatement.setString(2, element[1]);
myStatement.executeUpdate();
}
I'd like to optimize this to use the MySQL-supported syntax:
INSERT INTO table (col1, col2) VALUES ('val1', 'val2'), ('val1', 'val2')[, ...]
but with a PreparedStatement I don't know of any way to do this since I don't know beforehand how many elements array will contain. If it's not possible with a PreparedStatement, how else can I do it (and still escape the values in the array)?
You can create a batch by PreparedStatement#addBatch() and execute it by PreparedStatement#executeBatch().
Here's a kickoff example:
public void save(List<Entity> entities) throws SQLException {
try (
Connection connection = database.getConnection();
PreparedStatement statement = connection.prepareStatement(SQL_INSERT);
) {
int i = 0;
for (Entity entity : entities) {
statement.setString(1, entity.getSomeProperty());
// ...
statement.addBatch();
i++;
if (i % 1000 == 0 || i == entities.size()) {
statement.executeBatch(); // Execute every 1000 items.
}
}
}
}
It's executed every 1000 items because some JDBC drivers and/or DBs may have a limitation on batch length.
See also:
JDBC tutorial - Using PreparedStatement
JDBC tutorial - Using Statement Objects for Batch Updates
When MySQL driver is used you have to set connection param rewriteBatchedStatements to true ( jdbc:mysql://localhost:3306/TestDB?**rewriteBatchedStatements=true**).
With this param the statement is rewritten to bulk insert when table is locked only once and indexes are updated only once. So it is much faster.
Without this param only advantage is cleaner source code.
If you can create your sql statement dynamically you can do following workaround:
String myArray[][] = { { "1-1", "1-2" }, { "2-1", "2-2" }, { "3-1", "3-2" } };
StringBuffer mySql = new StringBuffer("insert into MyTable (col1, col2) values (?, ?)");
for (int i = 0; i < myArray.length - 1; i++) {
mySql.append(", (?, ?)");
}
myStatement = myConnection.prepareStatement(mySql.toString());
for (int i = 0; i < myArray.length; i++) {
myStatement.setString(i, myArray[i][1]);
myStatement.setString(i, myArray[i][2]);
}
myStatement.executeUpdate();
In case you have auto increment in the table and need to access it.. you can use the following approach... Do test before using because getGeneratedKeys() in Statement because it depends on driver used. The below code is tested on Maria DB 10.0.12 and Maria JDBC driver 1.2
Remember that increasing batch size improves performance only to a certain extent... for my setup increasing batch size above 500 was actually degrading the performance.
public Connection getConnection(boolean autoCommit) throws SQLException {
Connection conn = dataSource.getConnection();
conn.setAutoCommit(autoCommit);
return conn;
}
private void testBatchInsert(int count, int maxBatchSize) {
String querySql = "insert into batch_test(keyword) values(?)";
try {
Connection connection = getConnection(false);
PreparedStatement pstmt = null;
ResultSet rs = null;
boolean success = true;
int[] executeResult = null;
try {
pstmt = connection.prepareStatement(querySql, Statement.RETURN_GENERATED_KEYS);
for (int i = 0; i < count; i++) {
pstmt.setString(1, UUID.randomUUID().toString());
pstmt.addBatch();
if ((i + 1) % maxBatchSize == 0 || (i + 1) == count) {
executeResult = pstmt.executeBatch();
}
}
ResultSet ids = pstmt.getGeneratedKeys();
for (int i = 0; i < executeResult.length; i++) {
ids.next();
if (executeResult[i] == 1) {
System.out.println("Execute Result: " + i + ", Update Count: " + executeResult[i] + ", id: "
+ ids.getLong(1));
}
}
} catch (Exception e) {
e.printStackTrace();
success = false;
} finally {
if (rs != null) {
rs.close();
}
if (pstmt != null) {
pstmt.close();
}
if (connection != null) {
if (success) {
connection.commit();
} else {
connection.rollback();
}
connection.close();
}
}
} catch (SQLException e) {
e.printStackTrace();
}
}
#Ali Shakiba your code needs some modification. Error part:
for (int i = 0; i < myArray.length; i++) {
myStatement.setString(i, myArray[i][1]);
myStatement.setString(i, myArray[i][2]);
}
Updated code:
String myArray[][] = {
{"1-1", "1-2"},
{"2-1", "2-2"},
{"3-1", "3-2"}
};
StringBuffer mySql = new StringBuffer("insert into MyTable (col1, col2) values (?, ?)");
for (int i = 0; i < myArray.length - 1; i++) {
mySql.append(", (?, ?)");
}
mysql.append(";"); //also add the terminator at the end of sql statement
myStatement = myConnection.prepareStatement(mySql.toString());
for (int i = 0; i < myArray.length; i++) {
myStatement.setString((2 * i) + 1, myArray[i][1]);
myStatement.setString((2 * i) + 2, myArray[i][2]);
}
myStatement.executeUpdate();
This might be helpful in your case of passing array to PreparedStatement.
Store the required values to an array and pass it to a function to insert the same.
String sql= "INSERT INTO table (col1,col2) VALUES (?,?)";
String array[][] = new String [10][2];
for(int i=0;i<array.size();i++){
//Assigning the values in individual rows.
array[i][0] = "sampleData1";
array[i][1] = "sampleData2";
}
try{
DBConnectionPrepared dbcp = new DBConnectionPrepared();
if(dbcp.putBatchData(sqlSaveAlias,array)==1){
System.out.println("Success");
}else{
System.out.println("Failed");
}
}catch(Exception e){
e.printStackTrace();
}
putBatchData(sql,2D_Array)
public int[] putBatchData(String sql,String args[][]){
int status[];
try {
PreparedStatement stmt=con.prepareStatement(sql);
for(int i=0;i<args.length;i++){
for(int j=0;j<args[i].length;j++){
stmt.setString(j+1, args[i][j]);
}
stmt.addBatch();
stmt.executeBatch();
stmt.clearParameters();
}
status= stmt.executeBatch();
} catch (Exception e) {
e.printStackTrace();
}
return status;
}
It is possible to submit multiple updates in JDBC.
We can use Statement, PreparedStatement, and CallableStatement objects for batch update with disabled auto-commit.
addBatch() and executeBatch() functions are available with all statement objects to have BatchUpdate.
Here addBatch() method adds a set of statements or parameters to the current batch.
I have a method which retrieved values from resultset where some column values is 1.
Now I want to apply a condition which states that when beam_current = 101.20 ,beam_energy=2500.063 and st1_prmt_status_p45=1 then the values should be printed.
My code for displaying values is:
public LinkedHashMap < String, Integer > beam_CurrentStatus() throws SQLException {
try
{
con = getConnection();
stmt = con.createStatement();
String sql = "SELECT TOP 1 c.logtime, a.BL1_data_SS_ST,a.BL2_data_SS_ST,a.BL3_data_SS_ST,a.BL4_data_SS_ST,a.BL5_data_SS_ST,a.BL6_data_SS_ST,a.BL7_data_SS_ST,a.BL8_data_SS_ST,a.BL9_data_SS_ST,a.BL10_data_SS_ST,a.BL11_data_SS_ST, a.BL12_data_SS_ST,a.BL13_data_SS_ST,a.BL14_data_SS_ST,a.BL15_data_SS_ST,a.BL16_data_SS_ST,a.BL17_data_SS_ST,a.BL18_data_SS_ST,a.BL19_data_SS_ST,a.BL20_data_SS_ST,a.BL21_data_SS_ST,a.BL22_data_SS_ST,a.BL23_data_SS_ST,a.BL24_data_SS_ST,a.BL25_data_SS_ST,a.BL26_data_SS_ST,a.BL27_data_SS_ST,b.st1_prmt_status_p45,c.beam_current,c.beam_energy from INDUS2_BLFE.dbo.main_BLFE_status a inner join INDUS2_MSIS.dbo.main_MSIS_status b on a.logtime=b.logtime inner join INDUS2_BDS.dbo.DCCT c on b.logtime=c.logtime ORDER BY c.logtime DESC ";
stmt.executeQuery(sql);
rs = stmt.getResultSet();
ResultSetMetaData rsmd = rs.getMetaData();
while (rs.next()) {
for (int j = 2; j < 29; j++) {
if (rs.getInt(j) == 1) {
String name = rsmd.getColumnLabel(j);
map.put(name, rs.getInt(j));
}
}
}
} catch (Exception e) {
System.out.println("\nException in Bean " + e.getMessage());
} finally {
closeConnection(stmt, rs, con);
}
return map;
}
I want to apply the condition like
if(rs.getInt(29)==1|| rs.getDouble(30)==101.20||rs.getDouble(30)==2500.063)
{
for (int j = 2; j < 29; j++)
{.......
But this if condition has no effect on for loop??How to apply this if condition in while loop??
I think in below condition is wrong :
if(rs.getInt(29)==1|| rs.getDouble(30)==101.20||rs.getDouble(30)==2500.063)
Below is your condition :
st1_prmt_status_p45=1
beam_current = 101.20
beam_energy=2500.063
This should be condition :
if(rs.getInt(29)==1|| rs.getDouble(30)==101.20||rs.getDouble(31)==2500.063)
rs.getDouble(30) ===> rs.getDouble(31)
I've written a method that performs the following commands: select, insert, update, delete. Originally, I only wanted the first two columns of output after running SELECT * FROM SithLords. How can I get all of the rows and columns? Also, how can I allow the user to add newlines when running the select command like this:
SELECT jedi_name
FROM SithLords
WHERE level = 'master';
As of now, it only executes with one line:
select jedi_name from SithLords where level = 'master';
Code:
public void actionPerformed(ActionEvent e) {
String query = queryStatements.getText();
try {
PreparedStatement stmt = (PreparedStatement) connection.prepareStatement(query);
if(query.matches("(select|SELECT).*")) {
ResultSet result = stmt.executeQuery();
StringBuilder strResult = new StringBuilder();
while(result.next()) {
strResult.append(result.getString(1)).append(" ").append(result.getString(2));
strResult.append("\n");
}
queryResults.setText(strResult.toString());
} else if(query.matches("(insert|INSERT).*")) {
stmt.executeUpdate(query);
} else if(query.matches("(update|UPDATE).*")) {
stmt.executeUpdate(query);
} else if(query.matches("(delete|DELETE).*")) {
stmt.executeUpdate(query);
} else {
System.out.println("Not supported yet!");
}
} catch (SQLException error) {
error.printStackTrace();
}
}
If what you want is to retrieve from the recorset all its columns you could ask it how many columns does it have (n) and access from 1 to n:
ResultSet result = stmt.executeQuery();
ResultSetMetaData md = result.getMetaData();
int nCols = md.getColumnCount();
for(int c = 1; c <= nCols; c++)
strResult.append(result.getString(c).append(" "));
Concerning your multiline commands, you could replace the command end of lines by spaces:
query.replaceAll("(\\r|\\n)", " ");