public static List<SPACE_CreateLicenseModel> SPACE_getDetails() throws ClassNotFoundException, FileNotFoundException, JSONException{
SPACE_CreateLicenseModel view = new SPACE_CreateLicenseModel();
Statement stmt = null;
Connection connect = null;
List<SPACE_CreateLicenseModel> allData = new ArrayList<SPACE_CreateLicenseModel>();
try {
connect = SPACE_DBController.SPACE_getConnection();
stmt = connect.createStatement();
JSONObject obj = SPACE_Parse.parse ("C:/Users/Rachana/workspace/SPACEOM/WebContent/Data/SPACE_Database.json");
String tablename = obj.getString("table_name");
String sql = "SELECT * FROM " + tablename + " WHERE (SPLD_LicenseActiveStatus <> 5 OR SPLD_LicenseActiveStatus IS NULL)";
ResultSet result = stmt.executeQuery(sql);
int i =0;
while (result.next()) {
view.setSPLD_DeviceID_Mfg(result.getString(1));
view.setSPLD_DeviceID_ModelNo(result.getString(2));
view.setSPLD_DeviceID_SrNo(result.getString(3));
view.setSPLD_DeviceID_Search_mode(result.getByte(4));
view.setSPLD_LicenseType(result.getByte(5));
view.setSPLD_LicenseTypeChangedDate(result.getDate(6));
view.setSPLD_LicenseActiveStatus(result.getByte(7));
view.setSPLD_LicenseActiveDate(result.getDate(8));
view.setSPLD_LicenseAccess(result.getByte(9));
view.setSPLD_LicenseAccessMaxNo(result.getInt(10));
view.setSPLD_LicenseAccessCounter(result.getInt(11));
view.setSPLD_LicenseStartDate(result.getDate(12));
view.setSPLD_LicenseExpiryDate(result.getDate(13));
view.setSPLD_LicenseeOrg(result.getString(14));
view.setSPLD_LicenseeAddress(result.getString(15));
view.setSPLD_LocationActive(result.getString(16));
view.setSPDL_Longitude(result.getDouble(17));
view.setSPDL_Latitude(result.getDouble(18));
view.setSPDL_LocationTolerance(result.getFloat(19));
view.setSPLD_FutureOption1(result.getString(20));
view.setSPLD_FutureOption2(result.getString(21));
view.setSPLD_FutureOption3(result.getString(22));
view.setSPLD_FutureOption4(result.getInt(23));
view.setSPLD_FutureOption5(result.getInt(24));
view.setSPLD_StatCounter1_FirstUseDate(result.getDate(25));
view.setSPLD_StatCounter2_MessageTotal(result.getInt(26));
view.setSPLD_StatCounter3_FailedAttempts(result.getInt(27));
view.setSPLD_StatCounter4_FirstFailedAttemptDate(result.getDate(28));
view.setSPLD_StatCounter5_LastFailedAttemptDate(result.getDate(29));
view.setSPLD_StatCounter6(result.getInt(30));
view.setSPLD_StatCounter7(result.getInt(31));
view.setSPLD_StatCounterOption1(result.getString(32));
view.setSPLD_StatCounterOption2(result.getString(33));
view.setSPLD_StatCounterOption3(result.getString(34));
view.setSPLD_StatCounterOption4(result.getInt(35));
view.setSPLD_StatCounterOption5(result.getInt(36));
view.setSPLD_MainContact1Name(result.getString(37));
view.setSPLD_MainContact2Name(result.getString(38));
view.setSPLD_MobileNo1(result.getString(39));
view.setSPLD_MobileNo2(result.getString(40));
view.setSPLD_EmailID1(result.getString(41));
view.setSPLD_EmailID2(result.getString(42));
view.setSPLD_CustomerDetailOption1(result.getString(43));
view.setSPLD_CustomerDetailOption2(result.getString(44));
view.setSPLD_BroadCastGEN1(result.getString(45));
view.setSPLD_BroadCastGEN2(result.getString(46));
view.setSPLD_BroadCastID1(result.getInt(47));
view.setSPLD_DevSpecGEN1(result.getString(48));
view.setSPLD_DevSpecGEN2(result.getString(49));
view.setSPLD_DevSpecGEN3(result.getString(50));
view.setSPLD_DevSpecID1(result.getInt(51));
view.setSPLD_DevSpecID2(result.getInt(52));
view.setSPLD_MessageStatus(result.getString(53).charAt(0));
allData.add(i,view);
i++;
}
} catch (SQLException e1) {
// TODO Auto-generated catch block
e1.printStackTrace();
} catch (ClassNotFoundException e1) {
// TODO Auto-generated catch block
e1.printStackTrace();
}finally{
//finally block used to close resources
try{
if(stmt!=null)
stmt.close();
}catch(SQLException se2){
}// nothing we can do
try{
if(connect!=null)
connect.close();
}catch(SQLException se){
se.printStackTrace();
}
}
return allData;
}
I am fetching all the rows of the database and storing it in array. But while displaying only the last row is getting printed. The list elements are getting overridden. i.e., allData.add(1,view), allData.add(2,view) , allData.add(3,view) , allData.add(4,view) etc everything are same.
As you are not creating a new Object for each iteration of the loop, it is re-using the same object, so try
Statement stmt = null;
Connection connect = null;
List<SPACE_CreateLicenseModel> allData = new ArrayList<SPACE_CreateLicenseModel>();
try {
connect = SPACE_DBController.SPACE_getConnection();
....
while (result.next()) {
SPACE_CreateLicenseModel view = new SPACE_CreateLicenseModel();
Cause:
Currently for each row same object is getting updated hence all your objects in list have same values (Last Row).
Resolution:
You need to initialize SPACE_CreateLicenseModel each time in loop for every row.
while (result.next()) {
SPACE_CreateLicenseModel view = new SPACE_CreateLicenseModel();
view.setSPLD_DeviceID_Mfg(result.getString(1));
.
.
allData.add(i,view);
i++;
}
Hope this helps
Create a new view object with every iteration of your while loop. Every time you loop through the same view object is getting over written in memory. The final time your loop runs it replaces it with the last row values which is being displayed when you are printing your data...
while(yourCondition){
view = new SPACE_CreateLicenseModel();
//your code goes here....
}
Adding the above line in your loop will create a new view Object and will be added to your allData variable.
Related
I have an executable jar file I compiled from my program and I ran it on my PC. It works perfectly fine when I ran it in my command prompt using java -jar [nameofjar.jar]
However, I tried testing it on another pc. Using command prompt to run the same jar file, it throws an error:
D:\QA06122018_2>java -jar Indexing.jar
java.lang.NullPointerException
at IndexDriver.processText(IndexDriver.java:81)
at IndexDriver.index(IndexDriver.java:140)
at Main.main(Main.java:44).....
Both PC are using the same operating system and settings.
I even looked at the code regarding the error and there doesn't seem to be any problem with it. Ran fine on my IDE.
Is there anything I might overlooked?
EDIT:
The code :
public PreparedStatement preparedStatement = null;
MysqlAccessIndex con = new MysqlAccessIndex();
public Connection con1 = con.connect();
String path1;
public void index() throws Exception {
// Connection con1 = con.connect();
try {
Statement statement = con1.createStatement();
ResultSet rs = statement.executeQuery("select * from filequeue where Status='Active' LIMIT 5");
while (rs.next()) {
// get the filepath of the PDF document
path1 = rs.getString(2);
int getNum = rs.getInt(1);
Statement test = con1.createStatement();
test.executeUpdate("update filequeue SET STATUS ='Processing' where UniqueID="+getNum);
try {
// call the index function
PDDocument document = PDDocument.load(new File(path1),MemoryUsageSetting.setupTempFileOnly());
if (!document.isEncrypted()) {
PDFTextStripper tStripper = new PDFTextStripper();
for(int p=1; p<=document.getNumberOfPages();++p) {
tStripper.setStartPage(p);
tStripper.setEndPage(p);
try {
String pdfFileInText = tStripper.getText(document);
processText(pdfFileInText);
System.out.println("Page "+p+" done");
}catch (Exception e){
e.printStackTrace();
Statement statement1 = con1.createStatement();
statement1.executeUpdate("update filequeue SET Error ='E0003' where UniqueID="+getNum);
statement1.executeUpdate("update filequeue SET Status ='Error' where UniqueID="+getNum);
con1.commit();
con1.close();
}
}
}
// After completing the process, update status: Complete
Statement pre= con1.createStatement();
pre.executeUpdate("update filequeue SET STATUS ='Complete' where UniqueID="+getNum);
// con1.commit();
preparedStatement.close();
document.close();
System.out.println("Successfully commited changes to the database!");
con1.commit();
// con1.close();
// updateComplete_DB(getNum);
} catch (Exception e) {
try {
System.err.println(e);
Statement statement1 = con1.createStatement();
statement1.executeUpdate("update filequeue SET STATUS ='Error' where UniqueID="+getNum);
statement1.executeUpdate("update filequeue SET Error ='E0002' where UniqueID="+getNum);
con1.commit();
// add rollback function
rollbackEntries();
}catch (Exception e1){
System.out.println("Could not rollback updates :" + e1.getMessage());
}
}
// con1.close();
}
}catch(Exception e){
e.printStackTrace();
//System.out.println("lalala");
}
//con1.commit();
con1.close();
}
Calling the method:
public void processText(String text) throws SQLException {
String lines[] = text.split("\\r?\\n");
for (String line : lines) {
String[] words = line.split(" ");
String sql="insert IGNORE into test.indextable values (?,?);";
preparedStatement = con1.prepareStatement(sql);
int i=0;
for (String word : words) {
// check if one or more special characters at end of string then remove OR
// check special characters in beginning of the string then remove
// insert every word directly to table db
word=word.replaceAll("([\\W]+$)|(^[\\W]+)", "");
preparedStatement.setString(1, path1);
preparedStatement.setString(2, word);
preparedStatement.executeUpdate();
}
}
preparedStatement.close();
}
The root cause is that there were no lines to process.
You appear to only create prepared statements inside the for (String line : lines) { loop. But you only close the last statement you created (outside that loop).
When you don't have any lines, preparedStatement is null, because you never created one.
Even when you have lines to process, you are creating lots of prepared statements but only closing the last one.
You should probably create one prepared statement at the start of the method and reuse it for the whole method, closing it at the end.
I am trying to execute a query using postgre sql driver for java jdbc.
I have an issue with memory buildup my statement is in a loop and then sleeps.
The problem is when I look at the job in task manager I can see the memory climbing 00,004K at a time. I have read the documentation I have closed all connections statements resultsets but this still happens.
Please could you tell me what is causing this in my code.
String sSqlString = new String("SELECT * FROM stk.comms_data_sent_recv " +
"WHERE msgtype ='RECEIVE' AND msgstat ='UNPRC' " +
"ORDER BY p6_id,msgoccid " +
"ASC; ");
ResultSet rs = null;
Class.forName("org.postgresql.Driver");
Connection connection = DriverManager.getConnection(
"jdbc:postgresql://p6tstc01:5432/DEVC_StockList?autoCloseUnclosedStatements=true&logUnclosedConnections=true&preparedStatementCacheQueries=0&ApplicationName=P6Shunter", "P6dev",
"admin123");
//Main Loop
while(true)
{
try{
Statement statement = connection.createStatement();
statement.executeQuery(sSqlString);
//rs.close();
statement.close();
//connection.close();
rs = null;
//connection = null;
statement =null;
}
finally {
//connection.close();
}
try {
Thread.sleep(loopTime);
} catch (InterruptedException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
Notice the commented out code.. I did close all but that did not seem to make a difference. Whet I did see is that it seems that the statement executeQuery(sSqlString); is causing this the reason I think so is if I remove the statement there is no memory leak.
I could be wrong but please assist me.
UPDATE:
I have changed my code as with your recommendations. Hope its a bit better please let me know if I need to change something.
My main loop :
public static void main(String[] args) throws Exception {
// TODO Auto-generated method stub
//Main Loop
while(true)
{
getAndProcessAllUnprcMessagesFromStockList();
try {
Thread.sleep(loopTime);
} catch (InterruptedException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
My Function it will call do fetch data :
public static void getAndProcessAllUnprcMessagesFromStockList() throws Exception
{
ResultSet rs = null;
Statement statement = null;
Connection connection =null;
String sSqlString = new String("SELECT * FROM stk.comms_data_sent_recv " +
"WHERE msgtype ='RECEIVE' AND msgstat ='UNPRC' " +
"ORDER BY p6_id,msgoccid " +
"ASC; ");
try{
Class.forName("org.postgresql.Driver");
connection = DriverManager.getConnection(
"jdbc:postgresql://p6tstc01:5432/DEVC_StockList?autoCloseUnclosedStatements=true&logUnclosedConnections=true&preparedStatementCacheQueries=0&ApplicationName=P6Shunter", "P6dev",
"admin123");
PreparedStatement s = connection.prepareStatement(sSqlString,
ResultSet.TYPE_SCROLL_INSENSITIVE,
ResultSet.CONCUR_READ_ONLY);
rs = s.executeQuery();
while (rs.next()) {
//Process records
UnprcMsg msg = new UnprcMsg();
msg.setP6Id(rs.getString(1));
msg.setMsgOccId(rs.getString(2));
msg.setWsc(rs.getString(3));
msg.setMsgId(rs.getString(4));
msg.setMsgType(rs.getString(5));
msg.setMsgStatus(rs.getString(6));
//JOptionPane.showMessageDialog(null,msg.getP6Id(), "InfoBox: " + "StockListShunter", JOptionPane.INFORMATION_MESSAGE);
//msg2 = null;
}
rs.close();
s.close();
}
catch(Exception e)
{
e.printStackTrace();
}
finally
{
connection.close();
}
}
I have closed my connections statements and results.
I also downloaded eclipse memory analyzer and I ran the jar witch will execute my main loop. Ran it for about an hour and here's some of the data I got from memory analyzer..
Leak suspects :
Now I know I cant go on the memory usage of task manager but whats the difference? Why does task manager show the following :
I was concerned about the memory usage I see in task manager? should I be?
I am getting null value when I am reading the blob data from database. What might be the issue? Can some one help me on this?
Connection con = null;
PreparedStatement psStmt = null;
ResultSet rs = null;
try {
try {
Class.forName("oracle.jdbc.driver.OracleDriver");
} catch (ClassNotFoundException e) {
e.printStackTrace();
}
con =
DriverManager.getConnection("jdbc:oracle:thin:#MyDatabase:1535:XE","password","password");
System.out.println("connection established"+con);
psStmt = con
.prepareStatement("Select Photo from Person where Firstname=?");
int i = 1;
psStmt.setLong(1, "Nani");
rs = null;
rs = psStmt.executeQuery();
InputStream inputStream = null;
while (rs.next()) {
inputStream = rs.getBinaryStream(1);
//Blob blob = rs.getBlob(1);
//Blob blob1 = (Blob)rs.getObject(1);
//System.out.println("blob length "+blob1);//rs.getString(1);
}
System.out.println("bytessssssss "+inputStream);//here i am getting null value.
} catch (SQLException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
I believe you didn't use setString function to assign any value to firstname which leads to null
for example:
ps.preparedStatement("Select photo from person where firstname = ?");
ps.setString(1,"kick"); <----- add this line
system.out.println("bytes "+rs.getBinaryStream(1));
Another suggestions
there is no need to use rs = null; inside try catch block because you have rs=null; at beginning of
your code.
change
InputStream inputStream = null;
to
InputStream inputStream = new InputStream();
or
get rid of InputStream inputStream = null;
source you should take a look at
The most obvious error is using setLong instead of setString.
However one practice is fatal: declaring in advance. This in other languages is a good practice, but in java one should declare as close as possible.
This reduces scope, by which you would have found the error! Namely inputStream is called after a failed rs.next() - outside the loop. Maybe because no records were found.
This practice, declaring as near as feasible, also helps with try-with-resources which were used here to automatically close the statement and result set.
Connection con = null;
try {
Class.forName("oracle.jdbc.driver.OracleDriver");
con = DriverManager.getConnection(
"jdbc:oracle:thin:#MyDatabase:1535:XE","password","password");
System.out.println("connection established"+con);
try (PreparedStatement psStmt = con.prepareStatement(
"Select Photo from Person where Firstname=?")) {
int i = 1;
psStmt.setString(1, "Nani");
try (ResultSet rs = psStmt.executeQuery()) {
while (rs.next()) {
try (InputStream inputStream = rs.getBinaryStream(1)) {
//Blob blob = rs.getBlob(1);
//Blob blob1 = (Blob)rs.getObject(1);
//System.out.println("blob length "+blob1);//rs.getString(1);
Files.copy(inputStream, Paths.get("C:/photo-" + i + ".jpg"));
}
++i;
}
//ERROR System.out.println("bytessssssss "+inputStream);
} // Closes rs.
} // Closes psStmt.
}
1- In your code when setting the parameter's value of SQL query, be sure to use the appropriate data type of the field. So here you should use
psStmt.setString(1, "Nani");
instead of
psStmt.setLong(1, "Nani");
2- Make sure that the query is correct (Table name, field name).
3- Make sure that the table is containing data.
Excuse any wrong practices as I am very new to threading. I have a program that calls my api and gets data back in json format. Each request returns a row of data in json format. All together I need to retrieve about 2,000,000 rows a day which means 2,000,000 requests (I understand that this is bad design, but the system was not designed for this purpose it is just what I need to do for the next couple of weeks). When I tried running it on a single thread I was processing about 200 requests a minute which is much too slow. As a result I created 12 threads and I was processing 5500 rows a minutes which was a great improvement. The problem was only about on average 90% of the rows were inserted into the database as I ran it a few times to make sure. Before each insert printed to a file each URL which was sent and then I checked to see if each insert statement was successful (returned 1 when executed ) and it all seems fine. Every time I run it it inserts about 90% but it does varies and it has never been a consistent number. Am I doing something wrong inside my java code? Essentially the code starts in main by creating 12 threads. Each thread's creates a run method which calls a new instance of MySQLPopulateHistData and passes a start and end integer which are used in the insert statement for ranges. I have done many system.out.println type testing and can see all the threads do start and all the 12 instances (one instance for each thread) called are executing? Does anyone have any idea what it could be?
MAIN:
import java.io.IOException;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;
public class MainClass {
public static void main(String[] args) {
try {
//create a pool of threads
Thread[] threads = new Thread[12];
// submit jobs to be executing by the pool
for (int i = 0; i <12; i++) {
threads[i] = new Thread(new Runnable() {
public void run() {
try {
new MySQLPopulateHistData(RangeClass.IdStart, RangeClass.IdEnd);
} catch (Throwable e) {
//TODO Auto-generated catch block
e.printStackTrace();
}
}
});
threads[i].start();
Thread.sleep(1000);
RangeClass.IdStart = RangeClass.IdEnd + 1;
RangeClass.IdEnd = RangeClass.IdEnd + 170000;
}
} catch (Throwable e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
MyDataSourceFactory.class
import java.io.FileInputStream;
import java.io.IOException;
import java.util.Properties;
import javax.sql.DataSource;
import com.mysql.jdbc.jdbc2.optional.MysqlDataSource;
public class MyDataSourceFactory {
static String url = "jdbc:mysql://localhost:3306/my_schema";
static String userName = "root";
static String password = "password";
public synchronized static DataSource getMySQLDataSource() {
MysqlDataSource mysqlDS = null;
mysqlDS = new MysqlDataSource();
mysqlDS.setURL(url);
mysqlDS.setUser(userName);
mysqlDS.setPassword(password);
return mysqlDS;
}
}
MySQLPopulateHistData.class
public class MySQLPopulateHistData {
public MySQLPopulateHistData(int s, int e ) throws IOException, Throwable{
getHistory(s,e);
}
public synchronized void getHistory(int start, int end){
DataSource ds = MyDataSourceFactory.getMySQLDataSource();
Connection con = null;
Connection con2 = null;
Statement stmt = null;
Statement stmt2 = null;
ResultSet rs = null;
try {
con = ds.getConnection();
con2 = ds.getConnection();
stmt = con.createStatement();
stmt2 = con.createStatement();
rs = stmt.executeQuery("SELECT s FROM sp_t where s_id BETWEEN "+ start +" AND "+ end + " ORDER BY s;");
String s = "";
while(rs.next()){
s = rs.getString("s");
if( s == ""){
}
else{
try{
URL fullUrl = new URL(//My Url to my api with password with start and end range);
InputStream is = fullUrl.openStream();
String jsonStr = getStringFromInputStream(is);
JSONObject j = new JSONObject(jsonStr);
JSONArray arr = j.getJSONObject("query").getJSONObject("results").getJSONArray("quote");
for(int i=0; i<arr.length(); i++){
JSONObject obj = arr.getJSONObject(i);
String symbol = obj.getString("s");
stmt2.executeUpdate("INSERT into sp2_t(s) VALUES ('"+ s +"') BETWEEN "+start+" AND "+ end +";");
}
}
catch(Exception e){
}
}
s = "";
}
} catch (Exception e) {
e.printStackTrace();
}finally{
try {
if(rs != null) rs.close();
if(stmt != null) stmt.close();
if(con != null) con.close();
if(stmt2 != null) stmt.close();
if(con2 != null) con.close();
} catch (Exception e) {
e.printStackTrace();
}
}
}
}
UPDATE:
So I put:
(if s.equals("")){
System.out.println("EMPTY");
}
and it never printed out EMPTY. After the JSON requests gets converted to the JSONArray I added:
if(arr.length()>0){
StaticClassHolder.cntResponses++;
}
This is just a static variable in another class that gets incremented everytime there is a valid JSON response. It equalled to the exact right amount it was supposed to be. So it seems as if the URL gets all the responses properly, parses them properly, but is not INSERTING them properly into the database? I can't figure out why?
I also faced the similar issue while inserting records in Oracle. Since I didn't find any concrete solution. I tried with single thread and all went fine.
There are several reasons why this does not work:
A normal computer can only handle about 4-8 threads in total per cpu. As the system uses some of thise threads you would only be able to run some threads at the same time. The computer handles this by pausing some threads then running another thread.
If you try to send several queries through the socket to the mysql server at the same time chanses are that some of the requests will not work and you lose some of your data.
As for now I do not have any solution for faster updates of the table.
I am trying to use an SQL database with a Java program. I make a table that is 7 columns wide and 2.5 million rows (My next one I need to build will be about 200 million rows). I have two problems: building the SQL table is too slow (about 2,000 rows/minute) and searching the database is too slow (I need to find over 100 million rows in under a second if possible, it currently takes over a minute). I have tried creating a csv file and importing it, but I can't get it to work.
I am using xampp and phpMyAdmin on my computer (i5 + 6gb ram). I have three methods I am testing: createTable(), writeSQL(), and searchSQL().
createTable:
public static void createTable() {
String driverName = "org.gjt.mm.mysql.Driver";
Connection connection = null;
try {
Class.forName(driverName);
} catch (ClassNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
String serverName = "localhost";
String mydatabase = "PokerRanks4";
String url = "jdbc:mysql://" + serverName + "/" + mydatabase;
String username = "root";
String password = "";
try {
connection = DriverManager.getConnection(url, username, password);
} catch (SQLException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
///////////////
String table = "CREATE TABLE ranks(deckForm bigint(10) NOT NULL,rank0 int(2) NOT NULL,rank1 int(2) NOT NULL,rank2 int(2) NOT NULL,rank3 int(2) NOT NULL,rank4 int(2) NOT NULL,rank5 int(2) NOT NULL,PRIMARY KEY (deckForm),UNIQUE id (deckForm),KEY id_2 (deckForm))";
try {
Statement st = connection.createStatement();
st.executeUpdate(table);
} catch (SQLException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
///////////////
try {
connection.close();
} catch (SQLException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
writeSQL():
public static void writeSQL() {
String driverName = "org.gjt.mm.mysql.Driver";
Connection connection = null;
try {
Class.forName(driverName);
} catch (ClassNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
String serverName = "localhost";
String mydatabase = "PokerRanks4";
String url = "jdbc:mysql://" + serverName + "/" + mydatabase;
String username = "root";
String password = "";
try {
connection = DriverManager.getConnection(url, username, password);
} catch (SQLException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
/////////////// Prepared Statement with Batch
PreparedStatement statement = null;
String sql = "INSERT INTO ranks VALUES (? ,0, 0, 0, 0, 0, 0)";
long start = System.currentTimeMillis();
try {
statement = connection.prepareStatement(sql);
for (int i = 0; i < 100; i++) {
for (int j = 0; j < 100; j++) {
statement.setLong(1, (i*100 + j));
statement.addBatch();
}
System.out.println(i);
statement.executeBatch();
}
} catch (Exception e) {
e.printStackTrace();
} finally {
if (statement != null) {
try {
statement.close();
} catch (SQLException e) {
} // nothing we can do
}
if (connection != null) {
try {
connection.close();
} catch (SQLException e) {
} // nothing we can do
}
}
System.out.println("Total Time: " + (System.currentTimeMillis() - start) / 1000 );
///////////////
}
searchSQL():
public static void searchSQL() {
String driverName = "org.gjt.mm.mysql.Driver";
Connection connection = null;
try {
Class.forName(driverName);
} catch (ClassNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
String serverName = "localhost";
String mydatabase = "PokerRanks2";
String url = "jdbc:mysql://" + serverName + "/" + mydatabase;
String username = "root";
String password = "";
try {
connection = DriverManager.getConnection(url, username, password);
} catch (SQLException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
/////////////// Option 1, Prepared Statement
ResultSet rs = null;
PreparedStatement pstmt = null;
String query = "SELECT rank0, rank1, rank2, rank3, rank4, rank5 FROM ranks WHERE deckForm = ?";
long start = System.currentTimeMillis();
try {
pstmt = connection.prepareStatement(query);
for (int i = 0; i < 100000; i++) {
pstmt.setLong(1, 1423354957);
rs = pstmt.executeQuery();
while (rs.next()) {
int[] arr = {rs.getInt(1), rs.getInt(2), rs.getInt(3), rs.getInt(4), rs.getInt(5), rs.getInt(6)};
}
}
} catch (SQLException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
System.out.println("Total Time: " + (System.currentTimeMillis() - start) / 1000 );
///////////////
/*
/////////////// Option 2
Statement st = null;
long start = System.currentTimeMillis();
try {
st = connection.createStatement();
ResultSet rs = null;
long deckForm = 1012213456;
for (int i = 0; i < 100000; i++) {
rs = st.executeQuery("SELECT rank0, rank1, rank2, rank3, rank4, rank5 FROM ranks WHERE deckForm = " + deckForm);
while (rs.next()) {
int[] arr = {rs.getInt(1), rs.getInt(2), rs.getInt(3), rs.getInt(4), rs.getInt(5), rs.getInt(6)};
}
}
} catch (SQLException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
System.out.println("Total Time: " + (System.currentTimeMillis() - start) / 1000 );
///////////////
*/
try {
connection.close();
} catch (SQLException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
Sorry that's so long. I've tried everything I can think of to make this faster but I can't figure it out. Any suggestions?
Well, there's a few improvements you could make:
You are creating a connection each time you want to search, write or create,
you should use a pooled connection and datasources.
Optimize your queries by doing explain plans, and optimize your table relations and indexes.
You can use stored procedures and call them.
Well that's all I can help with, certainly there are more tips.
As to the insert speed, you need to disable all the indexes prior to doing the insert and re-enable them after you're done. Please see Speed of Insert Statements for a lot of detailed information on improving bulk insert speed.
The query speed is probably limited by your CPU and disk speeds. You may have to throw much more hardware at the problem.
building the SQL table is too slow (about 2,000 rows/minute)
So point of view on inserting a great number of rows is sure use Heap table, it's basic table, also it named as persistent page-array usually created just by CREATE TABLE, it's not effective for searching as you meant that search is slow but for inserting is very efficient because it add rows to first free position that what find or on the end of table. But on other hand, searching is very inefficietly, because is not guaranteed sort of items/rows.
searching the database is too slow (I need to find over 100 million
rows in under a second if possible, it currently takes over a minute)
So for this you should create table in that is searching is efficiently. In a case if you using Oracle, so it offers many constructions for physical implementation for example Index organized tables, Data clustering, Clustered tables - Index / Hash / Sorted hash ...
SQL Server i'm not sure but also clustered tables and MySQL i don't know exactly, i don't want to tell you something worst. I don't say that MySQL is bad or worse like Oracle for example but just not offer some techniques for physical implementation like Oracle for example
So, i mean that it's quite hard to say some recommendations for this approach but you seriously think and study something about physical implementations of database systems, have look at relational algebra for optimize your statements, which types of tables you should create, #duffymo meant right that you can let explain your query execution plan by EXPLAIN PLANE FOR and based on result to optimize. Also how to use indexes, it's strong database construction but each index mean much more operations for any modifying of database so well to rethink for which attribute you create index etc.
Via Google, you find many useful articles about data modeling, physical implementation etc.
Regards man, I wish best of luck