Append row in Sql without replacing previous row java - java

So i have database with value like this...
i'm trying to append the value by using insert into without replacing it,the data from this txt file...
but when i reload/refresh the database there is no new data being appended into the database...,
here is my code....
public static void importDatabase(String fileData){
try{
File database = new File(fileData);
FileReader fileInput = new FileReader(database);
BufferedReader in = new BufferedReader(fileInput);
String line = in.readLine();
line = in.readLine();
String[] data;
while (line != null){
data = line.split(",");
int ID = Integer.parseInt(data[0]);
String Nama = data[1];
int Gaji = Integer.parseInt(data[2]);
int Absensi = Integer.parseInt(data[3]);
int cuti = Integer.parseInt(data[4]);
String Status = data[5];
String query = "insert into list_karyawan values(?,?,?,?,?,?)";
ps = getConn().prepareStatement(query);
ps.setInt(1,ID);
ps.setString(2,Nama);
ps.setInt(3,Gaji);
ps.setInt(4,Absensi);
ps.setInt(5,cuti);
ps.setString(6,Status);
line = in.readLine();
}
ps.executeUpdate();
ps.close();
con.close();
System.out.println("Database Updated");
in.close();
}catch (Exception e){
System.out.println(e);
}
}
When i run it, it shows no error but the data never get into database, where did i go wrong?.,...

Auto-commit mode is enabled by default.
The JDBC driver throws a SQLException when a commit or rollback operation is performed on a connection that has auto-commit set to true.
Symptoms of the problem can be unexpected application behavior
update the JVM configuration for the ActiveMatrix BPM node to use the following Oracle connection property:
autoCommitSpecCompliant=false Try once
Note:I am not able to put as comment so i posted as a answer

Related

java - Jar file can't be executed on another pc

I have an executable jar file I compiled from my program and I ran it on my PC. It works perfectly fine when I ran it in my command prompt using java -jar [nameofjar.jar]
However, I tried testing it on another pc. Using command prompt to run the same jar file, it throws an error:
D:\QA06122018_2>java -jar Indexing.jar
java.lang.NullPointerException
at IndexDriver.processText(IndexDriver.java:81)
at IndexDriver.index(IndexDriver.java:140)
at Main.main(Main.java:44).....
Both PC are using the same operating system and settings.
I even looked at the code regarding the error and there doesn't seem to be any problem with it. Ran fine on my IDE.
Is there anything I might overlooked?
EDIT:
The code :
public PreparedStatement preparedStatement = null;
MysqlAccessIndex con = new MysqlAccessIndex();
public Connection con1 = con.connect();
String path1;
public void index() throws Exception {
// Connection con1 = con.connect();
try {
Statement statement = con1.createStatement();
ResultSet rs = statement.executeQuery("select * from filequeue where Status='Active' LIMIT 5");
while (rs.next()) {
// get the filepath of the PDF document
path1 = rs.getString(2);
int getNum = rs.getInt(1);
Statement test = con1.createStatement();
test.executeUpdate("update filequeue SET STATUS ='Processing' where UniqueID="+getNum);
try {
// call the index function
PDDocument document = PDDocument.load(new File(path1),MemoryUsageSetting.setupTempFileOnly());
if (!document.isEncrypted()) {
PDFTextStripper tStripper = new PDFTextStripper();
for(int p=1; p<=document.getNumberOfPages();++p) {
tStripper.setStartPage(p);
tStripper.setEndPage(p);
try {
String pdfFileInText = tStripper.getText(document);
processText(pdfFileInText);
System.out.println("Page "+p+" done");
}catch (Exception e){
e.printStackTrace();
Statement statement1 = con1.createStatement();
statement1.executeUpdate("update filequeue SET Error ='E0003' where UniqueID="+getNum);
statement1.executeUpdate("update filequeue SET Status ='Error' where UniqueID="+getNum);
con1.commit();
con1.close();
}
}
}
// After completing the process, update status: Complete
Statement pre= con1.createStatement();
pre.executeUpdate("update filequeue SET STATUS ='Complete' where UniqueID="+getNum);
// con1.commit();
preparedStatement.close();
document.close();
System.out.println("Successfully commited changes to the database!");
con1.commit();
// con1.close();
// updateComplete_DB(getNum);
} catch (Exception e) {
try {
System.err.println(e);
Statement statement1 = con1.createStatement();
statement1.executeUpdate("update filequeue SET STATUS ='Error' where UniqueID="+getNum);
statement1.executeUpdate("update filequeue SET Error ='E0002' where UniqueID="+getNum);
con1.commit();
// add rollback function
rollbackEntries();
}catch (Exception e1){
System.out.println("Could not rollback updates :" + e1.getMessage());
}
}
// con1.close();
}
}catch(Exception e){
e.printStackTrace();
//System.out.println("lalala");
}
//con1.commit();
con1.close();
}
Calling the method:
public void processText(String text) throws SQLException {
String lines[] = text.split("\\r?\\n");
for (String line : lines) {
String[] words = line.split(" ");
String sql="insert IGNORE into test.indextable values (?,?);";
preparedStatement = con1.prepareStatement(sql);
int i=0;
for (String word : words) {
// check if one or more special characters at end of string then remove OR
// check special characters in beginning of the string then remove
// insert every word directly to table db
word=word.replaceAll("([\\W]+$)|(^[\\W]+)", "");
preparedStatement.setString(1, path1);
preparedStatement.setString(2, word);
preparedStatement.executeUpdate();
}
}
preparedStatement.close();
}
The root cause is that there were no lines to process.
You appear to only create prepared statements inside the for (String line : lines) { loop. But you only close the last statement you created (outside that loop).
When you don't have any lines, preparedStatement is null, because you never created one.
Even when you have lines to process, you are creating lots of prepared statements but only closing the last one.
You should probably create one prepared statement at the start of the method and reuse it for the whole method, closing it at the end.

Getting image from server and saving to MySQL DB

Im getting an image from server as InputStream and then saving it to mySQL database. It works when I use Thread.sleep(5000);. But if I dont use it no picture is saved to the DB or only one picture and half of it or less. So I understand that the program needs time writing image to the database, but how much time? This is the question, I would like to know exactly when it finished writing image to the database and can start with the next image. Below is my code:
ResultSet rs = stmt.executeQuery(query);
while (rs.next()) {
int ID = rs.getInt(1);
String myName = rs.getString(2);
try {
String myCommand = "take picture and save /mydir/mydir2/mydir3" + myName + ".png";
telnet.sendCommand(myCommand); // Here taking a picture via telnet
// Thread.sleep(5000);// If I uncomment this line it works
String sqlCommand = "UPDATE my_table SET Picture = ? WHERE ID ='" + ID +"';";
PreparedStatement statement = conn.prepareStatement(sqlCommand);
String ftpUrl = "ftp://"+server_IP+"/mydir/mydir2/mydir3" + myName + ".png;type=i";
URL url = new URL(ftpUrl);
URLConnection connUrl = url.openConnection();
//Thread.sleep(5000); // If I uncomment this line, it works too.
InputStream inputStreamTelnet = connUrl.getInputStream();
statement.setBlob(1, inputStreamTelnet);
int row = statement.executeUpdate();
if (row > 0) {
System.out.println("A picture was inserted into DB.");
System.out.println("Value of row(s) : " + row);
}
} catch (Exception e) {
e.printStackTrace();
}
} // End of while
I would expect to put the waiting(sleep) after InputStream inputStreamTelnet = connUrl.getInputStream(); but it doesnt work when I put the sleep after this line. It works only when the sleep is before. Could someone explain me why and I would like to avoid using Thread.sleep(5000); and instead would like to wait exact time or not wait at all which will make the program faster also there might be a case saving the picture can take more than 5 seconds or maybe saving the picture doesnt take time but opening the url connection. There are 2 sleep lines on the code when I uncomment one of them the program works(saves the images to mysql DB successfully). I also verified on the server that the images exist but in the end I dont see them in the mysql DB.
UPDATE : I removed the try block and telnet stuff now it works without waiting but I really need the telnet stuff...
UPDATE 2: After inspecting my telnet class found out that I forgot to apply a change I made to single line... now it works without wait!
Huh, I tested my code on JDK 1.7.0_67 / PostgreSQL 9.2 and it works well:
public class ImageLoader {
private static final int START_IMAGE_ID = 1;
private static final int END_IMAGE_ID = 1000;
private static final String IMAGE_URL = "http://savepic.net/%d.jpg";
public static void main(String[] args) throws SQLException, IOException {
Connection connection = DriverManager.getConnection("jdbc:postgresql://localhost:5432/test", "username", "password");
PreparedStatement imageStatement = connection.prepareStatement("INSERT INTO public.image VALUES(?, ?)");
for (int i = START_IMAGE_ID; i <= END_IMAGE_ID; i++) {
String imageUrl = String.format(IMAGE_URL, i);
URL url = new URL(imageUrl);
URLConnection urlConnection = url.openConnection();
imageStatement.setLong(1, i);
imageStatement.setBytes(2, read(urlConnection.getInputStream()));
int count = imageStatement.executeUpdate();
if (count != 1) {
throw new IllegalStateException("Image with ID = " + i + " not inserted");
} else {
System.out.println("Image (" + imageUrl + ") saved to database");
}
}
imageStatement.close();
connection.close();
}
private static byte[] read(InputStream inputStream) throws IOException {
ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream(1 << 15); // assume image average size ~ 32 Kbytes
BufferedInputStream bufferedInputStream = new BufferedInputStream(inputStream);
byte[] buffer = new byte[1 << 10];
int read = -1;
while ((read = bufferedInputStream.read(buffer)) != -1) {
byteArrayOutputStream.write(buffer, 0, read);
}
return byteArrayOutputStream.toByteArray();
}
}

Insert into Database using form in Netbeans 7.01

I'm doing an individual project in java. I want to insert data into my database...but my program is successfully running without any error but when insert data and submit the my data it will give an error like this java.sql.SQLException: Can not issue data manipulation statements with executeQuery().This My Code: \
what can do for solved this problem
private void jButton1ActionPerformed(java.awt.event.ActionEvent evt) {
if (evt.getSource() == jButton1)``
{
int x = 0;
String s1 = jTextField1.getText().trim();
String s2 = jTextField2.getText();
char[] s3 = jPasswordField1.getPassword();
char[] s4 = jPasswordField2.getPassword();
String s8 = new String(s3);
String s9 = new String(s4);
String s5 = jTextField5.getText();
String s6 = jTextField6.getText();
String s7 = jTextField7.getText();
if(s8.equals(s9))
{
try{
File image = new File(filename);
FileInputStream fis = new FileInputStream(image);
ByteArrayOutputStream bos = new ByteArrayOutputStream();
byte buf[] = new byte[1024];
for (int readNum; (readNum = fis.read(buf)) != -1;) {
bos.write(buf, 0, readNum);
}
cat_image = bos.toByteArray();
PreparedStatement ps = conn.prepareStatement("insert into reg values(?,?,?,?,?,?,?)");
ps.setString(1,s1);
ps.setString(2,s2);
ps.setString(3,s8);
ps.setString(4,s5);
ps.setString(5,s6);
ps.setString(6,s7);
ps.setBytes(7,cat_image);
rs = ps.executeQuery();
if(rs.next())
{
JOptionPane.showMessageDialog(null,"Data insert Succesfully");
}else
{
JOptionPane.showMessageDialog(null,"Your Password Dosn't match" ,"Acces dinied",JOptionPane.ERROR_MESSAGE);
}
}catch(Exception e)
{
System.out.println(e);
}
}
Use ps.executeUpdate() or ps.execute().
From executeUpdate
Executes the SQL statement in this PreparedStatement object, which must be an SQL Data Manipulation Language (DML) statement, such as
INSERT, UPDATE or DELETE; or an SQL statement that returns nothing,
such as a DDL statement.
From execute
Executes the SQL statement in this PreparedStatement object, which
may be any kind of SQL statement. Some prepared statements return
multiple results; the execute method handles these complex statements
as well as the simpler form of statements handled by the methods
executeQuery and executeUpdate.The execute method returns a boolean to
indicate the form of the first result. You must call either the method
getResultSet or getUpdateCount to retrieve the result; you must call
getMoreResults to move to any subsequent result(s).
Then modify your code properly
int rowsAffected = ps.executeUpdate();
JOptionPane.showMessageDialog(null,"Data Rows Inserted "+ rowsAffected);
Also you have to close your streams and connections in a finally block.
SQLException is thrown because of wrong sql statement. You may have syntax error while inserting string and integer values. Check your sql statement after VALUES there should be "1-0" around integer elements and '"some value"' around string elements.

Performance Tuning While Loading CSV

I have attached below code
Functionality
Reading csv and insert in db after replacing values with webmacro.
Reading values from csv # first header information NO,NAME next to that read one by one values and put into webmacro context context.put("1","RAJARAJAN") next webmacro replace $(NO) ==>1 and $(NAME)==>RAJARAJAN and add in statment batch once it reached 1000 execute the batch.
Code is running as per functionality but it takes 4 minutes to parse 50,000 records need performance improvement or need to change logic ....kindly let me know if any doubts.
Any change to drastic performance...
Note: I use webmacro because to replace $(NO) in merge query to values read in CSV
Bala.csv
NO?NAME
1?RAJARAJAN
2?ARUN
3?ARUNKUMAR
Connection con=null;
Statement stmt=null;
Connection con1=null;
int counter=0;
try{
WebMacro wm = new WM();
Context context = wm.getContext();
String strFilePath = "/home/vbalamurugan/3A/email-1822820895/Bala.csv";
String msg="merge into temp2 A using
(select '$(NO)' NO,'$(NAME)' NAME from dual)B on(A.NO=B.NO)
when not matched then insert (NO,NAME)
values(B.NO,B.NAME) when matched then
update set A.NAME='Attai' where A.NO='$(NO)'";
String[]rowsAsTokens;
con=getOracleConnection("localhost","raymedi_hq","raymedi_hq","XE");
con.setAutoCommit(false);
stmt=con.createStatement();
File file = new File(strFilePath);
Scanner scanner = new Scanner(file);
try {
String headerField;
String header[];
headerField=scanner.nextLine();
header=headerField.split("\\?");
long start=System.currentTimeMillis();
while(scanner.hasNext()) {
String scan[]=scanner.nextLine().split("\\?");
for(int i=0;i<scan.length;i++){
context.put(header[i],scan[i]);
}
if(context.size()>0){
String m=replacingWebMacroStatement(msg,wm,context);
if(counter>1000){
stmt.executeBatch();
stmt.clearBatch();
counter=0;
}else{
stmt.addBatch(m);
counter++;
}
}
}
long b=System.currentTimeMillis()-start;
System.out.println("=======Total Time Taken"+b);
}catch(Exception e){
e.printStackTrace();
}
finally {
scanner.close();
}
stmt.executeBatch();
stmt.clearBatch();
stmt.close();
}catch(Exception e){
e.printStackTrace();
con.rollback();
}finally{
con.commit();
}
// Method For replace webmacro with $
public static String replacingWebMacroStatement(String Query, WebMacro wm,Context context) throws Exception {
Template template = new StringTemplate(wm.getBroker(), Query);
template.parse();
String macro_replaced = template.evaluateAsString(context);
return macro_replaced;
}
// for getting oracle connection
public static Connection getOracleConnection(String IPaddress,String username,String password,String Tns)throws SQLException{
Connection connection = null;
try{
String baseconnectionurl ="jdbc:oracle:thin:#"+IPaddress+":1521:"+Tns;
String driver = "oracle.jdbc.driver.OracleDriver";
String user = username;
String pass = password;
Class.forName(driver);
connection=DriverManager.getConnection(baseconnectionurl,user,pass);
}catch(Exception e){
e.printStackTrace();
}
return connection;
}
I can tell you that this code takes on average about 150ms on my machine:
StrTokenizer tokenizer = StrTokenizer.getCSVInstance();
for (int i=0;i<50000;i++) {
tokenizer.reset("a,b,c,d,e,f,g,h,i,j,k,l,m,n,o,p,q,r,s,t,u,v,w,x,y,z");
String toks[] = tokenizer.getTokenArray();
}
You'll find StrTokenizer in the apache commons-lang package, but I would doubt that String.split(), StringTokenizer or Scanner.nextLine() would be your bottleneck in any case. I would assume it's your database inserts times.
If that's the case you can do 1 of two things:
Tune your batch size.
Multithread the inserts
And as suggested, a profiler will help to determine where your time is spent.,

Insert into Star Schema efficiently using JDBC

I have star schema model in which Server Table contains information about server name. Information Table contains information that I want for specific server. And Actual Data Table contains information about which server contains which information.
Server Table
Information Table
Actual Data Table
Now the problem that I am having is- I am trying to insert Data into the Data Table using JDBC. But I am unsure how should I add data into Actual Data Table in star schema model. Should I connect to database and insert it every time for each information or there is any direct way we can do that by communicating to database only one time. This is my code where I am getting all the information for each server. And IndexData is the class where I insert values into Oracle Database.
public void fetchlog() {
InputStream is = null;
InputStream isUrl = null;
FileOutputStream fos = null;
try {
is = HttpUtil.getFile(monitorUrl);
if(monitorUrl.contains("stats.jsp") || monitorUrl.contains("monitor.jsp")) {
trimUrl = monitorUrl.replaceAll("(?<=/)(stats|monitor).jsp$", "ping");
}
isUrl = HttpUtil.getFile(trimUrl);
BufferedReader in = new BufferedReader (new InputStreamReader (is));
String line;
int i=0,j=0,k=0;
while ((line = in.readLine()) != null) {
if(line.contains("numDocs")) {
docs = in.readLine().trim();
//So should I keep on inserting into Database for each information, like this
//IndexData id = new IndexData(timeStamp, ServerName, InformationName, docs);
} else if(line.contains("indexSize")) {
indexSize = in.readLine().trim();
//For this information-- the same way?
//IndexData id = new IndexData(timeStamp, ServerName, InformationName, indexSize);
} else if(line.contains("cumulative_lookups")) {
cacheHits= in.readLine().trim();
//For this information too-- the same way?
//IndexData id = new IndexData(timeStamp, ServerName, InformationName, cacheHits);
} else if(line.contains("lastCommitTime")) {
lastCommitTime = in.readLine().trim();
//For this information too-- the same way?
//IndexData id = new IndexData(timeStamp, ServerName, InformationName, lastCommitTime );
}
BufferedReader inUrl = new BufferedReader (new InputStreamReader (isUrl));
String lineUrl;
Pattern regex = Pattern.compile("<str name=\"status\">(.*?)</str>");
while ((lineUrl = inUrl.readLine()) != null) {
System.out.println(lineUrl);
if(lineUrl.contains("str name=\"status\"")) {
Matcher regexMatcher = regex.matcher(lineUrl);
if (regexMatcher.find()) {
upDown= regexMatcher.group(1);
//For this information too-- the same way?
//IndexData id = new IndexData(timeStamp, ServerName, InformationName, upDown);
}
System.out.println("Status:- " + status);
}
}
//Or is there some other way we can insert directly into database by communicating with database only one time not multiple times for each information.
//IndexData id = new IndexData(timeStamp, ServerName, InformationName, Value);
fos = new FileOutputStream(buildTargetPath());
IOUtils.copy(is, fos);
} catch (FileNotFoundException e) {
log.error("File Exception in fetching monitor logs :" + e);
} catch (IOException e) {
log.error("Exception in fetching monitor logs :" + e);
}
}
I hope question is clear to everyone. Any suggestions will be appreciated.
There are two things I would suggest you look at. First, use a batch insert to perform all of the associated inserts in one JDBC transaction. For more information:
JDBC Batch Insert Example
I would also strongly recommend that you use a JDBC connection pooling library. We use c3p0 with our Postgres database. You can find more information here:
c3p0 Project Page
The basic idea would is to create a connection pool at startup time, then create JDBC batches for each set of related inserts.

Categories

Resources