Hi How can I dump data only for an instance of H2 In Memory DB.
What I currently have
PreparedStatement preparedStatement = connection
.prepareStatement("SCRIPT SIMPLE NOSETTINGS");
ResultSet resultSet = preparedStatement.executeQuery();
response.setContentType("text/plain");
ServletOutputStream out = response.getOutputStream();
while (resultSet.next()) {
String columnValue = resultSet.getString(1);
out.print(columnValue);
out.println();
This dumps the entire db structure however not just the insert data. Basically what I want to do is backup the data I insert during development mode so the next time the database is started I can script the data back in.
The table structure isn't a problem as it is done by JPA.
To filter out just inserts, you could use:
if (columnValue.startsWith("INSERT")) {
out.println(columnValue);
}
Related
I have a Java running process that inserts data into a SqlServer DB table every 3s, using a PreparedStatement executeBatch(). (Process A)
// Get connection here
con.setAutoCommit(false);
// Create PreparedStatement here
ps = con.prepareStatement(stmt);
// Add to batch with loop
ps.executeBatch();
con.commit();
Also, every 5s another Java process is reading the new inserted data from that same table. (Process B)
// Get connection here
con.setTransactionIsolation(Connection.TRANSACTION_SERIALIZABLE);
// Create PreparedStatement here
ps = con.prepareStatement(stmt);
rs = ps.executeQuery();
rs.setFetchSize(10000);
// While ResultSet has data, add it and return it
The main issue here is:
Process A is inserting about 300 rows
Process B sometimes is able to read the data while not all the rows in the batch were persisted (kinda in the middle of the batch), so it would return 80 rows and after another select right away I'll get the whole 300 rows.
Any ideas to what could I be possibly be improved?
So, for a school project, I am building a discord bot. One of the features that I have built in is that he can retrieve gif links from a MySQL database, and send them in a message. Now, my issue is that I am only able to retrieve one record from my database, and no other records. If I put the query that I use into MySQL workbench and run it, it will retrieve those records.
This is the method for retrieving the gifs
public static ArrayList<Gif> GetGifsFromDB(String msg){
ArrayList<Gif> gifs = new ArrayList<>();
try(Connection conn = (Connection)DriverManager.getConnection(url, userName, password)){
Class.forName("com.mysql.jdbc.Driver").newInstance();
Statement stmnt = conn.createStatement();
String sql = "Select * from gif WHERE Type = '" + msg + "'";
stmnt.execute(sql);
try(ResultSet rs = stmnt.getResultSet()){
while(rs.next()){
Gif g = new Gif();
g.setID(rs.getInt("GifID"));
g.setURL(rs.getString("GifURL"));
System.out.println(g.getID() + g.getURL());
gifs.add(g);
}
rs.close();
conn.close();
}
}
catch(SQLException ex){
System.err.println(ex.getMessage());
}
catch(Exception ex){
System.err.println(ex.getMessage());
}
return gifs;
}
The "Type" in the database it just a category. With the test data I have in there, the 3 types are no, surprised and lonely. Only no returns a gif.
Remove closing ResultSet and Connection lines:
rs.close();
conn.close();
You are already closing it using try-with-resources
Issue ended up being with MySql not committing records to the database. Once workbench was refreshed, the added records disappeared. Rather strange that even though the records weren't in the database, they could be retrieve.
Most likely your msg is not exactly matching with any of the values for the database Type column.
Test by running
SELECT COUNT(*) FROM gif WHERE Type = '... put msg content here ...'
Do this manually directly on the database.
You can also try to put following line of code at the end:
System.out.println("Number of Selected Gifs: "+gifs.size());
If either of those results zero, then it means that msg was not exactly matched with Type. Maybe uppercase/lowercase issue?
Also to avoid SQL Injection, and other issues, please strongly consider using bind variables using a PreparedStatement.
I wish to get user info just like provided by
SELECT SYS_CONTEXT ('USERENV', 'SESSION_USER') FROM DUAL;
and
SELECT SYS_CONTEXT ('USERENV', 'OS_USER') FROM DUAL;
inside a JAVA UDF for Oracle 11g without making a JDBC connection and running these queries to query from DUAL.
I tried System.getProperty("user.name") to read the current OS_user through jvm but I think we are not allowed to fetch information outside the database environment.
More generically, problem statement is to fetch information about the user who has logged into database and using that java UDF (where we need to determine these information) ?
I have found solution to above problem by using the "jdbc:default:connection" which is an internal connection maintained by database itself which is always available. Notice I did not do conn.close(); in the end because this is a shared stream which once closed is closed for all database clients.
public static String doSQL() throws SQLException {
String result = new String();
String q1 = "SELECT SYS_CONTEXT('USERENV','SESSION_USER') FROM DUAL";
Connection conn =
DriverManager.getConnection("jdbc:default:connection");
PreparedStatement ps = conn.prepareStatement(
q1
);
ResultSet rs = ps.executeQuery();
while (rs.next())
result = rs.getString(1);
return "my udf says"+result;
}
In a DB2 database, I have the following table:
CREATE TABLE MyTestTable
(
MYPATH VARCHAR(512) NOT NULL,
MYDATA BLOB,
CONSTRAINT MYTESTTABLE_PK PRIMARY KEY (MYPATH)
);
Using Java, I wish to update an existing row in this table with new blob data. My preferred way is to obtain an OutputStream to the BLOB column & write my data to the OutputStream.
Here is the test code I am using:
Connection connection = null;
PreparedStatement pStmnt = null;
ResultSet rSet = null;
try {
connection = ... // get db connection
String id = ... // set the MYPATH value
String sql = "SELECT MYDATA FROM MyTestTable WHERE MYPATH='"+id+"' FOR UPDATE";
pStmnt = connection.prepareStatement(sql);
rSet = pStmnt.executeQuery();
while (rSet.next()) {
Blob blobData = rSet.getBlob("MYDATA"); // this is a java.sql.Blob
OutputStream blobOutputStream = blobData.setBinaryStream(1);
blobOutputStream.write(data);
blobOutputStream.close();
connection.commit();
}
}
// close ResultSet/PreparedStatement/etc in the finally block
The above code works for the Oracle DB.
However, in DB2, calling setBinaryStream to get the OutputStream does not seem to work. The data does not get updated, and I do not get any error messages.
Qs: How can I get an OutputStream to the BLOB column of a DB2 table? What might need to be changed in the above code?
You are probably getting the data written to the Blob object successfully, but you need to do more with the PreparedStatement and ResultSet in order to actually update the value in the database.
First, your PreparedStatement must be instantiated using a version of Connection.prepareStatement() that takes a resultSetConcurrency parameter, which you must set to the value ResultSet.CONCUR_UPDATABLE. (I don't know that the SQL SELECT actually needs to specify the FOR UPDATE clause - see the tutorial at the link at the end of this answer.)
Second, after you close blobOutputStream, you need to update the value in the ResultSet using updateBlob(int columnIndex, Blob x) or updateBlob(String columnLabel, Blob x), then invoke ResultSet.updateRow() before doing a Connection.commit().
I haven't updated Blob values this way myself, but it should work. If you run into any issues trying to reuse the Blob originally read from the ResultSet (which you probably don't need to do if you're not actually using the original data), you can use Connect.createBlob() to make an empty one to start with. You can learn more about updating ResultSets from this tutorial.
I have 26 CSV files that I want to grab from the internet on a nightly basis and upload them into a Postgresql table. I have this working using Java, PreparedStatement, and Batch. Despite this, performance is painfully slow. To grab the 6000 or so entries and put them into Postgresql, it's taking 30 minutes. This is my first time doing something like this, so I don't exactly have a reference point as to whether this is fast or slow.
To get the file, I am using this code.
URL grabberUrl = new URL(csvUrl);
URLConnection grabberConn = grabberUrl.openConnection();
BufferedReader grabberReader = new BufferedReader(new InputStreamReader(grabberConn.getInputStream()));
I am then using PreparedStatement to and taking values from the input stream and setting them
con = DriverManager.getConnection(url, user, password);
pst = con.prepareStatement("insert into blah(name, year) values(?, ?)");
pst.setString(1, name);
pst.setString(2, year);
I am then batching up the inserts. I've tried values from 100 to 1000 with no meaningful change to performance.
pst.addBatch();
if (count == 100) {
count = 0;
pst.executeBatch();
}
Has anyone got any suggestions as to what I can do to make things faster?
If you can access the files from the PostgreSQL server try using the copy statement. See link
http://www.postgresql.org/docs/9.3/static/sql-copy.html
Also, if you know the data quality you can temporarily remove any table constraints and drop any index's. You can add the constraints and the index's after loading the data.
Try the following:
PGConnection con = (PGConnection) DriverManager.getConnection(...);
CopyManager copyManager = con.getCopyAPI();
copyManager.copyIn("copy mytable from stdin with (format csv)", grabberReader);
If mytable is heavily indexed, then drop the indexes, load, and recreate the indexes.