Reset sequence in DBUnit? - java

I want to reset the Database AND sequences after each test in Java+DBUnit/.
I've seen this question but doesn't have the code solution I am struggling to get.
How to use Oracle Sequence Numbers in DBUnit?

I've found the answer, it was in the Official Documentation. It was as easy as in the dataset you are using to prepare the database, add a reset_sequences attribute with a list of the ones you want to reset.
<?xml version='1.0' encoding='UTF-8'?>
<dataset reset_sequences="emp_seq, dept_seq">
<emp empno="1" ename="Scott" deptno="10" job="project manager" />
....
</dataset>
This solution is not working perfectly, as it didn't really reset the sequence, only simulates the reset on the inserted rows. If you want to effectively reset it, you should execute some commands. I've extended the DatabaseOperation for that purpose with this class.
public static final DatabaseOperation SEQUENCE_RESETTER_POSTGRES = new DatabaseOperation() {
#Override
public void execute(IDatabaseConnection connection, IDataSet dataSet)
throws DatabaseUnitException, SQLException {
String[] tables = dataSet.getTableNames();
Statement statement = connection.getConnection().createStatement();
for (String table : tables) {
int startWith = dataSet.getTable(table).getRowCount() + 1;
statement.execute("alter sequence " + table + "_PK_SEQ RESTART WITH "+ startWith);
}
}
};
public static final DatabaseOperation SEQUENCE_RESETTER_ORACLE = new DatabaseOperation() {
#Override
public void execute(IDatabaseConnection connection, IDataSet dataSet)
throws DatabaseUnitException, SQLException {
String[] tables = dataSet.getTableNames();
Statement statement = connection.getConnection().createStatement();
for (String table : tables) {
int startWith = dataSet.getTable(table).getRowCount() + 1;
statement.execute("drop sequence " + table + "_PK_SEQ if exists");
statement.execute("create sequence " + table + "_PK_SEQ START WITH " + startWith);
}
}
};

I've tested the solution provided by #Chexpir, and here is an improved/cleaner way (PostgreSQL implementation) - Also note that the sequence is reset to 1 (instead of retrieving the row count)
public class ResetSequenceOperationDecorator extends DatabaseOperation {
private DatabaseOperation decoree;
public ResetSequenceOperationDecorator(DatabaseOperation decoree) {
this.decoree = decoree;
}
#Override
public void execute(IDatabaseConnection connection, IDataSet dataSet) throws DatabaseUnitException, SQLException {
String[] tables = dataSet.getTableNames();
Statement statement = connection.getConnection().createStatement();
for (String table : tables) {
try {
statement.execute("ALTER SEQUENCE " + table + "_id_seq RESTART WITH 1");
}
// Don't care because the sequence does not appear to exist (but catch it silently)
catch(SQLException ex) {
}
}
decoree.execute(connection, dataSet);
}
}
And in your DatabaseTestCase:
public abstract class AbstractDBTestCase extends DataSourceBasedDBTestCase {
#Override
protected DatabaseOperation getTearDownOperation() throws Exception {
return new ResetSequenceOperationDecorator(DatabaseOperation.DELETE_ALL);
}
}

Can you please check the below link if anyway it helps you.
How to revert the database back to the initial state using dbUnit?

Related

Problem with DAO save method when using dbunit

I have a class that tests adding a group to a database:
class GroupDAOTest extends TestCase {
private IDatabaseTester databaseTester;
private GroupDao groupDao;
#BeforeEach
protected void setUp() throws Exception
{
databaseTester = new JdbcDatabaseTester("org.postgresql.Driver",
"jdbc:postgresql://localhost:5432/database_school", "principal", "school");
String file = getClass().getClassLoader().getResource("preparedDataset.xml").getFile();
IDataSet dataSet = new FlatXmlDataSetBuilder().build(new File(file));
databaseTester.setDataSet(dataSet);
databaseTester.setSetUpOperation(DatabaseOperation.CLEAN_INSERT);
databaseTester.onSetup();
groupDao = new GroupDao();
}
#Test
void add() throws Exception {
groupDao.save(new Group("NEW_GROUP"));
IDataSet databaseDataSet = databaseTester.getConnection().createDataSet();
ITable actualTable = databaseDataSet.getTable("groups");
String file = getClass().getClassLoader().getResource("GroupDao/add.xml").getFile();
IDataSet expectedDataSet = new FlatXmlDataSetBuilder().build(new File(file));
ITable expectedTable = expectedDataSet.getTable("groups");
Assertion.assertEquals(expectedTable, actualTable);
}
And here is the method "groupDao.save (new Group (" NEW_GROUP "));" must add a group with id = 4, name = "NEW_GROUP". Once the test passed, but when I ran it again and again, the group was added, but for some reason the id grew by one. And for some launch it was already like this:
[![enter image description here][1]][1]
Checked groupDao.save () - everything is fine, tried changing databaseTester.setSetUpOperation (DatabaseOperation ***), but it didn't help.
Can you tell me where the problem is, maybe I'm just not clearing something?
And just in case my dao method:
#Override
public void save(Group group) {
try (Connection connection = connectionProvider.getConnection();
PreparedStatement statement = connection.prepareStatement(SAVE_NEW_RECORD)) {
statement.setString(1, group.getName());
statement.executeUpdate();
} catch (SQLException e) {
e.printStackTrace();
}
}
And table schema:
CREATE TABLE groups
(
group_id serial PRIMARY KEY,
group_name VARCHAR(10) UNIQUE NOT NULL
);
Once the test passed, but when I ran it again and again, the group was added, but for some reason the id grew by one.
The issue is not with your code or configuration. PostgreSQL serial field type is auto-increment. It adds 1 to the field value each time saving row to the table.
Use the dbUnit ValueComparer assertion instead to compare with greater than or equal to instead of the assertion method you currently using which compares only on equality.
http://dbunit.sourceforge.net/datacomparisons/valuecomparer.html

Using SQL Server's BULK INSERT with EntityManager's createNativeQuery results in TransactionRequiredException

So i've been trying to figure this one out.
SQL Server has a way to Bulk Copy using sql which they outline here:
https://learn.microsoft.com/en-us/sql/connect/jdbc/using-bulk-copy-with-the-jdbc-driver?view=sql-server-2017#single-bulk-copy-operations
They even use executeUpdate().
I've been trying to do this with my EntityManager, however I always get the following:
javax.persistence.TransactionRequiredException: Executing an update/delete query
Here's my code:
#Service
#Transactional
public class UpdateService {
private EntityManager emPreStaging;
#PersistenceContext(unitName = "prestagingEntityManagerFactory")
public void setEmPreStaging(EntityManager emPreStaging) {
this.emPreStaging = emPreStaging;
}
#Async
public void insertData(String filePath) {
_log.info("[Update] Inserting data...");
try {
emPreStaging.createNativeQuery("BULK INSERT [test_table] FROM '" + filePath + "' WITH ( FIRSTROW = 2,FORMATFILE = 'C:\\csv\\location\\test.csv'").executeUpdate();
} catch (Exception e) {
_log.error("[Update] - Insert Data FAILED - " +e.getMessage());
} finally {
emPreStaging.close();
}
}
}
I'm currently using org.springframework.transaction.annotation.Transactional but I have also tried jpa's. I've tried wrapping the function instead of at class level to no avail. What am I doing wrong?

Find and replace text in MS Access table rows not working

Given a directory, my application traverses and loads .mdb MS Access dbs using the Jackcess API. Inside of each database, there is a table named GCMT_CMT_PROPERTIES with a column named cmt_data containing some text. I also have a Mapper object (which essentially resembles a Map<String,String> but allows duplicate keys) which I use as a dictionary when replacing a certain word from a string.
So for example if mapper contains fox -> dog then the sentence: "The fox jumps" becomes "The dog jumps".
The design I'm going with for this program is as follows:
1. Given a directory, traverse all subdirectories and load all .mdb files into a File[].
2. For each db file in File[], create a Task<Void> called "TaskMdbUpdater" and pass it the db file.
3. Dispatch and run each task as it is created (see 2. above).
TaskMdbUpdater is responsible for locating the appropriate table and column in the db file it was given and iteratively running a "find & replace" routine on each row of the table to detect words from the dictionary and replace them (as shown in example above) and finally updating that row before closing the db. Each instance of TaskMdbUpdater is a background thread with a Jackcess API DatabaseBuilder assigned to it, so it is able to manipulate the db.
In the current state, the code is running without throwing any exceptions whatsoever, however when I "manually" open the db through Access and inspect a given row, it appears to not have changed. I've tried to pin the source of the issue without any luck and would appreciate any support. If you need to see more code, let me know and I'll update my question accordingly.
public class TaskDatabaseTaskDispatcher extends Task<Void> {
private String parentDir;
private String dbFileFormat;
private Mapper mapper;
public TaskDatabaseTaskDispatcher(String parent, String dbFileFormat, Mapper mapper) {
this.parentDir = parent;
this.dbFileFormat = dbFileFormat;
this.mapper = mapper;
}
#Override
protected Void call() throws Exception {
File[] childDirs = getOnlyDirectories(getDirectoryChildFiles(new File(this.parentDir)));
DatabaseBuilder[] dbs = loadDatabasesInParent(childDirs);
Controller.dprint("TaskDatabaseTaskDispatcher", dbs.length + " databases were found in parent directory");
TaskMdbUpdater[] tasks = new TaskMdbUpdater[dbs.length];
Thread[] workers = new Thread[dbs.length];
for(int i=0; i<dbs.length; i++) {
// for each db, dispatch Task so a worker can update that db.
tasks[i] = new TaskMdbUpdater(dbs[i], mapper);
workers[i] = new Thread(tasks[i]);
workers[i].setDaemon(true);
workers[i].start();
}
return null;
}
private DatabaseBuilder[] loadDatabasesInParent(File[] childDirs) throws IOException {
DatabaseBuilder[] dbs = new DatabaseBuilder[childDirs.length];
// Traverse children and load dbs[]
for(int i=0; i<childDirs.length; i++) {
File dbFile = FileUtils.getFileInDirectory(
childDirs[i].getCanonicalFile(),
childDirs[i].getName() + this.dbFileFormat);
dbs[i] = new DatabaseBuilder(dbFile);
}
return dbs;
}
}
// StringUtils class, utility methods
public class StringUtils {
public static String findAndReplace(String str, Mapper mapper) {
String updatedStr = str;
for(int i=0; i<mapper.getMappings().size(); i++) {
updatedStr = updatedStr.replaceAll(mapper.getMappings().get(i).getKey(), mapper.getMappings().get(i).getValue());
}
return updatedStr;
}
}
// FileUtils class, utility methods:
public class FileUtils {
/**
* Returns only directories in given File[].
* #param list
* #return
*/
public static File[] getOnlyDirectories(File[] list) throws IOException, NullPointerException {
List<File> filteredList = new ArrayList<>();
for(int i=0; i<list.length; i++) {
if(list[i].isDirectory()) {
filteredList.add(list[i]);
}
}
File[] correctSizeFilteredList = new File[filteredList.size()];
for(int i=0; i<filteredList.size(); i++) {
correctSizeFilteredList[i] = filteredList.get(i);
}
return correctSizeFilteredList;
}
/**
* Returns a File[] containing all children under specified parent file.
* #param parent
* #return
*/
public static File[] getDirectoryChildFiles(File parent) {
return parent.listFiles();
}
}
public class Mapper {
private List<aMap> mappings;
public Mapper(List<aMap> mappings) {
this.mappings = mappings;
}
/**
* Returns mapping dictionary, typically used for extracting individual mappings.
* #return List of type aMap
*/
public List<aMap> getMappings() {
return mappings;
}
public void setMappings(List<aMap> mappings) {
this.mappings = mappings;
}
}
/**
* Represents a single String based K -> V mapping.
*/
public class aMap {
private String[] mapping; // [0] - key, [1] - value
public aMap(String[] mapping) {
this.mapping = mapping;
}
public String getKey() {
return mapping[0];
}
public String getValue() {
return mapping[1];
}
public String[] getMapping() {
return mapping;
}
public void setMapping(String[] mapping) {
this.mapping = mapping;
}
}
Update 1:
To verify my custom StringUtils.findAndReplace logic, I've performed the following unit test (in JUnit) which is passing:
#Test
public void simpleReplacementTest() {
// Construct a test mapper/dictionary
List<aMap> aMaps = new ArrayList<aMap>();
aMaps.add(new aMap(new String[] {"fox", "dog"})); // {K, V} = K -> V
Mapper mapper = new Mapper(aMaps);
// Perform replacement
String corpus = "The fox jumps";
String updatedCorpus = StringUtils.findAndReplace(corpus, mapper);
assertEquals("The dog jumps", updatedCorpus);
}
I'm including my TaskMdbUpdater class here separately with some logging code included, as I suspect point of failure lies somewhere in call:
/**
* Updates a given .mdb database according to specifications defined internally.
* #since 2.2
*/
public class TaskMdbUpdater extends Task<Void> {
private final String TABLE_NAME = "GCMT_CMT_PROPERTIES";
private final String COLUMN_NAME = "cmt_data";
private DatabaseBuilder dbPackage;
private Mapper mapper;
public TaskMdbUpdater(DatabaseBuilder dbPack, Mapper mapper) {
super();
this.dbPackage = dbPack;
this.mapper = mapper;
}
#Override
protected Void call() {
try {
// Controller.dprint("TaskMdbUpdater", "Worker: " + Thread.currentThread().getName() + " running");
// Open db and extract Table
Database db = this.dbPackage
.open();
Logger.debug("Opened database: {}", db.getFile().getName());
Table table = db.getTable(TABLE_NAME);
Logger.debug("Opening table: {}", table.getName());
Iterator<Row> tableRows = table.iterator();
// Controller.dprint("TaskMdbUpdater", "Updating database: " + db.getFile().getName());
int i=0;
try {
while( tableRows.hasNext() ) {
// Row is basically a<code> Map<Column_Name, Value> </code>
Row cRow = tableRows.next();
Logger.trace("Current row: {}", cRow);
// Controller.dprint(Thread.currentThread().getName(), "Database name: " + db.getFile().getName());
// Controller.dprint("TaskMdbUpdater", "existing row: " + cRow.toString());
String str = cRow.getString(COLUMN_NAME);
Logger.trace("Row {} column field contents (before find/replace): {}", i, str);
String newStr = performFindAndReplaceOnString(str);
Logger.trace("Row {} column field contents (after find/replace): {}", i, newStr);
cRow.put(COLUMN_NAME, newStr);
Logger.debug("Updating field in row {}", i);
Row newRow = table.updateRow(cRow); // <code>updateRow</code> returns the new, updated row. Ignoring this.
Logger.debug("Calling updateRow on table with modified row");
// Controller.dprint("TaskMdbUpdater", "new row: " + newRow.toString());
i++;
Logger.trace("i = {}", i);
}
} catch(NoSuchElementException e) {
// e.printStackTrace();
Logger.error("Thread has iterated past number of rows in table", e);
}
Logger.info("Iterated through {} rows in table {}", i, table.getName());
db.close();
Logger.debug("Closing database: {}", db.getFile().getName());
} catch (Exception e) {
// e.printStackTrace();
Logger.error("An error occurred while attempting to update row value", e);
}
return null;
}
/**
* #see javafx.concurrent.Task#failed()
*/
#Override
protected void failed() {
super.failed();
Logger.error("Task failed");
}
#Override
protected void succeeded() {
Logger.debug("Task succeeded");
}
private String performFindAndReplaceOnString(String str) {
// Logger.trace("OLD: [" + str + "]");
String updatedStr = null;
for(int i=0; i<mapper.getMappings().size(); i++) {
// loop through all parameter names in mapper to search for in str.
updatedStr = findAndReplace(str, this.mapper);
}
// Logger.trace("NEW: [" + updatedStr + "]");
return updatedStr;
}
}
Here's a small exerept from my log. As you can see, it doesn't seem to do anything after opening the table which has left me a bit perplexed:
INFO (16-02-2017 17:27:59) [Thread-9] NAMEMAP.logic.TaskDatabaseTaskDispatcher.call(): Located the following directories under specified MOIS parent which contains an .mdb file:
[01_Parent_All_Safe_Test[ RV_DMS_0041RV_DMS_0001RV_DMS_0003RV_DMS_0005RV_DMS_0007RV_DMS_0012RV_DMS_0013RV_DMS_0014RV_DMS_0016RV_DMS_0017RV_DMS_0018RV_DMS_0020RV_DMS_0023RV_DMS_0025RV_DMS_0028RV_DMS_0029RV_DMS_0031RV_DMS_0033RV_DMS_0034RV_DMS_0035RV_DMS_0036RV_DMS_0038RV_DMS_0039RV_DMS_0040 ]]
...
DEBUG (16-02-2017 17:27:59) [Thread-9] NAMEMAP.logic.TaskDatabaseTaskDispatcher.call(): Created new task: NAMEMAP.logic.TaskMdbUpdater#4cfe46fe
DEBUG (16-02-2017 17:27:59) [Thread-9] NAMEMAP.logic.TaskDatabaseTaskDispatcher.call(): Created new worker: Thread[Thread-22,5,main]
DEBUG (16-02-2017 17:27:59) [Thread-9] NAMEMAP.logic.TaskDatabaseTaskDispatcher.call(): Set worker Thread[Thread-22,5,main] as daemon
DEBUG (16-02-2017 17:27:59) [Thread-9] NAMEMAP.logic.TaskDatabaseTaskDispatcher.call(): Dispatching worker: Thread[Thread-22,5,main]
...
DEBUG (16-02-2017 17:28:00) [Thread-22] NAMEMAP.logic.TaskMdbUpdater.call(): Opened database: RV_DMS_0023.mdb
DEBUG (16-02-2017 17:28:00) [Thread-22] NAMEMAP.logic.TaskMdbUpdater.call(): Opening table: GCMT_CMT_PROPERTIES
After this point, there isn't any more entries entries in the log and the processor spikes at 100% load, remaining that way until I force kill the application. This could mean the program gets stuck in an infinite while loop - however if that were to be the case then shouldn't there be log entries in the file?
Update 2
Okay I've further narrowed the problem by printing log TRACE into stdio. It seems that my performFindAndReplaceOnString is super inefficient and it never gets past the first row of these dbs because it's just grinding away at the long string. Any suggestions on how I can efficiently perform a string replacement for this use case?

How can I write string to container to be used after a loop?

I have an aplication which create a number of query (update or insert) and then each query is executed.
The whole code is working fine but I've saw that my server IO latency is too much during this proccess.
The code execute a loop which is taking arround 1 minute.
Then what I wanted to do is write each query in memory instead to execute it, and then, once I have the whole list of query to execute, use "LOAD DATA LOCAL INFILE" from mysql, which will take less time.
My question is: How can I write all my query (String object) in a "File" or "any other container" in java to use it after the loop?.
#user3283548 This is my example code:
Class1:
import java.util.ArrayList;
public class Class1 {
public static void main(String[] args) throws Exception {
// TODO Auto-generated method stub
ArrayList<String> Staff=new ArrayList<String>();
Staff.add("tom");
Staff.add("Laura");
Staff.add("Patricia");
for (int x = 0; x < Staff.size(); x++) {
System.out.println(Staff.get(x));
Class2 user = new Class2 (Staff.get(x));
user.checkUser();
}
}
}
Class2:
public class Class2 {
private String user;
public Class2(String user){
this.user=user;
}
public void checkUser() throws Exception{
if (user.equals("tom")){
String queryUser="update UsersT set userStatus='2' where UserName='"+user+"';";
Class3 updateUser = new Class3(queryUser);
updateUser.UpdateQuery();;
}else{
String queryUser="Insert into UsersT (UserName,userStatus)Values('"+user+"','1');";
Class3 updateUser = new Class3(queryUser);
updateUser.InsertQuery();
System.out.println(user+" is not ton doing new insert");
}
}
}
Class3:
public class Class3 {
public String Query;
public Class3(String Query){
this.Query = Query;
}
public void UpdateQuery() throws Exception{
/*// Accessing Driver From Jar File
Class.forName("com.mysql.jdbc.Driver");
//DB Connection
Connection con = DriverManager.getConnection("jdbc:mysql://localhost:3306/default","root","1234567");
String sql =Query;
PreparedStatement pst = con.prepareStatement(sql);*/
System.out.println(Query); //Just to test
//pst.execute();
}
public void InsertQuery() throws Exception{
/*// Accessing Driver From Jar File
Class.forName("com.mysql.jdbc.Driver");
//DB Connection
Connection con = DriverManager.getConnection("jdbc:mysql://localhost:3306/default","root","1234567");
String sql =Query;
PreparedStatement pst = con.prepareStatement(sql);*/
System.out.println(Query); //Just to test
//pst.execute();
}
}
Then, what I wanted to do is create an ArraList in Class1 and use it in Class3 to collect all the queries which has to be executed.
The idea is to execute the list of queries in one time, once the main process is finished, istead to do it for each element within in loop of the Class1. I wanted to do it, because I think it will be take less resource IO from the server HD
Your loop is probably too slow because you're building up Strings using String
I'd hazard a guess you're doing things like
String query = "SELECT * FROM " + variablea + " WHERE + variableb + " = " ...
If you're doing a lot of string concatenation then use StringBuilder as every time you change a string it is actually re-created which is expensive. Simply changing your code to use StringBuilder instead of string will probably cut your loop executed time to a couple of MS. Simply call .toString() method of StringBuilder obj to get the string.
Storing objects
If you want to store anything for later use you should store it in a Collection. If you want a a key-value relationship then use a Map (HashMap would suit you fine). If you just want the values use an List (ArrayList is most popular).
So for example if I wanted to store query strings for later use I would...
Construct the string using StringBuilder.
Put the string (by calling .toString() into a HashMap
Get the query string from the HashMap...
You should never store things on disk if you don't need them to be persistent over application restarts and even then I'd store them in a database not in a file.
Hope this helps.
Thanks
David
EDIT: UPDATE BASED ON YOU POSTING YOUR CODE:
OK this needs some major re-factoring!
I've kept it really simple because I don't have a lot of time to re-write comprehensively.
I've commented where I have made corrections.
Your major issue here is creating objects in loops. You should just create the object once as creating objects is expensive.
I've also corrected other coding issues and replaced the for loop as you shouldn't be writing it like that.I've also renamed the classes to something useful.
I've not tested this so you may need to do some work to get it to work. But this should be a lot faster.
OLD CLASS 1
import java.util.ArrayList;
import java.util.List;
public class StaffChecker {
public static void main(String[] args) throws Exception {
// Creating objects is expensive, you should do this as little as possible
StaffCheckBO staffCheckBO = new StaffCheckBO();
// variables should be Camel Cased and describe what they hold
// Never start with ArrayList start with List you should specific the interface on the left side.
List<String> staffList = new ArrayList<String>();
staffList.add("tom");
staffList.add("Laura");
staffList.add("Patricia");
// use a foreach loop not a (int x = 0 ... ) This is the preffered method.
for (String staffMember : staffList) {
// You now dont need to use .get() you can access the current variable using staffMember
System.out.println(staffMember);
// Do the work
staffCheckBO.checkUser(staffMember);
}
}
}
OLD CLASS 2
/**
* Probably not really any need for this class but I'll assume further business logic may follow.
*/
public class StaffCheckBO {
// Again only create our DAO once...CREATING OBJECTS IS EXPENSIVE.
private StaffDAO staffDAO = new StaffDAO();
public void checkUser(String staffMember) throws Exception{
boolean staffExists = staffDAO.checkStaffExists(staffMember);
if(staffExists) {
System.out.println(staffMember +" is not in database, doing new insert.");
staffDAO.insertStaff(staffMember);
} else {
System.out.println(staffMember +" has been found in the database, updating user.");
staffDAO.updateStaff(staffMember);
}
}
}
OLD CLASS 3
import java.sql.*;
/**
* You will need to do some work to get this class to work fully and this is obviously basic but its to give you an idea.
*/
public class StaffDAO {
public boolean checkStaffExists(String staffName) {
boolean staffExists = false;
try {
String query = "SELECT * FROM STAFF_TABLE WHERE STAFF_NAME = ?";
PreparedStatement preparedStatement = getDBConnection().prepareStatement(query);
// Load your variables into the string in order to be safe against injection attacks.
preparedStatement.setString(1, staffName);
ResultSet resultSet = preparedStatement.executeQuery();
// If a record has been found the staff member is in the database. This obviously doesn't account for multiple staff members
if(resultSet.next()) {
staffExists = true;
}
} catch (SQLException e) {
System.out.println("SQL Exception in getStaff: " + e.getMessage());
}
return staffExists;
}
// Method names should be camel cased
public void updateStaff(String staffName) throws Exception {
try {
String query = "YOUR QUERY";
PreparedStatement preparedStatement = getDBConnection().prepareStatement(query);
// Load your variables into the string in order to be safe against injection attacks.
preparedStatement.setString(1, staffName);
ResultSet resultSet = preparedStatement.executeQuery();
} catch (SQLException e) {
System.out.println("SQL Exception in getStaff: " + e.getMessage());
}
}
public void insertStaff(String staffName) throws Exception {
try {
String query = "YOUR QUERY";
PreparedStatement preparedStatement = getDBConnection().prepareStatement(query);
// Load your variables into the string in order to be safe against injection attacks.
preparedStatement.setString(1, staffName);
ResultSet resultSet = preparedStatement.executeQuery();
} catch (SQLException e) {
System.out.println("SQL Exception in getStaff: " + e.getMessage());
}
}
/**
* You need to abstract the connection logic away so you avoid code reuse.
*
* #return
*/
private Connection getDBConnection() {
Connection connection = null;
try {
Class.forName("com.mysql.jdbc.Driver");
connection = DriverManager.getConnection("jdbc:mysql://localhost:3306/default", "root", "1234567");
} catch (ClassNotFoundException e) {
System.out.println("Could not find class. DB Connection could not be created: " + e.getMessage());
} catch (SQLException e) {
System.out.println("SQL Exception. " + e.getMessage());
}
return connection;
}
}

H2 Trigger, Notifier not called in Automatic Mixed Mode (AUTO_SERVER=TRUE) on remote client

I'm trying to use H2 Trigger facility to let clients connected to a H2 database in automatic mixed mode (AUTO_SERVER=TRUE) receive notification when something changes in the database table
test(id INTEGER NOT NULL AUTO_INCREMENT, message varchar(1024))
So far only the H2 server receives the TRIGGER notification, while clients cannot receive any notification therefore their only way to check for changes to the database is to poll with queries to the table, but this way the TRIGGER itself is useless, I could just simply all clients and server poll the database for changes!.
Is there some way to let a trigger notify all clients connected or call a method inside each client so that they realize the table has been modified with an insertion (doesn't bother me the delete or update cases)?
I post my code below which is based on this answer by Thomas Mueller (H2 database creator):
import java.sql.*;
import java.util.concurrent.atomic.AtomicLong;
import org.h2.api.Trigger;
public class TestSimpleDb
{
public static void main(String[] args) throws Exception
{
final String url = "jdbc:h2:test;create=true;AUTO_SERVER=TRUE;multi_threaded=true";
boolean isSender = false;
for (String arg : args)
{
if (arg.contains("receiver"))
{
System.out.println("receiver starting");
isSender = false;
}
else if (arg.contains("sender"))
{
System.out.println("sender starting");
isSender = true;
}
}
if (isSender)
{
Connection conn = DriverManager.getConnection(url);
Statement stat = conn.createStatement();
stat.execute("create table test(id INTEGER NOT NULL AUTO_INCREMENT, message varchar(1024))");
stat.execute("create trigger notifier "
+ "before insert, update, delete, rollback "
+ "on test FOR EACH ROW call \""
+ TestSimpleDb.Notifier.class.getName() + "\"");
Thread.sleep(500);
for (int i = 0; i < 10; i++) {
System.out.println("Sender: I change something...");
stat.execute("insert into test(message) values('my message')");
Thread.sleep(1000);
}
conn.close();
}
else
{
new Thread() {
public void run() {
try {
Connection conn = DriverManager.getConnection(url);
while (true) {
;
//this loop is just to keep the thread alive..
}
}
catch (Exception e)
{
e.printStackTrace();
}
}
}.start();
}
}
public static class Notifier implements Trigger
{
#Override
public void init(Connection cnctn, String string, String string1, String string2, boolean bln, int i) throws SQLException {
// Initializing trigger
}
#Override
public void fire(Connection conn, Object[] oldRow, Object[] newRow) throws SQLException {
if (newRow != null) {
System.out.println("Received: " + (String) newRow[1]);
}
}
#Override
public void close() {
// ignore
}
#Override
public void remove() {
// ignore
}
}
}
Like all triggers, this trigger is called on the server, that is, when using the automatic mixed mode (like you do) in the process that opened the database first. Therefore, if I first start the "sender", then I get the following output there:
sender starting
Sender: I change something...
Received: my message
Sender: I change something...
Received: my message
and if I then start the "receiver", I get the following messages there:
Receiver: event received
Receiver: event received
Receiver: event received
If you want that the "receiver" can display what rows were changed, you would need a different architecture. For example, you could add a timestamp column to the table (and an index for this column), and then, on the receiver side, query for the rows where the timestamp is new. This will only work for added and changed rows; for removed rows, you might need to add a new table that contains the removed rows since time x. This table would need to be garbage collected from time to time so that it doesn't grow forever.

Categories

Resources