one base executeQuery method or one for every query - java

I've started creating a toDoList and I like to create a "DataMapper" to fire queries to my Database.
I created this Datamapper to handle things for me but I don't know if my way of thinking is correct in this case. In my Datamapper I have created only 1 method that has to execute the queries and several methods that know what query to fire (to minimalize the open and close methods).
For example I have this:
public Object insertItem(String value) {
this.value = value;
String insertQuery = "INSERT INTO toDoList(item,datum) " + "VALUES ('" + value + "', CURDATE())";
return this.executeQuery(insertQuery);
}
public Object removeItem(int id) {
this.itemId = id;
String deleteQuery = "DELETE FROM test WHERE id ='" + itemId + "'";
return this.executeQuery(deleteQuery);
}
private ResultSet executeQuery(String query) {
this.query = query;
Connection con = null;
Statement st = null;
ResultSet rs = null;
try {
con = db.connectToAndQueryDatabase(database, user, password);
st = con.createStatement();
st.executeUpdate(query);
}
catch (SQLException e1) {
e1.printStackTrace();
}
finally {
if (rs != null) {
try {
rs.close();
} catch (SQLException e2) { /* ignored */}
}
if (st != null) {
try {
st.close();
} catch (SQLException e2) { /* ignored */}
}
if (con != null) {
try {
con.close();
} catch (SQLException e2) { /* ignored */}
}
System.out.println("connection closed");
}
return rs;
}
So now I don't know if it's correct to return a ResultSet like this. I tought of doing something like
public ArrayList<ToDoListModel> getModel() {
return null;
}
To insert every record returned in a ArrayList. But I feel like I'm stuck a little bit. Can someone lead me to a right way with an example or something?

It depends on the way the application works. If you have a lot of databases hits in a short time it would be better to bundle them and use the same database connection for all querys to reduce the overhead of the connection establishment and cleaning.
If you only have single querys in lager intervals you could do it this way.
You should also consider if you want to seperate the database layer and the user interface (if existing).
In this case you should not pass the ResultSet up to the user interface but wrap the data in an independent container and pass this through your application.

If I understand your problem correctly!, you need to pass a list of ToDoListModel objects
to insert into the DB using the insertItem method.
How you pass your object to insert items does not actually matter, but what you need to consider is how concurrent this DataMapper works, if it can be accessed by multiple threads at a time, you will end up creating multiple db connections which is little expensive.Your code actually works without any issue in sequential access.
So you can add a synchronized block to connection creation and make DataMapper class singleton.

Ok in that case what you can do is, create a ArrayList of hashmap first. which contains Key, Value as Column name and Column value. After that you can create your model.
public List convertResultSetToArrayList(ResultSet rs) throws SQLException{
ResultSetMetaData mdata = rs.getMetaData();
int columns = mdata.getColumnCount();
ArrayList list = new ArrayList();
while (rs.next()){
HashMap row = new HashMap(columns);
for(int i=1; i<=columns; ++i){
row.put(md.getColumnName(i),rs.getObject(i));
}
list.add(row);
}
return list;
}

Related

Java try-with-resource on SQL statement will these close properly?

I had a very sophisticated class that performed DB queries, the problem is it wasnt using try-with-resource statements so i had to .close() manually. To be safer, I tried to re-design it with try-with-resource. My question is if these resources will close properly given how I'm referencing them outside the objects containing those resources. For example, this class DBQuery i use to create queries and resources related to those queries
public class DBQuery {
private String _query;
private PreparedStatement _stmt;
private ResultSet _rs;
// constructor
public DBQuery (String query) {
_query = query;
}
public PreparedStatement execPreparedStatement() throws SQLException {
_stmt = DB.getCon().prepareStatement(_query, Statement.RETURN_GENERATED_KEYS);
return _stmt;
}
public ResultSet getRecordSet() throws SQLException {
_rs = _stmt.executeQuery();
return _rs;
}
public void setInt(int paramNum, int setVal) throws SQLException {
_stmt.setInt(paramNum, setVal);
}
public void setString(int paramNum, String setVal) throws SQLException {
_stmt.setString(paramNum, setVal);
}
}
Then this would be example usage of the class. loadActiveCompany given a companyId retrieves the company from the database and creates some objects. My question is two-fold:
will the resources close properly when loadActiveCompany completes.
is there any problem with how I'm using the try-catch blocks.
Thank you
// loads the active company into the view
public void loadActiveCompany(int companyId) {
boolean loadFailed = false;
// we are passed the company id
_activeCompany.setCompanyId(companyId);
DBQuery qComps = new DBQuery("SELECT comp_name FROM comps WHERE id=?");
try ( PreparedStatement stmtComps = qComps.execPreparedStatement() ) {
DB.getCon().rollback();
qComps.setInt(1, companyId);
try ( ResultSet rsComps = qComps.getRecordSet() ) {
rsComps.next();
String companyName = rsComps.getString("comp_name");
_activeCompany.setCompanyName(companyName);
}
} catch (SQLException e) {
System.out.println("Couldn't find company!");
loadFailed = true;
} finally {
if (loadFailed)
return;
}
DBQuery qGroups = new DBQuery("SELECT id, group_name FROM comps_groups WHERE comp_id=? ORDER BY sort_order ASC");
try ( PreparedStatement stmtGroups = qGroups.execPreparedStatement() ) {
qGroups.setInt(1, companyId);
try ( ResultSet rsGroups = qGroups.getRecordSet() ) {
while (rsGroups.next()) {
int groupId = rsGroups.getInt("id");
Group thisGroup = new Group();
thisGroup.setGroupId(groupId);
thisGroup.setGroupName(rsGroups.getString("group_name"));
DBQuery qAnchors = new DBQuery("SELECT id, anchor_name, anchor_type FROM comps_groups_anchors WHERE group_id=? ORDER BY sort_order ASC");
try ( PreparedStatement stmtAnchors = qAnchors.execPreparedStatement() ) {
qAnchors.setInt(1, groupId);
try ( ResultSet rsAnchors = qAnchors.getRecordSet() ) {
while (rsAnchors.next()) {
int anchorId = rsAnchors.getInt("id");
Anchor thisAnchor = new Anchor();
thisAnchor.setAnchorId(anchorId);
thisAnchor.setAnchorName(qAnchors.getRS().getString("anchor_name"));
thisAnchor.setAnchorType(qAnchors.getRS().getInt("anchor_type"));
thisGroup.addGroupAnchor(thisAnchor);
}
}
}
_activeCompany.getCompanyGroups().add(thisGroup);
}
}
DB.getCon().commit();
} catch (SQLException e) {
System.out.println("Could not load company!");
loadFailed = true;
} finally {
if (loadFailed)
return;
}
// print out the active company
_activeCompany.printStatus();
}
Short answers:
Yes, the PreparedStatements are all closed.
Not directly problems, but easier ways.
Here my BUTs:
The name execPreparedStatement is totally misleading, as it is not (in DB terms) executing anything, just creating the PreparedStatement. A better name would be createPreparedStatement or - lol - preparePreparedStatement
Why do you call DB.getCon().rollback();? I do not think this will lead to someplace good...
the way you use DBQuery at the moment, it will only bring pain. Basicall this is just a container that saves additional infos (_stmt + _rs) which makes it SEVERELY state and sequence dependent, prone to hit you with lots of NPEs
so the actions you call on DBQuery you could simply also call on the PreparedStatement, reducing complexity and taking away a few pitfalls
so either completely remove DBQuery
or remodel the DBQuery
to be Closeable/AutoCloseable,
add some checks to the other functions,
create the PreparedStatement right away (CTOR, query string as CTOR parameter),
keep it private, do not expose it
use it inside getRecordSet,
do not store any other references unless you REALLY need them
and in the close method close the PS,
Your loadFailed = true; and if (loadFailed) return; seems overly convoluted and error-prone. Why not directly call return; right where you currently have loadFailed = true; lines?
I would - personal preference - put those 2 whole try-catch blocks into their own methods, signaling failure with a boolean or something => more methods with each less code and better variable scope (for example no re-use of loadFailed, but better re-usability of the two methods)
You actually do NOT need the inner try-resource on the ResultSets, but it's good if you (can) keep em. Just be careful there, as closing a ResultSet might have an impact on its creator PreparedStatement. So if you test it (in a situation where you re-use the PreparedStatement, that what it's actually made for) and get a 'closed' Exception when reusing the PreparedStatement, then you remove the try-resource blocks around the ResultSets.

JDBC optimize MySql request on Multithread

I'm building a webcrawler and I'm looking for the best way to handle my requests and connection between my threads and the database (MySql).
I've 2 types of threads :
Fetchers : They crawl websites. They produce url and add they into 2 tables : table_url and table_file. They select from table_url
to continue the crawl. And update table_url to set visited=1 when they
have read a url. Or visited=-1 when they are reading it. They can
delete row.
Downloaders : They download files. They select from table_file. They update table_file to change the Downloaded column. They never
insert anything.
Right now I'm working with this :
I've a pool of connection based on c3p0.
Every target (website) have thoses variables :
private Connection connection_downloader;
private Connection connection_fetcher;
I create both connection only once when I instanciate a website. Then every thread will use thoses connections based on their target.
Every thread have thoses variables :
private Statement statement;
private ResultSet resultSet;
Before every Query I open a SqlStatement :
public static Statement openSqlStatement(Connection connection){
try {
return connection.createStatement();
} catch (SQLException e) {
e.printStackTrace();
}
return null;
}
And after every Query I close sql statement and resultSet with :
public static void closeSqlStatement(ResultSet resultSet, Statement statement){
if (resultSet != null) try { resultSet.close(); } catch (SQLException e) {e.printStackTrace();}
if (statement != null) try { statement.close(); } catch (SQLException e) {e.printStackTrace();}
}
Right now my Select queries only work with one select (I never have to select more than one for now but this will change soon) and is defined like this :
public static String sqlSelect(String Query, Connection connection, Statement statement, ResultSet resultSet){
String result = null;
try {
resultSet = statement.executeQuery(Query);
resultSet.next();
result = resultSet.toString();
} catch (SQLException e) {
e.printStackTrace();
}
closeSqlStatement(resultSet, statement);
return result;
}
And Insert, Delete and Update queries use this function :
public static int sqlExec(String Query, Connection connection, Statement statement){
int ResultSet = -1;
try {
ResultSet = statement.executeUpdate(Query);
} catch (SQLException e) {
e.printStackTrace();
}
closeSqlStatement(resultSet, statement);
return ResultSet;
}
My question is simple : can this be improved to be faster ? And I'm concerned about mutual exclusion to prevent a thread to update a link while another is doing it.
I believe your design is flawed. Having one connection assigned full-time for one website will severly limit your overall workload.
As you already have setup a connection pool, it's perfectly okay to fetch before you use (and return afterwards).
Just the same, try-with-catch for closing all your ResultSets and Statements after will make code more readable - and using PreparedStatement instead of Statement would not hurt as well.
One Example (using a static dataSource() call to access your pool):
public static String sqlSelect(String id) throws SQLException {
try(Connection con = dataSource().getConnection();
PreparedStatement ps = con.prepareStatement("SELECT row FROM table WHERE key = ?")) {
ps.setString(1, id);
try(ResultSet resultSet = ps.executeQuery()) {
if(rs.next()) {
return rs.getString(1);
} else {
throw new SQLException("Nothing found");
}
}
} catch (SQLException e) {
e.printStackTrace();
throw e;
}
}
Following the same pattern I suggest you create methods for all the different Insert/Update/Selects your application uses as well - all using the connection only for the short time inside the DB logic.
I can not see a real advantage to have all the Database stuff in your webcrawler threads.
Why don't you use a static class with the sqlSelect and sqlExec method, but without the Connection and ResultSet parameters. Both connection objects are static as well. Make sure the connection objects are valid befor using them.

ResultSet is Closed

Following is my Table Definition:
create Table alarms(
alarmId int primary key identity(1,1),
alarmDate varchar(50) not null,
alarmText varchar(50) not null,
alarmStatus varchar(10) Check (alarmStatus in(-1, 0, 1)) Default 0
);
Secondly here are some of my methods i'm using:
public void restartDatabase(){
try{
Class.forName(Settings.getDatabaseDriver());
connection = DriverManager.getConnection( Settings.getJdbcUrl() );
statement = connection.createStatement();
}
catch(Exception e){
e.printStackTrace();
}
}
public ResultSet executeQuery(String query){
ResultSet result = null;
try {
result = statement.executeQuery(query);
} catch (SQLException e) {
e.printStackTrace();
}
return result;
}
public void closeDatabase() {
try {
if ((statement != null) && (connection != null)) {
statement.close();
connection.close();
}
} catch (Exception e) {
e.printStackTrace();
}
}
What i want to do is to get all the alarmId's from the table where date is equal to the given date and then against each alarmId i want to update its status to given status:
public static void updateAlarmStatus(int status) {
ResultSet rs = null;
database.restartDatabase();
try {
rs = database
.executeQuery("Select alarmId from alarms where alarmDate = '"
+ Alarm.getFormattedDateTime(DateFormat.FULL,
DateFormat.SHORT) + "'");
while (rs.next()) {
database.executeUpdate("update alarms set alarmStatus = '"+status+"' where alarmId = '"+rs.getString("alarmId")+"'");
}
} catch (Exception e) {
e.printStackTrace();
} finally {
database.closeDatabase();
}
}
But it generates the Error that Result Set is Closed.
I Goggled it and came to know that a result set automatically closes when we try to execute another query inside it
and it needs to restart the connection.
i tried calling restartDatabase() method that is creating new connection but still getting the same error.
I'm guessing executeUpdate uses the same instance variable for its Statement as the query uses. When you create a new Statement and assign it to the variable, nothing is referring to the old one, so it gets cut loose and becomes subject to garbage-collection. During garbage collection the statement's finalizer is invoked, closing it. Closing the statement makes the ResultSet it created close as well.
You shouldn't be sharing these Statement variables between different queries and updates. The statement should be a local variable and not a member of an object instance.
Also, result Sets should always be local variables, they shouldn't be passed outside the method where they're created. The resultSet is a reference to a cursor, it doesn't actually hold any data. Always have your code read from the resultSet and populate some data structure with the results, then return the data structure.
You can also select and change all alarmIds at once:
rs = database.
executeQuery("Select group_concat(distinct alarmId) as alarmIds from alarms group by alarmDate having alarmDate = '"
+ Alarm.getFormattedDateTime(DateFormat.FULL,
DateFormat.SHORT) + "'");
while (rs.next()) { // there will be only one result
database.executeUpdate("update alarms set alarmStatus = '"+status+"' where alarmId in ("+rs.getString("alarmIds")+")");
}

jdbc performance

There are thee tables inside my database. One is employee, the second is employee_Project, and the third is employee_Reporting. Each table has a common employee_Number as its primary key, and there is a one to many relationship among them such that an employee has many projects and reporting dates.
I have run select * from employee, select * from employee_project, select * from employee_reporting in three data holder classes which have methods fillResultSet(Result set) and List<T> getData(). This is based on a SqlDbEngine class with a runQuery(PreparedStatement,DataHolder) method, and the implementation has been completed.
Now I have to design a getAllEmployee() method along with project and reporting detail with optimal code in java using JDBC. I have used an iterator but this solution is not acceptable; now I have to use a foreach loop.
This is what I have done:
public List<Employee> getAllEmployees() {
EmployeeDataHolderImpl empdataholder = new EmployeeDataHolderImpl();
List<Employee> list_Employee_Add = null;
try {
Connection connection = mySqlDbConnection.getConnection();
PreparedStatement preparedStatement = connection
.prepareStatement(GET_ALL_EMPLOYEE_DETAILS);
mySqlDBEngineImpl.runQuery(preparedStatement, empdataholder);
} catch (SQLException e) {
e.printStackTrace();
}
for (Employee employee : empdataholder.getData()) {
new EmployeeDAOImpl().getProject(employee);
new EmployeeDAOImpl.getReport(employee);
}
list_Employee_Add = empdataholder.getData();
return list_Employee_Add;
}
and make another method
public void getProject(Employee emp) {
EmployeeProjectDataHolderImpl employeeProjectHolder = new EmployeeProjectDataHolderImpl();
try {
Connection connection = mySqlDbConnection.getConnection();
PreparedStatement preparedStatement = connection
.prepareStatement(GET_ALL_PROJECT_DETAILS);
mySqlDBEngineImpl
.runQuery(preparedStatement, employeeProjectHolder);
} catch (SQLException e) {
e.printStackTrace();
}
for (EmployeeProject employee_Project : employeeProjectHolder.getData()) {
if (employee_Project.getEmployeeNumber() == emp.getEmpNumber()) {
emp.getProjects().add(employee_Project);
}
}
}
public void getReport(Employee emp) {
EmployeeReportDataHolderImpl employeeReportHolder = new EmployeeReportDataHolderImpl();
try {
Connection connection = mySqlDbConnection.getConnection();
PreparedStatement preparedStatement = connection
.prepareStatement(GET_ALL_REPORT_DETAILS);
mySqlDBEngineImpl
.runQuery(preparedStatement, employeeReportHolder);
} catch (SQLException e) {
e.printStackTrace();
}
for (EmployeeReport employee_Report : employeeReportHolder.getData()) {
if (employee_Report.getEmployeeNumber() == emp.getEmpNumber()) {
emp.getProjects().add(employee_Project);
}
}
}
}
and same for Employee Reporting but doing, this performance is going to decrease.no body worry about closing connection i will do it
Please tell me how I could improve my solution..
There are some issue with your code.
1.you are initializing EmployeeDAOImpl everytime, rather you can just keep one instance and call the operations over it.
new EmployeeDAOImpl().getProject(employee); new
EmployeeDAOImpl.getReport(employee);
2.I don't see where you close your connection after performing an SQL operation.
You should be having
try {
--code statements
}
catch(SQLException e){
e.printStackTrace();
}
finally{
-- close your connection and preparedStatement
}
Closing database connections is very vital.
If you use your actual code, you will have 3 impacts in your code:
You're opening a connection to get the employee's data.
For every employee, you open (and close) a new connection to get his projects.
For every employee, you open (and close) a new connection to get his reports.
Note that opening a new connection is a performance hit on your application. It doesn't matter if you use an enhanced for-loop or an Iterator, there would be many hits that can slow down your application.
Two ways to solve this problem:
Open a single connection where you run all your select statements. This will be better than opening/closing lot of connections.
Create a single SQL statement to retrieve the employees and the data you need for every employee. It will have better performance for different reasons:
A single connection to the database.
A single query instead of lot of queries to the database (a single I/O operation).
If your rdbms allows it, the query will be optimized for future requests (a single query instead of multiple queries).
I would prefer to go with the second option. For this, I tend to use a method that executes any SQL select statement and return a ResultSet. I'll post a basic example (note, the provided code can be improved depending on your needs), this method could be in your SqlDbEngine class:
public ResultSet executeSQL(Connection con, String sql, List<Object> arguments) {
PreparedStatement pstmt = null;
ResultSet rs = null;
try {
pstmt = con.prepareStatement(sql);
if (arguments != null) {
int i = 1;
for(Object o : arguments) {
pstmt.setObject(i++, o);
}
}
//method to execute insert, update, delete statements...
rs = pstmt.execute();
} catch(SQLException e) {
//handle the error...
}
return rs;
}
And this other method to handle all the query operation
public List<Employee> getAllEmployee() {
Connection con = null;
ResultSet rs = null;
List<Employee> lstEmployee = new ArrayList<Employee>();
try {
con = mySqlDbConnection.getConnection();
//write the sql to retrieve all the data
//I'm assuming these can be your columns, it's up to you
//this can be written using JOINs...
String sql = "SELECT E.EMPLOYEE_ID, E.EMPLOYEE_NAME, P.PROJECT_NAME, R.REPORT_NAME FROM EMPLOYEE E, PROJECT P, REPORT R WHERE E.EMPLOYEE_ID = P.EMPLOYEE_ID AND E.EMPLOYEE_ID = R.EMPLOYEE_ID";
//I guess you don't need parameters for this...
rs = SqlDbEngine.executeSQL(con, sql, null);
if (rs != null) {
Employee e;
int employeeId = -1, lastEmployeeId = -1;
while (rs.next()) {
//you need to make sure to create a new employee only when
//reading a new employee id
employeeId = rs.getInt("EMPLOYEE_ID");
if (lastEmployeeId != employeeId) {
e = new Employee();
lastEmployeeId = employeeId;
lstEmployee.add(e);
}
Project p = new Project();
Report r = new Report();
//fill values of p...
//fill values of r...
//you can fill the values taking advantage of the column name in the resultset
//at last, link the project and report to the employee
e.getProjects().add(p);
e.getReports().add(r);
}
}
} catch (Exception e) {
//handle the error...
} finally {
try {
if (rs != null) {
Statement stmt = rs.getStatement();
rs.close();
stmt.close();
}
if (con != null) {
con.close();
}
} catch (SQLException e) {
//handle the error...
}
}
return lstEmployee;
}
Note that the second way can be harder to code but it will give you the best performance. It's up to you to improve the provided methods, some advices:
Create a class that receives a ResultSet and builds a Project instance using the columns name of the ResultSet (similar for Report and Employee).
Create a method that handles the ResultSet and its Statement close.
As a best practice, never use select * from mytable, it's preferable to write the needed columns.
If I understand correctly, your code first loads all EmployeeReport rows and then filters them according to getEmployeeNumber(). You can let your database do this by modifying your SQL query.
Since you didn't show your SQL queries (I assume they're in GET_ALL_REPORT_DETAILS), I'll just make a guess... Try executing SQL like:
select *
from employee_reporting
where employeeNumber = ?
If you put this in a PreparedStatement, and then set the parameter value, your database will only return the data you need. For example:
PreparedStatement pstmt = con.prepareStatement(GET_ALL_REPORT_DETAILS);
pstmt.setInt(1, employee.getEmployeeNumber());
That should return only the EmployeeReport records having the desired employeeNumber. In case performance is still an issue, you could consider adding an index to your EmployeeReport table, but that's a different story...

How do I abstract my business logic and object defintions away from my database access code?

So I have a database with 2 tables - Workflows and WorkflowSteps I want to use the rows stored there to create objects in java BUT the catch is that I want to have my database code separated from my application code. From one point onwards - when Workflow/WorkflowSteps objects are create the rest of the application will not have to worry about DB access. So here is what I have:
public Workflow getPendingWorkflowId() {
int workflowid = -1;
Statement statement = null;
ResultSet rs = null;
try {
statement = con.createStatement();
rs = statement.executeQuery("SELECT id FROM xxx.workflows WHERE status = 'NOT-YET-STARTED' LIMIT 1");
while (rs.next()) {
workflowid = rs.getInt("id");
}
statement.close();
rs.close();
} catch (SQLException ex) {
Logger.getLogger(DBAccessor.class.getName()).log(Level.SEVERE, null, ex);
System.out.println("Error fetching workflows id");
}
return new Workflow(workflowid);
}
Each workflow object has a List to store the steps that pertain to a particular Workflow and then each WorkflowStep has a Map which is used to store data taken from a 3rd table:
public List<WorkflowStep> getUnworkedStepsByWFId(int id) {
//can be changed
ArrayList<WorkflowStep> steps = new ArrayList<WorkflowStep>();
Statement statement = null;
ResultSet rs = null;
try {
statement = con.createStatement();
rs = statement.executeQuery("SELECT * FROM `workflow_steps` WHERE `workflow_id` =" + id + " AND status = 'NOT-YET-STARTED'");
while (rs.next()) {
steps.add(new WorkflowStep(rs.getInt(1), rs.getInt(3), rs.getInt(4)));
}
statement.close();
rs.close();
} catch (SQLException ex) {
Logger.getLogger(DBAccessor.class.getName()).log(Level.SEVERE, null, ex);
System.out.println("Error fetching workflows id");
}
return steps;
}
And here is the query for the 3rd table:
public Map getParametersForStep(int workflowId, int workstepPos) {
Statement statement = null;
ResultSet rs = null;
Map<String, String> hMap = new HashMap<String, String>();
try {
statement = con.createStatement();
//MIGHT BE WRONG
rs = statement.executeQuery("SELECT wf.id AS workflowID, ws_steps.id AS workflowStepsID, name, param_value, pathname FROM workflows AS wf INNER JOIN workflow_steps AS ws_steps ON wf.id = ws_steps.workflow_id INNER JOIN ws_parameters ON ws_parameters.ws_id = ws_steps.id INNER JOIN submodule_params ON submodule_params.id = ws_parameters.sp_id AND wf.id =" + workflowId + " AND ws_steps.workflow_position =" + workstepPos);
String paramName = null;
String paramValue = null;
while (rs.next()) {
paramName = rs.getString("name");
if (rs.getString("param_value") == null) {
paramValue = rs.getString("pathname");
} else {
paramValue = rs.getString("param_value");
}
hMap.put(paramName, paramValue);
}
statement.close();
rs.close();
return hMap;
} catch (SQLException ex) {
Logger.getLogger(DBAccessor.class.getName()).log(Level.SEVERE, null, ex);
System.out.println("Error fetching workflow step parameters names");
}
return Collections.emptyMap();
}
Having this code in mind I end up with the following "procedure" to initialize a Workflow with all its WorkflowSteps and their Parameters:
Workflow wf = db.getPendingWorkflowId();
wf.initSteps(db.getUnworkedStepsByWFId(wf.getId()));
Iterator<WorkflowStep> it = wf.getSteps();
while(it.hasNext()) {
WorkflowStep step = it.next();
step.setParameters(db.getParametersForStep(wf.getId(), step.getPosInWorkflow()));
}
I think I have a good level of decoupling but I wonder if this can be refactored somehow - for example probably move the step.setParameters to a method of the WorkflowStep class but then I would have to pass a reference to the database connection (db) to a WorkflowStep object but in my view this will break the decoupling? So how would you people refactor this code?
It seems that you are rolling your own ORM. My suggestion would be to use one of existing ones like Hibernate.
This is the function of an Object Relational Mapper. It serves to abstract your DB access away from your business model. In fact, used properly, an ORM library allows you to write no database code at all.

Categories

Resources