Do I have to create another RowMapper? - java

I'm developing this application to fetch data from a single table from an existing Oracle database.
Here we've got the entity:
public class OrdemDeServicoCount {
private Long ordensInternas;
private Long ordensAtrasadas;
// assume getters and setters
}
The mapper:
public class OrdemMapper implements RowMapper<OrdemDeServicoCount> {
#Override
public OrdemDeServicoCount mapRow(ResultSet rs, int rowNum) throws SQLException {
OrdemDeServicoCount ordens = new OrdemDeServicoCount();
ordens.setOrdensInternas(rs.getLong("ordensInternas"));
// ordens.setOrdensAtrasadas(rs.getLong("ordensAtrasadas"));
return ordens;
}
}
And finally, the DAO:
public class OrdemDAO {
private JdbcTemplate jdbcTemplate;
public OrdemDAO(JdbcTemplate jdbcTemplate) {
super();
this.jdbcTemplate = jdbcTemplate;
}
public List<OrdemDeServicoCount> countOrdensInternasSemEncerrar() {
String sql = "SELECT COUNT(a.nr_sequencia) AS ordensInternas FROM MAN_ORDEM_SERVICO a "
+ "WHERE a.IE_STATUS_ORDEM IN (1,2) AND a.NR_GRUPO_PLANEJ IN (21)";
List<OrdemDeServicoCount> ordens = jdbcTemplate.query(sql, new OrdemMapper());
return ordens;
}
By the way, you all must know that if I declare uncomment the line ordens.setOrdensInternas(rs.getLong("ordensInternas")); in the mapper, I would get an error, because in my DAO, I'm not using that field.
But what if I need to create another method that uses just the ordensInternas field? Then again, I'd get an error...
So, my doubt here is: if I need to use the ordensAtrasadas field from the entity, will I have to create another class just to implement another mapper? Or is there a way that I can do any conditional in my current OrdemMapper class?

Just put your assignments in individual try-catch statements.
public class OrdemMapper implements RowMapper<OrdemDeServicoCount> {
#Override
public OrdemDeServicoCount mapRow(ResultSet rs, int rowNum) throws SQLException {
OrdemDeServicoCount ordens = new OrdemDeServicoCount();
try {
ordens.setOrdensInternas(rs.getLong("ordensInternas"));
} catch (SQLException ex) {
// This will happen if the columnIndex is invalid among other things
}
try {
ordens.setOrdensAtrasadas(rs.getLong("ordensAtrasadas"));
} catch (SQLException ex) {
// This will happen if the columnIndex is invalid among other things
}
return ordens;
}
}

Related

Rowmapper vs ResultSetExtractor in spring

I'm facing weird issue with Spring jdbc RowMapper:-
Here is my code
public void test(){
String sql=" Query will fetch records where dateColumn<='2021-08-17' Limit 1";
jdbcTemplate.query(sql, new ModelRowMapper());
}
public class ModelRowMapper implements RowMapper<ModelRowMapper> {
#Override
public ModelRowMapper mapRow(ResultSet rs, int rowNum) throws SQLException {
ModelRowMapper model= new ModelRowMapper();
System.out.println(rs.getString("value"));
}
}
Example:-
db records:-
2021-08-21
2021-08-15
2021-08-13
Output I'm expecting is 2021-08-15
In the ModelRowMapper class observed resultSet prints two values(1st is valid:- 2021-08-15) then print the invalid value and in the response also I will be getting invalid value
But above query properly works when I use the ResultSetExtractor
jdbcTemplate.query(sql, new ResultSetExtractor<String>() {
#Override
public String extractData(ResultSet rs) throws SQLException, DataAccessException {
while (rs.next()) {
System.err.println(rs.getString("value"));
}
//prints only one value and returns the same value
return "";
}
});
What would be the issue with rowMapper?....
Any suggestions would be helpful.......
You are somehow misunderstood how rowmapper should be called! use the following syntax, that would give you the desired result.
public void test(){
String sql=" Query will fetch records where dateColumn<='2021-08-17' Limit 1";
jdbcTemplate.query(query, new RowMapper<ModelRowMapper>(){
#Override
public ModelRowMapper mapRow(ResultSet rs, int rowNum) throws
SQLException {
ModelRowMapper model= new ModelRowMapper();
System.out.println(rs.getString("value"));
}
});
}

Jdbi and Inheritance: Conditional Mapping?

I have a single table called Tags that stores a "Tag" as a row, regardless of what specific subclass they represent. Some rows represent modbus tags, some snmp, some other protocols. All classes inheriting from Tag store their data in this one table, and unused columns simply contain null values.
At the moment, I have DAO methods like, getAllModBusTags() which contains an instruction mapToBean(ModBusTag.class). Eventually all of the subclasses of Tag are fetched from the database (one fetch per protocol) and then added to an ArrayList of the supertype Tag.
My question is, is there a simple means with Jdbi to perform conditional mapping of rows so that if a row contains a specific value, it is mapped to ModBusTag.class but if a row contains a different value it is mapped to SNMPTag.class, and so on and so forth?
My end goal is to have a single select statement that fetches every tag from the database, automaps to the correct bean on a row by row basis and then stores all of these subclass beans in a List of the supertype Tag.
Example Method for Single Type:
#Override
public List<SNMPTag> getSNMPTags(){
try(Handle handle = daoFactory.getDataSourceController().open()) {
return handle.createQuery("SELECT * FROM dbo.Tags WHERE Active = 1 AND Protocol = 'SNMP'")
.mapToBean(SNMPTag.class)
.list();
}
catch(Exception e){
if(sysconfig.getVerbose()){ e.printStackTrace(); }
}
return null;
}
Some bad pseudocode to indicate what I want to do:
#Override
public List<Tag> getAllTags(){
try(Handle handle = daoFactory.getDataSourceController().open()) {
return handle.createQuery("SELECT * FROM dbo.Tags WHERE Active = 1")
.mapRows(row -> row.Protocol.equals("SNMP").mapToBean(SNMPTag.class)
.mapRows(row -> row.Protocol.equals("ModBus").mapToBean(ModBusTag.class)
//etc
.list();
}
catch(Exception e){
if(sysconfig.getVerbose()){ e.printStackTrace(); }
}
return null;
}
You can use RowMapper with some amount of custom code to achieve what you need, we successfully use such approach in our project. Here is simplified general example of this technique:
public class PolymorphicRowMapper implements RowMapper<Parent> {
#Override
public Parent map(ResultSet rs, StatementContext ctx) throws SQLException {
Type type = Type.valueOf(rs.getString("type"));
if (type == Type.A) {
return mapTo(rs, ctx, ChildA.class);
} else if (type == Type.B) {
return mapTo(rs, ctx, ChildB.class);
}
throw new IllegalStateException("Could not resolve mapping strategy for object");
}
private static <T extends Parent> T mapTo(
ResultSet rs,
StatementContext ctx,
Class<T> targetClass
) throws SQLException {
return ctx.getConfig().get(Mappers.class)
.findFor(targetClass)
.orElseThrow(() ->
new NoSuchMapperException(String.format("No mapper registered for %s class", targetClass))
)
.map(rs, ctx);
}
}
public static void main(String[] args) {
var jdbi = Jdbi.create("...")
.registerRowMapper(BeanMapper.factory(ChildA.class))
.registerRowMapper(BeanMapper.factory(ChildB.class));
try (Handle handle = jdbi.open()) {
handle.createQuery("SELECT * FROM table")
.map(new PolymorphicRowMapper());
}
}
public enum Type {
A, B
}
public abstract class Parent {
final Type type;
protected Parent(final Type type) {
this.type = type;
}
}
public class ChildA extends Parent {
public ChildA() {
super(Type.A);
}
}
public class ChildB extends Parent {
public ChildB() {
super(Type.B);
}
}

Java SwingWorker load data from database to List

I have a problem with my MVC application that displays data in a JTable. Everything worked fine, but I decided to add a SwingWorker to retrieve data from the database.
My controller calls the model with data from the database. It looks like this.
Model.java
public class Model {
private List<Category> people = new Vector<Category>();
public List<Category> getPeople() {
return new ArrayList<Category>(people);
}
public void load() throws Exception {
people.clear();
DAOFactory factory = DAOFactory.getFactory(DAOFactory.MYSQL);
CategoryDAO personDAO = factory.getCategoryDAO();
people.addAll(personDAO.getCategory());
}
}
I add SwingWorker to getCategory class
MySQLCategodyDAO.java
public class MySQLCategoryDAO extends SwingWorker<Void, Vector<Object>> implements CategoryDAO{
private Job job;
private List<Category> cat;
public MySQLCategoryDAO(Job job){
this.job = job;
}
#Override
protected Void doInBackground() throws Exception {
// TODO Auto-generated method stub
if(job == Job.SELECT){
getCategory();
System.out.println("Table selected");
}
return null;
}
#Override()
public void done(){
}
public List<Category> getCategory() throws SQLException
{
cat = new ArrayList<Category>();
Connection conn = Database.getInstance().getConnection();
System.out.println(conn);
String sql = "select id, name from kategorie";
Statement selectStatement = conn.createStatement();
ResultSet results = selectStatement.executeQuery(sql);
while(results.next())
{
int id = results.getInt("id");
String name = results.getString("name");
Category category = new Category(id, name);
cat.add(category);
}
results.close();
selectStatement.close();
return cat;
}
}
View just retrieves the data from the model:
people = model.getPeople();
for (Category person : people) {
tablemodel
.addRow(new Object[] { person.getId(), person.getName() });
}
The problem comes when you call SwingWorker in class Model.java
public void load() throws Exception {
people.clear();
DAOFactory factory = DAOFactory.getFactory(DAOFactory.MYSQL);
CategoryDAO personDAO = factory.getCategoryDAO();
people.addAll(new MySQLCategoryDAO(Job.SELECT).execute()); - ERROR
}
Error:-
The method addAll(Collection<? extends Category>) in the type List<Category> is not applicable for the
arguments (void)
I know SwingWorker returns nothing, because there is an error. I should write the code in the method done(), but I have no idea how to solve it.
execute does not have a return value so it can't be used in the way you are trying to use it. The idea of SwingWorker is that the task should be executed asynchronously so you need to rework your design.
The SwingWorker bears a result (the List<Category>) and you either need to:
put the result somewhere from inside the SwingWorker (such as with the publish mechanism)
or call get from the outside to wait for the task to finish and return.
Here is the tutorial for review: http://docs.oracle.com/javase/tutorial/uiswing/concurrency/worker.html
Quick example:
class MySQLCategoryDAO extends SwingWorker<Void, Category> {
// ...
private List<Category> list; // do not modify inside doInBackground
MySQLCategoryDAO(Job job, List<Category> list) {
this.list = list;
// ...
}
#Override
protected Void doInBackground() {
// ...
while(results.next()) {
int id = results.getInt("id");
String name = results.getString("name");
publish(new Category(id, name)); // publish results to the EDT
}
// ...
return null;
}
#Override
protected void process(List<Category> chunks) {
list.addAll(chunks); // add results to the list on the EDT
// add to the JTable (?)
}
}
public void load() throws Exception {
people.clear();
DAOFactory factory = DAOFactory.getFactory(DAOFactory.MYSQL);
CategoryDAO personDAO = factory.getCategoryDAO();
// just execute
new MySQLCategoryDAO(Job.SELECT, people).execute();
}
If you want to populate the entire table at once then you can also publish a List after the loop instead of one Category at a time. process would receive a List<List<Category>> with a singular element.
Sorry my mistake.
From the view gets to model.getPeople (), but nothing is returned. I did a test:
But nothing is returned
public class Model {
private List<Category> people = new Vector<Category>();
public List<Category> getPeople() {
for (Category person : people) {
System.out.println(person.getName()); //no data
}
return new ArrayList<Category>(people);
}
public void load() throws Exception {
people.clear();
DAOFactory factory = DAOFactory.getFactory(DAOFactory.MYSQL);
new MySQLCategoryDAO(Job.SELECT,people).execute();
}
}

JdbcDaoSupport with a SQL SELECT FROM INSERT

I am trying to create a "select from insert" within my Spring JdbcDaoSupport class and am having trouble figuring out how to get the data from the select statement and return it.
My EventJdbcTemplate (my DaoImpl):
#Service
public class EventJdbcTemplate extends JdbcDaoSupport implements EventDao {
private static final Logger LOGGER = Logger.getLogger(EventJdbcTemplate.class);
private static final String SQL_INSERT_EVENT = "SELECT EVENT_ID FROM FINAL TABLE " +
"(INSERT INTO EBT10DBB.SB0401T0 (EVENT_NAME, HOST_NAME, USER_ID) " +
"VALUES(?, ?, \'EMP0321\'))";
#Autowired
public EventJdbcTemplate(DataSource pDataSource) {
super.setDataSource(pDataSource);
}
#Override
public Integer createEvent(EventBean pEventBean) { //(Integer id, String eventName)
if (LOGGER.isTraceEnabled()) {
LOGGER.trace("Entering create(Event event) of EventJDBCTemplate.");
}
// This SQL works, but is for an INSERT only.
/*this.getJdbcTemplate().query(SQL_INSERT_EVENT, new Object[]{
pEventBean.getEventName(),
pEventBean.getHostName()
});*/
final List eventList = this.getJdbcTemplate().query(SQL_INSERT_EVENT, new Object[]{
pEventBean.getEventName(),
pEventBean.getHostName()
}, new EventRowMapper()
);
Event event = null;
for (int i = 0; i < eventList.size(); i++) {
event = (Event)eventList.get(i);
}
if (LOGGER.isTraceEnabled()) {
LOGGER.trace("Exiting create(Event event) of EventJDBCTemplate.");
}
//return statement -- should return either the entire "pEventBean", or
//just the unique key, "EVENT_ID".
return event.getId();
}
EventRowMapper class (Not sure is I'll need this for the select or not):
public class EventRowMapper implements RowMapper<Event> {
#Override
public Event mapRow(ResultSet rs, int rowNum) throws SQLException {
final EventBuilder event = new EventImpl.EventBuilder();
event.setId(rs.getInt("EVENT_ID"));
event.setEventName("EVENT_NAME");
event.setHostName("HOST_NAME");
return event.build();
}
}
So my goal is to return an Integer value that would be the unique key (EVENT_ID) that is created from the INSERT SQL.
You can use SimpleJdbcInsert provided by Spring to get back generated keys, see following documentation provided by Spring section
13.5.2 Retrieving auto-generated keys using SimpleJdbcInsert
Here is the Link

Java Data-Entity model: Constructing general types

I have had some trouble with using a general type in a static method.
All comments on the source code are welcome, especially ones that significantly improve the code. I am also currently not planning on using any external framework, apart from JDBC, to keep it still simple, please do not put too much emphasis on that.
My view on not using external frameworks is also supported by the fact that the operations I will be using on the database are very minimal:
Inserting data
Updating data
Retrieving all fields. (And simply by putting in a different SQL Query you could already select what fields to retrieve
I do not plan on making a full framework, so I know that it will not be supporting everything. The speed of retrieving all fields is neither a real issue, as this will be pretty much only done on server bootup, and if used at any other time it will be done in a background task for which I do not really care when it is finished.
Entity.java:
abstract public class Entity<KeyType, DataType> {
protected KeyType key;
protected List<Object> data;
public Entity() {
data = new ArrayList<>();
}
//abstract public static Map<KeyType, DataType> getAll();
protected List<Object> createData(final DataAction dataAction) {
List<Object> list = new ArrayList<>();
if (dataAction == DataAction.INSERT) {
list.add(key);
}
list.addAll(data);
if (dataAction == DataAction.UPDATE) {
list.add(key);
}
return list;
}
abstract public void insert();
abstract public void update();
protected static <KeyType, DataType> Map<KeyType, DataType> getData(final Class<DataType> dataTypeClass, final String query) {
Map<KeyType, DataType> map = new HashMap<>();
try {
PreparedStatement preparedStatement = DatabaseConnection.getConnection().prepareStatement(query);
ResultSet resultSet = preparedStatement.executeQuery();
while (resultSet.next()) {
KeyType key = (KeyType)resultSet.getObject(1);
int index = 2;
List<Object> dataList = new ArrayList<>();
while (resultSet.getObject(index) != null) {
dataList.add(resultSet.getObject(index));
index++;
}
DataType dataObject = null;
try {
dataObject = dataTypeClass.getConstructor(List.class).newInstance(dataList);
} catch (InstantiationException | IllegalAccessException | IllegalArgumentException | InvocationTargetException | NoSuchMethodException | SecurityException ex) {
Logger.getLogger(Entity.class.getName()).log(Level.SEVERE, null, ex);
}
map.put(key, dataObject);
}
} catch (SQLException ex) {
Logger.getLogger(Entity.class.getName()).log(Level.SEVERE, null, ex);
}
return map;
}
protected void executeQuery(final String query, final List<Object> data) {
try {
PreparedStatement preparedStatement = DatabaseConnection.getConnection().prepareStatement(query);
int dataIndex = 0;
for (Object dataObject : data) {
preparedStatement.setObject(dataIndex, dataObject);
dataIndex++;
}
preparedStatement.execute();
preparedStatement.close();
} catch (SQLException ex) {
Logger.getLogger(Entity.class.getName()).log(Level.SEVERE, null, ex);
}
}
}
A concrete implementation, Account.java:
public class Account extends Entity<String, Account> {
private final static String SELECT_ALL_QUERY = "SELECT * FROM accounts";
private final static String INSERT_QUERY = "INSERT INTO accounts (username, password) VALUES(?, ?)";
private final static String UPDATE_QUERY = "UPDATE accounts SET password=? WHERE username=?";
private String username;
private String password;
public Account(final String username, final String password) {
this.username = username;
this.password = password;
key = username;
data.add(password);
}
public Account(final List<Object> data) {
this((String)data.get(0), (String)data.get(1));
}
public String getUsername() {
return username;
}
public void setUsername(final String username) {
this.username = username;
}
public String getPassword() {
return password;
}
public void setPassword(final String password) {
this.password = password;
}
public static Map<String, Account> selectAll() {
return getData(Account.class, SELECT_ALL_QUERY);
}
#Override
public void insert() {
executeQuery(INSERT_QUERY, createData(DataAction.INSERT));
}
#Override
public void update() {
executeQuery(UPDATE_QUERY, createData(DataAction.UPDATE));
}
}
I am generally happy about the concrete implementation, it seems like I have managed to bring it down to a bare minimum, except public Account(final List<Object> data) does not seem that nice, but I can live with it.
However, as guessed, the getData() from Entity is definately not nice, and I would like to improve it if possible.
What I would like to use is something like DataType dataObject = new DataType(dataList), but it seems like Generic Type Arguments cannot be instantiated.
So are there any ways of optimizing my current code in my current view? And is it possible to decouple the concrete classes and abstract classes even more?
EDIT:
Added a relevant question (I don't think I should make a fully new question for this thing, right?):
Is there a way to move the static Strings (SQL Queries) and the insert() and update() out of the Account class, into the Entity class?
To avoid the use of reflection in your getData method you should accept a factory that given a ResultSet creates instances of the specific type. Your selectAll method would then be something like:
public static Map<String, Account> selectAll()
{
return getData(
new EntityFactory<Account>()
{
public Account newInstance(ResultSet resultSet) throws SQLException
{
return new Account(resultSet.getString(0), resultSet.getString(1));
}
},
SELECT_ALL_QUERY
);
}
The getData method then ends up something like:
protected static <K, T extends Entity<K>> Map<K, T> getData(EntityFactory<T> entityFactory, String query)
{
Connection connection = null;
PreparedStatement preparedStatement = null;
ResultSet resultSet = null;
try
{
connection = dataSource.getConnection();
preparedStatement = connection.prepareStatement(query);
resultSet = preparedStatement.executeQuery();
Map<K, T> entities = new HashMap<>();
while (resultSet.next())
{
Entity<K> entity = entityFactory.newInstance(resultSet);
entities.put(entity.getKey(), entity);
}
return entities;
}
finally
{
closeQuietly(resultSet);
closeQuietly(prepareStatement);
closeQuietly(connection);
}
}
And assumes the Entity looks like:
public interface Entity<K>
{
public K getKey();
}
This allows you to remove the reflection and keeps the code that understands the database structure in one place. You should also use a similar template pattern to map from the domain object to the prepared statement when doing inserts and updates.
Now you've asked for comments on the code in general.
First off, code like this violates the Single Responsibility Principal and Seperation Of Concerns. A domain class should be a domain class and not contain persistance logic. Look at patterns like the Data Access Object for how this should be done.
Second, while I'm all for keeping it simple, Hibernate solved this problem a long time ago and JPA standardized it - you need a very good reason not to use one or both of these APIs.
Finally, your use of database resources - if you are going to use JDBC directly you have to clean up properly. Database connections are expensive resources and should be handled as such, the basic template for any JDBC call should be:
Connection connection = null;
PreparedStatement preparedStatement = null;
ResultSet resultSet = null;
try
{
connection = //get connection from pool or single instance.
preparedStatement = connection.prepareStatement("SELECT * FROM table WHERE column = ?");
preparedStatement.setString(1, "some string");
resultSet = preparedStatement.executeQuery();
while (resultSet.next())
{
//logic goes here.
}
}
catch (SQLException e)
{
//Handle exceptions.
}
finally
{
closeQuietly(resultSet);
closeQuietly(prepareStatement);
closeQuietly(connection);
}
The closeQuietly method has to be overloaded but should take the general form:
try
{
if (resultSet != null)
{
resultSet.close();
}
}
catch (SQLException e)
{
//Log exceptions but don't re-throw.
}
Well, as Darwind and Nick Holt told you, in a normal situation, you should use JPA, which is the Java standard specification for object-relational mapping. You can use Hibernate, EclipseLink or any other framework behind. Their design is can manage connections, transactions. In addition, using standards rather than exotic frameworks means that you can get help more easily for the community.
Another option is using Spring JDBC, which is quite light and facilitates many things.
Anyway, I suppose you did this for learning purpose so let's try to go further.
First, I think you should separate the classes in charge or retrieving the data (call it manager or Data Access Object -DAO-) and the entites representing the data themselves.
For me, using the class to get all the data as you did isn't a problem in itself. The problem is the position of the key is hardcoded. This should not be determined directly a generic (I mean the same for all the Entity implementation). This makes queries subjects to bugs when the first field is not the key (are you sure a select * from... will ALWAYS return the key in the first position? ) or with a composite key.
I think a better solution is to crate a Mapper interface and to implement it for each entity.
public interface RecordMapper<KeyType, DataType extends Entity> {
public void appendToMap(ResultSet resultSet, Map<KeyType, DataType>) throws SQLException;
}
The implementation of the mapper should be in charge of instanciating your entity, retrieving the key from the resultset, populating your entity and putting it in the map you expect.
public class AccountMapper implement RecordMapper<String, Account>{
public void appendToMap(ResultSet resultSet, Map<String, Account> accounts) throws SQLException {
String user= resultSet.getString("userName");
String pwd= resultSet.getString("passWord");
Account account = new Account(user, pwd);
accounts.put(user, account);
}
}
As I told you should move your data access methods in a DAO:
public class DAO{
public <KeyType, DataType> Map<KeyType, DataType> getData(final RecordMapper<KeyType, DataType> mapper, final String query) {
Map<KeyType, DataType> map = new HashMap<>();
try {
PreparedStatement preparedStatement = DatabaseConnection.getConnection().prepareStatement(query);
ResultSet resultSet = preparedStatement.executeQuery();
while (resultSet.next()) {
mapper.appendToMap(resultSet, map);
}
} catch (SQLException ex) {
Logger.getLogger(Entity.class.getName()).log(Level.SEVERE, null, ex);
} finally {
if(resultSet != null){
try{resultSet.close();} catch (Exception e){}
}
if(preparedStatement!= null){
try{preparedStatement.close();} catch (Exception e){}
}
}
return map;
}
public void executeQuery(final String query, final List<Object> data) {
try {
PreparedStatement preparedStatement = DatabaseConnection.getConnection().prepareStatement(query);
int dataIndex = 0;
for (Object dataObject : data) {
preparedStatement.setObject(dataIndex, dataObject);
dataIndex++;
}
preparedStatement.execute();
} catch (SQLException ex) {
Logger.getLogger(Entity.class.getName()).log(Level.SEVERE, null, ex);
} finally {
if(resultSet != null){
try{resultSet.close();} catch (Exception e){}
}
if(preparedStatement!= null){
try{preparedStatement.close();} catch (Exception e){}
}
}
}
}
To answer your second quenstion, I think that putting your request string in the abstract parent instead of is certainly not a good idea. Each time you create new entity, you have to create a new query in the parent. Weird...unless I haven't understood properly your question.
Personnaly I think that the queries should be build dynamically and you should use reflection and annotations but the answer should be a bit long. Once again, you can get a look at JPA to see how creating an entity should look like. By the way, it should be even better if the entities didn't have to extend a parent Entity class.

Categories

Resources