Spring - ehcache doesn't work propertly - java

I have a problem with ehcache in my application. I want to store in cache two method that has two diferent queries to db. The problem is that the data of second method is stored in data of first method and when the user make multiple request the data is duplicated everytime.
For example:
First call ->
method 1 return 0 items
method 2 return 2 items
Second call -> The methods are cached and just return the stored data but...
method 1 return 2 items ¿?
method 2 return 2 items
Third call ->
method 1 return 4 items ¿?
method 2 return 2 items
Dao class:
public class DataDAOImpl extends JdbcDaoSupport implements DataDAO {
#Autowired
private JdbcTemplate jdbcTemplate1;
#Autowired
private JdbcTemplate jdbcTemplate2;
#PostConstruct
private void initialize() {
setJdbcTemplate(jdbcTemplate1);
}
#Autowired
private Environment env;
#Cacheable("data_1")
public List<Data> getData1(String data, String start_date, String end_date) {
List<Data> list_data_1 = (List<Data>) jdbcTemplate1.query(
env.getProperty("sql_data_1"),
new BeanPropertyRowMapper<>(Data.class),
data, start_date, end_date);
return list_data_1;
}
#Cacheable("data_2")
public List<Data> getData2(String data, String start_date, String end_date) {
List<Data> list_data_2 = (List<Data>) jdbcTemplate2.query(
env.getProperty("sql_data_2"),
new BeanPropertyRowMapper<>(Data.class),
data, start_date, end_date);
return list_data_2;
}
}
Main class:
List<Data> arrayData = new ArrayList<Data>();
arrayData = dataDAO.getData1(data, start_date, end_date);
arrayData.addAll(dataDAO.getData2(data, start_date, end_date));
Thank you so much!

The caching works fine, the problem is what you do with the result returned.
List<Data> arrayData = new ArrayList<Data>();
arrayData = dataDAO.getData1(data, start_date, end_date);
arrayData.addAll(dataDAO.getData2(data, start_date, end_date));
The code above updates the collection, without doing any defensive copy. Since you are most likely caching on heap, you are effectively modifying the content of what is cached.
So either you do the defensive copy before merging the collections:
List<Data> arrayData = new ArrayList<Data>(dataDAO.getData1(data, start_date, end_date));
arrayData.addAll(dataDAO.getData2(data, start_date, end_date));
or Ehcache has configuration options so that it is the cache doing a copy for you each time something is read from the cache - see documentation for version 2.x and documentation for version 3.x.
Note that the code above is not null safe.

Related

JdbcTemplate does not return a proper resultset. Why?

I am making a discussion board with Spring.
I am using JdbcTemplate to populate the articles of the users from the database, but the JdbcTemplate's query method does not return the proper ResultSet. Interestingly, when I copy and paste the SQL query from the code to SQL Developer, it returns the proper results.
The photo that shows the SQL query works,
JdbcTemplate code
public class ForumDao {
private JdbcTemplate template;
public ForumDao(DataSource dataSource) {
template = new JdbcTemplate(dataSource);
}
public Collection<ForumArticle> getArticleList(){
Collection<ForumArticle> list = template.query("SELECT ARTICLE_ID, TITLE, NAME, VIEW_NUM, CREATED_DATE FROM MEMBER, FORUM WHERE MEMBER.ID = FORUM.MEMBER_ID",
new RowMapper<ForumArticle>() {
#Override
public ForumArticle mapRow(ResultSet rs, int rowNum) throws SQLException {
ForumArticle article = new ForumArticle();
System.out.println("completeeeee--------------------------------------------------------------------");
article.setArticleID(rs.getInt("ARTICLE_ID"));
article.setTitle(rs.getString("TITLE"));
article.setName(rs.getString("NAME"));
article.setViewNum(rs.getLong("VIEW_NAME"));
article.setCreatedDate(rs.getTimestamp("CREATED_DATE").toLocalDateTime());
return article;
}
});
System.out.println("-dddddddddddddddddddddddddddddddddddd " + list.size());
return list;
}
}
All the configuration set-up is done properly and I am using Oracle DB. I have another DAO class for user data and its JdbcTemplate works perfectly.
When I run my code, the list.size() returns 0 instead of 4. It does not throw any exception.
What can be the possible solution for this issue?
The following line looks wrong:
article.setViewNum(rs.getLong("VIEW_NAME"));
VIEW_NAME should be VIEW_NUM, no?
What's probably happening is that when the above line executes, the code throws a SQLException due to an unknown column in the result set, which terminates the processing and gives you an empty result.

Spring batch executing stor proc conditionally

In my Spring Batch Application, I am reading, processing and then trying to write with a ItemWriter to the database using stored procedure:
Below is what my CSV file looks like lets say which I want to read, process and write:
Cob Date;Customer Code;Identifer1;Identifier2;Price
20180123;ABC LTD;BFSTACK;1231.CZ;102.00
My ItemWriter:
#Slf4j
public class MyDBWriter implements ItemWriter<Entity> {
private final EntityDAO scpDao;
public MyWriter(EntityDAO scpDao) {
this.scpDao = scpDao;
}
#Override
public void write(List<? extends Entity> items) {
items.forEach(scpDao::insertData);
}
}
My DAO implementation:
#Repository
public class EntityDAOImpl implements EntityDAO {
#Autowired
private JdbcTemplate jdbcTemplate;
private SimpleJdbcCall simpleJdbcCall = null;
#PostConstruct
private void prepareStoredProcedure() {
simpleJdbcCall = new SimpleJdbcCall(jdbcTemplate).withProcedureName("loadPrice");
//declare params
}
#Override
public void insertData(Entity scp) {
Map<String, Object> inParams = new HashMap<>();
inParams.put("Identifier1", scp.getIdentifier1());
inParams.put("Identifier2", scp.getIdentifier1());
inParams.put("ClosingPrice", scp.getClosingPrice());
inParams.put("DownloadDate", scp.getDownloadDate());
simpleJdbcCall.execute(inParams);
}
}
My Stored procedure used to update is as follows:
ALTER PROCEDURE [dbo].[loadPrice]
#Identifier1 VARCHAR(50),
#Identifier1 VARCHAR(50),
#ClosingPrice decimal(28,4),
#DownloadDate datetime
AS
SET NOCOUNT ON;
UPDATE p
SET ClosingPrice = #ClosingPrice,
from Prices p
join Instrument s on s.SecurityID = p.SecurityID
WHERE convert(date, #DownloadDate) = convert(date, DownloadDate)
and s.Identifier1 = #Identifier1
if ##ROWCOUNT = 0
INSERT INTO dbo.Prices
(
sec.SecurityID
, ClosingPrice
, DownloadDate
)
select sec.SecurityID
, #ClosingPrice
, LEFT(CONVERT(VARCHAR, #DownloadDate, 112), 8)
from dbo.Instrument sec
WHERE sec.Identifier1 = #Identifier1
Give I have this setup, one of my requirement is that if I am unable to update/insert to the database using #Identifier1 i.e. there is no SecurityID which matched with Identifier1, I need to THEN update/insert
using the Identifier2. Second level match if you like.
How can I do this in my DAO insertData()? It is business logic and prefer in java code instead of stored proc but I am keen to look at your examples how this can be achieved.
How can I return a result of a row being updated/inserted and take decision as to whether or not to update/insert with second identifier?
For the update I would change the where clause to
WHERE convert(date, #DownloadDate) = convert(date, DownloadDate)
and (s.Identifier1 = #Identifier1 OR s.Identifier2 = #Identifier2)
and for the insert
WHERE sec.Identifier1 = #Identifier1 OR sec.Identifier2 = #Identifier2
That should work even if I haven't verified it myself. I am assuming that the given values for identifier1 and identifier2 can not match two different rows in the Instrument table.

how to define a keyspace dynamically in accessor

I am attempting to create an accessor to run slightly more complex queries in cassandra with java. I have no problem with the syntax, and I can get it to work, but my question is this: is there a way to dynamically declare a keyspace in an accessor?
For example, if you create a table map for the MappingManager you would declare the #Table and give it the keyspace and table name like so:
#Table(keypace="mykeyspace", name="orders")
public class Orders {
#PartitionKey
public UUID id;
//blah blah blah, rest of code
}
Now creating an accessor for that specific table is easy enough:
#Accessor
public interface OrdersAccessor {
#Query("SELECT * FROM orders WHERE status = :status")
Result pending(#Param("status") Integer status);
}
Simple. The problem is it demands a keyspace, and I am a huge fan of never hard-coding anything. I realize that I am "hard-coding" the keyspace in the Table definition in the MappingManager class definition, but if need be I only change it there and it updates everything that has to do with that. If I hard-code the keyspace in every single #Query definition inside the Accessor I will have to change, potentially, a bunch of different items if the keyspace gets updated, instead of only changing it one place in the #Table definition.
I have been searching Google for hours and I can't find a single instance of someone dynamically declaring a keyspace with an accessor, only thousands of examples of accessors where they are hard-coding the keyspace into the #Query like so:
#Accessor
public interface OrdersAccessor {
#Query("SELECT * FROM keyspace.orders WHERE status = :status")
Result pending(#Param("status") Integer status);
}
I realize the query I wrote isn't really cause for an accessor, I was just simplifying it for the sake of the example. So I am coming to the community asking for help, I can't find any examples of this anywhere. I can't imagine that I am the first person to ever want to do this, I just can't find any examples of anyone else tackling this problem. Thank you in advance for any help you can give, I can really use it.
#Sudhir Here is the solution I came up with. I am sure there are better ways to handle the connections, but I am still pretty new to cassandra and Java, and this is working well for my needs. I hope this helps...
public class DbInterface {
private Cluster cluster;
private Session session;
private Map<String, Session> dbMap;
private Map<String, Map<String, Mapper<Class>>> mappers = new ConcurrentHashMap<>();
public DbInterface(String host) {
Map<String, Session> connections = createConnection(host);
Session crSession = connections.get("crSession");
Session hppSession = connections.get("hppSession");
cluster = Cluster.builder().addContactPoint(host).build();
Session crSession = cluster.connect("mykeyspace");
Session hppSession = cluster.connect("hpp");
MappingManager crManager = new MappingManager(crSession);
MappingManager hppManager = new MappingManager(hppSession);
mappers.put("mykeyspace", new ConcurrentHashMap<>());
mappers.put("mykeyspace2", new ConcurrentHashMap<>());
Map cr = mappers.get("mykeyspace");
Map hpp = mappers.get("mykeyspace2");
cr.put("status", crManager.mapper(OrderStatus.class));
hpp.put("status", hppManager.mapper(OrderStatus.class));
cr.put("status_accessor", crManager.createAccessor(OrderStatusAccessor.class));
hpp.put("status_accessor", hppManager.createAccessor(OrderStatusAccessor.class));
cr.put("users", crManager.mapper(Users.class));
hpp.put("users", hppManager.mapper(Users.class));
cr.put("orders", crManager.mapper(Orders.class));
hpp.put("orders", hppManager.mapper(Orders.class));
cr.put("order_detail", crManager.mapper(OrderDetail.class));
hpp.put("order_detail", hppManager.mapper(OrderDetail.class));
cr.put("chal_orders", crManager.mapper(ChalOrder.class));
hpp.put("chal_orders", hppManager.mapper(ChalOrder.class));
cr.put("chal_order_detail", crManager.mapper(ChalOrderDetail.class));
hpp.put("chal_order_detail", hppManager.mapper(ChalOrderDetail.class));
cr.put("detail_accessor", crManager.createAccessor(OrderDetailAccessor.class));
hpp.put("detail_accessor", hppManager.createAccessor(OrderDetailAccessor.class));
cr.put("tracking_labels", crManager.mapper(TrackingLabels.class));
hpp.put("tracking_labels", hppManager.mapper(TrackingLabels.class));
}
public Session getConnection(String type) {
if(dbMap.containsKey(type)) {
return dbMap.get(type);
}
if(dbMap.containsKey(type.toLowerCase() +"Session")) {
return dbMap.get(type.toLowerCase() +"Session");
}
return dbMap.get("crSession");
}
public Map<String, Session> createConnection(String host) {
dbMap = new HashMap<>();
cluster = Cluster.builder().addContactPoint(host).build();
Session crSession = cluster.connect("mykeyspace");
Session hppSession = cluster.connect("hpp");
dbMap.put("crSession", crSession);
dbMap.put("hppSession", hppSession);
return dbMap;
}
public Map getDBMap(String client) {
if(mappers.containsKey(client)) {
return mappers.get(client);
}
throw new RuntimeException("Unknown Client: " + client);
}
}
One of the things I was thinking of doing is moving the session creation and Map creation to separate functions, then only connect and build the map for the session that is needed. Like instead of defaulting to connecting to both sessions when the DbInterface() is called, only connect to the session that is requested via the "host" param.
Anywho, I hope this helps you out. If you need it, here is an example of my other library that uses this...
public class MyRestController {
private final DbInterface db = new DbInterface(IPADDRESS);
#CrossOrigin
#RequestMapping("/status")
public String getStatus() {
Map managerMap = db.getDBMap("mykeyspace");
OrderStatusAccessor statuses = (OrderStatusAccessor) managerMap.get("status_accessor");
Result<OrderStatus> allStatuses = statuses.getAll();
//rest of the code here
}
}

External object linked through foreign key in hibernate and MySql

I'm using Spring data with Hibernate and MySql and I have a doubt.
My entity is
#Entity
#Table(name = "car", catalog = "DEMO")
public class Car implements java.io.Serializable {
/**
*
*/
private static final long serialVersionUID = 1L;
private Integer idCar;
#JsonBackReference
private CarType carType;
#JsonBackReference
private Fleet fleet;
private String id;
private int initialKm;
private String carChassis;
private String note;
#JsonManagedReference
private Set<Acquisition> acquisitions = new HashSet<Acquisition>(0);
with get and set method.
Sometimes, I need external object as carType, another entity.
If I use this webservice
#Override
#RequestMapping(value = { "/cars/{idFleet}"}, method = RequestMethod.GET)
public String getCars(#PathVariable int idFleet, Model model){
try{
model.addAttribute("carsList",fleetAndCarService.findCarsByIdFleet(idFleet));
//Modal parameter
model.addAttribute("carTypeList",fleetAndCarService.getCarsType());
model.addAttribute("fleetApplication",fleetAndCarService.getFleetById(idFleet));
model.addAttribute("carForm", new CarForm());
model.addAttribute("error",false);
}catch (Exception e){
LOG.error("Threw exception in FleetAndCarControllerImpl::getCars : " + ErrorExceptionBuilder.buildErrorResponse(e));
model.addAttribute("error",true);
}
return "cars";
}
from my html page I can retrieve carType.idCarType,but if I use this
#Override
#RequestMapping(value = { "/cars/{idFleet}"}, method = RequestMethod.GET)
public #ResponseBody TableUI getCars(#PathVariable int idFleet) {
TableUI ajaxCall=new TableUI();
try {
ajaxCall.setData(fleetAndCarService.findCarsByIdFleet(idFleet));
return ajaxCall;
} catch (QueryException e) {
ErrorResponse errorResponse= ErrorResponseBuilder.buildErrorResponse(e);
LOG.error("Threw exception in FleetAndCarControllerImpl::addCar :" + errorResponse.getStacktrace());
return ajaxCall;
}
}
where TableUi has only a field data where I put the result to use it into datatables, I don't have carType and fleet. Why? Do I have to use Hibernate.initialize, and how so it is a list?Thansk,regards
Also this update doesn't work:
#Override
#Transactional
public List<Car> findByFleetIdFleet(int idFleet) {
List<Car> carList= carRepository.findByFleetIdFleet(idFleet);
for (Car car:carList)
Hibernate.initialize(car.getCarType());
return carList;
}
You could call Hibernate.initialize on each element
Collection<Car> cars = fleetAndCarService.findCarsByIdFleet(idFleet);
for(Car car : cars) {
Hibernate.initialize(car.getCarType());
Hibernate.initialize(car.getFleet());
}
ajaxCall.setData();
return ajaxCall;
This would be a good starting point and would allow you to move forwards. At high scales however this could become a performance bottleneck as it will perform a query with each call to initialize so you will have 2*n queries to the database.
For maximum performance you will have several other options:
Iterate through the cars and build up a list of IDs and then query for the car types by ID in a single query with the list of IDs. Do the same for the fleets. Then call Hibernate.initialize. The first two queries will populate the persistence context and the call to initialize will not need to go to the database.
Create a special query for this call which fetch joins the properties you will need.
Setup batch fetching which will fetch the cards and fleets in batches instead of one car/fleet per query.
Use a second level cache so the initialization causes Hibernate to pull from the cache instead of the database.
Describing these options in details is beyond the scope of a single question but a good place to start would be Hibernate's documentation on performance.

JPA: Fetch data from DB instead of Persistance Context

I have a simple User Account application in which the user is able to change his details.
Updating the Database
The Managed Bean's method which takes the form parameters and calls the Service method:
public String changeDetails(){
Date date = DateUtil.getDate(birthDate);
Integer id = getAuthUser().getId();
UserDetail newDetails = new UserDetail(id, occupation, date, originCity, residenceCity, description);
EntityTransaction transaction = getTransaction();
userService.updateDetail(newDetails);
transaction.commit();
return null;
}
The Service Method:
public boolean updateDetail(UserDetail newDetails) {
boolean ok = true;
if (newDetails != null) {
UserDetail user = readDetail(newDetails.getId());
user.setOccupation(newDetails.getOccupation());
user.setOriginCity(newDetails.getOriginCity());
user.setResidenceCity(newDetails.getResidenceCity());
user.setBirth(newDetails.getBirth());
user.setDescription(newDetails.getDescription());
}
return ok;
}
Fetching data from DB
#PostConstruct
public void init(){
userService = new UserService();
sessionController.setAuthUser(userService.read(getAuthUser().getId()));
originCity = getAuthUser().getUserDetail().getOriginCity();
residenceCity = getAuthUser().getUserDetail().getResidenceCity();
occupation = getAuthUser().getUserDetail().getOccupation();
birthDate = DateUtil.getStringDate(getAuthUser().getUserDetail().getBirth());
description = getAuthUser().getUserDetail().getDescription();
}
The problem is that the behavior of this code is different. Sometimes I obtain the desired result: once I submit the new details and call the #PostConstruct init () the new details are printed. Some other times the old details are printed even though the DB entry is updated.
Conclusion: Sometimes the JPA brings me different result from what is in the DB. I guess that this results consist of data from the Persistance Context, data which isn't updated. Is there a way in which I can be sure that the JPA always brings the data directly from the DB? Or is there something I'm missing?
If you are using JPA 2 then #Cacheable(false) on your entity definition should make it read from the DB every time.
You mean is there a way to turn the cache off or empty it before an operation ?
emf.getCache().evictAll();

Categories

Resources