Im developing an application using spring boot, and spring JPA repositories.
The application logic is simple:
Get records from an external service
Check if record exists in my mysql database
If exists do an update, if not do an insert
This works, but the performance decreases as records are processed.
I enabled mysql logs to watch incoming queries, and I saw lot of repeated updates.
The log entries follows this pattern
lot of updates
other sentences
the same updates in (1) plus more updates
other sentences
the same updates in (3) plus more updates
Im using a Repository that extends CrudRepository and I do the updates using repository.save() method.
Im trying to save objects of a subclass of Event (using discriminator), those objects have a Hashmap that maps to a JSON column in mysql.
I observed too, that if I only change values in HashMap (without invoking repository.save() method), the database updates are still being invoked.
This is the recurrent code invoked when I get a record from external service:
else if (event.getType().equals(EventType.PAYMENT)) {
PaymentEvent paymentEvent = (PaymentEvent)event;
PaymentEvent oldPaymentEvent = eventRepository.getPaymentEvent(paymentEvent.getPaymentId(), paymentEvent.getTimestamp());
if (oldPaymentEvent == null) {
// save new event
eventRepository.save(paymentEvent);
} else {
// this method updates the internal Hashmap that maps to JSON Column in Mysql
oldPaymentEvent.updateCustomPropsFrom(paymentEvent);
// update old event, if I comment this line database updates are still being triggered (modifying internal hashamp in the line above trigger the udpates)
eventRepository.save(oldPaymentEvent);
}
This is the Repository definiton:
public interface EventRepository extends CrudRepository<Event, Long> {
}
And finally this is the Hashmap merge:
public void updateCustomPropsFrom(PaymentEvent event) {
if (event.getCouponAmount() != null ) {
this.customProps.put(KEY_COUPON_AMOUNT, event.getCouponAmount());
}
if (event.getCurrencyId() != null ) {
this.customProps.put(KEY_CURRENCY_ID, event.getCurrencyId());
}
if (event.getPaymentId() != null ) {
this.customProps.put(KEY_PAYMENT_ID, event.getPaymentId());
}
if (event.getReceivedAmount() != null ) {
this.customProps.put(KEY_RECEIVED_AMOUNT, event.getReceivedAmount());
}
if (event.getPaymentType() != null ) {
this.customProps.put(KEY_PAYMENT_TYPE, event.getPaymentType());
}
if (event.getPaymentReason() != null ) {
this.customProps.put(KEY_PAYMENT_REASON, event.getPaymentReason());
}
if (event.getTimestamp() != null)
this.setTimestamp(event.getTimestamp());
if (event.getStoreId()!=null)
this.setStoreId(event.getStoreId());
if (event.getUserId()!=null)
this.setUserId(event.getUserId());
if (event.getUserId()!=null)
this.setUserId(event.getUserId());
}
I've never faced this problem, can anyone help me?
Thanks in advance. Regards
Related
I've got a query which returns data from several tables. In the application, these tables are classes one within another, for example a Client has several Orders, each Order has several OrderDetails, each OrderDetail has several Products, and so on... But I can't figure out a proper way to build the entire object in the app since the query returns one row for (let's just say) each product, so I have one client repeated over and over for every product it has bought.
So far I've tried this terribly inefficient code, and it works, problem is, it takes too much time for the app to process all of this information when it retrieves several clients.
boolean orderFound = false;
for (Order order1 : orders) {
if (order1 .getId() == order.getId()) {
orderFound = true;
if (od.getId() != 0) {
boolean odFound = false;
for (OrderDetail orderdetail : order1.getOrderDetail()) {
if (orderDetail.getId() == od.getId()) {
if (prod.getId() != 0) {
odFound = true;
boolean prodFound= false;
for (Product product: orderDetail.getProducts()) {
if (product.getId() == product.getId()) {
prodFound= true;
}
}
if (!prodFound) {
orderDetail.getProducts().add(dia);
}
}
if (!odFound) {
order1.getOrderDetail().add(od);
}
}
}
}
if (!orderFound) {
if (order.getId() != 0) {
orders.add(order);
This works, but there's gotta be a better way and I haven't found it. I've been told this can be solved using HashSets but I still don't know how to use them. Any help will be appreciated.
If you are open to using third party libraries, I think this is what you are looking for:
How to use hibernate to query for an object with a nested object that has a nested collection of objects
I am using Room Architecture component from android Jet-pack in my App. I have implemented the Repository class where I manage my data sources like server and data from Room database. I am using live Data to get a list of all the objects present in my database and I have attached an Observer in my activity class. All works perfectly except one thing before making a call to my server I want to check if data is present in Room or not if data is present in Room I do not want to make a call to the server to save resources But when I try to get the data from local database in repository class it always returns null I have also tried attaching an observer to it but no use.
public LiveData<List<AllbrandsdataClass>> getAllBrands() {
brandsDao.getallbrands().observeForever(new Observer<List<AllbrandsdataClass>>() {
#Override
public void onChanged(#Nullable final List<AllbrandsdataClass> allbrandsdataClasses) {
final List<AllbrandsdataClass> listofbrandsobjectfromdb = allbrandsdataClasses;
if (listofbrandsobjectfromdb == null) {
Log.d(TAG, "Repository getallbrands number of brands in the DB is: 0");
} else {
// perform the logic to check and than fetch from server
}
return brandsDao.getallbrands();
}
}
}
here is my getAllBrands() method in the interface class which is annotated as DAO
#Query("SELECT * FROM AllbrandsdataClass order by timeStamp desc")
LiveData<List<AllbrandsdataClass>> getallbrands();
what I want is to perform a check in repository class for data from the local database before fetching the data from the server but I am unable to do it when using live data as shown above
Below I am using 2 live data streams(income, expense) of type "SumOfRowsFromDB" yours can be any depending upon your business logic, in the repository class to get a single live data "remainingIncome" of type Long
first, I added both my input live data as source to my output live data "remainingIncome" and in the lamda I set the value of my output live data as a method that is defined below, now whenever any of the input live data changes my method "combinedResult(income, expense)" gets called and I can change the value of my output accordingly as per my business logic.
public LiveData<Long> getRemainingIncome() {
MediatorLiveData<Long> remainingIncome = new MediatorLiveData<>();
LiveData<SumOfRowsFromDB> income = mainDashBoardDao.getTansWiseSum(Constants.TRANS_TYPES.get(2));
LiveData<SumOfRowsFromDB> expense = mainDashBoardDao.getTansWiseSum(Constants.TRANS_TYPES.get(1));
remainingIncome.addSource(income, value -> {
remainingIncome.setValue(combinedResult(income, expense));
});
remainingIncome.addSource(expense, value -> {
remainingIncome.setValue(combinedResult(income, expense));
});
return remainingIncome;
}
private Long combinedResult(LiveData<SumOfRowsFromDB> income, LiveData<SumOfRowsFromDB> expense) {
if (income.getValue() != null && expense.getValue() != null) {
return (income.getValue().getSumOfRow() - expense.getValue().getSumOfRow());
} else {
return 0L;
}
I'm trying to save with GreenDAO an entity called hotel. Each hotel has a relation one-to-many with some agreements and each agreement has got... well, a picture is worth a thousand words.
Now, what I do is the following:
daoSession.runInTx(new Runnable() {
#Override
public void run() {
ArrayList<Hotel> listOfHotels = getData().getAvailability();
for(Hotel h : listOfHotels)
{
List<HotelAgreement> hotelAgreements = h.getAgreements();
for(HotelAgreement ha : hotelAgreements) {
ha.setHotel_id(h.getHotel_id());
HotelAgreementDeadline hotelAgreementDeadline = ha.getDeadline();
List<HotelRemark> hr = hotelAgreementDeadline.getRemarks();
List<HotelAgreementDeadlinePolicies> hadp = hotelAgreementDeadline.getPolicies();
daoSession.getHotelReportDao().insertOrReplaceInTx( h.getReports() );
daoSession.getHotelPictureDao().insertOrReplaceInTx( h.getPictures() );
daoSession.getHotelRemarkDao().insertOrReplaceInTx(hr);
daoSession.getHotelAgreementDeadlinePoliciesDao().insertOrReplaceInTx(hadp);
daoSession.getHotelAgreementDeadlineDao().insertOrReplace(hotelAgreementDeadline);
daoSession.getHotelAgreementDao().insertOrReplace(ha);
}
// daoSession.getHotelReportsDao().insertOrReplace( getData().getReports() );
}
daoSession.getHotelDao().insertOrReplaceInTx(listOfHotels);
}
});
This, of course, does not work. I get a "Entity is detached from DAO context" error on the following line:
HotelAgreementDeadline hotelAgreementDeadline = ha.getDeadline();
I understand this is because I try to get the Agreements from a Hotel entity which does not come from the database, but from another source (a web service, in this case). But why does this happen with ha.getDeadline() and not with h.getAgreements()?
Now, I have the Hotel object and it does include pretty much all data: agreements, deadline, policies, remarks, pictures, report. I'd just like to tell GreenDAO: save it! And if I can't and I have to cycle through the tree - which is what I'm trying to do with the code above - how am I supposed to do it?
Here I read that I have to "store/load the object first using a Dao". Pretty awesome, but... how does it work? I read the greenDAO documentation about relations but couldn't find anything.
Thank you to everybody who's willing to help :-)
At some point, when you get the response from the webservice, you are creating new entity objects and filling them with the info. Try inserting each new object in the DB just after that.
If you want, you can insert, for example, all n Agreement for an Hotel using insertOrReplaceInTx, but you shouldn't use any relation before all the involved objects are in the DB.
I think that greendao team have to add the following control in the method
getToOneField() like in the getToManyList()
if(property == null){
code already generated by greendao plugin
}
return property;
so in your case in HotelAgreements class
#Keep
public DeadLine getDeadLine {
if(deadLine == null) {
long __key = this.deadLineId;
if (deadLine__resolvedKey == null || !deadLine__resolvedKey.equals(__key)) {
final DaoSession daoSession = this.daoSession;
if (daoSession == null) {
throw new DaoException("Entity is detached from DAO context");
}
DeadLineDao targetDao = daoSession.getDeadLineDao();
DeadLine deadLineNew = targetDao.load(__key);
synchronized (this) {
deadLine = deadLineNew;
deadLine__resolvedKey = __key;
}
}
}
return deadLine;
}
adding the control
if(deadLine == null) {
...
}
so if you receive data from rest json
the object is populated and getProperty() method return property field from object not from database just like it does with Lists
then you can insert or replace it
Then, when you load or load deeply object from db the property is null and greendao take it from DB
My multiplayer game using Firebase SDK for Java/Android has game rooms where 4 players can play together.
In order to implement the matchmaking, I have a Firebase reference for every room that players can join to. Each of these references has 4 slots (as child nodes) containing the UUIDs of the players that are taking part or an empty string if the slot is still available.
In order to prevent two (or more) players from claiming the same slot simultanteously, I'm using transactions. Is the following code correct for that purpose?
private int mUserSlot;
firebaseReference.runTransaction(new Transaction.Handler() {
#Override
public Result doTransaction(MutableData currentData) {
for (int s = 0; s <= 3; s++) { // loop through all slots
final String slotValue = currentData.child(slotFromLocalID(s)).getValue(String.class);
if (slotValue == null || slotValue.equals("") || slotValue.equals(mUserUUID)) { // if slot is still available
currentData.child(slotFromLocalID(s)).setValue(mUserUUID);
mUserSlot = s;
return Transaction.success(currentData);
}
}
return Transaction.abort();
}
#Override
public void onComplete(FirebaseError error, boolean committed, DataSnapshot currentData) {
if (error == null) {
if (committed) {
System.out.println("User is now in slot "+mUserSlot);
}
else {
System.out.println("User could not join, all slots occupied");
}
}
else {
System.out.println("Error: "+error.getMessage());
}
}
});
However, I'm throwing an Exception in the error != null branch in onComplete() above for debugging and after some test runs, I could see the following error message (from error.getMessage() in my debug logs:
The transaction was overridden by a subsequent set
What does that mean, exactly? I thought the transactions are to prevent concurrent access to the same fields which overwrite each other. Could that mean that some other part of the application is writing to that field without a transaction?
In this case, can I just handle it as I do with !committed? That means, in both cases, the value that I wanted to write is not there after the transaction has completed, is this correct?
Yes, that message means that one of your clients is writing to the same location using set instead of transaction which aborts the transaction. I would highly recommend not using set on a location where transactions are being used.
Im trying to update multiple records via an ATG class extending GenericService.
However im running against a roadblock.
How do I do a multiple insert query where i can keep adding all the items / rows into the cached object and then do a single command sync with the table using item.add() ?
Sample code
the first part is to clear out the rows in the table before insertion happens (mighty helpful if anyone knows of a way to clear all rows in a table without having to loop through and delete one by one).
MutableRepository repo = (MutableRepository) feedRepository;
RepositoryView view = null;
try{
view = getFeedRepository().getView(getFeedRepositoryFeedDataDescriptorName());
RepositoryItem[] items = null;
if(view != null){
QueryBuilder qb = view.getQueryBuilder();
Query getFeedsQuery = qb.createUnconstrainedQuery();
items = view.executeQuery(getFeedsQuery);
}
if(items != null && items.length>0){
// remove all items in the repository
for(RepositoryItem item :items){
repo.removeItem(item.getRepositoryId(), getFeedRepositoryFeedDataDescriptorName());
}
}
for(RSSFeedObject rfo : feedEntries){
MutableRepositoryItem feedItem = repo.createItem(getFeedRepositoryFeedDataDescriptorName());
feedItem.setPropertyValue(DB_COL_AUTHOR, rfo.getAuthor());
feedItem.setPropertyValue(DB_COL_FEEDURL, rfo.getFeedUrl());
feedItem.setPropertyValue(DB_COL_TITLE, rfo.getTitle());
feedItem.setPropertyValue(DB_COL_FEEDURL, rfo.getPublishedDate());
RepositoryItem item = repo.addItem(feedItem) ;
}
The way I interpret your question is that you want to add multiple repository items to your repository but you want to do it fairly efficiently at a database level. I suggest you make use of the Java Transaction API as recommended in the ATG documentation, like so:
TransactionManager tm = ...
TransactionDemarcation td = new TransactionDemarcation ();
try {
try {
td.begin (tm);
... do repository item work ...
}
finally {
td.end ();
}
}
catch (TransactionDemarcationException exc) {
... handle the exception ...
}
Assuming you are using a SQL repository in your example, the SQL INSERT statements will be issued after each call to addItem but will not be committed until/if the transaction completes successfully.
ATG does not provide support for deleting multiple records in a single SQL statement. You can use transactions, as #chrisjleu suggests, but there is no way to do the equivalent of a DELETE WHERE ID IN {"1", "2", ...}. Your code looks correct.
It is possible to invoke stored procedures or execute custom SQL through an ATG Repository, but that isn't generally recommended for portability/maintenance reasons. If you did that, you would also need to flush the appropriate portions of the item/query caches manually.