I have been using JSF, JPA and MySQL with EclipseLink for 5 years. I found that I want to shift to Object db as it is very fast specially with a very large dataset. During migration, I found this error.
In JPA with EclipseLink, I passed objects as parameters. But in Object DB, I need to pass the id of objects to get the results. I have to change this in several places. Can enyone help to overcome this issue.
THis code worked fine with EclipseLink and MySQL. Here I pass the object"salesRep" as the parameter.
String j = "select b from "
+ " Bill b "
+ " where b.billCategory=:cat "
+ " and b.billType=:type "
+ " and b.salesRep=:rep ";
Map m = new HashMap();
m.put("cat", BillCategory.Loading);
m.put("type", BillType.Billed_Bill);
m.put("rep", getWebUserController().getLoggedUser());
I have to chage like this to make it work in ObjectDB.Here I have to pass the id (type long) of the object"salesRep" as the parameter.
String j = "select b from "
+ " Bill b "
+ " where b.billCategory=:cat "
+ " and b.billType=:type "
+ " and b.salesRep.id=:rep ";
Map m = new HashMap();
m.put("cat", BillCategory.Loading);
m.put("type", BillType.Billed_Bill);
m.put("rep", getWebUserController().getLoggedUser().getId());
There is a difference between EclipseLink and ObjectDB in handling detached entity objects. The default behaviour of ObjectDB is to follow the JPA specification and stop loading referenced objects by field access (transparent navigation) once an object becomes detached. EclipseLink does not treat detached objects this way.
This could make a difference in situations such as in a JSF application, where an object becomes detached before loading all necessary referenced data.
One solution (the JPA portable way) is to make sure that all the required data is loaded before objects become detached.
Another possible solution is to enable loading referenced objects by access (transparent navigation) for detached objects, by setting the objectdb.temp.no-detach system property. See #3 in this forum thread.
Related
I have a simple spring web application, which is connected to a Postgre db. My question is I have method in dao, which is annotated with #Cacheable. Is there a way to log if the method goes to db, or its result is loaded from cache? For example, I'd like to see the following log:
The value is retrieved from db....
The value is retrieved from cache
You can enable trace logs for CacheAspectSupport. This is probably going to give you too much information though.
In case of a cache hit you'll get
Cache entry for key '" + key + "' found in cache '" + cache.getName() + "'"
And a cache miss
"No cache entry for key '" + key + "' in cache(s) " + context.getCacheNames()
There is no hookpoint to configure caching so that it calls you when those things happen. You may want to look at your cache library to see if they offer some hook point.
In my (java) Controller in a Play2 project I'm saving some data to an object.
So entity here is an instance of a Model subclass.
I do stuff like this
log.debug("Saving title=" + title + ", tags=" + tags);
entity.title = title;
entity.tags = tags;
entity.save();
// verify:
ModelClass m = ModelClass.find.byId(entity.id);
log.debug("Saved title=" + m.title + ", tags=" + m.tags);
Where title is a String and tags is a List<String>. The debug log says
Saving title=foo, tags=[bar, quux]
Saved title=foo, tags=null
So data is coming in, I'm not getting any warnings, but the list of strings is just lost somewhere along the way. I'm just using an in-memory h2 db, maybe it works when I'm really persisting it, but... what's up with this?
Edit: The generated SQL create syntax doesn't contain "tags" at all. So there's obviously something wrong with that.
Edit: see How to persist a property of type List<String> in JPA?
In JPA you must declare a List as #ElementCollection for it to be persisted. It seems that EBean do not support this feature.
One way to do it should be to declare your List tags as #Transient (ie. not persisted) and have methods to manage it while keeping up to date a simple String that contains your tags comma separated. That would be this String that gets persisted in a single column.
I'm making an online game. I'm testing the game with 300 players now and I have a problem. I have to update about 300 rows in database every second but the update takes too long. It takes about 11143ms (11s) which is pretty much for task which must be done in less than 1s. I'm making those updates to database from JAVA. I tried with PHP already but it's the same. The update SQL query is very simple...
String query5 = "UPDATE naselje SET zelezo = " + zelezo + ", zlato = " + zlato + ", les = " + les + ", hrana = " + hrana + " WHERE ID =" + ID;
So anyone knows how to make updates to database every second with faster performance or any other solution how to update resources for game (gold, wood, food,...)?
My configuration:
Intel Core i5 M520 2.40GHz
6 GB RAM
You are probably updating each row seperatly, you need to use batch update
Switch to PDO if you are not already on it, and use transactions. Also, restructure your tables to use InnoDB vs MyISAM.
InnoDB works better with larger tables which are frequently read/written.
This is one of the things that it was designed to handle. Multiple SELECT/UPDATE/INSERT statements which are very similar in style.
It is also good coding practice to use transactions when handling multiple consecutive calls of the above types.
Use this Google Search to learn more of PHP PDO and MySQL Transactions.
Example:
With Transactions
$pdo = new PDO(...);
$pdo->beginTransaction();
for ( $i = 0; $i < 1001; $i++) {
$pdo->query("UPDATE table SET column='$var' WHERE ID = $i");
}
$pdo->commit();
I have a result of a db query in java.sql.ResultSet that needs to be converted to hierarchical data structure. It looks a bit like so:
name|version|pname|code|count
n1|1.1|p1|c1|3
n1|1.1|p1|c2|2
n1|1.1|p2|c1|1
n1|1.2|p1|c1|0
n2|1.0|p1|c1|5
I need that converted into a hierarchical data structure:
N1
+ 1.1
+ p1
+ c1(3)
+ c2(2)
+ p2
+ c1(1)
+ 1.2
+ p1
+ c1(0)
N2
+ 1.0
+ p1
+ c1(5)
So my data structure can look something like this
Name {
String name
List<Version> versions
}
Version {
String version
List<PName> pnames
}
PName {
String pName
List<CodeCount> codeCounts
}
CodeCount {
String code
Integer count
}
Anyone have suggestions/code snippets on the best way to do this?
There are a few ways, and how you do it depends on how robust your solution needs to be.
One would be to just write a couple of objects that had the attributes in the database. Then you could get the result set, and iterate over it, creating a new object each time the key field (for example, "name") changed, and adding it to a list of that object. Then you'd set the attributes appropriately. That is the "quick and dirty" solution.
A slightly more robust way would be to use something like Hibernate to do the mapping.
If you do decide to do that, I would also suggest redoing your tables so that they accurately reflect your object structure. It may not be needed if you just want a fast solution. But if you are seeking a robust solution for commercial or enterprise software, it's probably a good idea.
I have db column whose datatype is Number (15) and i have the corresponding field in java classes as long. The question is how would i map it using java.sql.Types.
would Types.BIGINT work?
Or shall i use something else?
P.S:
I can't afford to change the datatype within java class and within DB.
From this link it says that java.sql.Types.BIGINT should be used for long in Java to Number in SQL (Oracle).
Attaching screenshot of the table in case the link ever dies.
A good place to find reliable size mappings between Java and Oracle Types is in the Hibernate ORM tool. Documented in the code here, Hibernate uses an Oracle NUMBER(19,0) to represent a java.sql.Types.BIGINT which should map to a long primitave
I always use wrapper type, because wrapper types can be express null values.
In this case I will use Long wrapper type.
I had a similar problem where I couldn't modify the Java Type or the Database Type. In my situation I needed to execute a native SQL query (to be able to utilize Oracle's Recursive query abilities) and map the result set to a non-managed entity (essentially a simple pojo class).
I found a combination of addScalar and setResultTransformer worked wonders.
hibernateSes.createSQLQuery("SELECT \n"
+ " c.notify_state_id as \"notifyStateId\", \n"
+ " c.parent_id as \"parentId\",\n"
+ " c.source_table as \"sourceTbl\", \n"
+ " c.source_id as \"sourceId\", \n"
+ " c.msg_type as \"msgType\", \n"
+ " c.last_updt_dtm as \"lastUpdatedDateAndTime\"\n"
+ " FROM my_state c\n"
+ "LEFT JOIN my_state p ON p.notify_state_id = c.parent_id\n"
+ "START WITH c.notify_state_id = :stateId\n"
+ "CONNECT BY PRIOR c.notify_state_id = c.parent_id")
.addScalar("notifyStateId", Hibernate.LONG)
.addScalar("parentId", Hibernate.LONG)
.addScalar("sourceTbl",Hibernate.STRING)
.addScalar("sourceId",Hibernate.STRING)
.addScalar("msgType",Hibernate.STRING)
.addScalar("lastUpdatedDateAndTime", Hibernate.DATE)
.setParameter("stateId", notifyStateId)
.setResultTransformer(Transformers.aliasToBean(MyState.class))
.list();
Where notifyStateId, parentId, sourceTbl, sourceId, msgType, and lastUpdatedDateAndTime are all properties of MyState.
Without the addScalar's, I would get a java.lang.IllegalArgumentException: argument type mismatch because Hibernate was turning Oracle's Number type into a BigDecimal but notifyStateId and parentId are Long types on MyState.