Java SimpleJPA for AWS SimpleDB Select Query - java

I'm having trouble getting objects back out of SimpleDB using the simpleJPA persistance API. I have successfully installed all the jars and can persist objects no problem. However I cannot seem to retrieve objects using select queries - but weirdly I can get results using count queries. There are no errors or exceptions, the queries simply don't return any results. When I debug I can view the actual AWS Query that is being generated in the background by simpleJPA, and when I run this query against a domain it returns the expected results no problem.
I've included my Java code below, it should return me a list of all the users in my database.
Query query = em.createQuery("SELECT u FROM User u");
List<User> results = (List<User>)query.getResultList();
As I said I can persist objects and count them, so there isn't anything wrong with my entity manager or factory, its just returning empty lists. If you need any more information just ask,
Thanks in advance!

I never got to the bottom of this problem. In the end I started a new AWS project in Eclipse and re-added the JAR files, solving the issue.

Related

Picketlink: finding users with given role

I configured a JPA store and see users and roles getting added correctly to the db when I call the related picketlink (2.7.1) API's
My questions is this: how does one get a list of all users that have a given role?
I tried doing this using the following RelationshipQuery
RelationshipQuery<Grant> rq = relationshipManager.createRelationshipQuery(Grant.class);
rq.setParameter(Grant.ROLE, role);
List<Grant> grants = rq.getResultList()
But the resulting grant list contains a single assignment grant, that refers to the last user in the database that has that role.
I checked the example queries in the documentation and tests but found nothing that does what I want. I know the project is no longer active but am hoping to find a solution to this.
Found out that role data wasn't imported correctly from the old db. Once I fixed that the above code worked as expected.

Appengine can't retrieve Entities from datastore

I got appengine all set up, and it workds on my localhost, I call
Entity greeting3 = new Entity(KeyFactory.createKey("World", "world3"));
greeting3.setProperty("raw", "2.2 # # # .");
datastore.put(greeting3);
to add my entities, and I can see them added in the console. I then use
Query q = new Query("World");
PreparedQuery pq = datastore.prepare(q);
for (Entity result : pq.asIterable()) {
resp.getWriter().println(result);
}
to retrieve the entities. it works perfect on localhost, but on the server I just can't receive the list of all added entities. they do get added though with the first code. I also use
e = datastore.get(KeyFactory.createKey("World", req.getParameter(k)));
resp.getWriter().println(e.getProperty("raw"));
and on localhost it works as well, but on the internet, it throws 'no entity found exception'. I've tried manually added datastore indexes but it did not help.
I've been working all day and it really upsets me now :( I'm also quite sure it worked yesterday...
please help
thanks
If your query is run very soon after you put the entity in the datastore, this issue could be because the app engine datastore is not "immediately consistent". It is "eventually consistent"
You can test this by waiting a few seconds (maybe 10?) before running your query.
If this was the bug, then you should check out the app engine datastore overview in the docs
I experienced this also, if its not that req.getParameter(k) is something else other than what you expected, then you just have to run the query again. Datastore is not really fast, that is why there is Memcache.
If you want your application to serve data immediately then, you have to "cache" it using the Memcache API.

Update all objects in JPA entity

I'm trying to update all my 4000 Objects in ProfileEntity but I am getting the following exception:
javax.persistence.QueryTimeoutException: The datastore operation timed out, or the data was temporarily unavailable.
this is my code:
public synchronized static void setX4all()
{
em = EMF.get().createEntityManager();
Query query = em.createQuery("SELECT p FROM ProfileEntity p");
List<ProfileEntity> usersList = query.getResultList();
int a,b,x;
for (ProfileEntity profileEntity : usersList)
{
a = profileEntity.getA();
b = profileEntity.getB();
x = func(a,b);
profileEntity.setX(x);
em.getTransaction().begin();
em.persist(profileEntity);
em.getTransaction().commit();
}
em.close();
}
I'm guessing that I take too long to query all of the records from ProfileEntity.
How should I do it?
I'm using Google App Engine so no UPDATE queries are possible.
Edited 18/10
In this 2 days I tried:
using Backends as Thanos Makris suggested but got to a dead end. You can see my question here.
reading DataNucleus suggestion on Map-Reduce but really got lost.
I'm looking for a different direction. Since I only going to do this update once, Maybe I can update manually every 200 objects or so.
Is it possible to to query for the first 200 objects and after it the second 200 objects and so on?
Given your scenario, I would advice to run a native update query:
Query query = em.createNativeQuery("update ProfileEntity pe set pe.X = 'x'");
query.executeUpdate();
Please note: Here the query string is SQL i.e. update **table_name** set ....
This will work better.
Change the update process to use something like Map-Reduce. This means all is done in datastore. The only problem is that appengine-mapreduce is not fully released yet (though you can easily build the jar yourself and use it in your GAE app - many others have done so).
If you want to set(x) for all object's, better to user update statement (i.e. native SQL) using JPA entity manager instead of fetching all object's and update it one by one.
Maybe you should consider the use of the Task Queue API that enable you to execute tasks up to 10min. If you want to update such a number of entities that Task Queues do not fit you, you could also consider the user of Backends.
Put the transaction outside of the loop:
em.getTransaction().begin();
for (ProfileEntity profileEntity : usersList) {
...
}
em.getTransaction().commit();
Your class behaves not very well - JPA is not suitable for bulk updates this way - you just starting a lot of transaction in rapid sequence and produce a lot of load on the database. Better solution for your use case would be scalar query setting all the objects without loading them into JVM first ( depending on your objects structure and laziness you would load much more data as you think )
See hibernate reference:
http://docs.jboss.org/hibernate/orm/3.3/reference/en/html/batch.html#batch-direct

EclipseLink can't retrieve entities inserted manually

I'm having some trouble with EclipseLink. My program has to interact with a database (representing a building). I've written a little input-testmode where I can manually insert stuff through the console.
My problem: a normal getByID-operation works just fine if I try to retrieve an entity I previously inserted through EclipseLink itself (by commit()), but throws a NoResultException when trying to select a row manually inserted via SQL-script (building -> lots of rooms -> script).
This (oversemplified) works fine:
main() {
MyRoom r = new MyRoom();
r.setID("floor1-roomnr4");
em.commit(r); //entity manager
DAO.getRoomByID("floor1-roomnr4"); // works
}
and the combination of generation-script + simply getRoomByID() throws an exception.
If I try it in SQL Developer I get the result I want for the exact select statement which just threw a NoResultException. I also only get this problem in the input-mode, otherwise selecting the generated rows works also fine.
Does EclipseLink have some cache-mechanism I'm unaware of which is causing some problem?
Are you sure EclipseLink and SQL Developer are connected to the same Database? Please verify the connection information for both. Is the generation-script committing the changes with the "commit" command?
If EclipseLink works similarly to Hibernate then yes there is a cache. The "first level cache" guaranties that you get the exact same instance within one transaction which makes sense. If you know EclipseLink/transactions then try to evict all loaded instances or commit the transaction and then try your DAO again. This would force EclipseLink to fetch the data from the database again
See Answer to similar question

not able to load entity after insertion in toplink

i am using toplink as ORM tool, i am facing one peculiar problem. I am inserting an entity into the session and then in the next line if i try to load the same entity, i am unable to get that, instead it returns me null. But the same issue if i try using hibernate, then it works properly. can any one please help.
Address address = new Address();
address.setAddressId("1");
address.setPincode(1);
uow2.registerNewObject(address);
ExpressionBuilder builder = new ExpressionBuilder();
Expression expr = builder.get("addressId").equal("1");
Address address1 = (Address)uow2.readObject(Address.class, expr);
at the end i get address1 as null. i don't understand as i am inserting the object with the same key and then trying to retrieve it... plz help me...
This is Native TopLink/EclipseLink code. You are only 'registering' the Address with the UnitOfWork which does not write out until committed.
There are a couple of ways to get uncommitted results from a UnitOfWork. In the scenario above you can call uow.setShouldNewObjectsBeCached(true) before registering the new object then the readObject call will find it.
You can also change the readObject call to a ReadObjectQuery and set conformResultsInUnitOfWork on the query.
If you are just starting out with EclipseLink/TopLink then I recommend using the JPA APIs. You will be able to find many resources on JPA. Then once you begin to optimize your code or begin to tackle complicated scenarios you can use the EclipseLink mailing lists and forums to get EclipseLink specific assistance.

Categories

Resources