i am using toplink as ORM tool, i am facing one peculiar problem. I am inserting an entity into the session and then in the next line if i try to load the same entity, i am unable to get that, instead it returns me null. But the same issue if i try using hibernate, then it works properly. can any one please help.
Address address = new Address();
address.setAddressId("1");
address.setPincode(1);
uow2.registerNewObject(address);
ExpressionBuilder builder = new ExpressionBuilder();
Expression expr = builder.get("addressId").equal("1");
Address address1 = (Address)uow2.readObject(Address.class, expr);
at the end i get address1 as null. i don't understand as i am inserting the object with the same key and then trying to retrieve it... plz help me...
This is Native TopLink/EclipseLink code. You are only 'registering' the Address with the UnitOfWork which does not write out until committed.
There are a couple of ways to get uncommitted results from a UnitOfWork. In the scenario above you can call uow.setShouldNewObjectsBeCached(true) before registering the new object then the readObject call will find it.
You can also change the readObject call to a ReadObjectQuery and set conformResultsInUnitOfWork on the query.
If you are just starting out with EclipseLink/TopLink then I recommend using the JPA APIs. You will be able to find many resources on JPA. Then once you begin to optimize your code or begin to tackle complicated scenarios you can use the EclipseLink mailing lists and forums to get EclipseLink specific assistance.
Related
I'm working on a small project using morphia for MongoDB. I want to know what is the best way to update a document without knowing first hand what field to update. Say for example, I have a form, after having saved to database I might want to come back and change some fields but I haven't decided yet what to change.
My current solution is doing an update for all fields whether it is changed or not and of course it throws null exception. Morphia complains on null value update.
My code looks like this:
Query<Project> q = datastore.find(Project.class, "_id", projectToUpdate.getId());
UpdateOperations<Project> update
= datastore.createUpdateOperations(Project.class)
.set("name", updatedProject.getName())
.set("deadline", updatedProject.getDeadline())
.set("priority", updatedProject.getPriority())
.set("completion", updatedProject.getCompletion())
.set("description", updatedProject.getDescription())
.set("projectManager", updatedProject.getPM())
.set("collaborators", updatedProject.getAllCollaborators())
.set("teams", updatedProject.getAllTeams())
.set("userStories", updatedProject.getUserStories())
.set("log", updatedProject.getLog());
datastore.findAndModify(q, update);
Exception
Exception in thread "main" org.mongodb.morphia.query.QueryException: Value cannot be null.
at org.mongodb.morphia.query.UpdateOpsImpl.set(UpdateOpsImpl.java:220)
at controllers.QueryProjects.updateProject(QueryProjects.java:78)
at controllers.DBConnection.TestMongo(DBConnection.java:152)
at penelope.Main.main(Main.java:12)
I was thinking about using delegate/event handler to update each field individually but I'm afraid that might degrade the performance.
You should just use datastore.save(). See an example at http://mongodb.github.io/morphia/1.3/getting-started/quick-tour/
I got appengine all set up, and it workds on my localhost, I call
Entity greeting3 = new Entity(KeyFactory.createKey("World", "world3"));
greeting3.setProperty("raw", "2.2 # # # .");
datastore.put(greeting3);
to add my entities, and I can see them added in the console. I then use
Query q = new Query("World");
PreparedQuery pq = datastore.prepare(q);
for (Entity result : pq.asIterable()) {
resp.getWriter().println(result);
}
to retrieve the entities. it works perfect on localhost, but on the server I just can't receive the list of all added entities. they do get added though with the first code. I also use
e = datastore.get(KeyFactory.createKey("World", req.getParameter(k)));
resp.getWriter().println(e.getProperty("raw"));
and on localhost it works as well, but on the internet, it throws 'no entity found exception'. I've tried manually added datastore indexes but it did not help.
I've been working all day and it really upsets me now :( I'm also quite sure it worked yesterday...
please help
thanks
If your query is run very soon after you put the entity in the datastore, this issue could be because the app engine datastore is not "immediately consistent". It is "eventually consistent"
You can test this by waiting a few seconds (maybe 10?) before running your query.
If this was the bug, then you should check out the app engine datastore overview in the docs
I experienced this also, if its not that req.getParameter(k) is something else other than what you expected, then you just have to run the query again. Datastore is not really fast, that is why there is Memcache.
If you want your application to serve data immediately then, you have to "cache" it using the Memcache API.
I'm trying to update all my 4000 Objects in ProfileEntity but I am getting the following exception:
javax.persistence.QueryTimeoutException: The datastore operation timed out, or the data was temporarily unavailable.
this is my code:
public synchronized static void setX4all()
{
em = EMF.get().createEntityManager();
Query query = em.createQuery("SELECT p FROM ProfileEntity p");
List<ProfileEntity> usersList = query.getResultList();
int a,b,x;
for (ProfileEntity profileEntity : usersList)
{
a = profileEntity.getA();
b = profileEntity.getB();
x = func(a,b);
profileEntity.setX(x);
em.getTransaction().begin();
em.persist(profileEntity);
em.getTransaction().commit();
}
em.close();
}
I'm guessing that I take too long to query all of the records from ProfileEntity.
How should I do it?
I'm using Google App Engine so no UPDATE queries are possible.
Edited 18/10
In this 2 days I tried:
using Backends as Thanos Makris suggested but got to a dead end. You can see my question here.
reading DataNucleus suggestion on Map-Reduce but really got lost.
I'm looking for a different direction. Since I only going to do this update once, Maybe I can update manually every 200 objects or so.
Is it possible to to query for the first 200 objects and after it the second 200 objects and so on?
Given your scenario, I would advice to run a native update query:
Query query = em.createNativeQuery("update ProfileEntity pe set pe.X = 'x'");
query.executeUpdate();
Please note: Here the query string is SQL i.e. update **table_name** set ....
This will work better.
Change the update process to use something like Map-Reduce. This means all is done in datastore. The only problem is that appengine-mapreduce is not fully released yet (though you can easily build the jar yourself and use it in your GAE app - many others have done so).
If you want to set(x) for all object's, better to user update statement (i.e. native SQL) using JPA entity manager instead of fetching all object's and update it one by one.
Maybe you should consider the use of the Task Queue API that enable you to execute tasks up to 10min. If you want to update such a number of entities that Task Queues do not fit you, you could also consider the user of Backends.
Put the transaction outside of the loop:
em.getTransaction().begin();
for (ProfileEntity profileEntity : usersList) {
...
}
em.getTransaction().commit();
Your class behaves not very well - JPA is not suitable for bulk updates this way - you just starting a lot of transaction in rapid sequence and produce a lot of load on the database. Better solution for your use case would be scalar query setting all the objects without loading them into JVM first ( depending on your objects structure and laziness you would load much more data as you think )
See hibernate reference:
http://docs.jboss.org/hibernate/orm/3.3/reference/en/html/batch.html#batch-direct
I'm having trouble getting objects back out of SimpleDB using the simpleJPA persistance API. I have successfully installed all the jars and can persist objects no problem. However I cannot seem to retrieve objects using select queries - but weirdly I can get results using count queries. There are no errors or exceptions, the queries simply don't return any results. When I debug I can view the actual AWS Query that is being generated in the background by simpleJPA, and when I run this query against a domain it returns the expected results no problem.
I've included my Java code below, it should return me a list of all the users in my database.
Query query = em.createQuery("SELECT u FROM User u");
List<User> results = (List<User>)query.getResultList();
As I said I can persist objects and count them, so there isn't anything wrong with my entity manager or factory, its just returning empty lists. If you need any more information just ask,
Thanks in advance!
I never got to the bottom of this problem. In the end I started a new AWS project in Eclipse and re-added the JAR files, solving the issue.
I'm having some trouble with EclipseLink. My program has to interact with a database (representing a building). I've written a little input-testmode where I can manually insert stuff through the console.
My problem: a normal getByID-operation works just fine if I try to retrieve an entity I previously inserted through EclipseLink itself (by commit()), but throws a NoResultException when trying to select a row manually inserted via SQL-script (building -> lots of rooms -> script).
This (oversemplified) works fine:
main() {
MyRoom r = new MyRoom();
r.setID("floor1-roomnr4");
em.commit(r); //entity manager
DAO.getRoomByID("floor1-roomnr4"); // works
}
and the combination of generation-script + simply getRoomByID() throws an exception.
If I try it in SQL Developer I get the result I want for the exact select statement which just threw a NoResultException. I also only get this problem in the input-mode, otherwise selecting the generated rows works also fine.
Does EclipseLink have some cache-mechanism I'm unaware of which is causing some problem?
Are you sure EclipseLink and SQL Developer are connected to the same Database? Please verify the connection information for both. Is the generation-script committing the changes with the "commit" command?
If EclipseLink works similarly to Hibernate then yes there is a cache. The "first level cache" guaranties that you get the exact same instance within one transaction which makes sense. If you know EclipseLink/transactions then try to evict all loaded instances or commit the transaction and then try your DAO again. This would force EclipseLink to fetch the data from the database again
See Answer to similar question