MongoDb Duplicate Keys with versionned document - java

I have a problem of E11000 duplicate key error collection: xxxxx index: _id_ dup key: since I decide to add version with the #Version annotation to an object.
#Version
private long version;
When the obj is modified, I use mongoTemplate.save(obj); to override the document and the error pop a that time.
All works fine before the versionned doc.
Do you have an idea ? Because I don't have.
The objs are stored with mongoDb 3.6
Thank you

if you add version to your document you have to change it.
version and id create a tuple key

The problem was that I tried to save a new copy of the object. Without versioning there was no problem.
I solved the problem by saving directly the obj.

Related

ISIS: Problems with Blob/Clob field serialization

Edit: Solution: Upgrading to ISIS 1.17.0 and setting the property isis.persistor.datanucleus.standaloneCollection.bulkLoad=false solved the first two problems.
I am using Apache ISIS 1.16.2 and I try to store Blob/Clob content in a MariaDB database (v10.1.35). Therefore, I use the DB connector org.mariadb.jdbc.mariadb-java-client (v2.3.0) and in the code the #Persistent annotation as shown in many examples and the ISIS documentation.
Using the code below, I just get one single column named content_name (in which the Blob object is serialized in binary form) instead of the three columns content_name, content_mimetype and content_bytes.
This is the Document class with the Blob field content:
#PersistenceCapable(identityType = IdentityType.DATASTORE)
#DatastoreIdentity(strategy = IdGeneratorStrategy.IDENTITY, column = "id")
#DomainObject(editing = Editing.DISABLED, autoCompleteRepository = DocumentRepository.class, objectType = "Document")
#Getter
// ...
public class Document implements Comparable<Document> {
#Persistent(
defaultFetchGroup = "false",
columns = {
#Column(name = "content_name"),
#Column(name = "content_mimetype"),
#Column(name = "content_bytes",
jdbcType = "BLOB",
sqlType = "LONGVARBINARY")
})
#Nonnull
#Column(allowsNull = "false")
#Property(optionality = Optionality.MANDATORY)
private Blob content;
// ...
#Column(allowsNull = "false")
#Property
private Date created = new Date();
public Date defaultCreated() {
return new Date();
}
#Column(allowsNull = "true")
#Property
#Setter
private String owner;
// ...
}
This creates the following schema for the DomainObject class Document with just one column for the Blob field:
CREATE TABLE `document` (
`id` bigint(20) NOT NULL AUTO_INCREMENT,
`content_name` mediumblob,
`created` timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP,
`owner` varchar(255) CHARACTER SET latin1 COLLATE latin1_bin DEFAULT NULL,
PRIMARY KEY (`id`)
) ENGINE=InnoDB DEFAULT CHARSET=latin1;
Normally, the class org.apache.isis.objectstore.jdo.datanucleus.valuetypes.IsisBlobMapping of the ISIS framework should do the mapping. But it seems that this Mapper is somehow not involved...
1. Question: How do I get the Blob field being split up in the three columns (as described above and in many demo projects). Even if I switch to HSQLDB, I still get only one column, so this might not be an issue with MariaDB.
2. Question: If I use a Blob/Clob field in a class that inherits from another DomainObject class, I often get a org.datanucleus.exceptions.NucleusException (stack trace see below) and I cannot make head or tail of it. What are potential pitfalls when dealing with inheritance? Why am I getting this exception?
3. Question: I need to store documents belonging to domain objects (as you might have guessed). The proper way of doing so would be to store the documents in a file system tree instead of a database (which also has by default some size limitations for object data) and reference the files in the object. In the Datanucleus documentation I found the extension serializeToFileLocation that should do exactly that. I tried it by adding the line #Extension(vendorName="datanucleus", key="serializeToFileLocation" value="document-repository") to the Blob field, but nothing happened. So my question is: Is this Datanucleus extension compatible with Apache Isis?
If this extension conflicts with Isis, would it be possible to have a javax.jdo.listener.StoreLifecycleListener or org.apache.isis.applib.AbstractSubscriber that stores the Blob on a file system before persisting the domain object to database and restoring it before loading? Are there better solutions available?
That's it for now. Thank you in advance! ;-)
The stack trace to question 2:
... (other Wicket related stack trace)
Caused by: org.datanucleus.exceptions.NucleusException: Creation of SQLExpression for mapping "org.datanucleus.store.rdbms.mapping.java.SerialisedMapping" caused error
at org.datanucleus.store.rdbms.sql.expression.SQLExpressionFactory.newExpression(SQLExpressionFactory.java:199)
at org.datanucleus.store.rdbms.sql.expression.SQLExpressionFactory.newExpression(SQLExpressionFactory.java:155)
at org.datanucleus.store.rdbms.request.LocateBulkRequest.getStatement(LocateBulkRequest.java:158)
at org.datanucleus.store.rdbms.request.LocateBulkRequest.execute(LocateBulkRequest.java:283)
at org.datanucleus.store.rdbms.RDBMSPersistenceHandler.locateObjects(RDBMSPersistenceHandler.java:564)
at org.datanucleus.ExecutionContextImpl.findObjects(ExecutionContextImpl.java:3313)
at org.datanucleus.api.jdo.JDOPersistenceManager.getObjectsById(JDOPersistenceManager.java:1850)
at org.apache.isis.core.runtime.system.persistence.PersistenceSession.loadPersistentPojos(PersistenceSession.java:1010)
at org.apache.isis.core.runtime.system.persistence.PersistenceSession.adaptersFor(PersistenceSession.java:1603)
at org.apache.isis.core.runtime.system.persistence.PersistenceSession.adaptersFor(PersistenceSession.java:1573)
at org.apache.isis.viewer.wicket.model.models.EntityCollectionModel$Type$1.loadInBulk(EntityCollectionModel.java:107)
at org.apache.isis.viewer.wicket.model.models.EntityCollectionModel$Type$1.load(EntityCollectionModel.java:93)
at org.apache.isis.viewer.wicket.model.models.EntityCollectionModel.load(EntityCollectionModel.java:454)
at org.apache.isis.viewer.wicket.model.models.EntityCollectionModel.load(EntityCollectionModel.java:70)
at org.apache.wicket.model.LoadableDetachableModel.getObject(LoadableDetachableModel.java:135)
at org.apache.isis.viewer.wicket.ui.components.collectioncontents.ajaxtable.CollectionContentsSortableDataProvider.size(CollectionContentsSortableDataProvider.java:68)
at org.apache.wicket.markup.repeater.data.DataViewBase.internalGetItemCount(DataViewBase.java:142)
at org.apache.wicket.markup.repeater.AbstractPageableView.getItemCount(AbstractPageableView.java:235)
at org.apache.wicket.markup.repeater.AbstractPageableView.getRowCount(AbstractPageableView.java:216)
at org.apache.wicket.markup.repeater.AbstractPageableView.getViewSize(AbstractPageableView.java:314)
at org.apache.wicket.markup.repeater.AbstractPageableView.getItemModels(AbstractPageableView.java:99)
at org.apache.wicket.markup.repeater.RefreshingView.onPopulate(RefreshingView.java:93)
at org.apache.wicket.markup.repeater.AbstractRepeater.onBeforeRender(AbstractRepeater.java:124)
at org.apache.wicket.markup.repeater.AbstractPageableView.onBeforeRender(AbstractPageableView.java:115)
at org.apache.wicket.Component.internalBeforeRender(Component.java:950)
at org.apache.wicket.Component.beforeRender(Component.java:1018)
at org.apache.wicket.MarkupContainer.onBeforeRenderChildren(MarkupContainer.java:1825)
... 81 more
Caused by: org.datanucleus.exceptions.NucleusException: Unable to create SQLExpression for mapping of type "org.datanucleus.store.rdbms.mapping.java.SerialisedMapping" since not supported
at org.datanucleus.store.rdbms.sql.expression.SQLExpressionFactory#newExpression(SQLExpressionFactory.java:189)
at org.datanucleus.store.rdbms.sql.expression.SQLExpressionFactory#newExpression(SQLExpressionFactory.java:155)
at org.datanucleus.store.rdbms.request.LocateBulkRequest#getStatement(LocateBulkRequest.java:158)
at org.datanucleus.store.rdbms.request.LocateBulkRequest#execute(LocateBulkRequest.java:283)
at org.datanucleus.store.rdbms.RDBMSPersistenceHandler#locateObjects(RDBMSPersistenceHandler.java:564)
at org.datanucleus.ExecutionContextImpl#findObjects(ExecutionContextImpl.java:3313)
at org.datanucleus.api.jdo.JDOPersistenceManager#getObjectsById(JDOPersistenceManager.java:1850)
at org.apache.isis.core.runtime.system.persistence.PersistenceSession#loadPersistentPojos(PersistenceSession.java:1010)
at org.apache.isis.core.runtime.system.persistence.PersistenceSession#adaptersFor(PersistenceSession.java:1603)
at org.apache.isis.core.runtime.system.persistence.PersistenceSession#adaptersFor(PersistenceSession.java:1573)
at org.apache.isis.viewer.wicket.model.models.EntityCollectionModel$Type$1#loadInBulk(EntityCollectionModel.java:107)
at org.apache.isis.viewer.wicket.model.models.EntityCollectionModel$Type$1#load(EntityCollectionModel.java:93)
at org.apache.isis.viewer.wicket.model.models.EntityCollectionModel#load(EntityCollectionModel.java:454)
at org.apache.isis.viewer.wicket.model.models.EntityCollectionModel#load(EntityCollectionModel.java:70)
at org.apache.wicket.model.LoadableDetachableModel#getObject(LoadableDetachableModel.java:135)
at org.apache.isis.viewer.wicket.ui.components.collectioncontents.ajaxtable.CollectionContentsSortableDataProvider#size(CollectionContentsSortableDataProvider.java:68)
at org.apache.wicket.markup.repeater.data.DataViewBase#internalGetItemCount(DataViewBase.java:142)
at org.apache.wicket.markup.repeater.AbstractPageableView#getItemCount(AbstractPageableView.java:235)
at org.apache.wicket.markup.repeater.AbstractPageableView#getRowCount(AbstractPageableView.java:216)
at org.apache.wicket.markup.repeater.AbstractPageableView#getViewSize(AbstractPageableView.java:314)
at org.apache.wicket.markup.repeater.AbstractPageableView#getItemModels(AbstractPageableView.java:99)
at org.apache.wicket.markup.repeater.RefreshingView#onPopulate(RefreshingView.java:93)
at org.apache.wicket.markup.repeater.AbstractRepeater#onBeforeRender(AbstractRepeater.java:124)
at org.apache.wicket.markup.repeater.AbstractPageableView#onBeforeRender(AbstractPageableView.java:115)
// <-- 8 times the following lines
at org.apache.wicket.Component#internalBeforeRender(Component.java:950)
at org.apache.wicket.Component#beforeRender(Component.java:1018)
at org.apache.wicket.MarkupContainer#onBeforeRenderChildren(MarkupContainer.java:1825)
at org.apache.wicket.Component#onBeforeRender(Component.java:3916)
// -->
at org.apache.wicket.Page#onBeforeRender(Page.java:801)
at org.apache.wicket.Component#internalBeforeRender(Component.java:950)
at org.apache.wicket.Component#beforeRender(Component.java:1018)
at org.apache.wicket.Component#internalPrepareForRender(Component.java:2236)
at org.apache.wicket.Page#internalPrepareForRender(Page.java:242)
at org.apache.wicket.Component#render(Component.java:2325)
at org.apache.wicket.Page#renderPage(Page.java:1018)
at org.apache.wicket.request.handler.render.WebPageRenderer#renderPage(WebPageRenderer.java:124)
at org.apache.wicket.request.handler.render.WebPageRenderer#respond(WebPageRenderer.java:195)
at org.apache.wicket.core.request.handler.RenderPageRequestHandler#respond(RenderPageRequestHandler.java:175)
at org.apache.wicket.request.cycle.RequestCycle$HandlerExecutor#respond(RequestCycle.java:895)
at org.apache.wicket.request.RequestHandlerStack#execute(RequestHandlerStack.java:64)
at org.apache.wicket.request.cycle.RequestCycle#execute(RequestCycle.java:265)
at org.apache.wicket.request.cycle.RequestCycle#processRequest(RequestCycle.java:222)
at org.apache.wicket.request.cycle.RequestCycle#processRequestAndDetach(RequestCycle.java:293)
at org.apache.wicket.protocol.http.WicketFilter#processRequestCycle(WicketFilter.java:261)
at org.apache.wicket.protocol.http.WicketFilter#processRequest(WicketFilter.java:203)
at org.apache.wicket.protocol.http.WicketFilter#doFilter(WicketFilter.java:284)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain#doFilter(ServletHandler.java:1668)
at org.apache.isis.core.webapp.diagnostics.IsisLogOnExceptionFilter#doFilter(IsisLogOnExceptionFilter.java:52)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain#doFilter(ServletHandler.java:1668)
at org.apache.shiro.web.servlet.AbstractShiroFilter#executeChain(AbstractShiroFilter.java:449)
at org.apache.shiro.web.servlet.AbstractShiroFilter$1#call(AbstractShiroFilter.java:365)
at org.apache.shiro.subject.support.SubjectCallable#doCall(SubjectCallable.java:90)
at org.apache.shiro.subject.support.SubjectCallable#call(SubjectCallable.java:83)
at org.apache.shiro.subject.support.DelegatingSubject#execute(DelegatingSubject.java:383)
at org.apache.shiro.web.servlet.AbstractShiroFilter#doFilterInternal(AbstractShiroFilter.java:362)
at org.apache.shiro.web.servlet.OncePerRequestFilter#doFilter(OncePerRequestFilter.java:125)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain#doFilter(ServletHandler.java:1668)
// ... some Jetty stuff
at java.lang.Thread#run(Thread.java:748)
After some research, I think that the problem of question 1 and 2 seems to be related to this ISIS bug report #1902.
In short: The datanucleus extension plugin resolving mechanism does not seem to find the ISIS value type adapters and therefore cannot know how to serialize ISIS's Blob/Clob types.
According to the aforementioned ISIS bug report, this problem is fixed in 1.17.0, so I am trying to upgrade from 1.16.2 to this version (which introduced many other problems, but that will be an extra topic).
For question 3 I have found Minio which addresses basically my problem, but it is a bit oversized for my needs. I will keep looking for other solutions to store Blob/Clobs to local file system and will keep Minio in mind...
UPDATE:
I upgraded my project to ISIS version 1.17.0. It solved my problem in question 1 (now I get three columns for a Blob/Clob object).
The problem in question 2 (NucleusException) is not solved by the upgrade. I figured out that it is only thrown if returning a list of DomainObjects with Blob/Clob field(s), i.e. if rendered as standalone table. If I get directly to an object's entity view, no exception is thrown and I can see/modify/download the Blob/Clob content.
In the meantime, I wrote my own datanucleus plugin, that stores the Blobs/Clobs as files on a file system.
UPDATE 2
I found a solution for circumventing the org.datanucleus.exceptions.NucleusException: Unable to create SQLExpression for mapping of type "org.apache.isis.objectstore.jdo.datanucleus.valuetypes.IsisClobMapping" since not supported. It seems to be a problem with bulk-loading (but I do not know any details).
Deactivating bulk-load via the property isis.persistor.datanucleus.standaloneCollection.bulkLoad=false (which is initially set to true by ISIS archetypes) solved the problem.

Problems with queryBuilder greenDAO

I am using latest version of GreenDAO... I am missing something on using the data from the DB.
I need to prevent the creation of records that have the same PROFILE_NUMBER. Currently during testing I have inserted 1 record with the PROFILE_NUMBER of 1.
I need someone to show me an example of how to obtain the actual value of the field from the db.
I am using this
SvecPoleDao svecPoleDao = daoSession.getSvecPoleDao();
List poles = svecPoleDao.queryBuilder().where(SvecPoleDao.Properties.Profile_number.eq(1)).list();
and it obtains something... this.
[com.example.bobby.poleattachmenttest2_workingdatabase.db.SvecPole#bfe830c3.2]
Is this serialized? The actual value I am looking for here is 1.
Here is the solution.You'll need to use listlazy() instead of list().
List<SvecPole> poles = svecPoleDao.queryBuilder().where(SvecPoleDao.Properties.Profile_number.eq(1)).listLazy();

Failed to make bulk upsert using mongo

I'm trying to do upsert using mongodb driver, here is a code:
BulkWriteOperation builder = coll.initializeUnorderedBulkOperation();
DBObject toDBObject;
for (T entity : entities) {
toDBObject = morphia.toDBObject(entity);
builder.find(toDBObject).upsert().replaceOne(toDBObject);
}
BulkWriteResult result = builder.execute();
where "entity" is morphia object. When I'm running the code first time (there are no entities in the DB, so all of the queries should be insert) it works fine and I see the entities in the database with generated _id field. Second run I'm changing some fields and trying to save changed entities and then I receive the folowing error from mongo:
E11000 duplicate key error collection: statistics.counters index: _id_ dup key: { : ObjectId('56adfbf43d801b870e63be29') }
what I forgot to configure in my example?
I don't know the structure of dbObject, but that bulk Upsert needs a valid query in order to work.
Let's say, for example, that you have a unique (_id) property called "id". A valid query would look like:
builder.find({id: toDBObject.id}).upsert().replaceOne(toDBObject);
This way, the engine can (a) find an object to update and then (b) update it (or, insert if the object wasn't found). Of course, you need the Java syntax for find, but same rule applies: make sure your .find will find something, then do an update.
I believe (just a guess) that the way it's written now will find "all" docs and try to update the first one ... but the behavior you are describing suggests it's finding "no doc" and attempting an insert.

using $addToset with java morphia aggregation

I have mongodb aggregation query and it works perfectly in shell.
How can i rewrite this query to use with morphia ?
org.mongodb.morphia.aggregation.Group.addToSet(String field) accepts only one field name but i need to add object to the set.
Query:
......aggregate([
{$group:
{"_id":"$subjectHash",
"authors":{$addToSet:"$fromAddress.address"},
---->> "messageDataSet":{$addToSet:{"sentDate":"$sentDate","messageId":"$_id"}},
"messageCount":{$sum:1}}},
{$sort:{....}},
{$limit:10},
{$skip:0}
])
Java code:
AggregationPipeline aggregationPipeline = myDatastore.createAggregation(Message.class)
.group("subjectHash",
grouping("authors", addToSet("fromAddress.address")),
--------??????------>> grouping("messageDataSet", ???????),
grouping("messageCount", new Accumulator("$sum", 1))
).sort(...)).limit(...).skip(...);
That's currently not supported but if you'll file an issue I'd be happy to include that in an upcoming release.
Thanks for your answer, I can guess that according to source code. :(
I don't want to use spring-data or java-driver directly (for this project) so I changed my document representation.
Added messageDataSet object which contains sentDate and messageId (and some other nested objects) (these values become duplicated in a document which is a bad design).
Aggregation becomes : "messageDataSet":{$addToSet:"$messageDataSet"},
and Java code is: grouping("messageDataSet", addToSet("messageDataSet")),
This works with moprhia. Thanks.

Morphia 0.110 does not find references with string id

We are working with mongodb/morphia since the early versions of morphia. So we always used a string as id-Field in our persistent objects:
#Id
private String id;
public PersistentEntity() {
id = new ObjectId().toHexString();
}
Up to Version 0.108 of morphia that was fine. Since morphia 0.110 however, references to objects with id as outlined above are not found any longer. The error shown is:
Mrz 10, 2015 9:49:35 AM org.mongodb.morphia.mapping.ReferenceMapper$1
eval WARNUNG: Null reference found when retrieving value for
<classname>
Currently our systems run with mongodb 2.6.8, java-driver 2.13.0 and morphia 0.108.
So how do we migrate the existing data on the customer systems? There is no way to change _id Field from String to ObjectId - mongodb does not allow that. Are we stuck with morphia 0.108?
Has anyone solved similar problems?
Are you using multiple datastores in the same app? There was a change in that area and we've had similar issues with the same version.
There will be a fix in 1.0-rc, which should be the next release.

Categories

Resources