I do have two entities that relate each other via a OneToMany-Relation.
Entity 1 is named "Change" and looks like the following
public class Change {
String attribute1;
#Column(name="\"ATTRIBUTE1\"")
public void getAttribute1() {
return this.attribute1;
}
public void setAttribute1(String attribute1) {
this.attribute1 = attribute1;
}
// and 7 more of these....
List<ChangeTask> relatedChangeTasks = new ArrayList<ChangeTask>();
#OneToMany(cascade={PERSIST, MERGE, REFRESH}
#JoinTable(name="CHANGE_CHANGETASK", joinColumns={#JoinColumn(name="CHANGE_ID")}, inverseJoinColumns={#JoinColumn(name="CHANGETASK_ID")})
#JoinColumn(name="\"relatedChangeTask_ID\"" )
public List<ChangeTask> getRelatedChangeTasks() {
return this.relatedChangeTasks;
}
public void setRelatedChangeTasks(List<ChangeTask> relatedChangeTasks) {
this.relatedChangeTasks = relatedChangeTasks;
}
}
Entity 2 is named ChangeTask and extends Change.
public class ChangeTask extends Change {
// some additional attributes...
}
Persisting a new or existing Change record with one ChangeTask added to the "relatedChangeTask" list works just perfect.
Now I have to change the annotation of the 8 attributes from Default to #Lob, so Change now looks like this:
public class Change {
String attribute1;
#Lob
#Column(name="\"ATTRIBUTE1\"")
#Basic(fetch=EAGER)
public String getAttribute1() {
if(fieldHandler != null) {
return (java.lang.String) fieldHandler.readObject(this, "attribute1", attribute1);
}
return attribute1;
}
public void setAttribute1(String attribute1) {
if(fieldHandler != null) {
this.attribute1= (java.lang.String) fieldHandler.writeObject(this, "attribute1", this.attribute1, attribute1);
return;
}
this.attribute1= attribute1;
}
// and 7 more of these....
List<ChangeTask> relatedChangeTasks = new ArrayList<ChangeTask>();
#OneToMany(cascade={PERSIST, MERGE, REFRESH}
#JoinTable(name="CHANGE_CHANGETASK", joinColumns={#JoinColumn(name="CHANGE_ID")}, inverseJoinColumns={#JoinColumn(name="CHANGETASK_ID")})
#JoinColumn(name="\"relatedChangeTask_ID\"" )
public List<ChangeTask> getRelatedChangeTasks() {
return this.relatedChangeTasks;
}
public void setRelatedChangeTasks(List<ChangeTask> relatedChangeTasks) {
this.relatedChangeTasks = relatedChangeTasks;
}
}
Now, when I try to add a given ChangeTask to a Change the persist operation does not fail. But at the end of the Transaction the relation has not been persisted, meaning the relation-table "CHANGE_CHANGETASK" remains empty. When I debug through the whole process, I can see that the list contains one entry before "entityManager.merge()" operation and it still contains one entry after the merge. But it never arrives at the database.
Does anybody have an idea what I'm doing wrong here? As strange as it may sound, it must be something related with the #Lob annotations. As soon as I remove those again from the entity everything works fine.
Thanks in advance.
You wrote
public void getAttribute1() {
That can't be right. I think you mean
public String getAttribute1() {
Additionally you have annotated the setter:
#Column(name="\"ATTRIBUTE1\"")
public void setAttribute1(String attribute1) {
this.attribute1 = attribute1;
}
You have to annotage either the field or the getter.
Related
In a mongok script I want to update the value of an attribute and save the change, in the project I currently only have a ReactiveMongoRepository
I'm sure that I'm missing something stupid but I can't figure it out
Here are my simplified classes
Repository :
#Repository
public interface CalendarMongoRepository extends ReactiveMongoRepository<CalendarMongo, CalendarMongoId> {
//Some methods not used here
}
POJO :
#Document("calendars")
public class CalendarMongo {
#Id
private CalendarMongoId id;
private String calendar;
public String getCalendar() {
return calendar;
}
public CalendarMongo setCalendar(String calendar) {
this.calendar = calendar;
return this;
}
//Other getters setters and other attributes
}
POJO ID :
public class CalendarMongoId {
#Indexed(name = "branchCode_index")
private BranchEnum branchCode;
//Getters setters and other attributes
}
Mongok Script :
#Slf4j
#ChangeUnit(id = "SetupAttribute", order = "7", author = "mongock")
public class DatabaseChangeV007SetupAttribute {
private final CalendarMongoRepository calendarMongoRepository;
public DatabaseChangeV007SetupFieldType(CalendarMongoRepository calendarMongoRepository) {
this.calendarMongoRepository = calendarMongoRepository;
}
#Execution
public void migrationMethod() {
calendarMongoRepository.findAll()
.map(calendarMongo -> calendarMongo.setCalendar("AA"))
.map(calendarMongoRepository::save);
}
#RollbackExecution
public void rollback() {
log.error("Rollback invoked");
}
}
When I launch my project this script just does nothing, I do not see the change in my database. I'm sure the problem is not the structure of the script because of other tests.
I tried to use doOnNext but it's not the way to do.
I tried to use flatMap but I didn't manage to make it work either.
Do you have an idea ?
Thank you in advance !
I am currently working on an application connected to a MongoDB instance. I am having trouble where the 'id' field of my object is not being returned to me within the application but is being returned as null.
The schema has an 'entity' as defined below:
{
"entity_id": String,
"parent": String,
"relevance": boolean
}
I'm querying the collection using the Java Sync Driver (4.4.1) like so:
try {
Entity testDoc = collection.find(eq("entity_id", entity_id)).first();
if (testDoc != null) {
//add entity to a list
}
} catch (Exception e) {
LOGGER.log(Level.SEVERE, "Failed to get Entity", e);
}
For some reason this will give me every field in the object when I query EXCEPT the entity_id. I keep getting this returned as:
entity_id= null
Two things stick out to me. The first being that every other field is a String (originally the Id was a UUID object but I simplified while troubleshooting) and they still return if it's other fields. The second being that there is a whitespace before this null value as if it's being formatted. Other null values return as field=null instead of field= null
I was looking to see if there is some security setting preventing things from being labeled as *_id or *id from being returned but I have found no such instance.
Edit: Here is the Entity Pojo for clarity
public class Entity {
#BsonProperty(value = "entity_id")
private String entityID;
#BsonProperty(value = "parent")
private String parent;
#Deprecated
#BsonProperty(value = "relevance")
private boolean relevance;
public Entity() {}
public Entity(String entityID, String parent, Boolean relevance) {
this.entityID = entityID;
this.parent = parent;
this.relevance = relevance;
}
public String getEntityID() {
return entityID;
}
public void setEntityID(String entityID) {
this.entityID = entityID;
}
public String getParent() {
return parent;
}
public void setParent(String parent) {
this.parent = parent;
}
public boolean isRelevant() {
return relevance;
}
public void relevance(boolean relevance) {
this.relevance = relevance;
}
}
So update for anyone watching this, it appears to have been an issue with my Eclipse IDE.
I reimported the project into IntelliJ Community Edition and rebuilt the Maven project, etc... After doing so, the test cases passed and my entityID returns in the query. Hopefully if anyone else runs into this issue they can do something similar.
I need something that seems not so specific but anyway I was unable to come up with nice and sophisticated solution.
Say I have very simple hibernate/jpa entity:
#Entity(name="entity")
public class Type {
#Id
#GeneratedValue(strategy=GenerationType.AUTO)
private Long id;
#Column(unique = true, nullable = false)
private String name;
#Column(unique = false, nullable = false)
private boolean defaultType;
}
What i need is to somehow annotate defaultType field so only (and exactly) one persisted entity have this value as true. When new entity get persisted with this defaultType as true, the old one (with defaultType=true) entity has to be altered and its defaultType value changed to false. Also if any entity get changed (its defaultType got changed to true), same rule should apply.
As far I know this can be achieved inside business logic (e.g. in DAO layer), with DB trigger or with hibernates interceptor or event (If there is another way, please let me know). I tried with DAO solution but it's kind of bad solution because it can be bypassed and it is really clumsy for such simple operation. DB triggers can not be added with hibernate/jpa annotations (if I am not mistaken) and i am not sure how to make this functionality with hibernate interceptors/events.
So, what is best solution for this problem?
You need use Callback method in JPA, for example PreUpdate or PostUpdate, for instance:
#Entity
#EntityListeners(com.acme.AlertMonitor.class) // set callback method in another class
public class Account {
Long accountId;
Integer balance;
boolean preferred;
#Id
public Long getAccountId() { ... }
...
public Integer getBalance() { ... }
...
#Transient
public boolean isPreferred() { ... }
...
public void deposit(Integer amount) { ... }
public Integer withdraw(Integer amount) throws NSFException {... }
#PreUpdate // callback method in some class
protected void validateCreate() {
if (getBalance() < MIN_REQUIRED_BALANCE)
throw new AccountException("Insufficient balance to open an
account");
}
#PostUpdate // callback method in some class
protected void adjustPreferredStatus() {
preferred =
(getBalance() >= AccountManager.getPreferredStatusLevel());
}
}
// callback method in another class
public class AlertMonitor {
#PreUpdate // callback method in another class
public void updateAccountAlert(Account acct) {
Alerts.sendMarketingInfo(acct.getAccountId(), acct.getBalance());
}
}
Update: About your question, If I undestand what you want, this code may help you:
#Entity(name="entity")
#EntityListeners(com.yourpackage.TypeListner.class)
public class Type {
...
#Column(unique = false, nullable = false)
private boolean defaultType;
}
public class TypeListner {
pivate static Type objectWithTrue = null;
public void init() { // call this method when application is started
List<Type> results = entityManager
.createQuery("from Type", Type.class)
.getResultList();
for(Type type: results) {
if(type.getDefaultType()) {
objectWithTrue = type;
}
}
}
private void changeDefaultType(Type changed) {
if(changed.getDefaultType()) {
if(changed != objectWithTrue && objectWithTrue != null) {
objectWithTrue.setDefaultType(false);
}
objectWithTrue = changed;
}
}
#PostPresist
public void newType(Type changed) {
changeDefaultType(changed);
}
#PostUpdate
public void updateType(Type changed) {
changeDefaultType(changed);
}
#PreRemove
public void removeType(Type changed) {
if(changed.getDefaultType() && objectWithTrue == changed) {
objectWithTrue = null;
}
}
OR
You can use listner #PreUpdate and #PrePresist and every times overwrite all Type objects without store any variable (it isn't so good for perfomance then first example, but more reliable):
#PreUpdate
void updateType(Type changed) {
if(changed.getDefaultType()
List<Type> results = entityManager
.createQuery("from Type", Type.class)
.getResultList();
for(Type type: results) {
if(changed != type && type.getDefaultType()) {
type.setDefaultType(false);
}
}
}
}
I am using #CascadeSave to save child object in separate collection.
My Document classes are :
public class FbUserProfile{
#Id
private long id;
#DBRef(lazy=true)
#CascadeSave()
private Set<FacebookFriend> friends;
#DBRef(lazy=true)
#CascadeSave()
private Set<FacebookFriendList> customFriendList;
}
public class FacebookFriend{
#Id
private long id;
private String name;
}
public class FacebookFriendList{
#Id
private long id;
private String name;
private String list_type;
}
I add some object in both friends,customFriendList.
and try to update fbUserProfile object using:
mongoTemplate.save(fbUserProfile);
note: fbUserProfile already exists in db. Now I am updating this
Error Message: Cannot perform cascade save on child object without id set
If I remove #CascadeSave. It works fine for me. How I can Cascade set objects.
I am also using #CascadeSave with other objects. Its working fine but they are not set object.
I found the same tutorials somewhere else: Baeldung's and JavaCodeGeeks (this is the one i've followed)
I've had that same problem, and I could solve it.
It happens when you try to persist a collection. It doesn't matter that the collection's items have the #Id, because the collection itself won't have it. I edited the code in the EventListener's onBeforeConvert to check if the field you're trying to CascadeSave is a collection (in my case a List). If it's a list, you just cycle through it checking each individual item for #Id and saving them.
If it's not a collection you still have to persist them the same way you did before
#Override
public void onBeforeConvert(Object source) {
ReflectionUtils.doWithFields(source.getClass(), new ReflectionUtils.FieldCallback() {
#Override
public void doWith(Field field)
throws IllegalArgumentException, IllegalAccessException {
ReflectionUtils.makeAccessible(field);
if (field.isAnnotationPresent(DBRef.class) && field.isAnnotationPresent(CascadeSave.class)){
final Object fieldValue = field.get(source);
if(fieldValue instanceof List<?>){
for (Object item : (List<?>)fieldValue){
checkNSave(item);
}
}else{
checkNSave(fieldValue);
}
}
}
});
}
private void checkNSave(Object fieldValue){
DbRefFieldCallback callback = new DbRefFieldCallback();
ReflectionUtils.doWithFields(fieldValue.getClass(), callback);
if (!callback.isIdFound()){
throw new MappingException("Oops, something went wrong. Child doesn't have #Id?");
}
mongoOperations.save(fieldValue);
}
The best way to set an ID on the dependent child object is to write a listener class by extending AbstractMongoEventListener class and override the onConvert() method.
public class CustomMongoEventListener extends
AbstractMongoEventListener<Object> {
#Autowired
private MongoOperations mongoOperations;
#Override
public void onBeforeConvert(final Object entity) {
if (entity.id == null || entity.id.isEmpty()) {
entity.id = generateGuid(); //generate random sequence ID
}
public static String generateGuid() {
SecureRandom randomGen = new SecureRandom();
byte[] byteArray = new byte[16];
randomGen.nextBytes(byteArray);
return new Base32().encodeToString(byteArray).substring(0,26);
}
}
Finally register your custom listener in `your configuration file. For annotation approach use the following code to register :
#Bean
public CustomMongoEventListener cascadingMongoEventListener() {
return new CustomMongoEventListener();
}
The above solution works fine incase if you have a list. But we can avoid firing a save query for each element from the list, as it reduces the performance. Here is the solution I have found out of the above code.
#Override
public void onBeforeConvert(BeforeConvertEvent<Object> event) {
Object source = event.getSource();
ReflectionUtils.doWithFields(source.getClass(), new ReflectionUtils.FieldCallback() {
#Override
public void doWith(Field field)
throws IllegalArgumentException, IllegalAccessException {
ReflectionUtils.makeAccessible(field);
if (field.isAnnotationPresent(DBRef.class) && field.isAnnotationPresent(CascadeSave.class)){
final Object fieldValue = field.get(source);
if(fieldValue instanceof List<?>){
for (Object item : (List<?>)fieldValue){
checkNAdd(item);
}
}else{
checkNAdd(fieldValue);
}
mongoOperations.insertAll(documents);
}
}
});
}
private void checkNAdd(Object fieldValue){
DbRefFieldCallback callback = new DbRefFieldCallback();
ReflectionUtils.doWithFields(fieldValue.getClass(), callback);
if (!callback.isIdFound()){
throw new MappingException("Oops, something went wrong. Child doesn't have #Id?");
}
documents.add(fieldValue);
}
Okey I extend the class and it will check if the document is exist if it exist it will update the document else it insert the document:
#Component
class GenericCascadeMongo(
private val mongoTemplate: MongoTemplate
) : AbstractMongoEventListener<Any>() {
override fun onBeforeConvert(event: BeforeConvertEvent<Any?>) {
val source = event.source
?: return
ReflectionUtils.doWithFields(source.javaClass) { field ->
ReflectionUtils.makeAccessible(field)
if (field.isAnnotationPresent(DBRef::class.java) && field.isAnnotationPresent(CascadeSave::class.java)) {
val fieldValue = field[source]
?: return#doWithFields
if (fieldValue is List<*>) {
fieldValue.filterNotNull().forEach {
checkAndSave(it)
}
} else {
checkAndSave(fieldValue)
}
}
}
}
private fun checkAndSave(fieldValue: Any) {
try {
val callback = DbRefFieldCallback(fieldValue)
ReflectionUtils.doWithFields(fieldValue.javaClass, callback)
if (!callback.isIdFound && callback.id == null) {
mongoTemplate.insert(fieldValue)
}
if (callback.id != null) {
val findById = mongoTemplate.exists(Query(Criteria.where(MConst.MONGO_ID).`is`(callback.id)), fieldValue.javaClass)
if (findById) {
mongoTemplate.save(fieldValue)
} else {
mongoTemplate.insert(fieldValue)
}
}
} catch (e: Exception) {
e.printStackTrace()
}
}
private class DbRefFieldCallback(val fieldValue: Any?) : FieldCallback {
var isIdFound = false
private set
var id: String? = null
private set
#Throws(IllegalArgumentException::class, IllegalAccessException::class)
override fun doWith(field: Field) {
ReflectionUtils.makeAccessible(field)
if (field.isAnnotationPresent(Id::class.java)) {
isIdFound = true
id = ReflectionUtils.getField(field, fieldValue)?.toString()
}
}
}
}
Could you guys please help me find where I made a mistake ?
I switched from SimpleBeanEditorDriver to RequestFactoryEditorDriver and my code no longer saves full graph even though with() method is called. But it correctly loads full graph in the constructor.
Could it be caused by circular reference between OrganizationProxy and PersonProxy ? I don't know what else to think :( It worked with SimpleBeanEditorDriver though.
Below is my client code. Let me know if you want me to add sources of proxies to this question (or you can see them here).
public class NewOrderView extends Composite
{
interface Binder extends UiBinder<Widget, NewOrderView> {}
private static Binder uiBinder = GWT.create(Binder.class);
interface Driver extends RequestFactoryEditorDriver<OrganizationProxy, OrganizationEditor> {}
Driver driver = GWT.create(Driver.class);
#UiField
Button save;
#UiField
OrganizationEditor orgEditor;
AdminRequestFactory requestFactory;
AdminRequestFactory.OrderRequestContext requestContext;
OrganizationProxy organization;
public NewOrderView()
{
initWidget(uiBinder.createAndBindUi(this));
requestFactory = createFactory();
requestContext = requestFactory.contextOrder();
driver.initialize(requestFactory, orgEditor);
String[] paths = driver.getPaths();
createFactory().contextOrder().findOrganizationById(1).with(paths).fire(new Receiver<OrganizationProxy>()
{
#Override
public void onSuccess(OrganizationProxy response)
{
if (response == null)
{
organization = requestContext.create(OrganizationProxy.class);
organization.setContactPerson(requestContext.create(PersonProxy.class));
} else
organization = requestContext.edit(response);
driver.edit(organization, requestContext);
}
#Override
public void onFailure(ServerFailure error)
{
createConfirmationDialogBox(error.getMessage()).center();
}
});
}
private static AdminRequestFactory createFactory()
{
AdminRequestFactory factory = GWT.create(AdminRequestFactory.class);
factory.initialize(new SimpleEventBus());
return factory;
}
#UiHandler("save")
void buttonClick(ClickEvent e)
{
e.stopPropagation();
save.setEnabled(false);
try
{
AdminRequestFactory.OrderRequestContext ctx = (AdminRequestFactory.OrderRequestContext) driver.flush();
if (!driver.hasErrors())
{
// Link to each other
PersonProxy contactPerson = organization.getContactPerson();
contactPerson.setOrganization(organization);
String[] paths = driver.getPaths();
ctx.saveOrganization(organization).with(paths).fire(new Receiver<Void>()
{
#Override
public void onSuccess(Void arg0)
{
createConfirmationDialogBox("Saved!").center();
}
#Override
public void onFailure(ServerFailure error)
{
createConfirmationDialogBox(error.getMessage()).center();
}
});
}
} finally
{
save.setEnabled(true);
}
}
}
with() is only used for retrieval of information, so your with() use with a void return type is useless (but harmless).
Whether a full graph is persisted is entirely up to your server-side code, which is intimately bound to your persistence API (JPA, JDO, etc.)
First, check that the Organization object you receive in your save() method on the server-side is correctly populated. If it's not the case, check your Locators (and/or static findXxx methods) ; otherwise, check your save() method's code.
Judging from the code above, I can't see a reason why it wouldn't work.
It took me some time to realize that the problem was the composite id of Person entity.
Below is the code snippet of PojoLocator that is used by my proxy entities.
public class PojoLocator extends Locator<DatastoreObject, Long>
{
#Override
public DatastoreObject find(Class<? extends DatastoreObject> clazz, Long id)
{
}
#Override
public Long getId(DatastoreObject domainObject)
{
}
}
In order to fetch child entity from DataStore you need to have id of a parent class. In order to achieve that I switched "ID class" for Locator<> to String which represents textual form of Objectify's Key<> class.
Here is how to looks now:
public class PojoLocator extends Locator<DatastoreObject, String>
{
#Override
public DatastoreObject find(Class<? extends DatastoreObject> clazz, String id)
{
Key<DatastoreObject> key = Key.create(id);
return ofy.load(key);
}
#Override
public String getId(DatastoreObject domainObject)
{
if (domainObject.getId() != null)
{
Key<DatastoreObject> key = ofy.fact().getKey(domainObject);
return key.getString();
} else
return null;
}
}
Please note that your implementation may slightly differ because I'm using Objectify4.