I want the first to be generated:
#Id
#Column(name = "PRODUCT_ID", unique = true, nullable = false, precision = 12,
scale = 0)
#GeneratedValue(strategy = GenerationType.SEQUENCE, generator = "PROD_GEN")
#BusinessKey
public Long getAId() {
return this.aId;
}
I want the bId to be initially exactly as the aId. One approach is to insert the entity, then get the aId generated by the DB (2nd query) and then update the entity, setting the bId to be equal to aId (3rd query). Is there a way to get the bId to get the same generated value as aId?
Note that afterwards, I want to be able to update bId from my gui.
If the solution is JPA, even better.
Choose your poison:
Option #1
you could annotate bId as org.hibernate.annotations.Generated and use a database trigger on insert (I'm assuming the nextval has already been assigned to AID so we'll assign the curval to BID):
CREATE OR REPLACE TRIGGER "MY_TRIGGER"
before insert on "MYENTITY"
for each row
begin
select "MYENTITY_SEQ".curval into :NEW.BID from dual;
end;
I'm not a big fan of triggers and things that happen behind the scene but this seems to be the easiest option (not the best one for portability though).
Option #2
Create a new entity, persist it, flush the entity manager to get the id assigned, set the aId on bId, merge the entity.
em.getTransaction().begin();
MyEntity e = new MyEntity();
...
em.persist(e);
em.flush();
e.setBId(e.getAId());
em.merge(e);
...
em.getTransaction().commit();
Ugly, but it works.
Option #3
Use callback annotations to set the bId in-memory (until it gets written to the database):
#PostPersist
#PostLoad
public void initialiazeBId() {
if (this.bId == null) {
this.bId = aId;
}
}
This should work if you don't need the id to be written on insert (but in that case, see Option #4).
Option #4
You could actually add some logic in the getter of bId instead of using callbacks:
public Long getBId() {
if (this.bId == null) {
return this.aId;
}
return this.bId;
}
Again, this will work if you don't need the id to be persisted in the database on insert.
If you use JPA, after inserting the new A the id should be set to the generated value, i tought (maybe it depends on which jpa provider you use), so no 2nd query needed. then set bld to ald value in your DAO?
Related
The idea is basically to extend some Repositories with custom functionality. So I got this setup, which DOES work!
#MappedSuperclass
abstract class MyBaseEntity {
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
var id: Int = 0
var eid: Int = 0
}
interface MyRepository<T : MyBaseEntity> {
#Transactional
fun saveInsert(entity: T): Optional<T>
}
open class MyRepositoryImpl<T : MyBaseEntity> : MyRepository<T> {
#Autowired
private lateinit var entityManager: EntityManager
#Transactional
override fun saveInsert(entity: T): Optional<T> {
// lock table
entityManager.createNativeQuery("LOCK TABLE myTable WRITE").executeUpdate()
// get current max EID
val result = entityManager.createNativeQuery("SELECT MAX(eid) FROM myTable LIMIT 1").singleResult as? Int ?: 0
// set entities EID with incremented result
entity.eid = result + 1
// test if table is locked. sending manually 2-3 POST requests to REST
Thread.sleep(5000)
// save
entityManager.persist(entity)
// unlock
entityManager.createNativeQuery("UNLOCK TABLES").executeUpdate()
return Optional.of(entity)
}
}
How would I do this more spring-Like?
At first, I thought the #Transactional would do the LOCK and UNLOCK stuff. I tried a couple of additional parameters and #Lock. I did go through docs and some tutorials but the abstract technical English is often not easy to understand. At the end, I did not get a working solution so I manually added the table-locking, which works fine. Still would prefer a more spring-like way to do it.
1) There might be a problem with your current design as well. The persist does not instantly INSERT a row in the database. That happens on transaction commit when the method returns.
So you unlock the table before the actual insert:
// save
entityManager.persist(entity) // -> There is no INSERT at this point.
// unlock
entityManager.createNativeQuery("UNLOCK TABLES").executeUpdate()
2) Going back to how to do it only with JPA without natives (it still requires a bit of a workaround as it is not supported by default):
// lock table by loading one existing entity and setting the LockModeType
Entity lockedEntity = entityManager.find(Entity.class, 1, LockModeType.PESSIMISTIC_WRITE);
// get current max EID, TRY NOT TO USE NATIVE QUERY HERE
// set entities EID with incremented result
// save
entityManager.persist(entity)
entityManager.flush() // -> Force an actual INSERT
// unlock by passing the previous entity
entityManager.lock(lockedEntity, LockModeType.NONE)
I am having trouble using Hibernate with MSSQL Server 2012. No matter what I do when I try to insert a value in a certain table using Hibernate I get generated id=0.
Here is the model.
#Entity
#Table(name = "tbl_ClientInfo")
public class ClientInfo {
#Id
#GeneratedValue(strategy=GenerationType.IDENTITY)
#Column (name = "auto_Client_ID", unique=true, nullable=false)
private int auto_Client_ID;
...
Here is the write.
public boolean addNewClient(Client client) {
// there is a class that wraps SessionFactory as singleton
Session session = getSessionFactory().openSession();
Transaction tx = null;
Integer clientFamId; //client family info id
Integer clientId; // actual client id
try {
// create fam info first with some data - need id for ClientInfo
tx = session.beginTransaction();
ClientFam clientFam = new ClientFam();
clientFamId = (Integer) session.save(clientFam);
clientFamId = (Integer) session.getIdentifier(clientFam); // this returns the right id
session.flush();
ClientInfo clientInfo = new ClientInfo();
clientInfo.setABunchOfFields(withStuff); //multiple methods
session.save(clientInfo);
clientInfoId = (Integer) session.getIdentifier(clientInfo); // this is always 0
session.flush();
tx.commit();
} catch (HibernateException e) {
if (tx!=null) tx.rollback();
e.printStackTrace();
return false;
} finally {
session.close();
}
return true;
}
In the database the PK auto_Client_ID is clustered, set to IDENTITY(1,1). Both ClientInfo and ClientFam records are created in the db, but hibernate returns 0. I also tried catching the value from save, but it's also 0.
I don't want to commit in-between separate insert: the transaction is when all inserts are fine (there are more after this, but I can't get to them because of this id issue yet).
The model for ClientFam is almost the same: the id field is #GeneratedValue(strategy=GenerationType.IDENTITY) as well.
I also tried specifying this for ClientInfo
#GeneratedValue(generator="increment", strategy=GenerationType.IDENTITY)
#GenericGenerator(name = "increment", strategy = "increment")
The first time I ran it it returned the correct value. However, the second time I ran it I got an error:
Cannot insert explicit value for identity column in table 'Report' when IDENTITY_INSERT is set to OFF
And that was the end of trying that. Everywhere I looked the recommendation is to use GenerationType.IDENTITY for auto incremented field in the db. That's supposed to return the right values. What might I be doing wrong?
I also tried getting the id from the ClientInfo object itself (I thought it should get written into it) after the right, but it's was also 0. Makes me think something is wrong with my ClientInfo model and/or annotations in it.
I found the problem with my situation - has nothing to do with Hibernate. There is a instead of insert trigger that wasn't returning id and hence messing up what save() returns.
This is just an educated guess, but you might want to remove the "unique=true" clause from the #Column definition. Hibernate may be handling the column as a unique constraint as opposed to a primary key.
Entity with id autogenerated from oracle trigger sequence.
#Entity
#Table(name = "REPORT", schema = "WEBPORTAL")
public class Report {
private Integer id;
....
#Id
#GeneratedValue(strategy = GenerationType.SEQUENCE, generator="report_sequence")
#SequenceGenerator(name="report_sequence", sequenceName = "report_id_seq")
#Column(name="REPORT_ID", unique = true, nullable = false)
public Integer getId() {
return id;
}
....
}
Service
#Service("reportService")
public class ReportServiceImpl implements ReportService {
....
#Transactional(readOnly=false)
public void saveOrUpdate(Report report) {
reportDAO.saveOrUpdate(report);
}
}
DAO
#Repository
public class ReportDAOImpl implements ReportDAO {
....
#Override
public Report save(Report report) {
try {
Session session = sessionFactory.getCurrentSession();
session.save(report);
} catch (Exception e) {
logger.error("error", e);
}
return report;
}
}
And When I call service's saveOrUpdate and then try to reach id of entity I get different value than persisted in database. Values on database with autogeneration all is ok. Any suggestions?
reportService.saveOrUpdate(report);
System.out.println(report.getId());
prints: 4150
but saved id in database is: 84
NOTE: My purpose to get Id comes from that I wanted to save childs with cascade. But foreign key on child was different in database(the values of id that I get with getId()).
And Id generated in database is incremented by 2. EX: 80, 82, 84.
UPDATE:
Oracle trigger for sequence generation
CREATE OR REPLACE TRIGGER REPORT_ID_TRIG
BEFORE INSERT ON WEBPORTAL.REPORT
FOR EACH ROW
BEGIN
SELECT report_id_seq.NEXTVAL
INTO :new.report_id
FROM dual;
END;
ANSWER: Trigger should check if id is null
CREATE OR REPLACE TRIGGER REPORT_ID_TRIG
BEFORE INSERT ON WEBPORTAL.REPORT
FOR EACH ROW
WHEN (new.report_id is null)
BEGIN
SELECT report_id_seq.NEXTVAL
INTO :new.report_id
FROM dual;
END;
DESCRIPTION:
#GeneratedValue is not just a sequence generator. It's bit of HiLo algorithm.When it first requests id from database it multiplies it with 50(it can differ) and next 50 new entities will be given ids consequently, and than next request to database. This is to decrease request to database.
The numbers that I get from java was right numbers that should be saved on report.
Without id null value check Hibernate firstly requested for id from database and sequence.nextval called. When hibernate was persisting it(completing transaction) the database called sequence.next second time and set that value to database. So on ReportDetails there was true id value of report and on the Report id it was id set from database.
The problem is that two separate mechanisms are in place to generate the key:
one at Hibernate level which is to call a sequence and use the value to populate an Id column and send it to the database as the insert key
and another mechanism at the database that Hibernate does not know about: the column is incremented via a trigger.
Hibernate thinks that the insert was made with the value of the sequence, but in the database something else occurred. The simplest solution would probably be to remove the trigger mechanism, and let Hibernate populate the key based on the sequence only.
Another Solution:
check your trigger definition that should be in this format
(WHEN (new.report_id is null) ) is important.
CREATE OR REPLACE TRIGGER TRIGGER_NAME
BEFORE INSERT ON TABLE_NAME
FOR EACH ROW
WHEN (new.id is null)
BEGIN
SELECT SEQUENCE_NAME.NEXTVAL
INTO :new.id
FROM dual;
END
I need to save data into 2 tables (an entity and an association table).
I simply save my entity with the save() method from my entity repository.
Then, for performances, I need to insert rows into an association table in native sql. The rows have a reference on the entity I saved before.
The issue comes here : I get an integrity constraint exception concerning a Foreign Key. The entity saved first isn't known in this second query.
Here is my code :
The repo :
public interface DistributionRepository extends JpaRepository<Distribution, Long>, QueryDslPredicateExecutor<Distribution> {
#Modifying
#Query(value = "INSERT INTO DISTRIBUTION_PERIMETER(DISTRIBUTION_ID, SERVICE_ID) SELECT :distId, p.id FROM PERIMETER p "
+ "WHERE p.id in (:serviceIds) AND p.discriminator = 'SRV' ", nativeQuery = true)
void insertDistributionPerimeter(#Param(value = "distId") Long distributionId, #Param(value = "serviceIds") Set<Long> servicesIds);
}
The service :
#Service
public class DistributionServiceImpl implements IDistributionService {
#Inject
private DistributionRepository distributionRepository;
#Override
#Transactional
public DistributionResource distribute(final DistributionResource distribution) {
// 1. Entity creation and saving
Distribution created = new Distribution();
final Date distributionDate = new Date();
created.setStatus(EnumDistributionStatus.distributing);
created.setDistributionDate(distributionDate);
created.setDistributor(agentRepository.findOne(distribution.getDistributor().getMatricule()));
created.setDocument(documentRepository.findOne(distribution.getDocument().getTechId()));
created.setEntity(entityRepository.findOne(distribution.getEntity().getTechId()));
created = distributionRepository.save(created);
// 2. Association table
final Set<Long> serviceIds = new HashSet<Long>();
for (final ServiceResource sr : distribution.getServices()) {
serviceIds.add(sr.getTechId());
}
// EXCEPTION HERE
distributionRepository.insertDistributionPerimeter(created.getId(), serviceIds);
}
}
The 2 queries seem to be in different transactions whereas I set the #Transactionnal annotation. I also tried to execute my second query with an entityManager.createNativeQuery() and got the same result...
Invoke entityManager.flush() before you execute your native queries or use saveAndFlush instead.
I your specific case I would recommend to use
created = distributionRepository.saveAndFlush(created);
Important: your "native" queries must use the same transaction! (or you need a now transaction isolation level)
you also wrote:
I don't really understand why the flush action is not done by default
Flushing is handled by Hibernate (it can been configured, default is "auto"). This mean that hibernate will flush the data at any point in time. But always before you commit the transaction or execute an other SQL statement VIA HIBERNATE. - So normally this is no problem, but in your case, you bypass hibernate with your native query, so hibernate will not know about this statement and therefore it will not flush its data.
See also this answer of mine: https://stackoverflow.com/a/17889017/280244 about this topic
I'm trying to manually delete every entity that's in a collection on an entity. The problem is, the entities don't get deleted from the database, even though they get removed from the collection on the task.
Below is the code im using to achieve this:
public int removeExistingCosts(final DataStoreTask task) {
int removedAccumulator = 0;
Query query = entityManager.createNamedQuery(DataStoreCost.GET_COSTS_FOR_TASK);
query.setParameter(DataStoreCost.TASK_VARIABLE_NAME, task);
try {
List costsForTask = query.getResultList();
for(Object cost : costsForTask) {
task.getCosts().remove(cost);
removedAccumulator++;
}
} catch (NoResultException e) {
logger.debug("Couldn't costs for task: {}", task.getId());
}
entityManager.flush();
entityManager.persist(task);
return removedAccumulator;
}
Any ideas?
P.S the collection is represented as:
#OneToMany(targetEntity = DataStoreCost.class, mappedBy = "task", cascade = CascadeType.ALL)
private Collection<DataStoreCost> costs;
Cheers.
I think you need to explicitly remove the Cost entity via the entityManager. When you remove the Cost from the Tasks cost list you actually only remove the reference to that instance. It does not know that that particular Cost will not be used anywhere else.
It's not deleting the entity, because it doesn't know if something else is referring to it.
You need to enable delete orphan. In jpa2, use the orphanRemoval attribute. If you're using hibernate annotations, use CascadeStyle delete orphan.