Transactional annotation slow down the performance after jpa implementation - java

I am upgrading my application, currently it is using Hibernate 3 and now moving to jpa-hibernate 4.
We have one utility class BaseDAO which is used for all database related operations like executeQuery, executeUpdate and all. The session is also managed there only.
Currently where we are using get methods to show data, or single update/create/delete operation then we are not creating transactions, the BaseDAO does it.
Now since we are migrating to JPA we need transactional for persists call.So I have added #Transactional at BaseDAO at class level, and when I add it, the application works fine but the functionalities where lot of interaction was there with DB it got slowed down. There is one report which earlier used to take 1 min, and now it takes 1.4 mins.
Hibernate Template method
public class BaseDAO extends HibernateDaoSupport {
public List executeQueryPaging(final String hql, final Object[] params, final Integer[] pagingParam) {
List results = null;
results = getHibernateTemplate().executeFind(new HibernateCallback() {
public Object doInHibernate(Session session) {
Query query = session.createQuery(hql);
if (params != null) {
for (int i = 0; i < params.length; i++) {
query.setParameter(i, params[i]);
}
}
query.setFirstResult(pagingParam[0]);
query.setMaxResults(pagingParam[1]);
return query.list();
}
});
return results;
}
}
JPA method
#Transactional
public class BaseDAO {
public List executeQueryPaging(final String hql, final Object[] params, final Integer[] pagingParam) {
Query query = entityManager.createQuery(hql);
List list = null;
if (params != null) {
for (int i = 0; i < params.length; i++) {
query.setParameter(i + 1, params[i]);
}
}
if (pagingParam != null) {
query.setFirstResult(pagingParam[0]);
query.setMaxResults(pagingParam[1]);
}
list = (List) query.getResultList();
return list;
}
}

Related

What is the best way to handle LazyInitializationException

I have been struggling with Hibernate lately. I recently ran into a problem which I would appreciate some help with. I have two entities :
1.User
#Entity
public class User{
#ID
private Long id;
#OneToMany (mappedBy = "user")
private Set<Activity> activities;
...
}
2.Activity:
#Entity
public class Activity {
#ID
private Long id;
#ManyToOne
private User user;
...
}
So here, since I didn't set user activities fetchType to EAGER, when I fetch user entity from the database all the activities will be fetched lazily.
What I did in UserRepository was:
#Override
public User getUserByUserName(String userName) {
EntityManager entityManager = entityManagerFactory.createEntityManager();
Query query = entityManager.createQuery
("from User u where u.userName = :user_name", User.class);
query.setParameter("user_name", userName);
try{
return (User) query.getSingleResult();
} catch(NoResultException e) {
return null;
} finally {
entityManager.close();
}
}
Doing this I get The LazyInitializationException when I wanted to use the fetched User's activies. But what I did was removing finaly block from the code :
#Override
public User getUserByUserName(String userName) {
EntityManager entityManager = entityManagerFactory.createEntityManager();
Query query = entityManager.createQuery
("from User u where u.userName = :user_name", User.class);
query.setParameter("user_name", userName);
try{
return (User) query.getSingleResult();
} catch(NoResultException e) {
return null;
}
}
This solved The exception. I wanted to know if this is the right way to do it or should I just change user activities fetchType to EAGER?
I guess the problem you're having comes from closing the entityManager.close();.
As User's reference to Activity is LAZY, JPA will load it only once you try to access it via code, like anything accessing the variable User.activities in any way.
This makes JPA load the referenced Activitys.
What I think happens there:
When you create (read from Database) those Users in your getUserByUserName() method, JPA keeps (has those created Users keep) a reference to the EntityManager entityManager they were created with.
So if you later on try to access User.activities, JPA will try to load those activities using the EntityManager entityManager.
If you have already closed that EntityManager entityManager (as you did in your final statement), then the loading will fail with that LazyInitializationException you then get.
Solution:
As I am not sing Hibernate, nor checking its codebase, I do not know whether entityManagerFactory.createEntityManager(); is actually creating separate instances if already present, and how all the EntityManager instances are managed.
But probably it's simply best to never close any EntityManager; instead, let the JPA implementation (Hibernate) take care of that. With all the possibilities of dependency-injecting stuff into your classes, I bet Hibernate has some pretty good mechanisms in place for dealing with that kind of resource management.
As an alternative, let Hiberante inject the EntityManager, so you don't even have to take care of creating it in the first place.
Finally, you can still stick to the LAZY loading, if that improves initial and long-term performance for you. Just don't close those EntityManagers.
If you need deeper insights into that matter,
check Hibernate's source code / preprocessors / code injection / bytecode weaving mechanisms
or use some JVM memory analysis tool to see if the number of instantiated EntityManagers increase linearly with calls to your getUserByUserName() method.
Update: Showing a completely different alternative
I personally use Payara, and use #EJB private UserCRUD mUserCRUD; to inject Data Access Objects, that I called CRUD (for Create Retriece Update Delete) a long time ago and still stick to them.
The basic principle is these 3 steps:
Step 1: I have a universal base class in a library, injecting the #PersistenceContext protected EntityManager mEM;
#Stateless
public abstract class TemplateCRUD_Simple<T> implements TemplateCRUD_Simple_Interface<T> {
#PersistenceContext protected EntityManager mEM;
protected final Class<T> mType;
public TemplateCRUD_Simple(final Class<T> pClass) {
mType = pClass;
}
/*
* INTERNALS
*/
#Override public String getTableName() {
return mType.getSimpleName();
}
#Override public String getNativeTableName() {
final Table table = mType.getAnnotation(Table.class);
if (table != null) return table.name();
return getTableName();
}
#Override public EntityManager getEntityManager() {
return mEM;
}
/*
* CREATE
*/
#Override public T create(final T t) {
return createAny(t);
}
#Override public <U> U createAny(final U u) {
mEM.persist(u);
mEM.flush();
mEM.refresh(u);
return u;
}
/*
* RETRIEVE
*/
#Override public T find(final long pID) throws JcXEntityNotFoundException {
final T ret = find(pID, null);
if (ret == null) throw new JcXEntityNotFoundException(getTableName() + " with ID " + pID + " cannot be found!");
return ret;
}
#Override public T find(final long pID, final T pDefault) {
final T ret = mEM.find(mType, Long.valueOf(pID));
if (ret == null) return pDefault;
return ret;
}
#Override public T find(final Long pID, final T pDefault) {
if (pID == null) return pDefault;
return find(pID.longValue(), pDefault);
}
#Override public T findCreate(final long pID) throws InstantiationException, IllegalAccessException {
final T item = find(pID, null);
if (item != null) return item;
final T item2 = mType.newInstance();
return create(item2);
}
// a lot more methods here
}
And its interface definition, also in the library:
public interface TemplateCRUD_Simple_Interface<T> {
EntityManager getEntityManager();
String getTableName();
String getNativeTableName();
// create
T create(T t);
<U> U createAny(U t);
// retrieve
T find(long pID) throws JcXEntityNotFoundException;
T find(long pID, T pDefault);
T find(Long pID, T pDefault);
T findCreate(long pID) throws InstantiationException, IllegalAccessException;
List<T> findAll(String pColName, Object pValue, final boolean pCaseSensitive);
List<T> findAll(String pColName, Object pValue);
List<T> findAll(String pColName, String pValue, final boolean pCaseSensitive);
List<T> findAll(String pColName, String pValue);
List<T> findAll(String pColName, long pValue);
List<T> findAllByFieldName(String pFieldName, Object pValue, final boolean pCaseSensitive);
List<T> findAllByFieldName(String pFieldName, Object pValue);
List<T> findWhereContains(final String pColName, final String pValue, final boolean pCaseSensitive);
List<T> findWhereContains(final String pColName, final String pValue);
List<T> getAll();
List<Long> getAllIds();
List<T> getByIds(final Collection<Long> pIds);
// update
T update(T t);
void updateProperties(T pItem, Map<String, String[]> pMatches);
T updateItem(Map<String, String[]> pMatches, long pID) throws InstantiationException, IllegalAccessException;
ArrayList<T> updateItems(String pEntityParamName, Map<String, String[]> pMap) throws InstantiationException, IllegalAccessException;
// delete
T delete(long pId);
// misc
long countAll();
Object getID(T pItem);
long getIDLong(T pItem);
boolean contains(T pItem);
void detach(final T pItem);
#Deprecated String getFieldNameInDb(final String pFieldName, final String pDefault);
// private List<T> getOverdueForXIn(final int pDays, final String pVarName);
List<T> getOverdueForDeletion(final boolean pAddAlreadyWarned);
List<T> getOverdueForUpdate(final boolean pAddAlreadyWarned);
}
Step 2: For each custom Class I have a CRUD (in this example, the 'User' Class):
CrudBase_BaseEntity is basically derived drom TemplateCRUD_Simple, just a few more steps in between for more flexibility
UserCRUD extends TemplateCRUD_Simple, so I can easily use those general methods
If I need specialized methods, I simply add them to the UserCRUD's code
This is an example for handling the User entity:
#Entity
#Table(name = "PT_User")
public class User extends _BaseEntity<User> {...}
And this is its CRUD/DAO:
#Stateless
public class UserCRUD extends CrudBase_BaseEntity<User> {
public UserCRUD() {
super(User.class);
}
public long getRegisteredUsersCount() {
final String sql = "SELECT COUNT(d) FROM " + getTableName() + " d";
final Query q = mEM.createQuery(sql);
final Long count = (Long) q.getSingleResult();
if (count == null) return 0;
return count.longValue();
}
public User findUserByUsernameOrEmail(final String pUid, final String pMail) {
final TypedQuery<User> query = mEM.createQuery("SELECT i FROM " + getTableName() + " i WHERE lower(i.email)=lower(:userid) OR lower(i.email)=lower(:usermail)", mType);
query.setParameter("userid", pUid);
query.setParameter("usermail", pMail);
final List<User> list = query.getResultList();
if (list == null || list.size() < 1) return null;
return list.get(0);
}
public List<User> getAllAdmins() {
final TypedQuery<User> query = mEM.createQuery("SELECT i FROM " + getTableName() + " i WHERE i.isAdmin = TRUE", mType);
final List<User> list = query.getResultList();
return list;
}
public List<User> getInvalidUsers() {
final TypedQuery<User> query = mEM.createQuery("SELECT i FROM " + getTableName() + " i "
+ "WHERE (i.username IS NULL"
+ " OR i.fullname IS NULL)"
+ " AND i.email IS NULL", mType);
final List<User> list = query.getResultList();
return list;
}
}
Step 3: In servlets, I simply inject it via #EJBannotation and then use it like this:
#WebServlet("/dyn/user/getAll")
#WebServletParams({})
#WebServletDescription()
public class GetAll extends BaseServlet {
private static final long serialVersionUID = -4567235617944396165L;
#EJB private UserCRUD mCRUD;
#Override protected void doGet_(final HttpServletRequest pReq, final HttpServletResponse pResp) throws IOException {
USessionManager.ensureUserLoggedInAndAdmin(pReq);
final List<User> items = mCRUD.getAll();
items.sort((final User pO1, final User pO2) -> JcUString.compareTo(pO1.getFullname(), pO2.getFullname(), false));
JSON.send(items, pResp);
}
}
This is the way I implemented it and use it.
At first it's lot of overhead
But it's really easy to set up new Classes und CRUDs
It's safe to use them
I use this in my JEE library, so I do not have to duplicate code, and any patch or addition I make is then available to all projects that use it
If need be, I could always access the (again: injected) EntityManager instance inside the TemplateCRUD.
I believe Spring uses a quite similar system, and calls it something like '...Repository'. If you want, you can also check that out and take its source code and adapt it. Usually when doing that, the #EJB and #PersistenceContext and #Inject do not transfer well between GlassFish/Payara, Spring or Hibernate, because not all support all of those Annotations in every context. As I said, this is quite Payara specific, and I've never tested it in other containers, but the approach should work everywhere.
To recap: this all depends a lot on dependency injection, me letting the container do all the work.
The rule is
the entities should not escape transaction bounds
The lazy loading is a killer feature of Hibernate, one should not afraid of it.
So, the repository/DAO should return the entity(es).
The service should be transactional(*), manage the entities but return DTOs outside.
Consider using any java bean mapper for that purpose to avoid monkey work.
That allows you not load unnecessary properties if they are not needed.
(*) if there are no long-living operations such as http calls.

Is there something like an Iterator, but with functions like Streams?

So basically what I am trying to do is the following:
Load Batch of Data from the Database
Map that data (Object[] query result) to a class representing the data in a readable format
Write to File
Repeat until query gets no more results
I listed the structures that I am familiar with that seem to fit the need and why they don't fit my needs.
Iterator → Has no option to map and filter without calling next()
I need to define the map function in a subclass though without actually having the data (similar to a stream), so that I can pass the "Stream" way up to a calling class and only there call next, which then calls all the map functions as a result
Stream → All data needs to be available before mapping and filtering is possible
Observable → Sends data as soon as it comes available. I need to process it in sync though
To get more of a feeling what I am trying to do, I made a small example:
// Disclaimer: "Something" is the structure I am not sure of now.
// Could be an Iterator or something else that fits (Thats the question)
public class Orchestrator {
#Inject
private DataGetter dataGetter;
public void doWork() {
FileWriter writer = new FileWriter("filename");
// Write the formatted data to the file
dataGetter.getData()
.forEach(data -> writer.writeToFile(data));
}
}
public class FileWriter {
public void writeToFile(List<Thing> data) {
// Write to file
}
}
public class DataGetter {
#Inject
private ThingDao thingDao;
public Something<List<Thing>> getData() {
// Map data to the correct format and return that
return thingDao.getThings()
.map(partialResult -> /* map to object */);
}
}
public class ThingDao {
public Something<List<Object[]>> getThings() {
Query q = ...;
// Dont know what to return
}
}
What I have got so far:
I tried to go from the base of an Iterator, because it's the only one that really fulfills my memory requirements. Then I have added some methods to map and loop over the data. It's not really a robust design though and it's going to be harder than I thought, so I wanted to know if there is anything out there already that does what I need.
public class QIterator<E> implements Iterator<List<E>> {
public static String QUERY_OFFSET = "queryOffset";
public static String QUERY_LIMIT = "queryLimit";
private Query query;
private long lastResultIndex = 0;
private long batchSize;
private Function<List<Object>, List<E>> mapper;
public QIterator(Query query, long batchSize) {
this.query = query;
this.batchSize = batchSize;
}
public QIterator(Query query, long batchSize, Function<List<Object>, List<E>> mapper) {
this(query, batchSize);
this.mapper = mapper;
}
#Override
public boolean hasNext() {
return lastResultIndex % batchSize == 0;
}
#Override
public List<E> next() {
query.setParameter(QueryIterator.QUERY_OFFSET, lastResultIndex);
query.setParameter(QueryIterator.QUERY_LIMIT, batchSize);
List<Object> result = (List<Object>) query.getResultList(); // unchecked
lastResultIndex += result.size();
List<E> mappedResult;
if (mapper != null) {
mappedResult = mapper.apply(result);
} else {
mappedResult = (List<E>) result; // unchecked
}
return mappedResult;
}
public <R> QIterator<R> map(Function<List<E>, List<R>> appendingMapper) {
return new QIterator<>(query, batchSize, (data) -> {
if (this.mapper != null) {
return appendingMapper.apply(this.mapper.apply(data));
} else {
return appendingMapper.apply((List<E>) data);
}
});
}
public void forEach(BiConsumer<List<E>, Integer> consumer) {
for (int i = 0; this.hasNext(); i++) {
consumer.accept(this.next(), i);
}
}
}
This works so far, but has some unchecked assignments which I do not really like and also I would like to have the ability to "append" one QIterator to another which is not hard by itself, but it should also take the maps that follow after the append.
Assume you have a DAO that provides data in a paginated manner, e.g. by applying the LIMIT and OFFSET clauses to the underlying SQL. Such a DAO class would have a method that takes those values as argument, i.e. the method would conform to the following functional method:
#FunctionalInterface
public interface PagedDao<T> {
List<T> getData(int offset, int limit);
}
E.g. calling getData(0, 20) would return the first 20 rows (page 1), calling getData(60, 20) would return the 20 rows on page 4. If the method returns less than 20 rows, it means we got the last page. Asking for data after the last row will return an empty list.
For the demo below, we can mock such a DAO class:
public class MockDao {
private final int rowCount;
public MockDao(int rowCount) {
this.rowCount = rowCount;
}
public List<SimpleRow> getSimpleRows(int offset, int limit) {
System.out.println("DEBUG: getData(" + offset + ", " + limit + ")");
if (offset < 0 || limit <= 0)
throw new IllegalArgumentException();
List<SimpleRow> data = new ArrayList<>();
for (int i = 0, rowNo = offset + 1; i < limit && rowNo <= this.rowCount; i++, rowNo++)
data.add(new SimpleRow("Row #" + rowNo));
System.out.println("DEBUG: data = " + data);
return data;
}
}
public class SimpleRow {
private final String data;
public SimpleRow(String data) {
this.data = data;
}
#Override
public String toString() {
return "Row[data=" + this.data + "]";
}
}
If you then want to generate a Stream of rows from that method, streaming all rows in blocks of a certain size, we need a Spliterator for that, so we can use StreamSupport.stream(Spliterator<T> spliterator, boolean parallel) to create a stream.
Here is an implementation of such a Spliterator:
public class PagedDaoSpliterator<T> implements Spliterator<T> {
private final PagedDao<T> dao;
private final int blockSize;
private int nextOffset;
private List<T> data;
private int dataIdx;
public PagedDaoSpliterator(PagedDao<T> dao, int blockSize) {
if (blockSize <= 0)
throw new IllegalArgumentException();
this.dao = Objects.requireNonNull(dao);
this.blockSize = blockSize;
}
#Override
public boolean tryAdvance(Consumer<? super T> action) {
if (this.data == null) {
if (this.nextOffset == -1/*At end*/)
return false; // Already at end
this.data = this.dao.getData(this.nextOffset, this.blockSize);
this.dataIdx = 0;
if (this.data.size() < this.blockSize)
this.nextOffset = -1/*At end, after this data*/;
else
this.nextOffset += data.size();
if (this.data.isEmpty()) {
this.data = null;
return false; // At end
}
}
action.accept(this.data.get(this.dataIdx++));
if (this.dataIdx == this.data.size())
this.data = null;
return true;
}
#Override
public Spliterator<T> trySplit() {
return null; // Parallel processing not supported
}
#Override
public long estimateSize() {
return Long.MAX_VALUE; // Unknown
}
#Override
public int characteristics() {
return ORDERED | NONNULL;
}
}
We can now test that using the mock DAO above:
MockDao dao = new MockDao(13);
Stream<SimpleRow> stream = StreamSupport.stream(
new PagedDaoSpliterator<>(dao::getSimpleRows, 5), /*parallel*/false);
stream.forEach(System.out::println);
Output
DEBUG: getData(0, 5)
DEBUG: data = [Row[data=Row #1], Row[data=Row #2], Row[data=Row #3], Row[data=Row #4], Row[data=Row #5]]
Row[data=Row #1]
Row[data=Row #2]
Row[data=Row #3]
Row[data=Row #4]
Row[data=Row #5]
DEBUG: getData(5, 5)
DEBUG: data = [Row[data=Row #6], Row[data=Row #7], Row[data=Row #8], Row[data=Row #9], Row[data=Row #10]]
Row[data=Row #6]
Row[data=Row #7]
Row[data=Row #8]
Row[data=Row #9]
Row[data=Row #10]
DEBUG: getData(10, 5)
DEBUG: data = [Row[data=Row #11], Row[data=Row #12], Row[data=Row #13]]
Row[data=Row #11]
Row[data=Row #12]
Row[data=Row #13]
As can be seen, we get 13 rows of data, retrieved from the database in blocks of 5 rows.
The data is not retrieved from the database until it is needed, causing low memory footprint, depending on block size and the stream operation not caching the data.
You can do it in one line as follows:
stmt = con.createStatement();
ResultSet rs = stmt.executeQuery(queryThatReturnsAllRowsOrdered);
Stream.generate(rs.next() ? map(rs) : null)
.takeWhile(Objects::nonNull)
.filter(<some predicate>)
.forEach(<some operation);
This starts processing when the first row is returned from the query and continues in parallel with the database until all rows have been read.
This approach only has one row in memory at a time, and minimises the load on the database by only running 1 query.
Mapping from a ResultSet is far more easy and natural than mapping from Object[] because you can access columns by name and with properly typed values, eg:
MyDao map(ResultSet rs) {
try {
String someStr = rs.getString("COLUMN_X");
int someInt = rs.getInt("COLUMN_Y"):
return new MyDao(someStr, someInt);
} catch (SQLException e ) {
throw new RuntimeException(e);
}
}

How to use Specification in JPA

I am implementing search functionality in my application. I am using Specification in findAll() and it is working perfectly. But when ever i am trying to achive in other methods like findByFirstName() it is not working
I am including what i did so far.
AircraftSpecification.java
public class AircraftSpecification {
private AircraftSpecification() {}
public static Specification<Aircraft> textInAllColumns(String text) {
if (!text.contains("%")) {
text = "%"+text+"%";
}
final String finalText = text;
return new Specification<Aircraft>() {
private static final long serialVersionUID = 1L;
#Override
public Predicate toPredicate(Root<Aircraft> root, CriteriaQuery<?> cq, CriteriaBuilder builder) {
List<SingularAttribute<Aircraft, ?>> tempAttributes = new ArrayList<>();
for (SingularAttribute<Aircraft, ?> attribute : root.getModel().getDeclaredSingularAttributes()) {
if (attribute.getJavaType().getSimpleName().equalsIgnoreCase("string")) {
tempAttributes.add(attribute);
}
}
final Predicate[] predicates = new Predicate[tempAttributes.size()];
for (int i = 0; i < tempAttributes.size(); i++) {
predicates[i] = builder.like(builder.lower(root.get(tempAttributes.get(i).getName())), finalText.toLowerCase());
}
return builder.or(predicates);
}
};
}
}
When i am calling
aircraftRepository.findAll(Specification.where(AircraftSpecification.textInAllColumns(searchText)));
it giving me proper data.
But when i am calling
aircraftRepository.findAllByName(name, Specification.where(AircraftSpecification.textInAllColumns(searchText)));
It throwing Exception.
Exception Is:
org.springframework.dao.InvalidDataAccessApiUsageException: At least 2 parameter(s) provided but only 1 parameter(s) present in query.; nested exception is java.lang.IllegalArgumentException: At least 2 parameter(s) provided but only 1 parameter(s) present in query.
Can any one help me how to use Specification other than findAll method.
You can't combine derived queries where Spring Data derives the query to execute from the method name with Specification.
Just make the name part of a query a Specification as well and combine the two with and.
The resulting call could look like this or similar:
aircraftRepository.findAll(
byName("Alfred")
.and(textInAllColumns(searchText))
);

getLinks method returns deleted Entities, how to prevent it?

Below is my code to get the list of Entity's linked. It works but the problem is that even the deleted Entity is returned, although the Entity is already emptied out and it is the only property set. Is there a way to not return the deleted entities at all? Or is there a way to filter it out?
EntityId idOfEntity = txn.toEntityId(entityId);
Entity txnEntity = txn.getEntity(idOfEntity);
EntityIterable result = txnEntity.getLinks(Arrays.asList(new String[] {linkName}));
for (Entity entity : result) {
}
When you delete an entity, it's your responsibility to check if there are incoming links to the deleted entity. Otherwise so called "phantom links" can appear. You can set -Dexodus.entityStore.debug.searchForIncomingLinksOnDelete=true (PersistentEntityStoreConfig#setDebugSearchForIncomingLinksOnDelete(true)) to debug deletion in your application. With this setting, Xodus searches for incoming links to each deleted entity and throws EntityStoreException if it finds. The setting should not be used in production environment as it significantly slows down entity deletion performance.
Here's the complete code that I have come up with:
#Override
public boolean deleteEntities(String instance, String namespace, final String entityType) {
final boolean[] success = {false};
final PersistentEntityStore entityStore = manager.getPersistentEntityStore(xodusRoot, instance);
try {
entityStore.executeInTransaction(
new StoreTransactionalExecutable() {
#Override
public void execute(#NotNull final StoreTransaction txn) {
EntityIterable result = null;
if (namespace != null && !namespace.isEmpty()) {
result =
txn.findWithProp(entityType, namespaceProperty)
.intersect(txn.find(entityType, namespaceProperty, namespace));
} else {
result =
txn.getAll(entityType).minus(txn.findWithProp(entityType, namespaceProperty));
}
final boolean[] hasError = {false};
for (Entity entity : result) {
entity.getLinkNames().forEach(linkName -> {
Entity linked = entity.getLink(linkName);
entity.deleteLink(linkName, linked);
});
// TODO: This is a performance issue
final List<String> allLinkNames = ((PersistentEntityStoreImpl) entityStore).getAllLinkNames((PersistentStoreTransaction) entityStore.getCurrentTransaction());
for (final String entityType : txn.getEntityTypes()) {
for (final String linkName : allLinkNames) {
for (final Entity referrer : txn.findLinks(entityType, entity, linkName)) {
referrer.deleteLink(linkName, entity);
}
}
}
if (!entity.delete()) {
hasError[0] = true;
}
}
success[0] = !hasError[0];
}
});
} finally {
// entityStore.close();
}
return success[0];
}

How can I do batch insert using ibatis annotations

I cannot find a tutorial on this, and I find the documentation scant. How can I do batch insert using ibatis annotations?
public interface MyTableMapper {
#Insert("insert into MyTable(col1) values (#{valueOfCol1})")
void insert(MyRecordClass obj);
}
public class MyTransactionalClass {
#Transactional
public void insert(MyRecordClass obj) {
myTableMapperInst.insert(obj);
}
}
I did this naive implementation (surprisingly without success :-):
public class MyTransactionalClass {
#Transactional(executorType = ExecutorType.BATCH)
public void insert(MyRecordClass obj) {
myTableMapperInst.insert(obj);
}
}
This is without annotation, according to the docs, your way seems correct.
try {
sqlMap.startTransaction()
List list = (Employee) sqlMap.queryForList("getFiredEmployees", null);
sqlMap.startBatch ();
for (int i=0, n=list.size(); i < n; i++) {
sqlMap.delete ("deleteEmployee", list.get(i));
}
sqlMap.executeBatch();
sqlMap.commitTransaction();
} finally {
sqlMap.endTransaction();
}
iBatis annotation insert bulk record you can do like this
#Insert({"<script>",
"insert into user_master (first_name,last_name) values ",
"<foreach collection='userList' item='user' index='index' open='(' separator = '),(' close=')' >#{user.first_name},#{user.last_name}</foreach>",
"</script>"})
int insertUserList(#Param("userList") List<UserNew> userList);
It's work for me and i inserted bulk record in my PostgreSQL database using above single insert.

Categories

Resources