I am using Spring Data and Mapstruct and I don't want hibernate to blindly load all the elements while mapping entity to dto.
Example:
public class VacancyEntity {
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
Integer id;
#ManyToOne(fetch = FetchType.LAZY)
#JoinColumn(name = "job_category_id", nullable = false)
JobCategoryEntity jobCategory;
#ManyToOne(fetch = FetchType.LAZY)
#JoinColumn(name = "company_id", nullable = false)
CompanyEntity company;
#ManyToOne(fetch = FetchType.LAZY)
#JoinColumn(name = "employer_created_by", nullable = false)
EmployerProfileEntity employerCreatedBy;
#Column(nullable = false)
String title;
.... }
DTO:
public class VacancyDto {
Integer id;
String title;
CompanyDto company;
EmployerProfileDto employerCreatedBy;
JobCategoryDto jobCategory;
...}
So I have two methods findByIdWithCompanyAndCity and findByIdWithJobAndCityAndEmployer in VacancyRepository to perform only one SQL request.
And two #Transactional methods in my VacancyService: findWithCompanyAndCity and findWithCompanyAndCityAndEmployer.
Best practice is returning Dto from Service layer, so we need to parse Entity to Dto in the Service.
And I really don't want to just leave whole mapping in #Transactional (session) because if I add some field really deep into my entity, Mapstruct just trigger N+1 problem.
Best that I came up with, is to include each inner entity into method and check manually that Mapstruct don't add some new methods. (it is faster then checking names)
Ex:
#Mapping(target = "id", source = "entity.id")
#Mapping(target = "description", source = "entity.description")
#Mapping(target = "jobCategory", source = "jobCategoryDto")
#Mapping(target = "employerCreatedBy", source = "employerProfileDto")
#Mapping(target = "city", source = "cityDto")
#Mapping(target = "company", ignore = true)
VacancyDto toDto(VacancyEntity entity,
JobCategoryDto jobCategoryDto,
EmployerProfileDto employerProfileDto,
CityDto cityDto);
....
But this doesn't fix the real issue. There are still session while mapping, so it can lead to N+1 problem.
So I came up with several solutions
Use special method in Service to trigger #Transactional method and then map into DTO out of session scope. But it seems really ugly to double methods in Service
Return Entity from Service (which is Bad Practice) and map into DTO there.
I know that I'll get LazyInitializationException in both cases, but it seems to me like it more robust and scalable then just unpredictably SELECT.
How do I perform the mapping from entity to DTO in the service layer but outside the Hibernate session in an elegant way?
You didn't ask a question but it seems the question is supposed to be:
How do I perform the mapping from entity to DTO in the service layer but outside the Hibernate session in an elegant way.
I'd recommend the TransactionTemplate for this.
Usage looks like this:
#Autowired
VacancyRepository repo;
#Autowired
TransactionTemplate tx;
void someMethod(String company, String city){
VacancyEntity vac = tx.execute(__ -> repo.findWithCompanyAndCity(company, city));
return mappToDto(vac);
}
That said, I think you are using the wrong a approach to solve the underlying problem.
I suggest you take a look at having a test to verify the number of SQL statements executed.
See https://vladmihalcea.com/how-to-detect-the-n-plus-one-query-problem-during-testing/ for a way to do that.
To avoid the N + 1 problem you still need to use an entity graph, although I think this is a perfect use case for Blaze-Persistence Entity Views.
I created the library to allow easy mapping between JPA models and custom interface or abstract class defined models, something like Spring Data Projections on steroids. The idea is that you define your target structure(domain model) the way you like and map attributes(getters) via JPQL expressions to the entity model.
A DTO model for your use case could look like the following with Blaze-Persistence Entity-Views:
#EntityView(VacancyEntity.class)
public interface VacancyDto {
#IdMapping
Integer getId();
String getTitle();
CompanyDto getCompany();
EmployerProfileDto getEmployerCreatedBy();
JobCategoryDto getJobCategory();
#EntityView(CompanyEntity.class)
interface CompanyDto {
#IdMapping
Integer getId();
String getName();
}
#EntityView(EmployerProfileEntity.class)
interface EmployerProfileDto {
#IdMapping
Integer getId();
String getName();
}
#EntityView(JobCategoryEntity.class)
interface JobCategoryDto {
#IdMapping
Integer getId();
String getName();
}
}
Querying is a matter of applying the entity view to a query, the simplest being just a query by id.
VacancyDto a = entityViewManager.find(entityManager, VacancyDto.class, id);
The Spring Data integration allows you to use it almost like Spring Data Projections: https://persistence.blazebit.com/documentation/entity-view/manual/en_US/index.html#spring-data-features
Page<VacancyDto> findAll(Pageable pageable);
The best part is, it will only fetch the state that is actually necessary!
I am doing a REST API using springboot and JPA.
I am trying to lazy fetching an entity in a One to Many relationship. Teacher to courses.
I can see the sql statements done by JPA as I have the debuging option on.
In the controller, when calling a path all works great, but I can see that JPA is executing two queries. One for the teacher and another one for its courses. As I know, the lazy loading does not query until the data is required and I am not requiring it.
I have checked and conirmed that in the controller, when I retrieve the teacher data JPA does not query for the courses, but AFTER the return statement of the controller, somewhere, the courses are required and it loads everything when I call the teacher info from postman with a GET call.
It seems as if the LAZY loading is working correctly, but after the controller JPA loads the course list. If I do the EAGER fetching everything is loaded before the return statemnt.
I am not writing any code as I guess the question is more theorical than practical.
Does anyone know how this works?
Thank you so much!!!!
EDIT:
Teacher table
#Entity
#Table(name="profesores")
public class Profesor implements Serializable{
#Id
#GeneratedValue(strategy=GenerationType.IDENTITY)
private Long id;
#Column(name="nombre")
private String nombre;
#OneToMany(cascade = CascadeType.ALL, fetch = FetchType.LAZY)
#JoinColumn(name = "profesor_id", referencedColumnName = "id")
private List<Curso> cursos = new ArrayList<>();
}
Course Table
#Entity
#Table(name = "curso")
public class Curso implements Serializable {
#Id
#GeneratedValue(strategy=GenerationType.IDENTITY)
private Long curso_id;
private String nombre;
#Column(name="profesor_id")
private Long profesorId;
}
Controller
#GetMapping("/profesor/{id}")
public ResponseEntity<?> getProfesor(#PathVariable(value = "id") Long id){
Profesor p = profesorService.findById(id);
if(p!=null) {
ResponseEntity<?> re = new ResponseEntity<>(p, HttpStatus.OK);
//Just one query executed. I don't know the courses yet
return re;
}
else {
return new ResponseEntity<Void>(HttpStatus.NOT_FOUND);
}
}
After the return re; statement, somewhere, the courses are retrieved and JPA queries for them. I don't know what does the controller call, as I do directly from PostMan.
After returned Entity Profesor is serialized for response when serializer try to access courses to serialized for response then JPA load courses also. To solve this issue, You can create a response class for response (without courses field)
public class ProfesorResponse {
private Long id;
private String number;
...constructor
}
then map your entity in response object and return it.
Profesor p = profesorService.findById(id);
ProfesorResponse response = new ProfesorResponse(p.getId(), p.getNumber());
I'm migrating my Spring Boot REST API from 1.5.4 to 2.0.3.
These are my two entities, a repository for one of them and a controller for accessing them:
Parent.java
#Entity
#Table(name = "PARENT")
public class Parent implements Serializable {
#Id
#GeneratedValue
#Column(name = "ID")
private Long id;
#OneToMany(mappedBy = "parent", fetch = FetchType.LAZY)
private List<Child> children;
}
Child.java
#Entity
#Table(name = "CHILD")
public class Child implements Serializable {
#Id
#GeneratedValue
#Column(name = "ID")
private Long id;
#Column(name = "PARENT_ID")
private Long parentId;
#JsonIgnore
#ManyToOne(fetch = FetchType.LAZY)
#JoinColumn(name = "PARENT_ID")
private Parent parent;
#Column(name = "name")
private String name;
}
ParentRepository.java
public interface ParentRepository extends JpaRepository<Parent, Long> {
}
ParentController.java
#RestController
#RequestMapping("/parents")
public class ParentController {
#Autowired
private ParentRepository parentRepository;
#RequestMapping(method = RequestMethod.GET)
public List<Parent> getParents() {
return parentRepository.findAll();
}
}
It appears that there is no longer an active session in the #RestController classes since
parentRepository.findAll().get(0).getChildren().get(0).getName();
now throws a
LazyInitializationException: failed to lazily initialize a collection of role: com.mycompany.myapplication.entity.Parent.children, could not initialize proxy - no Session
This can be fixed by setting a #Transactional annotation on either the controller method or the controller class.
However, the problem I have regards the lazily loaded children.
If I run the example code above, even with the #Transactional annotation, I get the same exception but with a nested
com.fasterxml.jackson.databind.JsonMappingException
This is due to the serialization to JSON happens outside of the controller, hence outside the active session.
There is an ugly fix for this, by reading some data from each child before exiting the method:
#RequestMapping(method = RequestMethod.GET)
public List<Parent> getParents() {
List<Parent> parents = parentRepository.findAll();
parents.stream()
.flatMap(p -> p.getChildren().stream())
.forEach(Child::getName);
return parents;
}
This works, but is terribly ugly and adds a lot of boilerplate.
Another solution would be to map all entities to DTOs before returning them to the client. But this solution adds another layer to my application which I don't want.
Is there a way to make sure that there is an active session during the automagical serialization of the entities?
Soo yeaah...
During migration I had previously set
spring.jpa.open-in-view = false
because I saw a new warning about it in the log. This setting removes the active session I wanted help adding...
Removing this setting and using the default (true) fixed my problem entirely.
I have the following simple application
Users Entity
#Entity
public class Users implements Serializable {
#Id
#GeneratedValue
private long id;
private String name;
#OneToMany(mappedBy = "user", fetch = FetchType.EAGER, cascade = {CascadeType.ALL})
private Set<UserRoleUser> userRoleUser;
// GETTERS AND SETTERS
}
UserRole Entity
#Entity
public class UserRole implements Serializable {
#Id
#GeneratedValue
private long id;
private String roleName;
#OneToMany(mappedBy = "userrole", fetch = FetchType.LAZY, cascade = CascadeType.ALL)
private Set<UserRoleUser> userRoleUser;
// GETTERS AND SETTERS
}
UserRoleUser Many to many resolver class
#Entity
public class UserRoleUser implements Serializable {
#Id
#GeneratedValue
private long id;
#ManyToOne
#JoinColumn(name = "fk_userId")
private Users user;
#ManyToOne
#JoinColumn(name = "fk_userroleId")
private UserRole userrole;
// GETTERS AND SETTERS
}
UserRoleUserRepository
#Repository
#Transactional
public interface UserRoleUserRepository extends JpaRepository<UserRoleUser, Long>, QueryDslPredicateExecutor<UserRoleUser>{
}
Main Application class
#SpringBootApplication
#Configuration
public class Application {
public static void main(String[] args) {
ConfigurableApplicationContext context = SpringApplication.run(Application.class, args);
UserRoleUserRepository userRoleUserRepository = context.getBean(UserRoleUserRepository.class);
Iterable<UserRoleUser> findAll = userRoleUserRepository.findAll(QUserRoleUser.userRoleUser.id.gt(0));
for (UserRoleUser userRoleUser : findAll) {
userRoleUserRepository.delete(userRoleUser);
}
}
}
On running the main application, the database records in the UserRoleUser table are not being deleted. What could be the issue? I am using Spring Data and QueryDsl.
I have also tried putting the delete functionality on a Controller but still doesn't work.
#RestController
#RequestMapping("/api")
public class DeleteController {
#Autowired
UserRoleUserRepository userRoleUserRepository;
#GetMapping("/delete")
public String delete() {
Iterable<UserRoleUser> findAll = userRoleUserRepository.findAll(QUserRoleUser.userRoleUser.id.gt(0));
for (UserRoleUser userRoleUser : findAll) {
userRoleUserRepository.delete(userRoleUser);
}
return new Date().toString();
}
}
If you need to use the given methods provided by CrudRepository, use the JpaRepository.deleteInBatch(). This solves the problem.
The problem is the entities are still attached and will not be deleted until they become detached. If you delete by their id instead of the entity itself, it will delete them.
One thing I noticed is you are deleting the users one at a time which could lead to a database performance hit as the query will be recreated each time. The easiest thing to do is to add all the ids to a set then delete the set of ids. Something like this:
Set<Integer> idList = new HashSet<>();
for (UserRoleUser userRoleUser : findAll) {
idList.add(userRoleUser.getId());
}
if (!idList.isEmpty()) {
userRoleUserRepository.delete(idList);
}
then in your repository add the delete method
#Modifying
#Query("DELETE FROM UserRoleUser uru WHERE uru.id in ?1")
#Transactional
void delete(Set<Integer> id);
The reason why the child objects (UserRoleUser) are not being deleted upon userRoleUserRepository.delete(userRoleUser) call is that each UserRoleUser points to a Users which in turn holds a #OneToMany reference Set<UserRoleUser> userRoleUser.
As described in this StackOverflow answer, what your JPA implementation (e.g. Hibernate) effectively does is:
The cache takes note of the requested child exclusion
The cache however does not verify any changes in Set<UserRoleUser>
As the parent #OneToMany field has not been updated, no changes are made
A solution would go through first removing the child element from Set<UserRoleUser> and then proceed to userRoleUserRepository.delete(userRoleUser) or userRepository.save(user)
In order to avoid this complication two answers have been provided:
Remove element by Id, by calling userRoleUserRepository.deleteById(userRoleUser.getId()) : in this case the entity structure (and therefore the parent) is not checked before deletion. In the analog case of deleteAll something more convoluted such as userRoleUserRepository.deleteByIdIn(userRoleUserList.stream().map(UserRoleUser::getId).collect(Collectors.toList())) would have to be employed
Convert your CrudRepository to a JpaRepository and use its deleteInBatch(userRoleUserList) method. As explained in this article and this StackOverflow answer the deleteInBatch method tries to delete all records at once, possibly generating a StackOverflow error in the case the number of records is too large. As repo.deleteAll() removes one record at a time this error it minimizes this risk (unless the call is itself inside a #Transactional method)
According to this StackOverflow answer, extra care should be used when recurring to deleteInBatch as it:
Does not cascade to other entities
Does not update the persistence context, requiring it to be cleared (the method bypasses the cache)
Finally , as far as I know , there is no way this could be done by simply calling userRoleUserRepository.delete(userRoleUser) without first updating the parent object. Any updates on this (whether by allowing such behaviour through annotations, configuration or any other means) would be a welcome addition to the answer.
I have a spring 4 app where I'm trying to delete an instance of an entity from my database. I have the following entity:
#Entity
public class Token implements Serializable {
#Id
#SequenceGenerator(name = "seqToken", sequenceName = "SEQ_TOKEN", initialValue = 500, allocationSize = 1)
#GeneratedValue(strategy = GenerationType.SEQUENCE, generator = "seqToken")
#Column(name = "TOKEN_ID", nullable = false, precision = 19, scale = 0)
private Long id;
#NotNull
#Column(name = "VALUE", unique = true)
private String value;
#ManyToOne(fetch = FetchType.EAGER)
#JoinColumn(name = "USER_ACCOUNT_ID", nullable = false)
private UserAccount userAccount;
#Temporal(TemporalType.TIMESTAMP)
#Column(name = "EXPIRES", length = 11)
private Date expires;
...
// getters and setters omitted to keep it simple
}
I have a JpaRepository interface defined:
public interface TokenRepository extends JpaRepository<Token, Long> {
Token findByValue(#Param("value") String value);
}
I have a unit test setup that works with an in memory database (H2) and I am pre-filling the database with two tokens:
#Test
public void testDeleteToken() {
assertThat(tokenRepository.findAll().size(), is(2));
Token deleted = tokenRepository.findOne(1L);
tokenRepository.delete(deleted);
tokenRepository.flush();
assertThat(tokenRepository.findAll().size(), is(1));
}
The first assertion passes, the second fails. I tried another test that changes the token value and saves that to the database and it does indeed work, so I'm not sure why delete isn't working. It doesn't throw any exceptions either, just doesn't persist it to the database. It doesn't work against my oracle database either.
Edit
Still having this issue. I was able to get the delete to persist to the database by adding this to my TokenRepository interface:
#Modifying
#Query("delete from Token t where t.id = ?1")
void delete(Long entityId);
However this is not an ideal solution. Any ideas as to what I need to do to get it working without this extra method?
Most probably such behaviour occurs when you have bidirectional relationship and you're not synchronizing both sides WHILE having both parent and child persisted (attached to the current session).
This is tricky and I'm gonna explain this with the following example.
#Entity
public class Parent {
#Id
#GeneratedValue(strategy = IDENTITY)
#Column(name = "id", unique = true, nullable = false)
private Long id;
#OneToMany(cascade = CascadeType.PERSIST, mappedBy = "parent")
private Set<Child> children = new HashSet<>(0);
public void setChildren(Set<Child> children) {
this.children = children;
this.children.forEach(child -> child.setParent(this));
}
}
#Entity
public class Child {
#Id
#GeneratedValue(strategy = IDENTITY)
#Column(name = "id", unique = true, nullable = false)
private Long id;
#ManyToOne
#JoinColumn(name = "parent_id")
private Parent parent;
public void setParent(Parent parent) {
this.parent = parent;
}
}
Let's write a test (a transactional one btw)
public class ParentTest extends IntegrationTestSpec {
#Autowired
private ParentRepository parentRepository;
#Autowired
private ChildRepository childRepository;
#Autowired
private ParentFixture parentFixture;
#Test
public void test() {
Parent parent = new Parent();
Child child = new Child();
parent.setChildren(Set.of(child));
parentRepository.save(parent);
Child fetchedChild = childRepository.findAll().get(0);
childRepository.delete(fetchedChild);
assertEquals(1, parentRepository.count());
assertEquals(0, childRepository.count()); // FAILS!!! childRepostitory.counts() returns 1
}
}
Pretty simple test right? We're creating parent and child, save it to database, then fetching a child from database, removing it and at last making sure everything works just as expected. And it's not.
The delete here didn't work because we didn't synchronized the other part of relationship which is PERSISTED IN CURRENT SESSION. If Parent wasn't associated with current session our test would pass, i.e.
#Component
public class ParentFixture {
...
#Transactional(propagation = Propagation.REQUIRES_NEW)
public void thereIsParentWithChildren() {
Parent parent = new Parent();
Child child = new Child();
parent.setChildren(Set.of(child));
parentRepository.save(parent);
}
}
and
#Test
public void test() {
parentFixture.thereIsParentWithChildren(); // we're saving Child and Parent in seperate transaction
Child fetchedChild = childRepository.findAll().get(0);
childRepository.delete(fetchedChild);
assertEquals(1, parentRepository.count());
assertEquals(0, childRepository.count()); // WORKS!
}
Of course it only proves my point and explains the behaviour OP faced. The proper way to go is obviously keeping in sync both parts of relationship which means:
class Parent {
...
public void dismissChild(Child child) {
this.children.remove(child);
}
public void dismissChildren() {
this.children.forEach(child -> child.dismissParent()); // SYNCHRONIZING THE OTHER SIDE OF RELATIONSHIP
this.children.clear();
}
}
class Child {
...
public void dismissParent() {
this.parent.dismissChild(this); //SYNCHRONIZING THE OTHER SIDE OF RELATIONSHIP
this.parent = null;
}
}
Obviously #PreRemove could be used here.
I had the same problem
Perhaps your UserAccount entity has an #OneToMany with Cascade on some attribute.
I've just remove the cascade, than it could persist when deleting...
You need to add PreRemove function ,in the class where you have many object as attribute e.g in Education Class which have relation with UserProfile
Education.java
private Set<UserProfile> userProfiles = new HashSet<UserProfile>(0);
#ManyToMany(fetch = FetchType.EAGER, mappedBy = "educations")
public Set<UserProfile> getUserProfiles() {
return this.userProfiles;
}
#PreRemove
private void removeEducationFromUsersProfile() {
for (UsersProfile u : usersProfiles) {
u.getEducationses().remove(this);
}
}
One way is to use cascade = CascadeType.ALL like this in your userAccount service:
#OneToMany(cascade = CascadeType.ALL)
private List<Token> tokens;
Then do something like the following (or similar logic)
#Transactional
public void deleteUserToken(Token token){
userAccount.getTokens().remove(token);
}
Notice the #Transactional annotation. This will allow Spring (Hibernate) to know if you want to either persist, merge, or whatever it is you are doing in the method. AFAIK the example above should work as if you had no CascadeType set, and call JPARepository.delete(token).
This is for anyone coming from Google on why their delete method is not working in Spring Boot/Hibernate, whether it's used from the JpaRepository/CrudRepository's delete or from a custom repository calling session.delete(entity) or entityManager.remove(entity).
I was upgrading from Spring Boot 1.5 to version 2.2.6 (and Hibernate 5.4.13) and had been using a custom configuration for transactionManager, something like this:
#Bean
public HibernateTransactionManager transactionManager(EntityManagerFactory entityManagerFactory) {
return new HibernateTransactionManager(entityManagerFactory.unwrap(SessionFactory.class));
}
And I managed to solve it by using #EnableTransactionManagement and deleting the custom
transactionManager bean definition above.
If you still have to use a custom transaction manager of sorts, changing the bean definition to the code below may also work:
#Bean
public PlatformTransactionManager transactionManager(EntityManagerFactory entityManagerFactory) {
return new JpaTransactionManager(entityManagerFactory);
}
As a final note, remember to enable Spring Boot's auto-configuration so the entityManagerFactory bean can be created automatically, and also remove any sessionFactory bean if you're upgrading to entityManager (otherwise Spring Boot won't do the auto-configuration properly). And lastly, ensure that your methods are #Transactional if you're not dealing with transactions manually.
I was facing the similar issue.
Solution 1:
The reason why the records are not being deleted could be that the entities are still attached. So we've to detach them first and then try to delete them.
Here is my code example:
User Entity:
#Entity
public class User {
#OneToMany(cascade = CascadeType.ALL, fetch = FetchType.LAZY, mappedBy = "user")
private List<Contact> contacts = new ArrayList<>();
}
Contact Entity:
#Entity
public class Contact {
#Id
private int cId;
#ManyToOne
private User user;
}
Delete Code:
user.getContacts().removeIf(c -> c.getcId() == contact.getcId());
this.userRepository.save(user);
this.contactRepository.delete(contact);
Here we are first removing the Contact object (which we want to delete) from the User's contacts ArrayList, and then we are using the delete() method.
Solution 2:
Here we are using the orphanRemoval attribute, which is used to delete orphaned entities from the database. An entity that is no longer attached to its parent is known as an orphaned entity.
Code example:
User Entity:
#Entity
public class User {
#OneToMany(cascade = CascadeType.ALL, fetch = FetchType.LAZY, mappedBy = "user", orphanRemoval = true)
private List<Contact> contacts = new ArrayList<>();
}
Contact Entity:
#Entity
public class Contact {
#Id
private int cId;
#ManyToOne
private User user;
}
Delete Code:
user.getContacts().removeIf(c -> c.getcId() == contact.getcId());
this.userRepository.save(user);
Here, as the Contact entity is no longer attached to its parent, it is an orphaned entity and will be deleted from the database.
I just went through this too. In my case, I had to make the child table have a nullable foreign key field and then remove the parent from the relationship by setting null, then calling save and delete and flush.
I didn't see a delete in the log or any exception prior to doing this.
If you use an newer version of Spring Data, you could use deleteBy syntax...so you are able to remove one of your annotations :P
the next thing is, that the behaviour is already tract by a Jira ticket:
https://jira.spring.io/browse/DATAJPA-727
#Transactional
int deleteAuthorByName(String name);
you should write #Transactional in Repository extends JpaRepository
Your initial value for id is 500. That means your id starts with 500
#SequenceGenerator(name = "seqToken", sequenceName = "SEQ_TOKEN",
initialValue = 500, allocationSize = 1)
And you select one item with id 1 here
Token deleted = tokenRepository.findOne(1L);
So check your database to clarify that
I've the same problem, test is ok but on db row isn't deleted.
have you added the #Transactional annotation to method? for me this change makes it work
In my case was the CASCADE.PERSIST, i changed for CASCADE.ALL, and made the change through the cascade (changing the father object).
CascadeType.PERSIST and orphanRemoval=true doesn't work together.
Try calling deleteById instead of delete on the repository. I also noticed that you are providing an Optional entity to the delete (since findOne returns an Optional entity). It is actually strange that you are not getting any compilation errors because of this. Anyways, my thinking is that the repository is not finding the entity to delete.
Try this instead:
#Test
public void testDeleteToken() {
assertThat(tokenRepository.findAll().size(), is(2));
Optional<Token> toDelete = tokenRepository.findOne(1L);
toDelete.ifExists(toDeleteThatExists -> tokenRepository.deleteById(toDeleteThatExists.getId()))
tokenRepository.flush();
assertThat(tokenRepository.findAll().size(), is(1));
}
By doing the above, you can avoid having to add the #Modifying query to your repository (since what you are implementing in that #Modifying query is essentially the same as calling deleteById, which already exists on the JpaRepository interface).