Hello good people i came accross a weird behaviour in my test.I'm using JPA hibernate annotation with spring.
let say i have an Class MyObject and it's property email is marqued
#Column(name="EMAIL", length=100, unique=true)
private String email;
i prepare for what i need to be in the database in the setup of this class MyObjectDAOImplTest
#Autowired
MyObject1 ob1;
#Autowired
MyObject1 ob2;
#Before
public void setUP(){
dao = manager.createthedao();
....
ob1.setEmail("some#email.com");
....
....
ob2.setEmail("someother#email.com");
....
dao.save(ob1);
dao.save(ob2);
}
so my a part from the fist test method all the reste are failling.I's about duplicates values on the email column but my hbm2ddl.auto=create and i even used the create-drop. but still. i just don't get it. i've used this in so many project without the unique of course but i expect the database to be dropped each time a test method is run.Is there anything about the unique i should be aware of ? thanks for reading.Give me your suggestion.Did i left out something or fail to do some?
You're missing #After method which is why you're seeing this behaviour. When running jUnit 4.x tests, the whole suite is run in a single thread one after another which means that you have to clear the state yourself or unspecified behaviour occurs, usually resources keep hanging and cause side effects to other unit tests.
Shouldn't you have some code to drop/remove the unit-test database after (or preferably before) each test? Are you sure that you are actually creating the database at all? What database engine you are using?
If you are using some memory based database, are you initializing it in the right place (every time a test is executed)?
Are you calling SessionFactory.close() somewhere? If you are using hibernate.hbm2ddl.auto=create-drop, that should handle the dropping of the database.
Related
What is the correct way to initialize some relatively big data, and share them (read only, so thread safe) across all JUnit5 tests?
I've looked this answer and others that are similar, but I always seem to have 1 or 2 more levels of assembly/instantiation than they deal with.
My testing setup is this:
I have a custom Repository data structure that needs to be initialized just once, read from multiple sources and assembled (about 100 - 200 mb) and then shared to all the tests.
Each test class instantiates an Engine in #BeforeAll, that needs the repository above and then goes on and executes the tests in series, calling engine.reset() between tests. Each test has it's own unique setup. Engine is semi-heavy, and impossible to have one for each test.
#TestInstance(TestInstance.Lifecycle.PER_CLASS) is used so we get only one instance per testing class, (and one engine per class).
Multithreading/Parallel testing is used, each test class is done in parallel, and methods within it are done in sequence. This means:
systemProperty("junit.jupiter.execution.parallel.enabled", true)
systemProperty("junit.jupiter.execution.parallel.mode.default", "same_thread")
systemProperty("junit.jupiter.execution.parallel.mode.classes.default", "concurrent")
systemProperty("junit.jupiter.execution.parallel.config.strategy","dynamic")
systemProperty("junit.jupiter.execution.parallel.config.dynamic.factor",1) // could be 2!
Since there is nothing before #BeforeAll, I had to improvise:
I ended up declaring the repository on the top level of a kotlin test class file, outside of the class and initialize it like this: (large irrelevant chunks are omitted for clarity)
TestSetAlpha.kt:
import org.junit.jupiter.api.*
val database:Repository = Repository().also{
it.setupData(Config(...))
it.someOtherInit()
blah blah
}
#TestInstance(TestInstance.Lifecycle.PER_CLASS)
class `Engine Test Set ALPHA` {
var eng = Engine()
#BeforeAll
fun initAll() {
// configure Engine
println("Configuring Engine ALPHA")
eng.setDatabase(database)
eng.configure {
....
....
}
}
#BeforeEach
fun init() {
// reset the engine
eng.reset()
}
#Test
fun `A simple test`() {
eng.add(...)
eng.add(...)
eng.execute()
// interrogate resulting state
assert(eng.property == ...)
...
}
On subsequent test class files, I can reuse the same database Repository, and it only realy initializes once on a project level (verified!). There are no changes and no mutability on the repository after it loads, and that is guaranteed by it's API. This means that on an 16 thread CPU, I can reuse the database and roughly run 16 test classes in parallel.
I'm not sure on the loading and instantiating semantics of that global val. With a lot of data, JUnit5 is waiting for the also closure to complete before continuing with any tests, probably because it can't proceed with the classes on the files? I've never gotten an error, but feel this will probably break with a future update or on another platform because it's not clean and looks like a hack.
I would like to specify and have a guarantee that the repository is instantiated and shared properly across all classes & files and then have the threads start. How do you go about doing that though? There isn't some kind of top level, global #BeforeBeforeAll, although it would be exactly what I require. Any feedback and refactoring is welcomed. I can't run the tests without parallelism of course.
Far simpler than I thought it would be!
On top scope or on another file, use a singleton object!
object DatabaseProvider {
val database: Repository by lazy(LazyThreadSafetyMode.SYNCHRONIZED) {
val r=Repository()
r.setupData(Config(...))
// Load and add everything into the database
return#lazy r
}
}
and then in each test class, you plug in the database as part of initialization:
#TestInstance(TestInstance.Lifecycle.PER_CLASS)
class `Engine Test Set ALPHA`{
var eng = Engine()
#BeforeAll
fun initAll() {
// configure Engine
eng.setupRepo(DatabaseProvider.database)
eng.configure= ....
println("Configuration of Engine 1 DONE!")
}
Note the lazy init mode set to synchronized.
The #BeforeAll methods will fire up before the database repository is loaded, but it will block on each test class until the initialization oof the repository is done, and then continue.
My unit tests are seeing org.hibernate.LazyInitializationException: could not initialize proxy [org.openapitools.entity.MenuItem#5] - no Session. I'm not sure why they expect a session in a unit test. I'm trying to write to an in-memory h2 database for the unit tests of my Controller classes that implement the RESTful APIs. I'm not using any mock objects for the test, because I want to test the actual database transactions. This worked fine when I was using Spring-Boot version 1.x, but broke when I moved to version 2. (I'm not sure if that's what caused the tests to break, since I made lots of other changes. My point is that my code has passed these tests already.)
My Repositories extend JPARepository, so I'm using a standard Hibernate interface.
There are many answers to this question on StackOverflow, but very few describe a solution that I could use with Spring-Data.
Addendum: Here's a look at the unit test:
#Test
public void testDeleteOption() throws ResponseException {
MenuItemDto menuItemDto = createPizzaMenuItem();
ResponseEntity<CreatedResponse> responseEntity
= adminApiController.addMenuItem(menuItemDto);
final CreatedResponse body = responseEntity.getBody();
assertNotNull(body);
Integer id = body.getId();
MenuItem item = menuItemApiController.getMenuItemTestOnly(id);
// Hibernate.initialize(item); // attempted fix blows up
List<String> nameList = new LinkedList<>();
for (MenuItemOption option : item.getAllowedOptions()) { // blows up here
nameList.add(option.getName());
}
assertThat(nameList, hasItems("pepperoni", "olives", "onions"));
// ... (more code)
}
My test application.properties has these settings
spring.datasource.url=jdbc:h2:mem:pizzaChallenge;DB_CLOSE_ON_EXIT=FALSE
spring.datasource.username=pizza
spring.datasource.password=pizza
spring.jpa.show-sql=true
This is not standard Hibernate, but spring data. You have to understand that Hibernate uses lazy loading to avoid loading the whole object graph from the database. If you close the session or connection to the database e.g. by ending a transaction, Hibernate can't lazy load anymore and apparently, your code tries to access state that needs lazy loading.
You can use #EntityGraph on your repository to specify that an association should be fetched or you avoid accessing the state that isn't initialized outside of a transaction. Maybe you just need to enlarge the transaction scope by putting #Transactional on the method that calls the repository and accesses the state, so that lazy loading works.
I found a way around this. I'm not sure if this is the best approach, so if anyone has any better ideas, I'd appreciate hearing from them.
Here's what I did. First of all, before reading a value from the lazy-loaded entity, I call Hibernate.initialize(item);
This throws the same exception. But now I can add a property to the test version of application.properties that says
spring.jpa.properties.hibernate.enable_lazy_load_no_trans=true
Now the initialize method will work.
P.S. I haven't been able to find a good reference for Spring properties like this one. If anyone knows where I can see the available properties, I'd love to hear about it. The folks at Spring don't do a very good job of documenting these properties. Even when they mention a specific property, they don't provide a link that might explain it more thoroughly.
The question has already been asked at Stackoverflow but I haven't found the answer. I can't understand how to test (unit) my DAO and Service layers (you will be able to find the methods I would like to test below). So, there are two opposite notions regarding testing of DAO: the one is not to test it at all, the opposite one is to test it only with in-memory DB. As for service layer there are meaning that method should be tested only if it has business logic. So, frankly I can't even imaging what to do because I'm not sure which the way is correct. At my last pet project I tested DAO by using jUnit only (examle for saveEntity()): I explicitly created entity, populated it, saved it by using DAO-method, when retrieved it, asserted result and explicitly deleted the object from the DB. I'm sure that it is not the proper way to test it. So, please take a look at the code below and give me an advice how to test these layers' methods properly:
DAO
#Override
public void saveEntity(Artist entity) throws ConstraintViolationException {
sessionFactory.getCurrentSession().save(entity);
}
Service
#Transactional
#Override
public void saveEntity(Artist entity) throws ConstraintViolationException {
artistDAO.saveEntity(entity);
}
I wanted to do it with Mockito but all examples I found not similar to my case.
Thank you for any ideas how to do that.
I learnt flywaydb migration with java works with JDBC connection and also spring support through SpringTemplate, but flyway doesn't work with DAOs.
for tables/entities with more relationships,it makes life much easier to do migration with DAO's rather than sql.
is there a solution or work-around to deal with this ?
First, Flyway has its own transaction managing system and does not use Spring transaction handling.
If your DAOs extend JdbcDaoSupport, you could instantiate manually the your DAO and then manually inject the provided JdbcTemplate in the DAO:
public class MyJdbcMigration implements SpringJdbcMigration {
public void migrate(JdbcTemplate jdbcTemplate) {
MyJdbcDao dao = new MyJdbcDao();
dao.setJdbcTemplate(jdbcTemplate);
dao.updateDate();
}
}
I know this comes very late, but for future visitors with the same problem this might be helpful.
In my opinion, the creator of Flyway is actually wrong in this subject. It's perfectly fine to migrate data with business logic and there is no chicken and egg problem, as long as you do not change the structure of the database in your update script.
One example: you have a field "password" in your database and it is clear text. Because of security concerns you now want to use a special hash function and hash all passwords in the database (it should be a secure one and the database does not have a function to do that). The hash function is declared in your UserDAO and called when the user is created or when they change their password. Although that's not a perfect example, there are many possible scenarios where accessing a DAO for the migration makes sense.
Fortunately a work colleague of mine found a solution to the problem, and it only requires around 5 lines of code. You also need to add Apache Deltaspike to your dependencies, if it isn't already.
In your DAO, add an import for BeanProvider:
import org.apache.deltaspike.core.api.provider.BeanProvider;
Then we simply make the DAO a singleton:
public static UserDao getInstance() {
return BeanProvider.getContextualReference(UserDao.class, false, new DaoLiteral());
}
That's pretty much it. In your Flyway script you can now access the DAO:
#Override
public void migrate(Connection cnctn) throws Exception{
UserDao userdao = UserDao.getInstance();
List<User> userList = userdao.getAllUsers();
...
}
Explanation: the Class (VX_yourflywaymigrationscript) is not managed by the CDI Container, so it's not possible to inject the DAO. BeanProvider is made for exactly that - it can load a Bean and give you the reference, even if you are not in a CDI context.
I hope that helps.
Your DAOs rely on the very structure Flyway was designed to change. We therefore have a chicken and egg problem here. The way to solve this is to run Flyway before the rest of your application (including the DAOs) gets initialized.
So I have a interesting problem that i will need some help with. I know a bunch of questions have been asked around rollbacks in transactions using junit but I believe my problem and slightly different. To give people a better understanding of the problem let me start from the beginning.
I have implemented a UserManagementService with its respective DAO for a user management system. There is a general method called CreateUser(User obj) that is used to create a unique user. Now, there is a constraint set that email addresses are unique so if we try to invoke this method with a email address that has already been used, we throw a custom exception called UserManagementException with its respective error message. All this works fine however, the problem I am having is when it comes to the unit test. Oh, before i forget, let me mention the software stack i am using [Java, spring, hibernate]
I have my unit test class annotated with the Transactional annotations for each method that actually hits the db. These methods also have the #Rollback annotation so that all inserts, updates and deletions are rolled back at the end of each test invocation. So the problem i am facing here is I would like to test for the unique user constraint scenario. By calling the createUser(obj) a second time with a user object with the same email address I want to ensure that the UserManagementException exception is thrown. However, since it is transactional, whenever a exception is thrown, the transaction is rollback before the unit test completes and hence fails the test. Below is the test case.
#Test
#Rollback
#Transactional
public void testUniqueCreateConsoleUser() {
boolean success;
ConsoleUser newUser;
//first one
userManagementDao.createConsoleUser(user);
//second one. This shd throw a UserManagementException
try {
//now try and insert a new user with same email
newUser = new ConsoleUser("Queen", "Kong", "king.kong#blah.com", "kingkong","Universal Studios", "America/Los_Angeles", false, null);
userManagementDao.createConsoleUser(newUser);
//if this passed this is a problem. Console users should have unique email address
success = false;
} catch (UserManagementException e) {
success = true;
}
Assert.assertTrue(success);
}
The weird thing is when i am running it through the debugger, the Assert.assertTrue() method is invoked correctly but the test ultimately fails.
Another thing i tried was to add a prop to the #Transactional annotation. I added the flowing #Transactional(noRollbackFor = UserManagementException.class) in hopes that if the exception was thrown, the rollback wouldn't be invoked then but at the end of the test. I may be approaching this the wrong way so any ideas or best practices around this sort of testing would be greatly appricieated.
Note: Below is a snippet from the stacktrace..
org.springframework.transaction.UnexpectedRollbackException: Transaction rolled back because it has been marked as rollback-only
at org.springframework.transaction.support.AbstractPlatformTransactionManager.commit(AbstractPlatformTransactionManager.java:695)
at org.springframework.transaction.interceptor.TransactionAspectSupport.commitTransactionAfterReturning(TransactionAspectSupport.java:321)
at org.springframework.transaction.aspectj.AbstractTransactionAspect.ajc$afterReturning$org_springframework_transaction_aspectj_AbstractTransactionAspect
It's hard to tell from your example, but you seem to be testing against your actual DAO implementation. Rather than have unit test data hitting your actual database, mock your DAO with either a mock implementation or a mocking framework. You can then manipulate the data returned programmatically and contort it into whatever validation scenarios you want.
If you can confirm that an extra rollback is thrown (for example - when spring does the insert, when it sees that it fails, does it already roll the transaction back?) then you should catch the rollback, or configure spring not to roll the transaction back.
That is, clearly, the rollback which spring is implementing is conflicting with the expected rollback in your unit test. This rollback is then confusing the rollback annotation, causing an unexpected thrown exception in the "unit-test / Spring ether".
THE SIMPLE SOLUTION : Don't enable the automated rollbacks for this test. Tests don't always have to be perfectly elegant.
Rather than inserting a user and then inserting another user with the same email address I suggest first loading an existing user from the database and then attempting to insert anther with the same email address as the one that was retrieved. If so you simply need to do:
#Test(expected = UserManagementException.class)
public void insert_duplicate_user() throws Exception {
// Read user from database
final ConsoleUser user = dao.load(...);
// Create new user with same email address.
final ConsoleUser newUser = new ConsoleUser (...);
newUser.setEmail(user.getEmail());
// Write
dao.createConsoleUser(newUser);
/*
* If you get here, there is a problem with your DAO logic
* and a new user (with the same email was created).
* So, we need to clean that up
*/
// Delete new user
dao.deleteUser(newUser);
}
This test will fail unless a UserManagementException is thrown.