My unit tests are seeing org.hibernate.LazyInitializationException: could not initialize proxy [org.openapitools.entity.MenuItem#5] - no Session. I'm not sure why they expect a session in a unit test. I'm trying to write to an in-memory h2 database for the unit tests of my Controller classes that implement the RESTful APIs. I'm not using any mock objects for the test, because I want to test the actual database transactions. This worked fine when I was using Spring-Boot version 1.x, but broke when I moved to version 2. (I'm not sure if that's what caused the tests to break, since I made lots of other changes. My point is that my code has passed these tests already.)
My Repositories extend JPARepository, so I'm using a standard Hibernate interface.
There are many answers to this question on StackOverflow, but very few describe a solution that I could use with Spring-Data.
Addendum: Here's a look at the unit test:
#Test
public void testDeleteOption() throws ResponseException {
MenuItemDto menuItemDto = createPizzaMenuItem();
ResponseEntity<CreatedResponse> responseEntity
= adminApiController.addMenuItem(menuItemDto);
final CreatedResponse body = responseEntity.getBody();
assertNotNull(body);
Integer id = body.getId();
MenuItem item = menuItemApiController.getMenuItemTestOnly(id);
// Hibernate.initialize(item); // attempted fix blows up
List<String> nameList = new LinkedList<>();
for (MenuItemOption option : item.getAllowedOptions()) { // blows up here
nameList.add(option.getName());
}
assertThat(nameList, hasItems("pepperoni", "olives", "onions"));
// ... (more code)
}
My test application.properties has these settings
spring.datasource.url=jdbc:h2:mem:pizzaChallenge;DB_CLOSE_ON_EXIT=FALSE
spring.datasource.username=pizza
spring.datasource.password=pizza
spring.jpa.show-sql=true
This is not standard Hibernate, but spring data. You have to understand that Hibernate uses lazy loading to avoid loading the whole object graph from the database. If you close the session or connection to the database e.g. by ending a transaction, Hibernate can't lazy load anymore and apparently, your code tries to access state that needs lazy loading.
You can use #EntityGraph on your repository to specify that an association should be fetched or you avoid accessing the state that isn't initialized outside of a transaction. Maybe you just need to enlarge the transaction scope by putting #Transactional on the method that calls the repository and accesses the state, so that lazy loading works.
I found a way around this. I'm not sure if this is the best approach, so if anyone has any better ideas, I'd appreciate hearing from them.
Here's what I did. First of all, before reading a value from the lazy-loaded entity, I call Hibernate.initialize(item);
This throws the same exception. But now I can add a property to the test version of application.properties that says
spring.jpa.properties.hibernate.enable_lazy_load_no_trans=true
Now the initialize method will work.
P.S. I haven't been able to find a good reference for Spring properties like this one. If anyone knows where I can see the available properties, I'd love to hear about it. The folks at Spring don't do a very good job of documenting these properties. Even when they mention a specific property, they don't provide a link that might explain it more thoroughly.
Related
In my project, I want method A,B use database1, and all other methods use database2.
Now I write like this in controller
DataSourceHolder.putDataSource("db1");
String code = methodA();//for get some
DataSourceHolder.putDataSource("db2");
methodC(code);
...
DataSourceHolder.putDataSource("db1");
methodB(code);//for set some
DataSourceHolder.putDataSource("db2");
In most cases, it runs normally. But while many people visit my webset, it may use wrong dataSource. Some data can not be saved or updated.
The question has already been asked at Stackoverflow but I haven't found the answer. I can't understand how to test (unit) my DAO and Service layers (you will be able to find the methods I would like to test below). So, there are two opposite notions regarding testing of DAO: the one is not to test it at all, the opposite one is to test it only with in-memory DB. As for service layer there are meaning that method should be tested only if it has business logic. So, frankly I can't even imaging what to do because I'm not sure which the way is correct. At my last pet project I tested DAO by using jUnit only (examle for saveEntity()): I explicitly created entity, populated it, saved it by using DAO-method, when retrieved it, asserted result and explicitly deleted the object from the DB. I'm sure that it is not the proper way to test it. So, please take a look at the code below and give me an advice how to test these layers' methods properly:
DAO
#Override
public void saveEntity(Artist entity) throws ConstraintViolationException {
sessionFactory.getCurrentSession().save(entity);
}
Service
#Transactional
#Override
public void saveEntity(Artist entity) throws ConstraintViolationException {
artistDAO.saveEntity(entity);
}
I wanted to do it with Mockito but all examples I found not similar to my case.
Thank you for any ideas how to do that.
I learnt flywaydb migration with java works with JDBC connection and also spring support through SpringTemplate, but flyway doesn't work with DAOs.
for tables/entities with more relationships,it makes life much easier to do migration with DAO's rather than sql.
is there a solution or work-around to deal with this ?
First, Flyway has its own transaction managing system and does not use Spring transaction handling.
If your DAOs extend JdbcDaoSupport, you could instantiate manually the your DAO and then manually inject the provided JdbcTemplate in the DAO:
public class MyJdbcMigration implements SpringJdbcMigration {
public void migrate(JdbcTemplate jdbcTemplate) {
MyJdbcDao dao = new MyJdbcDao();
dao.setJdbcTemplate(jdbcTemplate);
dao.updateDate();
}
}
I know this comes very late, but for future visitors with the same problem this might be helpful.
In my opinion, the creator of Flyway is actually wrong in this subject. It's perfectly fine to migrate data with business logic and there is no chicken and egg problem, as long as you do not change the structure of the database in your update script.
One example: you have a field "password" in your database and it is clear text. Because of security concerns you now want to use a special hash function and hash all passwords in the database (it should be a secure one and the database does not have a function to do that). The hash function is declared in your UserDAO and called when the user is created or when they change their password. Although that's not a perfect example, there are many possible scenarios where accessing a DAO for the migration makes sense.
Fortunately a work colleague of mine found a solution to the problem, and it only requires around 5 lines of code. You also need to add Apache Deltaspike to your dependencies, if it isn't already.
In your DAO, add an import for BeanProvider:
import org.apache.deltaspike.core.api.provider.BeanProvider;
Then we simply make the DAO a singleton:
public static UserDao getInstance() {
return BeanProvider.getContextualReference(UserDao.class, false, new DaoLiteral());
}
That's pretty much it. In your Flyway script you can now access the DAO:
#Override
public void migrate(Connection cnctn) throws Exception{
UserDao userdao = UserDao.getInstance();
List<User> userList = userdao.getAllUsers();
...
}
Explanation: the Class (VX_yourflywaymigrationscript) is not managed by the CDI Container, so it's not possible to inject the DAO. BeanProvider is made for exactly that - it can load a Bean and give you the reference, even if you are not in a CDI context.
I hope that helps.
Your DAOs rely on the very structure Flyway was designed to change. We therefore have a chicken and egg problem here. The way to solve this is to run Flyway before the rest of your application (including the DAOs) gets initialized.
I am trying to speed up the Integration tests in our environment. All our classes are autowired. In our applicationContext.xml file we have defined the following:
<context:annotation-config/>
<context:component-scan base-package="com.mycompany.framework"/>
<context:component-scan base-package="com.mycompany.service"/>
...additional directories
I have noticed that Spring is scanning all directories indicated above and then iterates over each bean and caches the properties of each one. (I went over the DEBUG messages from spring)
As a result, the following test takes about 14 seconds to run:
public class MyTest extends BaseSpringTest {
#Test
def void myTest(){
println "test"
}
}
Is there any way to lazy load the configuration? I tried adding default-lazy-init="true" but that didn't work.
Ideally, only the beans required for the test are instantiated.
thanks in advance.
Update: I should have stated this before, I do not want to have a context file for each test. I also do not think one context file for just the tests would work. (This test context file would end up including everything)
If you really want to speed up your application context, disable your <component-scan and performs the following routine before running any test
Resource resource = new ClassPathResource(<PUT_XML_PATH_RIGHT_HERE>); // source.xml, for instance
InputStream in = resource.getInputStream();
Document document = new SAXReader().read(in);
Element root = document.getRootElement();
/**
* remove component-scanning
*/
for ( Iterator i = root.elementIterator(); i.hasNext(); ) {
Element element = (Element) i.next();
if(element.getNamespacePrefix().equals("context") && element.getName().equals("component-scan"))
root.remove(element);
}
in.close();
ClassPathScanningCandidateComponentProvider scanner = new ClassPathScanningCandidateComponentProvider(true);
for (String source: new String[] {"com.mycompany.framework", "com.mycompany.service"}) {
for (BeanDefinition bd: scanner.findCandidateComponents(source)) {
root
.addElement("bean")
.addAttribute("class", bd.getBeanClassName());
}
}
//add attribute default-lazy-init = true
root.addAttribute("default-lazy-init","true");
/**
* creates a new xml file which will be used for testing
*/
XMLWriter output = new XMLWriter(new FileWriter(<SET_UP_DESTINATION_RIGHT_HERE>));
output.write(document);
output.close();
Besides that, enable <context:annotation-config/>
As you need to perform the routine above before running any test, you can create an abstract class where you can run the following
Set up a Java system property for testing environment as follows
-Doptimized-application-context=false
And
public abstract class Initializer {
#BeforeClass
public static void setUpOptimizedApplicationContextFile() {
if(System.getProperty("optimized-application-context").equals("false")) {
// do as shown above
// and
System.setProperty("optimized-application-context", "true");
}
}
}
Now, for each test class, just extends Initializer
One approach is to skip the auto detection completely and either load up a separate context (with the components required for the test) or redefine your beans at runtime (prior to the test running).
This thread discusses redefinition of beans and a custom test class for doing this:
Spring beans redefinition in unit test environment
This is the price you pay for auto-detection of components - it's slower. Even though your test only requires certain beans, your <context:component-scan> is much broader, and Spring will instantiate and initialise every bean it finds.
I suggest that you use a different beans file for your tests, one which only defines the beans necessary for the test itself, i.e. not using <context:component-scan>.
Probably what you need is to refactor your config to use less autowiring. My approach is almost always wire the beans by name, trying to be explicit with the design but, at the same time, not being too verbose either, using autowiring when is clear that you are using it in order to hide minor details.
Addendum:
If that is not enough and you are using junit, you may want to use a utility from the JUnit Addons project. The class DirectorySuiteBuilder dynamically builds up a test suite from a directory structure. So you can make something like
DirectorySuiteBuilder builder = new DirectorySuiteBuilder();
Test suite = builder.suite("project/tests");
Initializing the Spring context before this code, you can run all tests at once. However, if each test assume a "clean" Spring context, then you are probably lost.
In this kind of situation, you will need to find a balance.
On one hand, you would rightly want to run the tests in a shortest possible time to get the results quick. This is especially important when working in a team environment with continuous integration working.
On the other hand, you would also rightly want to keep the configuration of tests as simple as possible so the maintenance of test suite would not become too cumbersome to be useful.
But at the end of the day, you will need to find your own balance and make a decision.
I would recommend creating a few context configuration files for testing to group some tests so such a simple test would not take long time simply being configured by Spring, while keeping the number of configuration files to minimum you can manage.
Convention bean factory is designed to solve this problem and speeds up the whole process significantly, 3x or more.
Since none of the answers here solved this problem for me, I add my own experience.
My problem was that Spring, Hibernate and EhCache grouped up in the attempt of drowning my console with verbose DEBUG messages, resulting unreadable log and - far worse - unbearable low performance.
Configuring their log levels fixed all up:
Logger.getLogger("org.hibernate").setLevel(Level.INFO);
Logger.getLogger("net.sf.ehcache").setLevel(Level.INFO);
Logger.getLogger("org.springframework").setLevel(Level.INFO);
Hello good people i came accross a weird behaviour in my test.I'm using JPA hibernate annotation with spring.
let say i have an Class MyObject and it's property email is marqued
#Column(name="EMAIL", length=100, unique=true)
private String email;
i prepare for what i need to be in the database in the setup of this class MyObjectDAOImplTest
#Autowired
MyObject1 ob1;
#Autowired
MyObject1 ob2;
#Before
public void setUP(){
dao = manager.createthedao();
....
ob1.setEmail("some#email.com");
....
....
ob2.setEmail("someother#email.com");
....
dao.save(ob1);
dao.save(ob2);
}
so my a part from the fist test method all the reste are failling.I's about duplicates values on the email column but my hbm2ddl.auto=create and i even used the create-drop. but still. i just don't get it. i've used this in so many project without the unique of course but i expect the database to be dropped each time a test method is run.Is there anything about the unique i should be aware of ? thanks for reading.Give me your suggestion.Did i left out something or fail to do some?
You're missing #After method which is why you're seeing this behaviour. When running jUnit 4.x tests, the whole suite is run in a single thread one after another which means that you have to clear the state yourself or unspecified behaviour occurs, usually resources keep hanging and cause side effects to other unit tests.
Shouldn't you have some code to drop/remove the unit-test database after (or preferably before) each test? Are you sure that you are actually creating the database at all? What database engine you are using?
If you are using some memory based database, are you initializing it in the right place (every time a test is executed)?
Are you calling SessionFactory.close() somewhere? If you are using hibernate.hbm2ddl.auto=create-drop, that should handle the dropping of the database.