I'm trying to test my DAO layer (which is built on JPA) in separation. In the unit test, I'm using DbUnit to populate the database and Spring Test to get an instance of ApplicationContext.
When I tried to use the SpringJunit4ClassRuner, the ApplicationContext got injected, but the DbUnit's getDataSet() method never got called.
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(locations = "/testdao.xml")
public class SimpleJPATest extends DBTestCase implements ApplicationContextAware {
...
Then I tried to remove the #RunWith annotation, that removed the problems with getDataSet() method. But now I no longer get ApplicationContext instance injected. I tried to use the #TestExecutionListeners annotation, which is supposed to configure the DependencyInjectionTestExecutionListener by default, but the AppContext still doesn't get injected.
#TestExecutionListeners
#ContextConfiguration(locations = "/testdao.xml")
public class SimpleJPATest extends DBTestCase implements ApplicationContextAware {
...
Does anyone have any ideas? Is it generally a bad idea to combine these two frameworks?
EDIT: here is the rest of the source for the test class:
#TestExecutionListeners
#ContextConfiguration(locations = "/testdao.xml")
public class SimpleJPATest extends DBTestCase implements ApplicationContextAware {
static final String TEST_DB_PROPS_FILE = "testDb.properties";
static final String DATASET_FILE = "testDataSet.xml";
static Logger logger = Logger.getLogger( SimpleJPATest.class );
private ApplicationContext ctx;
public SimpleJPATest() throws Exception {
super();
setDBUnitSystemProperties(loadDBProperties());
}
#Test
public void testSimple() {
EntityManagerFactory emf = ctx.getBean("entityManagerFactory", EntityManagerFactory.class);
EntityManager em = emf.createEntityManager();
GenericDAO<Club> clubDAO = new JpaGenericDAO<Club>(ClubEntity.class, "ClubEntity", em);
em.getTransaction().begin();
Collection<Club> allClubs = clubDAO.findAll();
em.getTransaction().commit();
assertEquals(1, allClubs.size());
}
#Override
public void setApplicationContext(ApplicationContext applicationContext) throws BeansException {
this.ctx = applicationContext;
}
private void setDBUnitSystemProperties(Properties props) {
System.setProperty(PropertiesBasedJdbcDatabaseTester.DBUNIT_DRIVER_CLASS,
props.getProperty("db.driver"));
System.setProperty(PropertiesBasedJdbcDatabaseTester.DBUNIT_CONNECTION_URL,
props.getProperty("db.url"));
System.setProperty(PropertiesBasedJdbcDatabaseTester.DBUNIT_USERNAME,
props.getProperty("db.username"));
System.setProperty(PropertiesBasedJdbcDatabaseTester.DBUNIT_PASSWORD,
props.getProperty("db.password"));
}
private Properties loadDBProperties() throws Exception {
URL propsFile = ClassLoader.getSystemResource(TEST_DB_PROPS_FILE);
assert (propsFile != null);
Properties props = new Properties();
props.load(propsFile.openStream());
return props;
}
#Override
protected void setUpDatabaseConfig(DatabaseConfig config) {
config.setProperty( DatabaseConfig.PROPERTY_DATATYPE_FACTORY,
new HsqldbDataTypeFactory() );
}
#Override
protected DatabaseOperation getSetUpOperation() throws Exception {
return DatabaseOperation.CLEAN_INSERT;
}
#Override
protected DatabaseOperation getTearDownOperation() throws Exception {
return DatabaseOperation.DELETE_ALL;
}
#Override
protected IDataSet getDataSet() throws Exception {
logger.debug("in getDataSet");
URL dataSet = ClassLoader.getSystemResource(DATASET_FILE);
assert (dataSet != null);
FlatXmlDataSet result = new FlatXmlDataSetBuilder().build(dataSet);
return result;
}
}
I've used these two frameworks together without any issues. I've had to do some things a little different from the standard though to get it to work:
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(locations = { "classpath:applicationContext.xml" })
#TestExecutionListeners({ DependencyInjectionTestExecutionListener.class,
DirtiesContextTestExecutionListener.class })
public class MyDaoTest extends DBTestCase {
#Autowired
private MyDao myDao;
/**
* This is the underlying BasicDataSource used by Dao. If The Dao is using a
* support class from Spring (i.e. HibernateDaoSupport) this is the
* BasicDataSource that is used by Spring.
*/
#Autowired
private BasicDataSource dataSource;
/**
* DBUnit specific object to provide configuration to to properly state the
* underlying database
*/
private IDatabaseTester databaseTester;
/**
* Prepare the test instance by handling the Spring annotations and updating
* the database to the stale state.
*
* #throws java.lang.Exception
*/
#Before
public void setUp() throws Exception {
databaseTester = new DataSourceDatabaseTester(dataSource);
databaseTester.setDataSet(this.getDataSet());
databaseTester.setSetUpOperation(this.getSetUpOperation());
databaseTester.onSetup();
}
/**
* Perform any required database clean up after the test runs to ensure the
* stale state has not been dirtied for the next test.
*
* #throws java.lang.Exception
*/
#After
public void tearDown() throws Exception {
databaseTester.setTearDownOperation(this.getTearDownOperation());
databaseTester.onTearDown();
}
/**
* Retrieve the DataSet to be used from Xml file. This Xml file should be
* located on the classpath.
*/
#Override
protected IDataSet getDataSet() throws Exception {
final FlatXmlDataSetBuilder builder = new FlatXmlDataSetBuilder();
builder.setColumnSensing(true);
return builder.build(this.getClass().getClassLoader()
.getResourceAsStream("data.xml"));
}
/**
* On setUp() refresh the database updating the data to the data in the
* stale state. Cannot currently use CLEAN_INSERT due to foreign key
* constraints.
*/
#Override
protected DatabaseOperation getSetUpOperation() {
return DatabaseOperation.CLEAN_INSERT;
}
/**
* On tearDown() truncate the table bringing it back to the state it was in
* before the tests started.
*/
#Override
protected DatabaseOperation getTearDownOperation() {
return DatabaseOperation.TRUNCATE_TABLE;
}
/**
* Overridden to disable the closing of the connection for every test.
*/
#Override
protected void closeConnection(IDatabaseConnection conn) {
// Empty body on purpose.
}
// Continue TestClass here with test methods.
I've had to do things a little more manual than I would like, but the same scenario applies if you try to use the JUnit Parameterized runner with Spring (in that case you have to start the TextContext manually). The most important thing to note is that I override the closeConnection() method and leave it blank. This overrides the default action of closing the dataSource connection after each test which can add unnecessary time as the connection will have to be reopened after every test.
Related
I have a service with a persistence setup using JPA, Hibernate and Guice (if it's useful, I'm not using Spring). This is the first, working version of my code:
public class BookDao {
#Inject
protected Provider<EntityManager> entityManagerProvider;
protected EntityManager getEntityManager() {
return entityManagerProvider.get();
}
#Transactional
public void persist(Book book) {
getEntityManager().persist(book);
}
}
public class MyAppModule extends AbstractModule {
#Override
protected void configure() {
initializePersistence();
}
private void initializePersistence() {
final JpaPersistModule jpaPersistModule = new JpaPersistModule("prod");
jpaPersistModule.properties(new Properties());
install(jpaPersistModule);
}
}
But now I need to configure multiple persistence units. I'm following the advice in this mailing list, and according to them, I should move my module logic to a private module. I did as suggested and created a second version of the same code, the changes are commented below:
#BindingAnnotation
#Retention(RetentionPolicy.RUNTIME)
#Target({ FIELD, PARAMETER, METHOD })
public #interface ProductionDataSource {} // defined this new annotation
public class BookDao {
#Inject
#ProductionDataSource // added the annotation here
protected Provider<EntityManager> entityManagerProvider;
protected EntityManager getEntityManager() {
return entityManagerProvider.get();
}
#Transactional
public void persist(Book book) throws Exception {
getEntityManager().persist(book);
}
}
public class MyAppModule extends PrivateModule { // module is now private
#Override
protected void configure() {
initializePersistence();
// expose the annotated entity manager
Provider<EntityManager> entityManagerProvider = binder().getProvider(EntityManager.class);
bind(EntityManager.class).annotatedWith(ProductionDataSource.class).toProvider(entityManagerProvider);
expose(EntityManager.class).annotatedWith(ProductionDataSource.class);
}
private void initializePersistence() {
JpaPersistModule jpaPersistModule = new JpaPersistModule("prod");
jpaPersistModule.properties(new Properties());
install(jpaPersistModule);
}
}
The newly annotated EntityManager is being correctly injected by Guice and is non-null, but here's the fun part: some of my unit tests started failing, for example:
class BookDaoTest {
private Injector injector;
private BookDao testee;
#BeforeEach
public void setup() {
injector = Guice.createInjector(new MyAppModule());
injector.injectMembers(this);
testee = injector.getInstance(BookDao.class);
}
#Test
public void testPersistBook() throws Exception {
// given
Book newBook = new Book();
assertNull(newBook.getId());
// when
newBook = testee.persist(newBook);
// then
assertNotNull(newBook.getId()); // works in the first version, fails in the second
}
}
In the first version of my code the last line above just works: the entity is persisted and has a new id. However, in the second version of my code (using a PrivateModule and exposing an annotated EntityManager from it) the persist() operation doesn't work anymore, the entity is without an id. What could be the problem? I didn't do any other configuration changes in my environment, and I don't see error messages in the logs. Let me know if you need more details.
It turns out that the problem was the #Transactional annotation. In the first version of my code, Guice automatically adds interceptors for managing the transaction. By doing a debug, I found out that before executing my persist(Book book) method, Guice calls the following method from the com.google.inject.internal.InterceptorStackCallback package:
public Object intercept(Object proxy, Method method, Object[] arguments, MethodProxy methodProxy)
In the second version of my code, when I exposed the persistence unit from a private module the above interceptor was no longer called, leaving my persist operation without transaction handling. This is a known issue and is by design.
As a workaround I had to implement transactions by hand, making my code more verbose. I also had to change the way the entity manager is injected. This solution worked for me:
public class BookDao {
#Inject
#Named(PROD_PERSISTENCE_UNIT_NAME)
private EntityManagerFactory entityManagerFactory;
private EntityManager getEntityManager() {
return entityManagerFactory.createEntityManager();
}
public void persist(Book book) throws Exception {
EntityManager em = getEntityManager();
try {
em.getTransaction().begin();
em.persist(book);
em.getTransaction().commit();
} catch (Exception e) {
em.getTransaction().rollback();
throw e;
} finally {
em.close();
}
}
}
public class MyAppModule extends PrivateModule {
public static final String PROD_PERSISTENCE_UNIT_NAME = "prod";
#Override
protected void configure() {
initializePersistence();
}
private void initializePersistence() {
// persistence unit set to prod DB
final JpaPersistModule jpaPersistModule = new JpaPersistModule(PROD_PERSISTENCE_UNIT_NAME);
// connection properties set to suitable prod values
jpaPersistModule.properties(new Properties());
install(jpaPersistModule);
// expose bindings to entity manager annotated as "prod"
bind(JPAInitializer.class).asEagerSingleton();
bind(PersistService.class).annotatedWith(named(PROD_PERSISTENCE_UNIT_NAME)).to(PersistService.class).asEagerSingleton();
expose(PersistService.class).annotatedWith(named(PROD_PERSISTENCE_UNIT_NAME));
bind(EntityManagerFactory.class).annotatedWith(named(PROD_PERSISTENCE_UNIT_NAME)).toProvider(binder().getProvider(EntityManagerFactory.class));
expose(EntityManagerFactory.class).annotatedWith(named(PROD_PERSISTENCE_UNIT_NAME));
bind(EntityManager.class).annotatedWith(named(PROD_PERSISTENCE_UNIT_NAME)).toProvider(binder().getProvider(EntityManager.class));
expose(EntityManager.class).annotatedWith(named(PROD_PERSISTENCE_UNIT_NAME));
bind(UnitOfWork.class).annotatedWith(named(PROD_PERSISTENCE_UNIT_NAME)).toProvider(binder().getProvider(UnitOfWork.class));
expose(UnitOfWork.class).annotatedWith(named(PROD_PERSISTENCE_UNIT_NAME));
}
}
As a lesson, be very watchful around annotations and other such "magic" that modifies your code under the hood, finding bugs becomes quite difficult.
This might have been coded wrongly, but any idea how it should be done is appreciated.
I have this class TestClass which needs to inject many service class. Since I can't use #BeforeClass on #Autowired objects, I result on using AbstractTestExecutionListener. Everything was working as expected but when I'm on #Test blocks, all objects are evaluated null.
Any idea how to solve this?
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(classes = { ProjectConfig.class })
#TestExecutionListeners({ TestClass.class })
public class TestClass extends AbstractTestExecutionListener {
#Autowired private FirstService firstService;
// ... other services
// objects needs to initialise on beforeTestClass and afterTestClass
private First first;
// ...
// objects needs to be initialised on beforeTestMethod and afterTestMethod
private Third third;
// ...
#Override public void beforeTestClass(TestContext testContext) throws Exception {
testContext.getApplicationContext().getAutowireCapableBeanFactory().autowireBean(this);
first = firstService.setUp();
}
#Override public void beforeTestMethod(TestContext testContext) throws Exception {
third = thirdService.setup();
}
#Test public void testOne() {
first = someLogicHelper.recompute(first);
// ...
}
// other tests
#Override public void afterTestMethod(TestContext testContext) throws Exception {
thirdService.tearDown(third);
}
#Override public void afterTestClass(TestContext testContext) throws Exception {
firstService.tearDown(first);
}
}
#Service
public class FirstService {
// logic
}
For starters, having your test class implement AbstractTestExecutionListener is not a good idea. A TestExecutionListener should be implemented in a stand-alone class. So you might want to rethink that approach.
In any case, your current configuration is broken: you disabled all default TestExecutionListener implementations.
To include the defaults, try the following configuration instead.
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(classes = ProjectConfig.class)
#TestExecutionListeners(listeners = TestClass.class, mergeMode = MERGE_WITH_DEFAULTS)
public class TestClass extends AbstractTestExecutionListener {
// ...
}
Regards,
Sam (author of the Spring TestContext Framework)
I'm having troubles setting this to work. Basically I want when a tests finishes running for the database to be in the exact same state as before. This happens when using Spring/Hibernate managed sessions and connections, but not for DBUnit. I've tried lots of things and at this point I was doing something like wrapping the shared datasource in TransactionAwareDataSourceProxy and executing the load dataset manually instead of using #DatabaseSetup.
this.databaseConnection = new DatabaseConnection(dataSource.getConnection());
IDataSet xmlFileDataSet = new FlatXmlDataSetBuilder().build(getClass().getResourceAsStream("/dataset.xml"));
DatabaseOperation.REFRESH.execute(databaseConnection, xmlFileDataSet);
And it simply doesn't work. I've checked the dataSource object and it's a TransactionAwareDataSourceProxy instance so everything should be in place. All the data in the dataset is committed and persisted and all the data added/modified inside the spring managed session is not.
Does anyone have a clue what I might be missing or any did this before and ran into the same troubles?
Current code (tried with and without TransactionAwareDataSourceProxy).
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(classes = {
RecipeManagementITConfig.class,
SpringJpaTestConfig.class,
DatabaseITConfig.class
},
initializers = {PleaseWork.class})
#TestExecutionListeners({
DependencyInjectionTestExecutionListener.class,
DirtiesContextTestExecutionListener.class,
TransactionalTestExecutionListener.class,
DbUnitTestExecutionListener.class
})
#TransactionConfiguration(defaultRollback = true)
#Transactional
public class DefaultIngredientServiceIT {
#Autowired
protected DataSource dataSource;
#Before
public void init() throws Exception {
System.out.println("> " + dataSource); // org.springframework.jdbc.datasource.TransactionAwareDataSourceProxy#d4ce346
dbTester = new DataSourceDatabaseTester(dataSource);
dbTester.setDataSet(getDataSet());
dbTester.setSetUpOperation(DatabaseOperation.REFRESH);
dbTester.onSetup();
}
#After
public void destroy() throws Exception {
dbTester.onTearDown();
}
private IDataSet getDataSet() throws Exception {
return new FlatXmlDataSetBuilder().build(getClass().getResourceAsStream("/dataset.xml"));
}
#Transactional
#Test
public void testDeletionOfIngredients() {
(...)
}
}
Try using a DataSourceDatabaseTester like this:
public class MyTestCase {
#Autowired
private DataSource dataSource;
private IDatabaseTester dbTester;
#Before
public void init() throws Exception {
dbTester = new DataSourceDatabaseTester( getDataSource() );
dbTester.setDataSet( getDataSet() );
dbTester.onSetUp();
}
#Destroy
public void destroy() throws Exception {
dbTester.onTearDown();
}
private IDataSet getDataSet() throws Exception {
return new FlatXmlDataSet(new FileInputStream("dataset.xml"));
}
}
I'm using Spring to inject the path to a directory into my unit tests. Inside this directory are a number of files that should be used to generate test data for parameterized test cases using the Parameterized test runner. Unfortunately, the test runner requires that the method that provides the parameters be static. This doesn't work for my situation because the directory can only be injected into a non-static field. Any ideas how I can get around this?
You can use a TestContextManager from Spring. In this example, I'm using Theories instead of Parameterized.
#RunWith(Theories.class)
#ContextConfiguration(locations = "classpath:/spring-context.xml")
public class SeleniumCase {
#DataPoints
public static WebDriver[] drivers() {
return new WebDriver[] { firefoxDriver, internetExplorerDriver };
}
private TestContextManager testContextManager;
#Autowired
SomethingDao dao;
private static FirefoxDriver firefoxDriver = new FirefoxDriver();
private static InternetExplorerDriver internetExplorerDriver = new InternetExplorerDriver();
#AfterClass
public static void tearDown() {
firefoxDriver.close();
internetExplorerDriver.close();
}
#Before
public void setUpStringContext() throws Exception {
testContextManager = new TestContextManager(getClass());
testContextManager.prepareTestInstance(this);
}
#Theory
public void testWork(WebDriver driver) {
assertNotNull(driver);
assertNotNull(dao);
}
}
I found this solution here : How to do Parameterized/Theories tests with Spring
I assume you are using JUnit 4.X since you mentioned the Parameterized test runner. This implies you aren't using #RunWith(SpringJUnit4ClassRunner). Not a problem, just listing my assumptions.
The following uses Spring to get the test files directory from the XML file. It doesn't inject it, but the data is still available to your test. And in a static method no less.
The only disadvantage I see is that it may mean your Spring config is getting parsed/configured multiple times. You could load just a smaller file with test specific info if need be.
#RunWith(Parameterized.class)
public class MyTest {
#Parameters
public static Collection<Object[]> data() {
ApplicationContext ctx = new ClassPathXmlApplicationContext("/jeanne/jeanne.xml");
String dir = ctx.getBean("testFilesDirectory", String.class);
// write java code to link files in test directory to the array
return Arrays.asList(new Object[][] { { 1 } });
}
// rest of test class
}
For someone reading this late 2015 or later, Spring 4.2 has, in addition to SpringJUnit4ClassRunner added SpringClassRule and SpringMethodRule which leverage the support for Spring TestContext Framework.
This means first class support for any Runner like MockitoJUnitRunner or Parameterized:
#RunWith(Parameterized.class)
public class FibonacciTest {
#ClassRule public static final SpringClassRule SCR = new SpringClassRule();
#Rule public final SpringMethodRule springMethodRule = new SpringMethodRule();
long input;
long output;
public FibonacciTest(long input, long output) { this.input = input; ...}
#Test
public void testFibonacci() {
Assert.assertEquals(output, fibonacci(input));
}
public List<Long[]> params() {
return Arrays.asList(new Long[][] { {0, 0}, {1, 1} });
}
}
It's enough to annotate test class with #RunWith(Parameterized.class) and #ContextConfiguration, use #Autowired for dependency injection and use TestContextManager in constructor for initialization, e.g.:
#RunWith(Parameterized.class)
#ContextConfiguration(classes = TestConfig.class)
public class MyTest {
#Autowired
private DataSource dataSource;
private final int param;
#Parameterized.Parameters
public static List<Object[]> params() {
return Arrays.asList(new Object[][]{
{1},
{2},
});
}
public MyTest(int p) {
this.param = p;
new TestContextManager(getClass()).prepareTestInstance(this);
}
#Test
public void testSomething() {
…
}
}
Here is a first solution without JUnit 4.12 parameterized factory, below an improved solution with it.
Static context without transactional support
Let Spring do all configuration parsing and autowiring with TestContextManager class.
The trick is to use a fake test instance to get autowired fields and pass them to the parameterized test which will effectively run.
But keep in mind prepareTestInstance() do the autowiring but doesn't manage test transaction and other nice stuffs handled by beforeTestMethod() and afterTestMethod().
#RunWith(Parameterized.class)
#ContextConfiguration(locations = {"/test-context.xml", "/mvc-context.xml"})
#WebAppConfiguration
#ActiveProfiles("test-profile")
public class MyTest {
#Parameters
public static Collection<Object[]> params() throws Exception {
final MyTest fakeInstance = new MyTest();
final TestContextManager contextManager = new TestContextManager(MyTest.class);
contextManager.prepareTestInstance(fakeInstance);
final WebApplicationContext context = fakeInstance.context;
// Do what you need with Spring context, you can even access web resources
final Resource[] files = context.getResources("path/files");
final List<Object[]> params = new ArrayList<>();
for (Resource file : files) {
params.add(new Object[] {file, context});
}
return params;
}
#Parameter
public Resource file;
#Autowired
#Parameter(1)
public WebApplicationContext context;
}
However a drawback appear if you have a lot of autowired fields because you have to manually pass them to the array parameters.
Parameterized factory with full Spring support
JUnit 4.12 introduce ParametersRunnerFactory which allow to combine parameterized test and Spring injection.
public class SpringParametersRunnerFactory implements ParametersRunnerFactory {
#Override
public Runner createRunnerForTestWithParameters(TestWithParameters test) throws InitializationError {
final BlockJUnit4ClassRunnerWithParameters runnerWithParameters = new BlockJUnit4ClassRunnerWithParameters(test);
return new SpringJUnit4ClassRunner(test.getTestClass().getJavaClass()) {
#Override
protected Object createTest() throws Exception {
final Object testInstance = runnerWithParameters.createTest();
getTestContextManager().prepareTestInstance(testInstance);
return testInstance;
}
};
}
}
The factory can be added to previous test class to give full Spring support like test transaction, reinit dirty context and servlet test. And of course there no more need to pass autowired fields from fake test instance to parameterized test.
#UseParametersRunnerFactory(SpringParametersRunnerFactory.class)
#RunWith(Parameterized.class)
#ContextConfiguration(locations = {"/test-context.xml", "/mvc-context.xml"})
#WebAppConfiguration
#Transactional
#TransactionConfiguration
public class MyTransactionalTest {
#Parameters
public static Collection<Object[]> params() throws Exception {
final MyTransactionalTest fakeInstance = new MyTransactionalTest();
final TestContextManager contextManager = new TestContextManager(MyTransactionalTest.class);
contextManager.prepareTestInstance(fakeInstance);
final WebApplicationContext context = fakeInstance.context;
// Do what you need with Spring context, you can even access web resources
final Resource[] files = context.getResources("path/files");
final List<Object[]> params = new ArrayList<>();
for (Resource file : files) {
params.add(new Object[] {file});
}
return params;
}
#Parameter
public Resource file;
#Autowired
private WebApplicationContext context;
}
I use the following solution with the Parameterized.class without any problem:
http://bmocanu.ro/coding/320/combining-junit-theoriesparameterized-tests-with-spring/
#ContextConfiguration(value = "classpath:test-context.xml")
public abstract class AbstractJunitTest extends AbstractJUnit4SpringContextTests {
private static TestContextManager testContextManager = null;
private static DAOFactory daoFactory = null;
#Before
public void initApplicationContext() throws Exception {
if (testContextManager == null) {
testContextManager = new TestContextManager(getClass());
testContextManager.prepareTestInstance(this);
daoFactory = (DAOFactory)applicationContext.getBean("daoFactory");
}
}
protected DAOFactory getDaoFactory() throws Exception {
return daoFactory;
}
}
#RunWith(Parameterized.class)
public class SomeTestClass extends AbstractJunitTest {
...
}
Remember that Spring inject using #Autowired, but also with setter. So instead of using #Autowired, use the setter:
private static String directory;
public void setDirectory(String directory) {
this.directory = directory;
}
public static String getDirectory() {
return directory;
}
I'm looking to create a sample project while learning Guice which uses JDBC to read/write to a SQL database. However, after years of using Spring and letting it abstract away connection handling and transactions I'm struggling to work it our conceptually.
I'd like to have a service which starts and stops a transaction and calls numerous repositories which reuse the same connection and participate in the same transaction. My questions are:
Where do I create my Datasource?
How do I give the repositories access to the connection? (ThreadLocal?)
Best way to manage the transaction (Creating an Interceptor for an annotation?)
The code below shows how I would do this in Spring. The JdbcOperations injected into each repository would have access to the connection associated with the active transaction.
I haven't been able to find many tutorials which cover this, beyond ones which show creating interceptors for transactions.
I am happy with continuing to use Spring as it is working very well in my projects, but I'd like to know how to do this in pure Guice and JBBC (No JPA/Hibernate/Warp/Reusing Spring)
#Service
public class MyService implements MyInterface {
#Autowired
private RepositoryA repositoryA;
#Autowired
private RepositoryB repositoryB;
#Autowired
private RepositoryC repositoryC;
#Override
#Transactional
public void doSomeWork() {
this.repositoryA.someInsert();
this.repositoryB.someUpdate();
this.repositoryC.someSelect();
}
}
#Repository
public class MyRepositoryA implements RepositoryA {
#Autowired
private JdbcOperations jdbcOperations;
#Override
public void someInsert() {
//use jdbcOperations to perform an insert
}
}
#Repository
public class MyRepositoryB implements RepositoryB {
#Autowired
private JdbcOperations jdbcOperations;
#Override
public void someUpdate() {
//use jdbcOperations to perform an update
}
}
#Repository
public class MyRepositoryC implements RepositoryC {
#Autowired
private JdbcOperations jdbcOperations;
#Override
public String someSelect() {
//use jdbcOperations to perform a select and use a RowMapper to produce results
return "select result";
}
}
If your database change infrequently, you could use the data source that comes with the database's JDBC driver and isolate the calls to the 3rd party library in a provider (My example uses the one provided by the H2 dataabse, but all JDBC providers should have one). If you change to a different implementation of the DataSource (e.g. c3PO, Apache DBCP, or one provided by app server container) you can simply write a new Provider implementation to get the datasource from the appropriate place. Here I've use singleton scope to allow the DataSource instance to be shared amongst those classes that depend on it (necessary for pooling).
public class DataSourceModule extends AbstractModule {
#Override
protected void configure() {
Names.bindProperties(binder(), loadProperties());
bind(DataSource.class).toProvider(H2DataSourceProvider.class).in(Scopes.SINGLETON);
bind(MyService.class);
}
static class H2DataSourceProvider implements Provider<DataSource> {
private final String url;
private final String username;
private final String password;
public H2DataSourceProvider(#Named("url") final String url,
#Named("username") final String username,
#Named("password") final String password) {
this.url = url;
this.username = username;
this.password = password;
}
#Override
public DataSource get() {
final JdbcDataSource dataSource = new JdbcDataSource();
dataSource.setURL(url);
dataSource.setUser(username);
dataSource.setPassword(password);
return dataSource;
}
}
static class MyService {
private final DataSource dataSource;
#Inject
public MyService(final DataSource dataSource) {
this.dataSource = dataSource;
}
public void singleUnitOfWork() {
Connection cn = null;
try {
cn = dataSource.getConnection();
// Use the connection
} finally {
try {
cn.close();
} catch (Exception e) {}
}
}
}
private Properties loadProperties() {
// Load properties from appropriate place...
// should contain definitions for:
// url=...
// username=...
// password=...
return new Properties();
}
}
To handle transactions a Transaction Aware data source should be used. I wouldn't recommend implementing this manually. Using something like warp-persist or a container supplied transaction management, however it would look something like this:
public class TxModule extends AbstractModule {
#Override
protected void configure() {
Names.bindProperties(binder(), loadProperties());
final TransactionManager tm = getTransactionManager();
bind(DataSource.class).annotatedWith(Real.class).toProvider(H2DataSourceProvider.class).in(Scopes.SINGLETON);
bind(DataSource.class).annotatedWith(TxAware.class).to(TxAwareDataSource.class).in(Scopes.SINGLETON);
bind(TransactionManager.class).toInstance(tm);
bindInterceptor(Matchers.any(), Matchers.annotatedWith(Transactional.class), new TxMethodInterceptor(tm));
bind(MyService.class);
}
private TransactionManager getTransactionManager() {
// Get the transaction manager
return null;
}
static class TxMethodInterceptor implements MethodInterceptor {
private final TransactionManager tm;
public TxMethodInterceptor(final TransactionManager tm) {
this.tm = tm;
}
#Override
public Object invoke(final MethodInvocation invocation) throws Throwable {
// Start tx if necessary
return invocation.proceed();
// Commit tx if started here.
}
}
static class TxAwareDataSource implements DataSource {
static ThreadLocal<Connection> txConnection = new ThreadLocal<Connection>();
private final DataSource ds;
private final TransactionManager tm;
#Inject
public TxAwareDataSource(#Real final DataSource ds, final TransactionManager tm) {
this.ds = ds;
this.tm = tm;
}
public Connection getConnection() throws SQLException {
try {
final Transaction transaction = tm.getTransaction();
if (transaction != null && transaction.getStatus() == Status.STATUS_ACTIVE) {
Connection cn = txConnection.get();
if (cn == null) {
cn = new TxAwareConnection(ds.getConnection());
txConnection.set(cn);
}
return cn;
} else {
return ds.getConnection();
}
} catch (final SystemException e) {
throw new SQLException(e);
}
}
// Omitted delegate methods.
}
static class TxAwareConnection implements Connection {
private final Connection cn;
public TxAwareConnection(final Connection cn) {
this.cn = cn;
}
public void close() throws SQLException {
try {
cn.close();
} finally {
TxAwareDataSource.txConnection.set(null);
}
}
// Omitted delegate methods.
}
static class MyService {
private final DataSource dataSource;
#Inject
public MyService(#TxAware final DataSource dataSource) {
this.dataSource = dataSource;
}
#Transactional
public void singleUnitOfWork() {
Connection cn = null;
try {
cn = dataSource.getConnection();
// Use the connection
} catch (final SQLException e) {
throw new RuntimeException(e);
} finally {
try {
cn.close();
} catch (final Exception e) {}
}
}
}
}
I would use something like c3po to create datasources directly. If you use ComboPooledDataSource you only need instance (pooling is done under the covers), which you can bind directly or through a provider.
Then I'd create an interceptor on top of that, one that e.g. picks up #Transactional, manages a connection and commit/ rollback. You could make Connection injectable as well, but you need to make sure you close the connections somewhere to allow them to be checked into the pool again.
To inject a data source, you probably don't need to be bound to a single data source instance since the database you are connecting to features in the url. Using Guice, it is possible to force programmers to provide a binding to a DataSource implementation (link) . This data source can be injected into a ConnectionProvider to return a data source.
The connection has to be in a thread local scope. You can even implement your thread local scope but all thread local connections must be closed & removed from ThreadLocal object after commit or rollback operations to prevent memory leakage. After hacking around, I have found that you need to have a hook to the Injector object to remove ThreadLocal elements. An injector can easily be injected into your Guice AOP interceptor, some thing like this:
protected void visitThreadLocalScope(Injector injector,
DefaultBindingScopingVisitor visitor) {
if (injector == null) {
return;
}
for (Map.Entry, Binding> entry :
injector.getBindings().entrySet()) {
final Binding binding = entry.getValue();
// Not interested in the return value as yet.
binding.acceptScopingVisitor(visitor);
}
}
/**
* Default implementation that exits the thread local scope. This is
* essential to clean up and prevent any memory leakage.
*
* The scope is only visited iff the scope is an sub class of or is an
* instance of {#link ThreadLocalScope}.
*/
private static final class ExitingThreadLocalScopeVisitor
extends DefaultBindingScopingVisitor {
#Override
public Void visitScope(Scope scope) {
// ThreadLocalScope is the custom scope.
if (ThreadLocalScope.class.isAssignableFrom(scope.getClass())) {
ThreadLocalScope threadLocalScope = (ThreadLocalScope) scope;
threadLocalScope.exit();
}
return null;
}
}
Make sure you call this after the method has been invoked and closing the connection. Try this to see if this works.
Please check the solution I provided: Transactions with Guice and JDBC - Solution discussion
it is just a very basic version and simple approach. but it works just fine to handle transactions with Guice and JDBC.