I'm working with PostgreSQL and Spring 4 and want my app auto create database when it running.
My Entity Class is:
#Entity
#Table(name = "user", schema = "public")
public class User extends BaseEntity {
private Integer id;
private String name;
private Integer contractId;
public User() {
}
public User(Integer id) {
super(id);
}
#Id
#Column(name = "usr_id", nullable = false)
#GeneratedValue(strategy= GenerationType.IDENTITY)
public Integer getId() {
return id;
}
public void setId(Integer id) {
this.id = id;
}
#Basic
#Column(name = "usr_name", nullable = true, length = -1)
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
#Basic
#Column(name = "usr_contract_id", nullable = true)
public Integer getContractId() {
return contractId;
}
public void setContractId(Integer contractId) {
this.contractId = contractId;
}
}
HibernateConfig.java
#Configuration
#EnableTransactionManagement(proxyTargetClass = true)
#PropertySources({
#PropertySource(value = "classpath:application.properties")})
#ConfigurationProperties(prefix = "spring.datasource")
public class HibernateConfig {
#Autowired
private Environment environment;
#Autowired
private DataSource dataSource;
#Autowired
private MultiTenantConnectionProvider multiTenantConnectionProvider;
#Autowired
private CurrentTenantIdentifierResolver currentTenantIdentifierResolver;
public HibernateConfig() {}
#Bean
public LocalSessionFactoryBean sessionFactory() throws Exception {
LocalSessionFactoryBean sessionFactory = new LocalSessionFactoryBean();
sessionFactory.setDataSource(dataSource);
sessionFactory.setHibernateProperties(hibernateProperties());
sessionFactory.setPackagesToScan(new String[] {
"com.xxx.xxx.model",
});
return sessionFactory;
}
private Properties hibernateProperties() {
Properties properties = new Properties();
properties.put(DIALECT, environment.getRequiredProperty(DIALECT));
properties.put(SHOW_SQL, environment.getRequiredProperty(SHOW_SQL));
properties.put(FORMAT_SQL, environment.getRequiredProperty(FORMAT_SQL));
properties.put(HBM2DDL_AUTO, environment.getRequiredProperty(HBM2DDL_AUTO));
return properties;
}
#Bean
#Primary
#Autowired
public HibernateTransactionManager transactionManager(SessionFactory s) {
HibernateTransactionManager txManager = new HibernateTransactionManager();
txManager.setSessionFactory(s);
return txManager;
}
#Bean
#Autowired
public HibernateTemplate hibernateTemplate(SessionFactory s) {
HibernateTemplate hibernateTemplate = new HibernateTemplate(s);
return hibernateTemplate;
}
}
application.properties
# Database connection settings:
jdbc.driverClassName=org.postgresql.Driver
jdbc.url=jdbc:postgresql://localhost:5432/database
jdbc.username=postgres
jdbc.password=111111
hibernate.dialect=org.hibernate.dialect.PostgreSQLDialect
hibernate.show_sql=false
hibernate.format_sql=false
hibernate.hbm2ddl.auto=update
spring.datasource.initialSize=50
spring.datasource.maxActive=200
spring.datasource.maxIdle=200
spring.datasource.minIdle=50
But when I running SQL to access table User, this will appear error: Table 'User' does not exist.
How can I make Hibernate to auto create database?
Postgres unlike mysql does not support Create Database If not exist.
Thus changing hibernate.hbm2ddl.auto=create and changing URL jdbc.url=jdbc:postgresql://localhost/database?createDatabaseIfNotExist=true
won't work for you.
However you can try simulating the behavior as in below questions:
Create Postgres database on the fly, if it doesn't exists using Hibernate
Simulate CREATE DATABASE IF NOT EXISTS for PostgreSQL?
Try this way
spring.jpa.hibernate.ddl-auto=update
spring.jpa.generate-ddl=true
spring.jpa.database-platform=org.hibernate.dialect.PostgreSQL94Dialect
spring.datasource.driverClassName=org.postgresql.Driver
spring.datasource.url= jdbc:postgresql://localhost:5432/postgres
spring.datasource.username=postgres
spring.datasource.password=123
spring.jpa.show-sql=true
spring.session.store-type=none
This is work for me.
From the Automatic schema generation section of the Hibernate User Guide:
javax.persistence.schema-generation.database.action
Setting to perform SchemaManagementTool actions automatically as part of the SessionFactory lifecycle. Valid options are defined by the externalJpaName value of the Action enum:
none - No action will be performed.
create - Database creation will be generated.
drop - Database dropping will be generated.
drop-and-create - Database dropping will be generated followed by database creation.
there spring.jpa.hibernate.ddl-auto=update ==> update, you can change according to your scenario.
The property hibernate.hbm2ddl.auto will do the trick for you. It automatically validates or exports schema DDL to the database when the SessionFactory is created. With create-drop, the database schema will be dropped when the SessionFactory is closed explicitly.
Hibernate can accept these options for the above property.
validate: validate the schema, makes no changes to the database.
update: update the schema.
create: creates the schema, destroying previous data.
create-drop: drop the schema at the end of the session.
Spring Boot
Postgres does not support createDatabaseIfNotExist=true
So you can try similar way and it worked for me see Screenshot
#SpringBootApplication
public class SpringSecurityJwtApplication{
public static void main(String[] args) {
Logger logger = LoggerFactory.getLogger(SpringSecurityJwtApplication.class);
Connection connection = null;
Statement statement = null;
try {
logger.debug("Creating database if not exist...");
connection = DriverManager.getConnection("jdbc:postgresql://localhost:5432/", "postgres", "postgres");
statement = connection.createStatement();
statement.executeQuery("SELECT count(*) FROM pg_database WHERE datname = 'database_name'");
ResultSet resultSet = statement.getResultSet();
resultSet.next();
int count = resultSet.getInt(1);
if (count <= 0) {
statement.executeUpdate("CREATE DATABASE database_name");
logger.debug("Database created.");
} else {
logger.debug("Database already exist.");
}
} catch (SQLException e) {
logger.error(e.toString());
} finally {
try {
if (statement != null) {
statement.close();
logger.debug("Closed Statement.");
}
if (connection != null) {
logger.debug("Closed Connection.");
connection.close();
}
} catch (SQLException e) {
logger.error(e.toString());
}
}
SpringApplication.run(SpringSecurityJwtApplication.class, args);
}
}
The problem is about hibernate dialect. You are using old one. You should use newer one like this.
spring.jpa.database-platform=org.hibernate.dialect.PostgreSQL95Dialect
Just change
from:
#Table(name = "user") || #Entity(name="user")
to:
#Table(name = "users") || #Entity(name="users")
Because PostgreSQL has default "user"
you can have a schema.sql script with "CREATE SCHEMA IF NOT EXISTS x;" with
spring.jpa.properties.hibernate.hbm2ddl.auto=update
It should work
Related
During a Unit test i'm trying to delete an entry from my database via Spring's CrudRepository, but it seems like nothing is happening.
The entity:
#Entity #Table(name = "FACTION")
public class Faction implements Serializable
{
private static final long serialVersionUID = 1L;
#Id #GeneratedValue(strategy = GenerationType.IDENTITY)
#Column(name = "ID", columnDefinition = "int", nullable = false, unique = true)
private Integer id;
public Integer getId()
{
return this.id;
}
}
The repository:
public interface FactionDao extends CrudRepository<Faction, Integer>
{
}
My test class:
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(classes = Config.class)
#Sql({ "/delete-testData.sql", "/insert-testData.sql" })
public class TestFactionDao
{
#Autowired
private FactionDao dao;
#Test
public void testDelete()
{
System.out.println(this.dao.findOne(1));
this.dao.delete(1);
System.out.println(this.dao.findOne(1));
}
}
Spring configuration:
#Configuration
#EnableJpaRepositories(basePackageClasses = Config.class)
#EnableTransactionManagement
public class Config
{
#Bean
public DataSource dataSource()
{
return new EmbeddedDatabaseBuilder().setType(EmbeddedDatabaseType.H2).build();
}
#Bean
public PlatformTransactionManager transactionManager(DataSource pDataSource)
{
return new DataSourceTransactionManager(pDataSource);
}
#Bean
LocalContainerEntityManagerFactoryBean entityManagerFactory(DataSource pDataSource)
{
HibernateJpaVendorAdapter tJpaVendorAdapter = new HibernateJpaVendorAdapter();
tJpaVendorAdapter.setDatabase(Database.H2);
tJpaVendorAdapter.setGenerateDdl(true);
tJpaVendorAdapter.setShowSql(true);
LocalContainerEntityManagerFactoryBean tEntityManagerFactory = new LocalContainerEntityManagerFactoryBean();
tEntityManagerFactory.setJpaVendorAdapter(tJpaVendorAdapter);
tEntityManagerFactory.setDataSource(pDataSource);
tEntityManagerFactory.setPackagesToScan(Config.class.getPackage().getName());
return tEntityManagerFactory;
}
}
The sql scripts:
-- delete-testData.sql
delete from FACTION;
-- insert-testData.sql
insert into FACTION (ID) values
(1);
Below is the console output during the test case. As you can see, no delete operation is executed and i can still read the entity i just deleted:
Hibernate: create table FACTION (ID int generated by default as identity, primary key (ID))
Hibernate: select faction0_.ID as ID1_3_0_ from FACTION faction0_ where faction0_.ID=?
de.iavra.data.Faction#1b97f47
Hibernate: select faction0_.ID as ID1_3_0_ from FACTION faction0_ where faction0_.ID=?
Hibernate: select faction0_.ID as ID1_3_0_ from FACTION faction0_ where faction0_.ID=?
de.iavra.data.Faction#17b8fa4
I tried annoting my test method with #Rollback(false), but it doesn't seem to make any difference. Calling flush() on the dao throws an exception saying there are no pending updates.
I test your code, and the solution is to change transactionManager type from PlatformTransactionManager to JpaTransactionManager. (I think maybe PlatformTransactionManager only commit the delete when the function ends.)
So the code is:
Config.java
#Bean
public JpaTransactionManager transactionManager(EntityManagerFactory emf) {
return new JpaTransactionManager(emf);
}
#Bean
LocalContainerEntityManagerFactoryBean entityManagerFactory(DataSource pDataSource) {
HibernateJpaVendorAdapter tJpaVendorAdapter = new HibernateJpaVendorAdapter();
...
You must add transaction for update and delete operations.
public class FactionService {
#Autowired
private FactionDao dao;
#Transactional
public void delete(int id){
dao.delete(id);
}
}
I have just started working in java spring framework. Am just trying to populate a simple table with columns id and a name. But am getting :
Unknown entity: org.hibernate.MappingException
I get that it is commonly encountered exception. But I couldn't fix this. You can find the The entity, dao and hibernate config am using below.
HibernateConfig.java
#Getter #Setter
#Configuration#ConfigurationProperties(prefix = "databaseConfiguration")
public class HibernateConfig {
#Value("${driverClass}")
private String driverClass;
#Value("${url}")
private String url;
#Value("username")
private String username;
#Value("password")
private String password;
#Value("${hibernateDialect}")
private String hibernateDialect;
#Value("${hbm2ddlAuto}")
private String hbm2ddlAuto;
private Integer minSize;
private Integer maxSize;
#Bean
public DataSource dataSource() {
DriverManagerDataSource dataSource = new DriverManagerDataSource();
dataSource.setDriverClassName(driverClass);
dataSource.setUrl(url);
dataSource.setUsername(username);
dataSource.setPassword(password);
return dataSource;
}
#Bean
public Properties hibernateProperties() {
Properties properties = new Properties();
properties.put("hibernate.hbm2ddl.auto", hbm2ddlAuto);
properties.put("hibernate.dialect", hibernateDialect);
properties.put("hibernate.c3p0.min_size", minSize);
properties.put("hibernate.c3p0.max_size", maxSize);
return properties;
}
#Bean
public LocalSessionFactoryBean sessionFactory(DataSource dataSource) {
LocalSessionFactoryBean sessionFactory = new LocalSessionFactoryBean();
sessionFactory.setDataSource(dataSource);
sessionFactory.setHibernateProperties(hibernateProperties());
return sessionFactory;
}
#Bean
public ITestDao testDao() {
ITestDao testDao = new TestDao();
return testDao;
}
}
All the properties are being taken from the .yml file. ITestDao is the interface with abstract add() method in it.
Entity class
#Getter
#Setter
#Entity
#Table(name = "test")
public class Test {
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
#Column(name = "id", nullable = false, unique = true)
private Long id;
#Column(name = "dump", nullable = false)
private String dump;
}
Dao class
#Repository
#Transactional
#Getter
#Setter
public class TestDao implements ITestDao {
#Autowired
private LocalSessionFactoryBean sessionFactoryBean;
public Test add(Test test) {
try {
sessionFactoryBean.getObject().getCurrentSession().getTransaction().begin();
sessionFactoryBean.getObject().getCurrentSession().persist(test);
} finally {
sessionFactoryBean.getObject().getCurrentSession().getTransaction().commit();
}
return test;
}
}
A service method will call this dao with #Transactional annotated above it. But while calling this add() dao method am getting Unknown entity:
org.hibernate.MappingException
Try this way :
#Bean
public LocalSessionFactoryBean sessionFactory(DataSource dataSource) {
LocalSessionFactoryBean sessionFactory = new LocalSessionFactoryBean();
sessionFactory.setDataSource(dataSource);
sessionFactory.setPackagesToScan(new String[] { "my.package.model" });// You need to provide to adapt : my.package.model
sessionFactory.setHibernateProperties(hibernateProperties());
return sessionFactory;
}
Good luck
You might be missing below annotation.
#EntityScan("some.known.persistence")
The #EntityScan only identifies which classes should be used by a specific persistence context.
I am having some trouble with Spring Boot, Spring Data and having Entities in an external jar. Any help would be greatly appreciated!
My Sprint Data repository looks like this:
#Repository
public interface MyFileRepository extends PagingAndSortingRepository<MyFile, Long> {
#Modifying
#Transactional
#Query("Delete from MyFile f where f.created < ?1")
long deleteOldEntities(Date cutoffDate);
}
My entity, which is in another jar entirely looks like this:
#Entity
#SequenceGenerator(
name = "SequenceIdGenerator",
sequenceName = "SEQ_ID_MY_FILE",
allocationSize = 20
)
#Table(
name = "MYFILE_TABLE"
)
public class MyFile extends BaseEntity {
private long id;
private byte[] data;
[...]
public MyFile() {}
#Id
#Column(
name = "id",
nullable = false
)
#GeneratedValue(
generator = "SequenceIdGenerator"
)
public long getId() {
return this.id;
}
public void setId(long id) {
this.id = id;
}
[...]
}
And the BaseEntity looks like this:
#MappedSuperclass
public abstract class BaseEntity implements Serializable {
private static final long serialVersionUID = 1L;
private static final Charset UTF_8 = Charset.forName("UTF-8");
private Date created = null;
private Date updated = null;
public BaseEntity() {}
#Column(
name = "created"
)
#Temporal(TemporalType.TIMESTAMP)
public Date getCreated() {
return this.created == null?null:new Date(this.created.getTime());
}
public void setCreated(Date created) {
if(created != null) {
this.created = new Date(created.getTime());
}
}
So, when I try to run this code I get a long stacktrace which basically ends with:
Caused by: org.hibernate.hql.internal.ast.QuerySyntaxException: MyFile is not mapped [Delete from MyFile f where f.created < ?1]
I believe that this may have something to do with the Spring Boot Configuration. The external jar does not have and #SpringBootApplication anywhere. It is basically just a jar with all my Entities.
My application jar however has this:
#SpringBootApplication
#EntityScan("myapp.service.dao.entity") --> This is the package where all my entities are located.
public class CommonApplication {
}
What is my error?
To scan entities residing in jar, you have to set packagesToScan field of LocalSessionFactory.
#Bean
public LocalSessionFactoryBean sessionFactory(DataSource dataSource) {
LocalSessionFactoryBean localSessionFactory = new LocalSessionFactoryBean();
localSessionFactory.setDataSource(dataSource);
localSessionFactory
.setPackagesToScan(new String[]{"myapp.service.dao.entity", "com.application.entity"});
return localSessionFactory;
}
I got this working using by using the following bean to set the packages scan:
#Bean
public EntityManagerFactory entityManagerFactory() {
HibernateJpaVendorAdapter vendorAdapter = new HibernateJpaVendorAdapter();
vendorAdapter.setGenerateDdl(false);
vendorAdapter.setShowSql(false);
vendorAdapter.setDatabase(Database.MYSQL);
LocalContainerEntityManagerFactoryBean factory = new LocalContainerEntityManagerFactoryBean();
factory.setJpaVendorAdapter(vendorAdapter);
factory.setPackagesToScan("add packages here");
return factory.getObject();
}
I have 2 entities - Media and Keyword with one to many relationship. I though I marked the fetch to Lazy Madia is still fetching all the keywords.
I am using spring 3.2 Hibernate 4.3.4
Media Entity:
#Entity
public class Media {
#Id
#NotNull
#Column(name = "mediaid")
private String mediaId;
#NotNull
#OneToMany(cascade = CascadeType.ALL, mappedBy = "media", fetch = FetchType.LAZY)
private List<Keyword> keywords;
public List<Keyword> getKeywords() {
return keywords;
}
public void setKeywords(List<Keyword> keywords) {
if ( this.keywords!=null && this.keywords.size()>0 )
this.keywords.addAll(keywords);
else
this.keywords = keywords;
for (Keyword keyword : keywords) {
keyword.setMedia(this);
}
}
}
Keyword Entity:
#Entity(name = "keywords")
public class Keyword {
#Id
#NotNull
#Column(unique = true)
private String idKey;
#ManyToOne(fetch = FetchType.LAZY)
#JoinColumn(name = "idmedia")
private Media media;
public Media getMedia() {
return media;
}
public void setMedia(Media media) {
this.media = media;
if (idKey == null || idKey.isEmpty())
setIdKey(this.media.getMediaId());
else
setIdKey(this.media.getMediaId() + "_" + idKey);
if (!media.containsKeyword(this))
media.getKeywords().add(this);
}
}
I can see the entities being fetch in the show_sql output but also this test fails
#WebAppConfiguration
#ContextConfiguration(classes = {PersistenceConfig.class})
#Transactional
#TransactionConfiguration(defaultRollback = true)
public class MediasRepositoryTest extends AbstractTransactionalTestNGSpringContextTests {
#Autowired
MediasRepository mediasRepository;
#PersistenceContext
EntityManager manager;
public void thatMediaLazyLoadsKeywords() throws Exception {
PersistenceUnitUtil unitUtil = manager.getEntityManagerFactory().getPersistenceUnitUtil();
Media media = DomainFixtures.createMedia();
Media savedMedia = mediasRepository.save(media);
Media retrievedMedia = mediasRepository.findOne(savedMedia.getMediaId());
assertNotNull(retrievedMedia);
assertFalse(unitUtil.isLoaded(retrievedMedia, "keywords"));
}
}
Configuration:
#EnableTransactionManagement
public class PersistenceConfig {
static Logger logger = LoggerFactory.getLogger(PersistenceConfig.class);
#Autowired
private Environment env;
// #Value("${init-db:false}")
private String initDatabase = "false";
#Bean
public DataSource dataSource()
{
logger.info("Starting dataSource");
BasicDataSource dataSource = new BasicDataSource();
logger.info("jdbc.driverClassName"+ env.getProperty("jdbc.driverClassName"));
dataSource.setDriverClassName(env.getProperty("jdbc.driverClassName"));
dataSource.setUrl(env.getProperty("jdbc.url"));
dataSource.setUsername(env.getProperty("jdbc.username"));
dataSource.setPassword(env.getProperty("jdbc.password"));
logger.info("End dataSource");
return dataSource;
}
#Bean
public EntityManagerFactory entityManagerFactory() throws SQLException
{
logger.info("Starting entityManagerFactory");
HibernateJpaVendorAdapter vendorAdapter = new HibernateJpaVendorAdapter();
vendorAdapter.setGenerateDdl(Boolean.TRUE);
vendorAdapter.setShowSql(Boolean.parseBoolean(env.getProperty("hibernate.show_sql")));
LocalContainerEntityManagerFactoryBean factory = new LocalContainerEntityManagerFactoryBean();
factory.setJpaVendorAdapter(vendorAdapter);
factory.setPackagesToScan("....");
factory.setDataSource(dataSource());
Properties jpaProperties = new Properties();
jpaProperties.put("hibernate.hbm2ddl.auto", env.getProperty("hibernate.hbm2ddl.auto"));
jpaProperties.put("hibernate.enable_lazy_load_no_trans", true);
jpaProperties.put("hibernate.dialect", "org.hibernate.dialect.PostgreSQLDialect");
factory.setJpaProperties(jpaProperties);
factory.afterPropertiesSet();
logger.info("End entityManagerFactory");
return factory.getObject();
}
.
.
.
}
You test is running in the context of a single transaction.
The Keywords are not being fetched, they're still in the persistence context from when you created them.
Try adding a manager.clear() after your save method call.
FetchType.LAZY is just a hint to the persistence provider, it is not a setting you can rely on. The persistence provider is free to load the attributes whenever it sees fit.
Thus you cannot reliably test whether lazy loading works as expected.
Spring configuration is done in code using annotations instead of XML file. I was trying to query some data and create new columns into ORACLE database through hibernate. My problem is that hibernate only generates SELECT queries, when I use sessionFactory.getCurrentSession().save(), hibernate doesn't generate INSERT queries. I think this could be a transaction issue but couldn't find where went wrong. I'll put code below and any help will be appreciated.
That's the main configuration class:
#Configuration
#ComponentScan
#EnableAutoConfiguration
#EnableTransactionManagement
#PropertySource({ "file:src/main/resources/Config/database.properties" })
public class QCWordImportExportTool {
#Autowired
private Environment env;
#Autowired
private WietHibernateInterceptor wietHibernateInterceptor;
/**
* main, says it all! :)
*
* #param args
*/
public static void main(String[] args) {
SpringApplication.run(QCWordImportExportTool.class, args);
}
#Bean
MultipartConfigElement multipartConfigElement() {
MultiPartConfigFactory factory = new MultiPartConfigFactory();
factory.setMaxFileSize("10MB");
factory.setMaxRequestSize("1024KB");
return factory.createMultipartConfig();
}
#Bean
public LocalSessionFactoryBean sessionFactory() {
LocalSessionFactoryBean sessionFactory = new LocalSessionFactoryBean();
sessionFactory.setDataSource(restDataSource());
sessionFactory.setPackagesToScan(new String[] { "com.ciena.prism.almtools.wiet" });
sessionFactory.setHibernateProperties(hibernateProperties());
sessionFactory.setEntityInterceptor(this.wietHibernateInterceptor);
return sessionFactory;
}
#Bean
public DataSource restDataSource() {
BasicDataSource dataSource = new BasicDataSource();
dataSource.setDriverClassName(env.getProperty("jdbc.driverClassName"));
dataSource.setUrl(env.getProperty("jdbc.url"));
dataSource.setUsername(env.getProperty("jdbc.username"));
dataSource.setPassword(env.getProperty("jdbc.password"));
return dataSource;
}
#Bean
public HibernateTransactionManager transactionManager() {
HibernateTransactionManager transactionManager = new HibernateTransactionManager();
transactionManager.setSessionFactory(sessionFactory().getObject());
return transactionManager;
}
#Bean
public PersistenceExceptionTranslationPostProcessor exceptionTranslation() {
return new PersistenceExceptionTranslationPostProcessor();
}
Properties hibernateProperties() {
return new Properties() {
private static final long serialVersionUID = 1L;
{
setProperty("hibernate.dialect", env.getProperty("hibernate.dialect"));
setProperty("hibernate.hbm2ddl.auto", env.getProperty("hibernate.hbm2ddl.auto"));
setProperty("hibernate.globally_quoted_identifiers",
env.getProperty("hibernate.globally_quoted_identifiers"));
setProperty("hibernate.show_sql", env.getProperty("hibernate.show_sql"));
setProperty("hibernate.format_sql", env.getProperty("hibernate.format_sql"));
}
};
}
}
The Service class is an interface and I'll post the Service Impl class with the main method:
#Service
#Transactional(readOnly = true)
public class ImportExportManagerImpl implements ImportExportManager {
private TestFacade testFacade;
private TestFolderFacade testFolderFacade;
private UserManager userManager;
#Autowired
SessionFactory sessionFactory;
#Autowired
RequirementCoverageDAO requirementCoverageDao;
#Autowired
RequirementDAO requirementDao;
#Autowired
WietHibernateInterceptor wietHibernateInterceptor;
#Autowired
public ImportExportManagerImpl(TestFacade testFacade, TestFolderFacade testFolderFacade,
UserManager userManager) {
this.testFacade = testFacade;
this.testFolderFacade = testFolderFacade;
this.userManager = userManager;
}
/*
* (non-Javadoc)
*
* #see com.ciena.prism.almtools.wiet.managers.ImportExportManager#importTestCases(java.lang.String,
* java.lang.String, java.util.List)
*/
#Override
#Transactional(readOnly = false)
public void importTestCases(String domain, String project, List<TestCase> testCases)
throws RequestFailureException, RESTAPIException, InvalidDataException {
System.out.println("Start to import...");
setDBSchema(domain, project);
for (TestCase testCase : testCases) {
TestFolder testFolder = retrieveTestFolderFromPath(domain, project, testCase.getFolderPath());
Test test = new Test(testCase, testFolder);
ALMEntity almEntity = new ALMEntity(test);
Test existingTest = getExistingTest(domain, project, test);
if (existingTest == null) {
existingTest = new Test(testFacade.createEntity(domain, project, almEntity));
} else {
testFacade.updateEntity(domain, project, existingTest.getId(), almEntity);
}
System.out.println(existingTest.getName());
/* Create Requirement_Coverage using test and doors_object_ids */
List<String> doors_object_ids = testCase.getDoors_object_ids();
for (String doors_object_id : doors_object_ids) {
List<Requirement> requirementList = requirementDao.findAllFromDoorsobjectid(doors_object_id);
if (requirementList != null && !requirementList.isEmpty()) {
System.out.println("*************Requirement:" + doors_object_id + " not null");
/* check if the coverage already exist */
Requirement requirement = requirementList.get(0);
List<RequirementCoverage> requirementCoverageList = requirementCoverageDao
.findAllFromTestIdReqId(Integer.parseInt(existingTest.getId()),
requirement.getReqId());
if (requirementCoverageList == null || requirementCoverageList.isEmpty()) {
System.out.println("**************Creating new requirement coverage");
/* create a new Requirement Coverage Object */
RequirementCoverage requirementCoverage = new RequirementCoverage();
requirementCoverage.setRequirement(requirement);
requirementCoverage.setEntityId(Integer.parseInt(existingTest.getId()));
requirementCoverage.setEntityType("TEST");
requirementCoverageDao.create(requirementCoverage);
System.out.println("*********assigned DB id: " + requirementCoverage.getId());
}
} else {
throw new InvalidDataException("Requirement Management Tool Id : " + doors_object_id
+ " doesn't exist in QC");
}
}
}
}
}
And here's the DAO impl class:
#Repository
public class RequirementCoverageDAOImpl implements RequirementCoverageDAO {
#Autowired
private SessionFactory sessionFactory;
#Override
public Integer create(RequirementCoverage requirementCoverage) {
return (Integer) sessionFactory.getCurrentSession().save(requirementCoverage);
}
}
Then Entity Class:
#Entity
#Table(schema = "wiet", name = "REQ_COVER")
public class RequirementCoverage {
#SuppressWarnings("unused")
private static final long serialVersionUID = 1L;
#Id
#SequenceGenerator(name = "req_cover_id_gen", sequenceName = "wiet.REQ_COVER_SEQ", allocationSize = 1)
#GeneratedValue(strategy = GenerationType.SEQUENCE, generator = "req_cover_id_gen")
#Column(name = "RC_ITEM_ID", unique = true, nullable = false)
private Integer id;
#OneToOne
#JoinColumn(name = "RC_REQ_ID", nullable = false)
private Requirement requirement;
#Column(name = "RC_ENTITY_ID", nullable = false)
private Integer entityId;
#Column(name = "RC_ENTITY_TYPE", nullable = false)
private String entityType;
....setters and gettters...
}
Hope I have put this clear and thanks for reading.
WietHibernateInterceptor is used to change schema dynamically:
#Component
public class WietHibernateInterceptor extends EmptyInterceptor {
private static final long serialVersionUID = 1L;
private String schema;
#Override
public String onPrepareStatement(String sql) {
String prepedStatement = super.onPrepareStatement(sql);
if (prepedStatement.toLowerCase().contains("wiet.".toLowerCase())) {
/* As #SequenceGenerator ignores schema, sequence squery is manually set to be correct */
prepedStatement = prepedStatement.replaceAll("`", "\"");
prepedStatement = prepedStatement.replaceAll("wiet.", this.schema + "\".\"");
}
/* Change schema dynamically */
prepedStatement = prepedStatement.replaceAll("wiet", this.schema);
return prepedStatement;
}
public String getSchema() {
return schema;
}
public void setSchema(String schema) {
this.schema = schema;
}
}
Hibernate will only generate the INSERT statements when it flushes the session. It will flush the session in the following scenarios
When the #Transactional method ends,
When it reaches its batch limit (if configured for batch inserts) or,
When you call a query that it detects needs the session to be
flushed eg you call a count() method. Hibernate would flush the
transactional session and commit it so that count() returns an
accurate value of records.
My only ideas is that an exception is being thrown that isn't being caught (but I would have expected it to show in your logs or your console) which is causing your transaction to roll back, or that for some reason the #Transactional session isn't being created or maintained or something.
1st thing I'd try, remove the Hibernate dialect from your hibernate properties. Hibernate does a fantastic job determining which dialect to use based on the driver you give it, and (especially with Oracle drivers) if you force it to use a different dialect it can produce strange errors.
2nd thing I'd try is to determine if there are any SQLException's being thrown by Hibernate. I'd assume that you have already turned on your hibernate logging using Log4J or whatever logging provider you are using. See if you can find an SQLException being thrown.
Your #Transaction annotation says readOnly = true. That means only read is allowed in that transaction. Remove readOnly = true.
Also look at the Spring Transaction management 9.5.6. Using #Transactional.