ISIS: Problems with Blob/Clob field serialization - java

Edit: Solution: Upgrading to ISIS 1.17.0 and setting the property isis.persistor.datanucleus.standaloneCollection.bulkLoad=false solved the first two problems.
I am using Apache ISIS 1.16.2 and I try to store Blob/Clob content in a MariaDB database (v10.1.35). Therefore, I use the DB connector org.mariadb.jdbc.mariadb-java-client (v2.3.0) and in the code the #Persistent annotation as shown in many examples and the ISIS documentation.
Using the code below, I just get one single column named content_name (in which the Blob object is serialized in binary form) instead of the three columns content_name, content_mimetype and content_bytes.
This is the Document class with the Blob field content:
#PersistenceCapable(identityType = IdentityType.DATASTORE)
#DatastoreIdentity(strategy = IdGeneratorStrategy.IDENTITY, column = "id")
#DomainObject(editing = Editing.DISABLED, autoCompleteRepository = DocumentRepository.class, objectType = "Document")
#Getter
// ...
public class Document implements Comparable<Document> {
#Persistent(
defaultFetchGroup = "false",
columns = {
#Column(name = "content_name"),
#Column(name = "content_mimetype"),
#Column(name = "content_bytes",
jdbcType = "BLOB",
sqlType = "LONGVARBINARY")
})
#Nonnull
#Column(allowsNull = "false")
#Property(optionality = Optionality.MANDATORY)
private Blob content;
// ...
#Column(allowsNull = "false")
#Property
private Date created = new Date();
public Date defaultCreated() {
return new Date();
}
#Column(allowsNull = "true")
#Property
#Setter
private String owner;
// ...
}
This creates the following schema for the DomainObject class Document with just one column for the Blob field:
CREATE TABLE `document` (
`id` bigint(20) NOT NULL AUTO_INCREMENT,
`content_name` mediumblob,
`created` timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP,
`owner` varchar(255) CHARACTER SET latin1 COLLATE latin1_bin DEFAULT NULL,
PRIMARY KEY (`id`)
) ENGINE=InnoDB DEFAULT CHARSET=latin1;
Normally, the class org.apache.isis.objectstore.jdo.datanucleus.valuetypes.IsisBlobMapping of the ISIS framework should do the mapping. But it seems that this Mapper is somehow not involved...
1. Question: How do I get the Blob field being split up in the three columns (as described above and in many demo projects). Even if I switch to HSQLDB, I still get only one column, so this might not be an issue with MariaDB.
2. Question: If I use a Blob/Clob field in a class that inherits from another DomainObject class, I often get a org.datanucleus.exceptions.NucleusException (stack trace see below) and I cannot make head or tail of it. What are potential pitfalls when dealing with inheritance? Why am I getting this exception?
3. Question: I need to store documents belonging to domain objects (as you might have guessed). The proper way of doing so would be to store the documents in a file system tree instead of a database (which also has by default some size limitations for object data) and reference the files in the object. In the Datanucleus documentation I found the extension serializeToFileLocation that should do exactly that. I tried it by adding the line #Extension(vendorName="datanucleus", key="serializeToFileLocation" value="document-repository") to the Blob field, but nothing happened. So my question is: Is this Datanucleus extension compatible with Apache Isis?
If this extension conflicts with Isis, would it be possible to have a javax.jdo.listener.StoreLifecycleListener or org.apache.isis.applib.AbstractSubscriber that stores the Blob on a file system before persisting the domain object to database and restoring it before loading? Are there better solutions available?
That's it for now. Thank you in advance! ;-)
The stack trace to question 2:
... (other Wicket related stack trace)
Caused by: org.datanucleus.exceptions.NucleusException: Creation of SQLExpression for mapping "org.datanucleus.store.rdbms.mapping.java.SerialisedMapping" caused error
at org.datanucleus.store.rdbms.sql.expression.SQLExpressionFactory.newExpression(SQLExpressionFactory.java:199)
at org.datanucleus.store.rdbms.sql.expression.SQLExpressionFactory.newExpression(SQLExpressionFactory.java:155)
at org.datanucleus.store.rdbms.request.LocateBulkRequest.getStatement(LocateBulkRequest.java:158)
at org.datanucleus.store.rdbms.request.LocateBulkRequest.execute(LocateBulkRequest.java:283)
at org.datanucleus.store.rdbms.RDBMSPersistenceHandler.locateObjects(RDBMSPersistenceHandler.java:564)
at org.datanucleus.ExecutionContextImpl.findObjects(ExecutionContextImpl.java:3313)
at org.datanucleus.api.jdo.JDOPersistenceManager.getObjectsById(JDOPersistenceManager.java:1850)
at org.apache.isis.core.runtime.system.persistence.PersistenceSession.loadPersistentPojos(PersistenceSession.java:1010)
at org.apache.isis.core.runtime.system.persistence.PersistenceSession.adaptersFor(PersistenceSession.java:1603)
at org.apache.isis.core.runtime.system.persistence.PersistenceSession.adaptersFor(PersistenceSession.java:1573)
at org.apache.isis.viewer.wicket.model.models.EntityCollectionModel$Type$1.loadInBulk(EntityCollectionModel.java:107)
at org.apache.isis.viewer.wicket.model.models.EntityCollectionModel$Type$1.load(EntityCollectionModel.java:93)
at org.apache.isis.viewer.wicket.model.models.EntityCollectionModel.load(EntityCollectionModel.java:454)
at org.apache.isis.viewer.wicket.model.models.EntityCollectionModel.load(EntityCollectionModel.java:70)
at org.apache.wicket.model.LoadableDetachableModel.getObject(LoadableDetachableModel.java:135)
at org.apache.isis.viewer.wicket.ui.components.collectioncontents.ajaxtable.CollectionContentsSortableDataProvider.size(CollectionContentsSortableDataProvider.java:68)
at org.apache.wicket.markup.repeater.data.DataViewBase.internalGetItemCount(DataViewBase.java:142)
at org.apache.wicket.markup.repeater.AbstractPageableView.getItemCount(AbstractPageableView.java:235)
at org.apache.wicket.markup.repeater.AbstractPageableView.getRowCount(AbstractPageableView.java:216)
at org.apache.wicket.markup.repeater.AbstractPageableView.getViewSize(AbstractPageableView.java:314)
at org.apache.wicket.markup.repeater.AbstractPageableView.getItemModels(AbstractPageableView.java:99)
at org.apache.wicket.markup.repeater.RefreshingView.onPopulate(RefreshingView.java:93)
at org.apache.wicket.markup.repeater.AbstractRepeater.onBeforeRender(AbstractRepeater.java:124)
at org.apache.wicket.markup.repeater.AbstractPageableView.onBeforeRender(AbstractPageableView.java:115)
at org.apache.wicket.Component.internalBeforeRender(Component.java:950)
at org.apache.wicket.Component.beforeRender(Component.java:1018)
at org.apache.wicket.MarkupContainer.onBeforeRenderChildren(MarkupContainer.java:1825)
... 81 more
Caused by: org.datanucleus.exceptions.NucleusException: Unable to create SQLExpression for mapping of type "org.datanucleus.store.rdbms.mapping.java.SerialisedMapping" since not supported
at org.datanucleus.store.rdbms.sql.expression.SQLExpressionFactory#newExpression(SQLExpressionFactory.java:189)
at org.datanucleus.store.rdbms.sql.expression.SQLExpressionFactory#newExpression(SQLExpressionFactory.java:155)
at org.datanucleus.store.rdbms.request.LocateBulkRequest#getStatement(LocateBulkRequest.java:158)
at org.datanucleus.store.rdbms.request.LocateBulkRequest#execute(LocateBulkRequest.java:283)
at org.datanucleus.store.rdbms.RDBMSPersistenceHandler#locateObjects(RDBMSPersistenceHandler.java:564)
at org.datanucleus.ExecutionContextImpl#findObjects(ExecutionContextImpl.java:3313)
at org.datanucleus.api.jdo.JDOPersistenceManager#getObjectsById(JDOPersistenceManager.java:1850)
at org.apache.isis.core.runtime.system.persistence.PersistenceSession#loadPersistentPojos(PersistenceSession.java:1010)
at org.apache.isis.core.runtime.system.persistence.PersistenceSession#adaptersFor(PersistenceSession.java:1603)
at org.apache.isis.core.runtime.system.persistence.PersistenceSession#adaptersFor(PersistenceSession.java:1573)
at org.apache.isis.viewer.wicket.model.models.EntityCollectionModel$Type$1#loadInBulk(EntityCollectionModel.java:107)
at org.apache.isis.viewer.wicket.model.models.EntityCollectionModel$Type$1#load(EntityCollectionModel.java:93)
at org.apache.isis.viewer.wicket.model.models.EntityCollectionModel#load(EntityCollectionModel.java:454)
at org.apache.isis.viewer.wicket.model.models.EntityCollectionModel#load(EntityCollectionModel.java:70)
at org.apache.wicket.model.LoadableDetachableModel#getObject(LoadableDetachableModel.java:135)
at org.apache.isis.viewer.wicket.ui.components.collectioncontents.ajaxtable.CollectionContentsSortableDataProvider#size(CollectionContentsSortableDataProvider.java:68)
at org.apache.wicket.markup.repeater.data.DataViewBase#internalGetItemCount(DataViewBase.java:142)
at org.apache.wicket.markup.repeater.AbstractPageableView#getItemCount(AbstractPageableView.java:235)
at org.apache.wicket.markup.repeater.AbstractPageableView#getRowCount(AbstractPageableView.java:216)
at org.apache.wicket.markup.repeater.AbstractPageableView#getViewSize(AbstractPageableView.java:314)
at org.apache.wicket.markup.repeater.AbstractPageableView#getItemModels(AbstractPageableView.java:99)
at org.apache.wicket.markup.repeater.RefreshingView#onPopulate(RefreshingView.java:93)
at org.apache.wicket.markup.repeater.AbstractRepeater#onBeforeRender(AbstractRepeater.java:124)
at org.apache.wicket.markup.repeater.AbstractPageableView#onBeforeRender(AbstractPageableView.java:115)
// <-- 8 times the following lines
at org.apache.wicket.Component#internalBeforeRender(Component.java:950)
at org.apache.wicket.Component#beforeRender(Component.java:1018)
at org.apache.wicket.MarkupContainer#onBeforeRenderChildren(MarkupContainer.java:1825)
at org.apache.wicket.Component#onBeforeRender(Component.java:3916)
// -->
at org.apache.wicket.Page#onBeforeRender(Page.java:801)
at org.apache.wicket.Component#internalBeforeRender(Component.java:950)
at org.apache.wicket.Component#beforeRender(Component.java:1018)
at org.apache.wicket.Component#internalPrepareForRender(Component.java:2236)
at org.apache.wicket.Page#internalPrepareForRender(Page.java:242)
at org.apache.wicket.Component#render(Component.java:2325)
at org.apache.wicket.Page#renderPage(Page.java:1018)
at org.apache.wicket.request.handler.render.WebPageRenderer#renderPage(WebPageRenderer.java:124)
at org.apache.wicket.request.handler.render.WebPageRenderer#respond(WebPageRenderer.java:195)
at org.apache.wicket.core.request.handler.RenderPageRequestHandler#respond(RenderPageRequestHandler.java:175)
at org.apache.wicket.request.cycle.RequestCycle$HandlerExecutor#respond(RequestCycle.java:895)
at org.apache.wicket.request.RequestHandlerStack#execute(RequestHandlerStack.java:64)
at org.apache.wicket.request.cycle.RequestCycle#execute(RequestCycle.java:265)
at org.apache.wicket.request.cycle.RequestCycle#processRequest(RequestCycle.java:222)
at org.apache.wicket.request.cycle.RequestCycle#processRequestAndDetach(RequestCycle.java:293)
at org.apache.wicket.protocol.http.WicketFilter#processRequestCycle(WicketFilter.java:261)
at org.apache.wicket.protocol.http.WicketFilter#processRequest(WicketFilter.java:203)
at org.apache.wicket.protocol.http.WicketFilter#doFilter(WicketFilter.java:284)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain#doFilter(ServletHandler.java:1668)
at org.apache.isis.core.webapp.diagnostics.IsisLogOnExceptionFilter#doFilter(IsisLogOnExceptionFilter.java:52)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain#doFilter(ServletHandler.java:1668)
at org.apache.shiro.web.servlet.AbstractShiroFilter#executeChain(AbstractShiroFilter.java:449)
at org.apache.shiro.web.servlet.AbstractShiroFilter$1#call(AbstractShiroFilter.java:365)
at org.apache.shiro.subject.support.SubjectCallable#doCall(SubjectCallable.java:90)
at org.apache.shiro.subject.support.SubjectCallable#call(SubjectCallable.java:83)
at org.apache.shiro.subject.support.DelegatingSubject#execute(DelegatingSubject.java:383)
at org.apache.shiro.web.servlet.AbstractShiroFilter#doFilterInternal(AbstractShiroFilter.java:362)
at org.apache.shiro.web.servlet.OncePerRequestFilter#doFilter(OncePerRequestFilter.java:125)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain#doFilter(ServletHandler.java:1668)
// ... some Jetty stuff
at java.lang.Thread#run(Thread.java:748)

After some research, I think that the problem of question 1 and 2 seems to be related to this ISIS bug report #1902.
In short: The datanucleus extension plugin resolving mechanism does not seem to find the ISIS value type adapters and therefore cannot know how to serialize ISIS's Blob/Clob types.
According to the aforementioned ISIS bug report, this problem is fixed in 1.17.0, so I am trying to upgrade from 1.16.2 to this version (which introduced many other problems, but that will be an extra topic).
For question 3 I have found Minio which addresses basically my problem, but it is a bit oversized for my needs. I will keep looking for other solutions to store Blob/Clobs to local file system and will keep Minio in mind...
UPDATE:
I upgraded my project to ISIS version 1.17.0. It solved my problem in question 1 (now I get three columns for a Blob/Clob object).
The problem in question 2 (NucleusException) is not solved by the upgrade. I figured out that it is only thrown if returning a list of DomainObjects with Blob/Clob field(s), i.e. if rendered as standalone table. If I get directly to an object's entity view, no exception is thrown and I can see/modify/download the Blob/Clob content.
In the meantime, I wrote my own datanucleus plugin, that stores the Blobs/Clobs as files on a file system.
UPDATE 2
I found a solution for circumventing the org.datanucleus.exceptions.NucleusException: Unable to create SQLExpression for mapping of type "org.apache.isis.objectstore.jdo.datanucleus.valuetypes.IsisClobMapping" since not supported. It seems to be a problem with bulk-loading (but I do not know any details).
Deactivating bulk-load via the property isis.persistor.datanucleus.standaloneCollection.bulkLoad=false (which is initially set to true by ISIS archetypes) solved the problem.

Related

How to retrieve entity with column of database type text[] and not get NoSuchMethodError

I have an entity that is defined as per below, when I try to retrieve it from the database it throw and intermitent java.lang.NoSuchMethodError. The Error normally occurs when that endpoint is called multiple times before the other finishes. From my experience and internet search, this error usually occurs when the compiled java version does not match the runtime java version.
I am using Java 8 for compiling and running the application. I am also using Eclipselink 2.6.2 and the Postgres driver version is 42.3.3.
Entity:
#Struct(name = "myArrayColumn")
#Entity
#Table(name = "myTable")
public class MyTableEntity {
...
#Column(name = "myArrayColumn", columnDefinition = "text[]")
#Array(databaseType = "varchar")
private List<String> myArrayColumn;
...
}
Table:
CREATE TABLE myTable
(
...
myArrayColumn text[],
...
)
Stacktrace:
Caused by: java.lang.NoSuchMethodError: java.nio.ByteBuffer.position(I)Ljava\/nio\/ByteBuffer;
at org.postgresql.jdbc.ArrayDecoding.readBinaryArray(ArrayDecoding.java:529)
at org.postgresql.jdbc.PgArray.readBinaryArray(PgArray.java:175)
at org.postgresql.jdbc.PgArray.getArrayImpl(PgArray.java:150)
at org.postgresql.jdbc.PgArray.getArray(PgArray.java:111)
You are in the wrong thinking. The database does not support storing this kind of array object. You can store multiple records and use a column to associate them, or store them as sentences separated by special symbols, and then use the String.split function to split them after taking them out.

MongoDb Duplicate Keys with versionned document

I have a problem of E11000 duplicate key error collection: xxxxx index: _id_ dup key: since I decide to add version with the #Version annotation to an object.
#Version
private long version;
When the obj is modified, I use mongoTemplate.save(obj); to override the document and the error pop a that time.
All works fine before the versionned doc.
Do you have an idea ? Because I don't have.
The objs are stored with mongoDb 3.6
Thank you
if you add version to your document you have to change it.
version and id create a tuple key
The problem was that I tried to save a new copy of the object. Without versioning there was no problem.
I solved the problem by saving directly the obj.

Flyway partial migration of legacy application

In an application with a custom database migrator which we want to replace with Flyway.
These migrations are split into some categories like "account" for user management and "catalog" for the product catalog.
Files are named $category.migration.$version.sql. Here, $category is one of the above categories and $version is an integer version starting from 0.
e.g. account.migration.23.sql
Although one could argue that each category should be a separate database, in fact it isn't and a major refactoring would be required to change that.
Also I could use one schema per category, but again this would require rewriting all SQL queries.
So I did the following:
Move $category.migration.$version.sql to /sql/$category/V$version__$category.sql (e.g. account.migration.1.sql becomes /sql/account/V1_account.sql)
Use a metadata table per category
set the baseline version to zero
In code that would be
String[] _categories = new String[] { "catalog", "account" };
for (String _category : _categories) {
Flyway _flyway = new Flyway();
_flyway.setDataSource(databaseUrl.getUrl(), databaseUrl.getUser(), databaseUrl.getPassword());
_flyway.setBaselineVersion(MigrationVersion.fromVersion("0"));
_flyway.setLocations("classpath:/sql/" + applicationName);
_flyway.setTarget(MigrationVersion.fromVersion(_version + ""));
_flyway.setTable(category + "_schema_version");
_flyway.setBaselineOnMigrate(true); // (1)
_flyway.migrate();
}
So there would be the metadata tables catalog_schema_version and account_schema_version.
Now the issue is as follows:
Starting with an empty database I would like to apply all pre-existing migrations per category, as done above.
If I remove _flyway.setBaselineOnMigrate(true); (1), then the catalog migration (the first one) succeeds, but it would complain for account that the schema public is not empty.
Likewise setting _flyway.setBaselineOnMigrate(true); causes the following behavior:
The migration of "catalog" succeeds but V0_account.sql is ignored and Flyway starts with V1_account.sql, maybe because it somehow still thinks the database was already baselined?
Does anyone have a a suggestion for resolving the problem?
Your easiest solution is to keep the schema_version tables in another schema each. I've answered a very similar question here.
Regarding your observation on baseline, those are expected traits. The migration of account starts at v1 because with the combination of baseline=0, baselineOnMigrate=true and a non empty target schema (because catalog has populated it) Flyway has determined this is a pre-existing database that is equal to the baseline - thus start at v1.

Derby "A truncation error was encountered trying to shrink CLOB '<stream-value>' to length 255"

I have a column definition in my JPA entity.
#Lob
String message;
I have following error when I try to persist my entity with some message.
Caused by: java.sql.SQLException: Java exception: 'A truncation error was encountered trying to shrink CLOB '' to length 255.: org.apache.derby.iapi.services.io.DerbyIOException'.
Caused by: org.apache.derby.iapi.services.io.DerbyIOException: A truncation error was encountered trying to shrink CLOB '' to length 255.
I'm using Hibernate 4.2.15.Final with Apache Derby 10.10.2 and schema generation during testing. It looks like the length of the column is limited to 255. How to overcome this error?
The Derby dialect in Hibernate generates column definition with limited legth:
message clob(255)
You may want to override this behaviour and set your own length or leave default length. You may use
#Column(columnDefinition="clob")
#Lob
String message;
Remember that in derby
A CLOB without a specified length is defaulted to two gigabytes (2,147,483,647)
This is regarded as a bug in Hibernate https://hibernate.atlassian.net/browse/HHH-7264 and https://hibernate.atlassian.net/browse/HHH-1501 but it's not going to be fixed as it seems.
Hibernate does not take into account the length/scale/precision nor
the dialect when determining the Hibernate Type to use for a property
which omitted specifying one. What it does is to basically choose a
default based solely on the Java type
The base class of all Derby Dialects (DB2Dialect) does this:
registerColumnType( Types.CLOB, "clob($l)" );
You can hook declare a Hibernate DialectResolver which does something like:
private class DerbyTenSevenDialectWithClob extends DerbyTenSevenDialect {
public DerbyTenSevenDialectWithClob() {
super();
registerColumnType( Types.CLOB, "clob" );
}
}
I just verified that using Hibernate 5.3.7 and Derby 10.12.1.1
I had the same problem and here is my solution for hibernate-core-4.2.7 and Derby:
If you create a string columns they will become VARCHAR(255)
#Column()
private String userName;
BUT you can increase the length, for example to 1000 characters by doing :
#Column(length = 1000)
private String description;
PS: I had to drop/delete my old table to force create a new with the correct size, also have a look inside the #Column interface there is some useful stuff there like unique, ullable, updatable

DbUnit: NoSuchColumnException and case sensitivity

Before posting this I googled a bit, I looked for in dbunit-user
archives and a bit also in DbUnit bug list, but I'm not found what
looking for.
Unfortunately, answers here did not help me either.
I'm using DbUnit 2.4.8 with MySQL 5.1.x to populate in setUp some JForum tables.
The issue is first appearing on jforum_users table created by this script
CREATE TABLE `jforum_users` (
`user_id` INT(11) NOT NULL AUTO_INCREMENT,
`user_active` TINYINT(1) NULL DEFAULT NULL,
`username` VARCHAR(50) NOT NULL DEFAULT '',
`user_password` VARCHAR(32) NOT NULL DEFAULT '',
[...]
PRIMARY KEY (`user_id`)
)
COLLATE='utf8_general_ci'
ENGINE=InnoDB
ROW_FORMAT=DEFAULT
AUTO_INCREMENT=14
Executing REFRESH as database setup operation the following exception is raised.
org.dbunit.dataset.NoSuchColumnException: jforum_users.USER_ID -
(Non-uppercase input column: USER_ID) in ColumnNameToIndexes cache
map. Note that the map's column names are NOT case sensitive.
at org.dbunit.dataset.AbstractTableMetaData.getColumnIndex(AbstractTableMetaData.java:117)
at org.dbunit.operation.AbstractOperation.getOperationMetaData(AbstractOperation.java:89)
at org.dbunit.operation.RefreshOperation.execute(RefreshOperation.java:98)
at org.dbunit.AbstractDatabaseTester.executeOperation(AbstractDatabaseTester.java:190)
at org.dbunit.AbstractDatabaseTester.onSetup(AbstractDatabaseTester.java:103)
at net.jforum.dao.generic.AbstractDaoTest.setUpDatabase(AbstractDaoTest.java:43)
I looked in AbstractTableMetaData.java sources and nothing seems -statically- wrong.
The method
private Map createColumnIndexesMap(Column[] columns)
uses
columns[i].getColumnName().toUpperCase()
in writing map keys.
And then the method
public int getColumnIndex(String columnName)
uses
String columnNameUpperCase = columnName.toUpperCase();
Integer colIndex = (Integer) this._columnsToIndexes.get(columnNameUpperCase);
to read object from the map.
I really can't undestand what's going on...
Anybody can help me please?
Edit after last #limc answer
I'm using a PropertiesBasedJdbcDatabaseTester to configure my DbUnit env, as follow:
Properties dbProperties = new Properties();
dbProperties.load(new FileInputStream(testConfDir+"/db.properties"));
System.setProperty(PropertiesBasedJdbcDatabaseTester.DBUNIT_DRIVER_CLASS,
dbProperties.getProperty(PropertiesBasedJdbcDatabaseTester.DBUNIT_DRIVER_CLASS));
System.setProperty(PropertiesBasedJdbcDatabaseTester.DBUNIT_CONNECTION_URL,
dbProperties.getProperty(PropertiesBasedJdbcDatabaseTester.DBUNIT_CONNECTION_URL));
System.setProperty(PropertiesBasedJdbcDatabaseTester.DBUNIT_USERNAME,
dbProperties.getProperty(PropertiesBasedJdbcDatabaseTester.DBUNIT_USERNAME));
System.setProperty(PropertiesBasedJdbcDatabaseTester.DBUNIT_PASSWORD,
dbProperties.getProperty(PropertiesBasedJdbcDatabaseTester.DBUNIT_PASSWORD));
System.setProperty(PropertiesBasedJdbcDatabaseTester.DBUNIT_SCHEMA,
dbProperties.getProperty(PropertiesBasedJdbcDatabaseTester.DBUNIT_SCHEMA));
databaseTester = new PropertiesBasedJdbcDatabaseTester();
databaseTester.setSetUpOperation(getSetUpOperation());
databaseTester.setTearDownOperation(getTearDownOperation());
IDataSet dataSet = getDataSet();
databaseTester.setDataSet(dataSet);
databaseTester.onSetup();
I had a similar problem to this today (using the IDatabaseTester interface added in v2.2 against MySQL) and spent several hours tearing my hair out over it. The OP is using a PropertiesBasedJdbcDatabaseTester, whilst I was using its 'parent' JdbcDatabaseTester.
DBUnit has a FAQ answer related to this NoSuchColumnException (specific to MySQL) but it looks like an oversight to me that it neglects to mention that each connection drawn from the interface's getConnection() method will have separate config. In fact I'd go so far as to call it bug given the wording of the various bits of doco I looked at today and the names of the classes involved (eg. DatabaseConfig, yet per Connection?).
Anyway, in sections of code like setup/teardown (example below) you don't even provide the Connection object so there's no way I could see to set the config in there.
dbTester.setDataSet(beforeData);
dbTester.onSetup();
In the end I just extended JdbcDatabaseTester to #Override the getConnection() method and inject the configuration specific to MySQL each time:
class MySQLJdbcDatabaseTester extends org.dbunit.JdbcDatabaseTester {
public MySQLJdbcDatabaseTester(String driverClass, String connectionUrl, String username, String password,
String schema) throws ClassNotFoundException {
super(driverClass, connectionUrl, username, password, schema);
}
#Override
public IDatabaseConnection getConnection() throws Exception {
IDatabaseConnection connection = super.getConnection();
DatabaseConfig dbConfig = connection.getConfig();
dbConfig.setProperty(DatabaseConfig.PROPERTY_DATATYPE_FACTORY, new MySqlDataTypeFactory());
dbConfig.setProperty(DatabaseConfig.PROPERTY_METADATA_HANDLER, new MySqlMetadataHandler());
return connection;
}
}
And finally all the errors went away.
I have reason to believe the problem stemmed from user_id column as the record ID. I have similar problem in the past where the row ID is generated natively by SQL Server. I'm not at my work desk now, but try this solution to see if it helps: http://old.nabble.com/case-sensitivity-on-tearDown--td22964025.html
UPDATE - 02-03-11
I have a working solution here. Here's my test code:-
MySQL Script
CREATE TABLE `jforum_users` (
`user_id` INT(11) NOT NULL AUTO_INCREMENT,
`user_active` TINYINT(1) NULL DEFAULT NULL,
`username` VARCHAR(50) NOT NULL DEFAULT '',
PRIMARY KEY (`user_id`)
)
COLLATE='utf8_general_ci'
ENGINE=InnoDB
ROW_FORMAT=DEFAULT
AUTO_INCREMENT=14
dbunit-test.xml Test File
<?xml version='1.0' encoding='UTF-8'?>
<dataset>
<jforum_users user_id="100" username="First User" />
</dataset>
Java Code
Class.forName("com.mysql.jdbc.Driver");
Connection jdbcConnection = DriverManager.getConnection("jdbc:mysql://localhost:8889/test", "", "");
IDatabaseConnection con = new DatabaseConnection(jdbcConnection);
InputStream is = getClass().getClassLoader().getResourceAsStream("dbunit-test.xml");
IDataSet dataSet = new FlatXmlDataSetBuilder().build(is);
DatabaseOperation.CLEAN_INSERT.execute(con, dataSet);
con.close();
I didn't get any errors, and the row was added into the database.
Just FYI, I did try a REFRESH and that works fine without errors too:-
DatabaseOperation.REFRESH.execute(con, dataSet);
I'm using DBUnit 2.4.8 and MySQL 5.1.44.
Hope this helps.
I came here looking for an answer to this problem. For me the problem was the Hibernate Naming Strategy. I realised this is the problem as show_sql was true in the Spring's application.properties:
spring.jpa.show-sql=true
I could see the generated table SQL and the field name was 'FACT_NUMBER' instead of 'factNumber' I had in my dbunit's xml.
This was solved by forcing the default naming strategy (ironically the default seems to be org.hibernate.cfg.ImprovedNamingStrategy, which puts in the '_'):
spring.jpa.hibernate.naming-strategy=org.hibernate.cfg.DefaultNamingStrategy
When I got this error, it was because my schema had a not null constraint on a column, but this column was missing from my datafile.
For example, my table had
<table name="mytable">
<column>id</column>
<column>entity_type</column>
<column>deleted</column>
</table>
<dataset>
<mytable id="100" entity_type"2"/>
</dataset>
I have a not null constraint on the deleted column and when I run the test, I get the NoSuchColumnException.
When I change the dataset to
<mytable id="100" entity_type"2" deleted="0"/>
I get past the Exception.
I was faced with this problem and the reason was that the dtd of my dataset file had a description different of the table where i wanted to insert data.
So check that your table where you want to insert data has the same columns that your dtd file.
when I delete in the dtd file the column that was not in the table where i inserted the data the problem disappeared.
Well in my case it was a csv file encoded in UTF-8 with BOM char in the beginning. I was using notepad to create csv files. Use notepade++ to avoid saving BOM char.
I had same problem , then figured I have used different column name in my DB than what I have inside my XML file.
I'm sure you got problem in user_id vs USER_ID.
I've just stumbled my self over this error message.
I had to extend an old piece of code - I needed to add a new column to several tables. In one of my entities I've forgotten to create a setter for this column. So you might check your entities if they are "complete".
Sometimes it might be as simple as that.
Ok I faced the same trouble and I found the solution, The way we are creating the test data is wrong,for what kind of data set we were using, We were using the xml data set for which following format is correct is you are using the FlatXmlDataSet then there is a different format, for more explanation read in the link provided below. the xml should be in following format.
<?xml version="1.0" encoding="UTF-8"?>
<dataset>
<table>
<column>id</column>
<column>name</column>
<column>department</column>
<column>startDate</column>
<column>endDate</column>
<row>
<value>999</value>
<value>TEMP</value>
<value>TEMP DEPT</value>
<value>2113-10-13</value>
<value>2123-10-13</value>
</row>
</table>
</dataset>
If you wish to know more go to this link : http://dbunit.sourceforge.net/components.html

Categories

Resources