I have problem extracting data of the Oracle custom type from the database.
One of the columns in the table is of the Oracle custom type (which is actually VARRAY):
Here is create statement for this type:
create or replace TYPE CAlarmMessList AS VARRAY(15) OF CAlarmMess;
Where CAlarmMess is the following:
create or replace TYPE CAlarmMess AS OBJECT (EreignisTypId NUMBER(9), EreignisKlasseId NUMBER(9), AlarmZeit DATE, ParamWert1 VARCHAR2(65 CHAR), ParamWert2 VARCHAR2(65 CHAR), ParamWert3 VARCHAR2(65 CHAR) [....]
And the field I need has name ALARME and type CALARMMESSLIST.
I generated Entity for the table (WSENSORSTATE) using Oracle tools for Eclipse. This field was generated as Object field.
I manually created java type CAlarmMess with fields:
private BigDecimal ereignisTypId;
private BigDecimal ereignisKlasseId;
private Date alarmZeit;
private String paramWert1;
private String paramWert2;
private String paramWert3;
And changed type of the field in my Entity to List of CAlarmMess.
When I added customizer for my entity:
#Customizer(com.companyname.entities.WsensorstateCustomizer.class)
Where customize method is the following:
public void customize(ClassDescriptor descriptor) throws Exception {
ObjectArrayMapping arrayMapping = new ObjectArrayMapping();
arrayMapping.setReferenceClass(Wsensorstate.class);
arrayMapping.setAttributeName("alarme");
arrayMapping.setFieldName("CAlarmMess");
arrayMapping.setStructureName("CAlarmMessList");
descriptor.addMapping(arrayMapping);
}
Now it falls with following error:
Exception [EclipseLink-197] (Eclipse Persistence Services - 2.0.1.v20100213-r6600): org.eclipse.persistence.exceptions.DescriptorException
Exception Description: The mapping [alarme] is not the appropriate type for this descriptor
Mapping: org.eclipse.persistence.mappings.structures.ObjectArrayMapping[alarme]
Descriptor: RelationalDescriptor(com.companyname.entities.Wsensorstate --> [DatabaseTable(WSENSORSTATE)])
I tried to add #StructConverter for the CAlarmMess but it also didn't work.
All suggestion will be appreciated, I already spent too much time working on it. :)
You have a WsensorstateCustomizer customizer I assume set on the Wsensorstate entity, but the mapping you are adding is set to reference Wsensorstate.class. This is telling EclipseLink that you are trying to build an array/collection of Wsensorstate, which isn't allowed as Wsensorstate is an entity with a RelationalDescriptor. Instead, you will have to create a java class for the CAlarmMess structure so that you get a collection of CAlarmMess objects in your Wsensorstate entity. Or you can map the alarme directly to a VARRAY and deal with the Oracle object yourself, but I'd go with the previous approach.
You will need to build an ObjectRelationalDataTypeDescriptor for the CAlarmMess java class as I do not know if this is exposed through the JPA interface, much like you are building the ObjectArrayMapping. A simple example from the EclipseLink automated tests is:
ObjectRelationalDataTypeDescriptor descriptor = new ObjectRelationalDataTypeDescriptor();
// SECTION: DESCRIPTOR
descriptor.setJavaClass(Phone.class);
Vector vector = new Vector();
vector.addElement("PolicyHolders");
descriptor.setTableNames(vector);
// SECTION: PROPERTIES
descriptor.descriptorIsAggregate();
descriptor.setStructureName("PHONE_TYPE");
descriptor.addFieldOrdering("PHONETYPE");
descriptor.addFieldOrdering("AREACODE");
descriptor.addFieldOrdering("PHONENUMBER");
descriptor.addDirectMapping("type", "getType", "setType", "PHONETYPE");
descriptor.addDirectMapping("areaCode", "getAreaCode", "setAreaCode", "AREACODE");
descriptor.addDirectMapping("number", "getNumber", "setNumber", "PHONENUMBER");
Which will need to be added to the session in a session customizer before it can be used by the ObjectArrayMapping.
Related
Normally when using spring data repositories, an object in which the result data can be stored is needed like the customer in this example: https://spring.io/guides/gs/accessing-data-mongodb/ .
In my case I'm trying to use an object which is declared in another project I'm importing using maven - let's call it MyDoc. The object has an attribute Long id while the document in the MongoDB has an addition field _id from type ObjectId. This btw is like this because the MongoDB serves as an archive and the actual id from MyDoc would not be unique.
In a service class I then use this the MongoTemplate to make database queries like this:
List<MyDoc> list = template.findAll(MyDoc.class, "DOCS");
org.springframework.core.convert.ConverterNotFoundException: No converter found capable of converting from type [org.bson.types.ObjectId] to type [java.lang.Long]
How can I convert the ObjectId to a type of Long?
Alternately, I could, of course, use the java MongoDB driver but I wanted to reduce the number of dependencies to maintain since the MongoDB driver comes with the boot-starter-data-MongoDB dependency, and also hoped for a more intuitive way, in the end, to interact with the database like with the spring data repositories.
First thing is Long Id from MyDoc is not unique, so it cannot act as _id of mongoDB.
Then you need to have one more _id field in your class. By Default Spring data mongoDB will map field named id to _id in dataBase.
Essentially what you can do is create a wrapper class around MyDoc by extending it and then add new field of Type ObjectId and annotate it with #Id. By that way you will have unique index and also mongoTemplate will not try to convert _id of database to Long id
I'm using Apache Ignite with class annotations as described in "Query Configuration by Annotations".
How should we handle class changes? For example what happen if from v1 and v2 of my application I add a new property?
Are previous values deserialized? Can I specify a default value?
I cannot find any documentation on this topic. I have tried with a simple use case and seems that new properties are null. How can I handle this?
UPDATE
Following suggestions from #dmagda I have tried to add a property on my class, adding it to the table using ALTER TABLE MYTABLE ADD COLUMN myNewProperty varchar; and then changing it's value using UPDATE MYTABLE SET myNewProperty='myDefaultValue'.
But unfortunately running the abode UPDATE I get the exception: Error: class org.apache.ignite.binary.BinaryObjectException: Failed to unmarshal object with optimized marshaller (state=50000,code=0)
It is possible to update existing records by changing new fields using SQL? How?
UPDATE 2
Solved my problem. It was caused by the fact that my class was written in scala with some scala specific types ('Map', ...). My app connects to Ignite using client mode and so when executing UPDATE from sqlline utility Ignite was unable to deserialize the types.
Now I switched my class to be plain POJO and now I'm able to update schema and update data.
Just update your Java class by adding a new field and it will be stored and can be read back without any issue. You might see null as a value of the new field for two reasons:
It was not set to any specific value by your application
You're reading back from Ignite an old object which was stored before you updated your class and, thus, the new field didn't present there.
If you need to access the new field using SQL, then use ALTER TABLE command to add the field to the SQL schema.
I am trying to map the MySQL JSON column to Java Entity class. Looking for the cleanest way of doing this.
Upon doing some research found 3 possible ways:
Extending AbstractSingleColumnStandardBasicType
Create a custom UserType
Use an attribute Converter
I used an attribute converter to convert the JSON column from String (as MySQL driver makes it to a String) to my required type - this works with both the Hibernate V4.3.10 and V5.2.10
I tried to find if JSON is natively supported in Hibernate and found the PR https://github.com/hibernate/hibernate-orm/pull/1395, based on the PR looks like it does add JSON mapping to the MySQL Dialect hence letting Hibernate know about the JSON Column.
Does this mean I can use something like this to map to JSON Column in DB ?#Column(name="json_type_column")
Private Object correspondingJsonAttribute;
If I cannot use it like this and need to use one of the above 3 methods, is there a reason I would need to upgrade to get the registerColumnType( Types.JAVA_OBJECT, "json" ); which is part of the PR and is present in Hibernate V5.2.10, Do I get any more features from V5.2.10 that support JSON columns?
I also looked into the corresponding test case to understand how the JSON column mapping is being done https://github.com/hibernate/hibernate-orm/blob/master/hibernate-core/src/test/java/org/hibernate/test/bytecode/enhancement/access/MixedAccessTestTask.java, this uses #Access annotation via property, looks like it sets the corresponding JSON column variable in Entity to Map after converting it from String.
Any help is much appreciated.
Thanks!
Upon doing some research found 3 possible ways:
Extending AbstractSingleColumnStandardBasicType
Create a custom UserType
Use an attribute Converter
AttributeConvertor won't help you for this, but you can still use a custom UserType, or Hibernate Type Descriptors.
Does this mean I can use something like this to map to JSON Column in
DB?
#Column(name="json_type_column") Private Object
correspondingJsonAttribute;
No. The json type is just for JDBC so that Hibernate knows how to handle that JDBC object when setting a parameter on a PreparedStatement or when fetching a ResultSet.
Do I get any more features from V5.2.10 that support JSON columns?
No, but you just need to supply your own JSON type.
You can just use the hibernate-types which is available on Maven Central.
<dependency>
<groupId>com.vladmihalcea</groupId>
<artifactId>hibernate-types-52</artifactId>
<version>${hibernate-types.version}</version>
</dependency>
And use the provided JdonType from Hibernate Types as it works on MySQL, PostgreSQL, Oracle, SQL Server or H2 without doing any modifications.
When I use IntelliJ to generate a persistence mapping from exisitng database schema it puts a catalog value as part of #Table annotation. Unfortunately names of database instances have names of dev/test/prod environemnts in them and while I can overwrite the connection string with a map passed to EntityManagerFactory I still get Invalid object name 'BAR_DEV.dbo.FOO' when executing a query against BAR_TEST instance.
Can I dynamically overwrite the catalog value at runtime without doing global search and replace to remove it manually after entity generation?
#Entity
#Table(name = "FOO", schema = "dbo", catalog = "BAR_DEV")
public class Foo{ /* ... */ }
No, it is not possible directly with standard JPA.
However, a solution I used in my project was to define multiple persistence units, each for a particular environment. You may overwrite any database mapping in an orm.xml file, or even set default catalog or schema for all entities. Next step is to dynamically retrieve proper EntityManager - if you are using Java EE, I recomment injecting using #Inject and creating a producer, which returns particular EM for specified environment.
Non portable, Eclipselink only org.eclipse.persistence.dynamic.DynamicHelper.SessionCustomizer can replace many defaults at runtime.
EDIT: I haven't ready code for You. I use this way
public void customize(Session session) throws SQLException {
...
for (ClassDescriptor descriptor : session.getDescriptors().values()) {
if (!descriptor.getTables().isEmpty() && descriptor.getAlias().equalsIgnoreCase(descriptor.getTableName())) {
tableName = TABLE_PREFIX + clazz.getSimpleName();
descriptor.setTableName(tableName);
}
}
I was generating Entity class for tables using JPA Tools in Eclipse Mars.It generate this
#Column(name="someColumn")
private String someColumn;
type of data member in Entity class which is of varying(130) type in databse. I have to manually edit every data member like this
#Column(name="someColumn",length=130)
private String someColumn;
by adding length in Column annotation. How to configure JPA Tools for generating data member with length automatically?