How can I execute a stored procedure with JPA & Spring Data? - java

I am trying to call the Terminal_GetTicket stored procedure in my database but keep getting the following exception:
PropertyReferenceException: No property getTicket found for type TicketInfo
I have cross validated my configuration with a very simple test entity and everything seems to work fine, however for the actual case, something is wrong.
Here is my domain entity (TicketInfo):
#Entity
#NamedStoredProcedureQuery(name = "TicketInfo.getTicket", procedureName = "Terminal_GetTicket", resultClasses = TicketInfo.class, parameters = {
#StoredProcedureParameter(mode = ParameterMode.IN, name = "sys_id_game", type = Integer.class)})
public class TicketInfo {
#Id #GeneratedValue
private Long id;
private String idTicket;
private Integer externalTicketCode;
private Short sequenseAlert;
private Integer dlTimeStamp;
All the instance variables have their getters and setters properly defined and the stored procedure has a total of 5 output parameters matching the attributes of TicketInfo.
Furthermore, here is my repository interface:
public interface TicketInfoRepository extends CrudRepository<TicketInfo, Long> {
#Transactional(timeout = 5)
#Procedure
TicketInfo getTicket(Integer sys_id_game);
}
Also, here is my context.xml file (for Spring):
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:jpa="http://www.springframework.org/schema/data/jpa"
xmlns:context="http://www.springframework.org/schema/context"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:repository="http://www.springframework.org/schema/data/repository"
xsi:schemaLocation="http://www.springframework.org/schema/beans
http://www.springframework.org/schema/beans/spring-beans-4.0.xsd
http://www.springframework.org/schema/context
http://www.springframework.org/schema/context/spring-context-4.1.xsd
http://www.springframework.org/schema/data/jpa
http://www.springframework.org/schema/data/jpa/spring-jpa-1.8.xsd
http://www.springframework.org/schema/data/repository
http://www.springframework.org/schema/data/repository/spring-repository-1.5.xsd">
<context:component-scan base-package="ar.com.boldt.godzilla" />
<jpa:repositories base-package="xx.xxx.xxx.godzilla.business.dao" />
<bean id="jpaVendorAdapter"
class="org.springframework.orm.jpa.vendor.HibernateJpaVendorAdapter">
<property name="showSql" value="${dataSource.show.sql}" />
<property name="generateDdl" value="false" />
<property name="database" value="SQL_SERVER" />
</bean>
<bean id="entityManagerFactory"
class="org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean">
<property name="dataSource" ref="dataSource" />
<property name="jpaVendorAdapter" ref="jpaVendorAdapter" />
<!-- spring based scanning for entity classes -->
<property name="packagesToScan" value="xx.xxx.xxx.godzilla.business.dao" />
</bean>
<bean id="transactionManager" class="org.springframework.orm.jpa.JpaTransactionManager" />
<bean id="cacheManager" class="org.springframework.cache.ehcache.EhCacheCacheManager">
<property name="cacheManager" ref="ehcache" />
</bean>
<bean id="ehcache"
class="org.springframework.cache.ehcache.EhCacheManagerFactoryBean">
<property name="configLocation" value="classpath:ehcache.xml" />
</bean>
</beans>
And finally a watered-down version of the stored procedure itself:
ALTER PROCEDURE [Terminal_GetTicket](
#arg int
,#res int output
,#res2 int output
)
as
Declare #error int
select 0, 1, 2
RETURN #error
Now, whenever I try setting the #Autowired annotation, I get the exception mentioned above.

I remember that I have been struggling with the MS SQL stored procedures and spring-data-jpa. This is how I have been able to successfully run it:
Model:
#NamedNativeQueries({
#NamedNativeQuery(
name = "yourInternalName",
query = "EXEC [procedure_name] :param1, :param2",
resultClass = Foo.class
)
})
#Entity
public class Foo{
/* Fields, getters, setters*/
}
That's pretty straightforward. This approach is different though, you are not declaring procedures directly (that's also the reason why it doesn't have to work if you decide to change RDBS).
Then you have to extend your repository:
public interface FooRepositoryCustom {
Foo fancyMethodName(arg1, arg2);
}
And directly implement it:
public class FooRepositoryImpl implements FooRepositoryCustom {
#PersistenceContext
EntityManager entityManager;
#Override
public Foo fancyMethodName(arg1, arg2) {
Query query = entityManager.createNamedQuery("yourInternalName");
query.setParameter("param1", arg1);
query.setParameter("param2", arg2);
return query.getResultList();
}
Let's put it all together:
public interface FooRepository extends CrudRepository<Foo, Long>, FooRepositoryCustom {
}
Note that if you decide to return for example a List of Foo objects you only edit return value in your custom repository.

I followed SirKometas advice but I could not get it to work so I came up with something that worked for me and I think from syntax point of view is better. First create your entity class like below.
#NamedStoredProcedureQueries({//
#NamedStoredProcedureQuery(//
name = "MySP"//
, procedureName = "my_sp"//
, parameters = { //
#StoredProcedureParameter(mode = ParameterMode.IN, name = "arg", type = String.class)}//
, resultClasses = Foo.class)//})
#Entity
public class Foo {
Then the Implementation class of the repository would be:
#Component
public class FooRepositoryImpl implements FooCustomRepository {
#PersistenceContext
EntityManager entityManager;
#Override
public List<Foo> foo(String arg) {
Query query = entityManager.createNamedStoredProcedureQuery("MySP");
query.setParameter("arg", arg);
return query.getResultList();
}
}
The rest of the implementation is like the answer from SirKometa above. Think also that you have to create a EntityManager bean in your application for this to work.

Related

Spring MongoDB inserting unwanted objects

I haven't been able to find a single web page or other post about this issue. Hence, I'm here posting.
Within the documents I am storing to my mongodb, I have these things showing up:
"itemModifiers" : [
{
"val$implicitModifierString" : "16% increased Spell Damage",
"modifierName" : "16% increased Spell Damage"
}
]
The val$implicitModifierString is actually a variable from within my Java code, which was not set to the ItemModifiers.class instance. Basically, when I set a variable in my classes which I am storing to MongoDb, any variable or Object that I use to set that variable is also getting stored to the database (or at least that is what it looks like to me!).
Here is some sample code of what the process looks like (if you hate maps, sorry; not really relevant here.):
public ItemModifier deriveModifier(final String modifier) {
for (Pattern outerKey : tierMap.keySet()) {
if (outerKey.matcher(modifier).matches()) {
HashMap<Pattern, ItemModifierTier> innerMap = tierMap.get(outerKey);
for (Pattern innerKey : innerMap.keySet()) {
if (innerKey.matcher(modifier).matches()) {
Matcher innerMatcher = innerKey.matcher(modifier);
Double[] tierValues = new Double[innerMatcher.groupCount()];
innerMatcher.find();
for (int i = 1; i <= innerMatcher.groupCount(); i++) {
tierValues[i - 1] = Double.valueOf(innerMatcher.group(i));
}
return new ItemModifier() {{
setModifierName(modifier);
setModifierTerm(termMap.get(outerKey.pattern()));
setModifierTier(innerMap.get(innerKey));
setModifierType(itemModifierType);
setModifierValues(tierValues);
}};
}
}
}
}
return null;
}
And here is the ItemModifier class (intentionally indexed every field because they are all queryable via a service; I have not yet created composite indexes but plan to once the issue at hand is sorted):
#JsonInclude(JsonInclude.Include.NON_NULL)
#JsonPropertyOrder({
"modifierName",
"modifierTerm",
"modifierType",
"modifierTier",
"modifierValues",
"modifierAverage"
})
public class ItemModifier {
#Indexed
#JsonProperty("modifierName")
private String modifierName;
#Indexed
#JsonProperty("modifierTerm")
private String modifierTerm;
#Indexed
#JsonProperty("modifierType")
private ItemModifierType modifierType;
#Indexed
#JsonProperty("modifierTier")
private ItemModifierTier modifierTier;
#Indexed
#JsonProperty("modifierValues")
private Double[] modifierValues;
#Indexed
#JsonProperty("modifierAverage")
private Double modifierAverage;
public ItemModifier() {
}
public String getModifierName() {
return modifierName;
}
public void setModifierName(String modifierName) {
this.modifierName = modifierName;
}
//... the other getters/setters
}
This ItemModifiers.class is held within an ItemDocument.class and is stored to the mongo database simply by invoking mongoOperations.insert(itemDocumentInstance);.
In case it matters, this is my mongoConfig.xml:
<beans xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:mongo="http://www.springframework.org/schema/data/mongo"
xmlns="http://www.springframework.org/schema/beans"
xsi:schemaLocation="http://www.springframework.org/schema/data/mongo
http://www.springframework.org/schema/data/mongo/spring-mongo-1.0.xsd
http://www.springframework.org/schema/beans
http://www.springframework.org/schema/beans/spring-beans-3.0.xsd">
<mongo:mongo host="127.0.0.1" port="27017"/>
<mongo:db-factory dbname="public-stash-api"/>
<bean id="mappingContext"
class="org.springframework.data.mongodb.core.mapping.MongoMappingContext"/>
<bean id="defaultMongoTypeMapper"
class="org.springframework.data.mongodb.core.convert.DefaultMongoTypeMapper">
<constructor-arg name="typeKey">
<null/>
</constructor-arg>
</bean>
<bean id="mappingMongoConverter"
class="org.springframework.data.mongodb.core.convert.MappingMongoConverter">
<constructor-arg name="mongoDbFactory" ref="mongoDbFactory"/>
<constructor-arg name="mappingContext" ref="mappingContext"/>
<property name="typeMapper" ref="defaultMongoTypeMapper"/>
</bean>
<bean id="mongoTemplate" class="org.springframework.data.mongodb.core.MongoTemplate">
<constructor-arg name="mongoDbFactory" ref="mongoDbFactory"/>
<constructor-arg name="mongoConverter" ref="mappingMongoConverter"/>
</bean>
</beans>
Thank you in advance for your help!
You are mixing Jackson with MongoDB. The fact that MongoDB uses documents does not mean it is using Jackson. MongoDB stores document in BSON (Binary JSON) format but you can't customize the way your documents are stored by using Jackson annotations.
There are Spring Data MongoDB annotations (like org.springframework.data.mongodb.core.mapping.Field) for that very purpose. You actually used one of them in your code (org.springframework.data.mongodb.core.index.Indexed).
As it turns out, this is the culprit:
return new ItemModifier() {{
setModifierName(modifier);
setModifierTerm(termMap.get(outerKey.pattern()));
setModifierTier(innerMap.get(innerKey));
setModifierType(itemModifierType);
setModifierValues(tierValues);
}};
The static instantiation is what caused the weird val$inputVariables to persist into the Mongo Documents via Spring.

MyBatis nullPointerException when inserting null

I'm getting a MyBatis NPE when I try to insert a record with a null primary key and then get the key value value back (an oracle trigger sets the key).
FooMapper interface:
...
public void insertFooObject (final Foo foo);
...
FooMapper.xml:
...
<insert id="insertFooObject" useGeneratedKeys="true" keyColumn="foo_id" keyProperty="fooId">
insert into foos (foo_id, gcor_id, registration_date)
values (#{fooId, jdbcType=NUMERIC}, #{gcorId, jdbcType=NUMERIC}, #{registrationDate, jdbcType=DATE})
</insert>
...
Here's the model:
public class Foo {
private final Integer fooId;
private final Integer gcorId;
private final Date registrationDate;
public Foo(final Integer fooId, final Integer gcorId, final Date registrationDate) {
this.fooId = fooId;
this.gcorId = gcorId;
this.registrationDate = registrationDate;
}
...
And here's the call:
...
Foo foo = new Foo(null, 229, null);
fooMapper.insertFooObject(foo);
...
fooMapper is injected by spring and can be used successfully for other SQL statements. When I pass in a number for fooId and registrationDate is null, everything works. When fooId is null (as shown), I get the error:
org.mybatis.spring.MyBatisSystemException: nested exception is org.apache.ibatis.exceptions.PersistenceException:
Error updating database. Cause: java.lang.NullPointerException
The error may involve defaultParameterMap
The error occurred while setting parameters
SQL: insert into foos (foo_id, gcor_id, registration_date) values (?, ?, ?)
Cause: java.lang.NullPointerException
Here is my config file:
<?xml version="1.0" encoding="UTF-8"?>
<beans
xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:util="http://www.springframework.org/schema/util"
xsi:schemaLocation="
http://www.springframework.org/schema/beans
http://www.springframework.org/schema/beans/spring-beans-3.0.xsd
http://www.springframework.org/schema/util
http://www.springframework.org/schema/util/spring-util-3.0.xsd
">
<bean id="datasource" class="org.apache.ibatis.datasource.pooled.PooledDataSource">
<property name="driver" value="oracle.jdbc.driver.OracleDriver"/>
<property name="url" value="jdbc:oracle:thin:#......"/>
<property name="username" value="..."/>
<property name="password" value="..."/>
</bean>
<bean id="sqlSessionFactory" class="org.mybatis.spring.SqlSessionFactoryBean">
<property name="dataSource" ref="datasource"/>
</bean>
<bean id="fooMapper" class="org.mybatis.spring.mapper.MapperFactoryBean">
<property name="mapperInterface" value="com.foo.FooMapper" />
<property name="sqlSessionFactory" ref="sqlSessionFactory" />
</bean>
<bean id="transactionManager" class="org.springframework.jdbc.datasource.DataSourceTransactionManager">
<property name="dataSource" ref="datasource" />
</bean>
</beans>
Any ideas how to correct this? I didn't have luck googling and thought I'd be okay by specifying jdbcType. Thanks!
I believe you are using useGeneratedkeys for oracle, which oracle does not supprot. GeneratedKeys are supported by mysql and postgress I believe. Instead use selectkeys to fetch the next value or as you are ok to pass the null value to fooId remove the useGeneratedKeys which is by default is false. You can get next value using sequence
<selectKey keyProperty="fooId"
resultClass="int">
SELECT nextVal('foo_id_seq')
</selectKey>
I found my problem in the model class (Foo.java):
public int getFooId() {
return fooId;
}
fooId is an Integer and not an int. By changing the getter return type to an Integer, I was able to insert a null primary key and get the trigger-created sequence id back

SpringBatch Jaxb2Marshaller: different name of class and xml attribute

I try to read an xml file as input for spring batch:
Java Class:
package de.example.schema.processes.standardprocess;
#XmlAccessorType(XmlAccessType.FIELD)
#XmlType(name = "Process", namespace = "http://schema.example.de/processes/process", propOrder = {
"input"
})
public class Process implements Serializable
{
#XmlElement(namespace = "http://schema.example.de/processes/process")
protected ProcessInput input;
public ProcessInput getInput() {
return input;
}
public void setInput(ProcessInput value) {
this.input = value;
}
}
SpringBatch dev-job.xml:
<bean id="exampleReader" class="org.springframework.batch.item.xml.StaxEventItemReader" scope="step">
<property name="fragmentRootElementName" value="input" />
<property name="resource"
value="file:#{jobParameters['dateiname']}" />
<property name="unmarshaller" ref="jaxb2Marshaller" />
</bean>
<bean id="jaxb2Marshaller" class="org.springframework.oxm.jaxb.Jaxb2Marshaller">
<property name="classesToBeBound">
<list>
<value>de.example.schema.processes.standardprocess.Process</value>
<value>de.example.schema.processes.standardprocess.ProcessInput</value>
...
</list>
</property>
</bean>
Input file:
<?xml version="1.0" encoding="UTF-8"?>
<process:process xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:process="http://schema.example.de/processes/process">
<process:input>
...
</process:input>
</process:process>
It fires the following exception:
[javax.xml.bind.UnmarshalException: unexpected element (uri:"http://schema.example.de/processes/process", local:"input"). Expected elements are <<{http://schema.example.de/processes/process}processInput>]
at org.springframework.oxm.jaxb.JaxbUtils.convertJaxbException(JaxbUtils.java:92)
at org.springframework.oxm.jaxb.AbstractJaxbMarshaller.convertJaxbException(AbstractJaxbMarshaller.java:143)
at org.springframework.oxm.jaxb.Jaxb2Marshaller.unmarshal(Jaxb2Marshaller.java:428)
If I change to in xml it work's fine. Unfortunately I can change neither the xml nor the java class.
Is there a possibility to make Jaxb2Marshaller map the element 'input' to the class 'ProcessInput'?
I don't believe JAXB allows this. JAXB is a binding API, so it doesn't provide much in the way of customization. That being said, you can use XStream and provide aliases for what you need, allowing you to customize the mapping of XML to object however you want. You can see an XStream example here: https://github.com/spring-projects/spring-batch/blob/master/spring-batch-samples/src/main/resources/jobs/iosample/xml.xml

DynamoDB and TableNameOverride with prefix

I am testing DynamoDB tables and want to set up different table names for prod and dev environment using the prefix "dev_" for development.
I made this test to print the table name:
import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMapperConfig.TableNameOverride;
TableNameOverride tbl = new TableNameOverride("test").withTableNamePrefix("dev_");
System.out.println("name=" + tbl.getTableName() + " prefix=" + tbl.getTableNamePrefix());
This prints: name=null prefix=dev_
How come the name here is null ?
TableNameOverride tbl = new TableNameOverride("test");//.withTableNamePrefix("dev_");
System.out.println("name=" + tbl.getTableName() + " prefix=" + tbl.getTableNamePrefix());
This prints: name=test prefix=null
*How can I get the table name to be "dev_test" ?*
I want to use this later to get a "dev_" prefix for all tables in development mode like this:
DynamoDBTable annotation = (DynamoDBTable) myclass.getClass().getAnnotation(DynamoDBTable.class);
TableNameOverride tbl = new TableNameOverride(annotation.tableName()).withTableNamePrefix("dev_");
Or is there another solution to separate between dev and prod tables?
I first thought of putting them in separate regions but not sure about this.
Could also use this:
mapper.save(ck, new DynamoDBMapperConfig(new TableNameOverride((isDev ? "dev_" : "") + annotation.tableName())));
withTableNamePrefix is a static method. So this line is creating a new instance of TableNameOverride with the String "test", and then throwing that instance away by using it to call the static withTableNamePrefix method:
TableNameOverride tbl = new TableNameOverride("test").withTableNamePrefix("dev_");
To answer the deeper question of separating test from prod, I would recommend having 2 separate AWS Accounts entirely, one for dev and one for prod. This is the only way you can:
See billing separately
Ensure you never leak data between prod and test systems
Have high scaling on a dev table prevent you from scaling a prod table higher
I've faced the same situation and struggled with myself a couple of days to get that working.
Just in case you're using DynamoDB + Spring here is what worked for me:
POJO class:
#DynamoDBTable(tableName = "APP-ACCESSKEY")
public class AccessKey {
#NotBlank
#Size(min = 1, max = 36)
private String accessToken;
#NotNull
#Size(min = 3, max = 15)
private String userName;
private Date dateInsertion;
public AccessKey() {
// ... All POJO stuff
}
Spring configuration:
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://www.springframework.org/schema/beans
http://www.springframework.org/schema/beans/spring-beans-3.1.xsd">
<!-- Amazon Credentials -->
<bean id="tableNameOverride" class="org.springframework.beans.factory.config.MethodInvokingFactoryBean">
<property name="staticMethod" value="com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMapperConfig.TableNameOverride.withTableNamePrefix"/>
<property name="arguments" value="DES-" />
</bean>
<bean id="dynamoDBMapperConfig" class="com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMapperConfig">
<constructor-arg index="0" ref="tableNameOverride" />
</bean>
<bean id="BasicAWSCredentials" class="com.amazonaws.auth.BasicAWSCredentials">
<constructor-arg index="0" value="${amazon.accessKey}" />
<constructor-arg index="1" value="${amazon.secretKey}" />
</bean>
<bean id="amazonDynamoDBClient" class="com.amazonaws.services.dynamodbv2.AmazonDynamoDBClient">
<constructor-arg index="0" ref="BasicAWSCredentials" />
<property name="endpoint" value="http://dynamodb.us-west-2.amazonaws.com" />
</bean>
<bean id="dynamoDBMapper" class="com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMapper">
<constructor-arg index="0" ref="amazonDynamoDBClient" />
<constructor-arg index="1" ref="dynamoDBMapperConfig" />
</bean>
</beans>
Explanation:
Taking into account that my AccessKey object point to APP-ACCESSKEY table on AWS DynamodDB then it turns out that after running this, your application will start to point to DES-APP-ACCESSKEY.
Hope it helps someone who's facing a situation akin to it
Cheers
Same as Paolo Almeidas solution, but with Spring-Boot annotations.
Just wanted to share it and maybe save someone time:
I have dynamodb tables for each namespace, e.g. myApp-dev-UserTable, myApp-prod-UserTable and I am using the EKS_NAMESPACE env variable, which in my case gets injected into the pods by kubernetes.
import com.amazonaws.services.dynamodbv2.AmazonDynamoDB;
import com.amazonaws.services.dynamodbv2.AmazonDynamoDBClientBuilder;
import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMapper;
import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMapperConfig;
#Configuration
#EnableDynamoDBRepositories(basePackages = "de.dynamodb")
public class DynamoDBConfig {
#Value("${EKS_NAMESPACE}")
String eksNamespace;
#Bean
public AmazonDynamoDB amazonDynamoDB() {
return AmazonDynamoDBClientBuilder.standard()
.withEndpointConfiguration(new AwsClientBuilder.EndpointConfiguration(
"dynamodb.eu-central-1.amazonaws.com", "eu-central-1"))
.withCredentials(awsCredentials())
.build();
}
#Bean
public AWSCredentialsProvider awsCredentials() {
return WebIdentityTokenCredentialsProvider.builder().build();
}
// Table Name override:
#Bean
public DynamoDBMapperConfig.TableNameOverride tableNameOverride() {
return DynamoDBMapperConfig.TableNameOverride.withTableNamePrefix("myApp-" + eksNamespace + "-");
}
#Bean
public DynamoDBMapperConfig dynamoDBMapperConfig() {
return DynamoDBMapperConfig.builder().withTableNameOverride(tableNameOverride()).build();
}
#Bean
// Marked as primary bean to override default bean.
#Primary
public DynamoDBMapper dynamoDBMapper() {
return new DynamoDBMapper(amazonDynamoDB(), dynamoDBMapperConfig());
}
}
With a table like this:
#Data
#DynamoDBTable(tableName = "UserTable")
public class User {
#DynamoDBHashKey
private String userId;
#DynamoDBAttribute
private String foo;
#DynamoDBAttribute
private String bar;
}

No endpoint mapping found for..., using SpringWS, JaxB Marshaller

I get this error: No endpoint mapping found for [SaajSoapMessage {http://mycompany/coolservice/specs}ChangePerson]
Following is my ws config file:
<bean class="org.springframework.ws.server.endpoint.mapping.PayloadRootAnnotationMethodEndpointMapping">
<description>An endpoint mapping strategy that looks for #Endpoint and #PayloadRoot annotations.</description>
</bean>
<bean class="org.springframework.ws.server.endpoint.adapter.MarshallingMethodEndpointAdapter">
<description>Enables the MessageDispatchServlet to invoke methods requiring OXM marshalling.</description>
<constructor-arg ref="marshaller"/>
</bean>
<bean id="marshaller" class="org.springframework.oxm.jaxb.Jaxb2Marshaller">
<property name="contextPaths">
<list>
<value>org.company.xml.persons</value>
<value>org.company.xml.person_allextensions</value>
<value>generated</value>
</list>
</property>
</bean>
<bean id="persons" class="com.easy95.springws.wsdl.wsdl11.MultiPrefixWSDL11Definition">
<property name="schemaCollection" ref="schemaCollection"/>
<property name="portTypeName" value="persons"/>
<property name="locationUri" value="/ws/personnelService/"/>
<property name="targetNamespace" value="http://mycompany/coolservice/specs/definitions"/>
</bean>
<bean id="schemaCollection" class="org.springframework.xml.xsd.commons.CommonsXsdSchemaCollection">
<property name="xsds">
<list>
<value>/DataContract/Person-AllExtensions.xsd</value>
<value>/DataContract/Person.xsd</value>
</list>
</property>
<property name="inline" value="true"/>
</bean>
I have then the following files:
public interface MarshallingPersonService {
public final static String NAMESPACE = "http://mycompany/coolservice/specs";
public final static String CHANGE_PERSON = "ChangePerson";
public RespondPersonType changePerson(ChangePersonType request);
}
and
#Endpoint
public class PersonEndPoint implements MarshallingPersonService {
#PayloadRoot(localPart=CHANGE_PERSON, namespace=NAMESPACE)
public RespondPersonType changePerson(ChangePersonType request) {
System.out.println("Received a request, is request null? " + (request == null ? "yes" : "no"));
return null;
}
}
I am pretty much new to WebServices, and not very comfortable with annotations. I am following a tutorial on setting up jaxb marshaller in springws. I would rather use xml mappings than annotations, although for now I am getting the error message.
EDIT: ChangePersonType
#XmlAccessorType(XmlAccessType.FIELD)
#XmlType(name = "ChangePersonType", propOrder = {
"applicationArea",
"dataArea"
})
public class ChangePersonType {
#XmlElement(name = "ApplicationArea", namespace = "http://mycompany/coolservice/specs", required = true)
protected TransApplicationAreaType applicationArea;
#XmlElement(name = "DataArea", namespace = "http://mycompany/coolservice/specs", required = true)
protected DataArea dataArea;
#XmlAttribute(required = true)
#XmlJavaTypeAdapter(NormalizedStringAdapter.class)
protected String releaseID;
#XmlAttribute
#XmlJavaTypeAdapter(NormalizedStringAdapter.class)
protected String versionID;
--The rest are getters and setters.
I solved it. The parameter of the end point class and return variable had to be wrapped in JAXBElement, like JAXBElement.
The reason is
The classes generated by JAXB2 from
your schema come in two flavors: those
that have a #XmlRootElement
annotation, which can be used directly
as either parameter or response, and
those who haven't. Those classes which
haven't got this annotation need to be
wrapped in a JAXBElement.
Besides the generated classes from
your schema, JAXB2 also generates an
ObjectFactory class, which clarifies
the use of JAXBElement. There are some
factory methods is there, which
illustrate how you can use the various
schema types.
Arjen Poutsma
h ttp://forum.springsource.org/showthread.php?t=49817

Categories

Resources