Dealing with shared namespaces with multiple WSDL's (xmlbeans) - java

I have five WSDL's that share namespaces, but not all of them. I generate client code out of them (databinding with XMLBeans). Seperately they compile fine. I create JAR files out of each generated client code.
Once I try to use all JAR files within a project, I get naming / compile conflicts.
I want to reuse as much as possible. Is there any smart way to deal with this (rather than giving each client an own node in the package structure)?

The XMLBeans (2.x) faq notes the limitations of xsdconfig namespace mapping:
Note: XMLBeans doesn’t support using two or more sets of java classes (in different packages) mapped to schema types/elements that have the same names and target namespaces, using all in the same class loader. Depending on the direction you are using for the java classes to schema types mapping, some features might not work correctly. This is because even though the package names for the java classes are different, the schema location for the schema metadata (.xsb files) is the same and contains the corresponding implementing java class, so the JVM will always pick up the first on the classpath. This can be avoided if multiple class loaders are used.

Related

Generating AVRO classes in specific package

I have two .avsc files with matching namespace and field names. When generating, the classes generated from first schema are being overwritten by classes from the second schema. Is there a way to generate classes in a specific directory, but only for one of the .avsc files?
If I change the namespace in avro schema everything is great, but Kafka messages aren't being read and I get the following error:
Could not find class com.package.TestClaim
Obviously, because avro's namespace after change is com.package.test_claim.TestClaim
This is what's generated when I added *.test_claim to namespace of one of the schemas.
in a specific directory, but only for one of the .avsc files?
That's what the namespace does. This isn't overridable elsewhere, so yes, two files will conflict if compiled separately.
Unclear how you are compiling the Java classes, but if you use Avro IDL rather than AVSC, you can declare/combine many record types under a single namespace on the protocol (which will write the classes to a single folder)
And if you needed different/nested packaging, that is available too
#namespace("org.apache.avro.firstNamespace")
protocol MyProto {
#namespace("org.apache.avro.someOtherNamespace")
record Foo {}

Load external java function through XQuery with Saxon

I can access Java class and methods when executing xquery with Saxon when they are declared correctly (through namespace pointing to package and class) but I wonder if there is a way to create a kind of "dynamic" class path at each run to load external jar file and search classes in it instead of in the current project/program classpath (as I cannot add all possible class in it).
So for instance I have something like :
declare namespace dpr="java:com.*****.atm.dpr.common.util.DPRConfigurationLoader";
declare variable $rules as node()* := doc(dpr:getApplicationProperty('Common','RulesFileLocation'))//category;
I can replace the path the real class with an emulated version but it means I must create each possible class on my side (not really a good way as it means a "patch" for each new java call...).
So if I provide a jar containing the classes I need is there a way to load it so that the namespace point to it ?
I know I can load .class file if they are on classpath, but 3 jar files entirely ?
Thanks.
Technically, Saxon doesn't require external classes to be on the classpath - it requires them to be accessible using the appropriate ClassLoader. If you understand ClassLoaders and are prepared to write your own or configure third-party offerings, then you can load classes from anywhere. All the hooks are there in Saxon if you want to do such things; but there's nothing packaged with the product.
Some of the thing you could try include:
With Configuration.setDynamicLoader() you can change the way Saxon does dynamic loading of external classes, including the classes used for Java extension functions.
With Configuration.getDynamicLoader().setClassLoader() you could provide a different ClassLoader for loading classes, for example a URLClassLoader.
With ProfessionalConfiguration.setExtensionBinder("java", XXX) you could register a customized JavaExtensionLibrary, typically as a subclass of the standard one, allowing you to change the way URIs are mapped to Java classes and the way methods are selected (for example)
This is all very low-level system programming and is not for the faint-hearted.

How to manage JARs with third-party implementations in configuration classes?

Assume that
I like to manage the configuration of a Java application in form of one or many classes referencing each other which I de/serialize to/from XML because I like the way how that saves a lot of work.
I have a Java project with interfaces and the application packaged in different JARs where interfaces are designed to allow third-parties to implement interfaces and the user to load them at runtime through a fancy GUI. Configuration classes exist in form of interfaces and thus can occur in the serialized XML.
I would like to have one configuration file only controlling all pathes of all resources (I'll probably have to give that up, but I'm curious about your answers). It is searched for in a default location and created with default values if inexisting or can be specified on command line.
How would I go about getting the information about the location of third-party implementations before loading the configuration and still keep one clean configuration file only?
I
parse the XML once using XStream and XStream.omitField for the field which contains instances of configuration classes which need to be loaded from a referenced JAR.
Then I read the partial configuration to the location of the JARs and load the classes.
Then I overwrite the partially parsed configuration with a completely deserialized one (using a new instance of XStream).

What's the correct or proper way to specify XSD schemaLocation across projects?

Say I have two projects, A and B. Java projects, in case that's important.
Project A contains a bunch of XSD files that represent core types and elements. They are all placed in a package called, say, "definition". This gets built into project-a.jar.
Project B represents an extension project, and it's allowed to defined its own types and elements. I created a new schema and placed it in "definition.extension" package. This gets built into project-b.jar.
Now, for the XSDs in Project B, what exactly should I put as the schemaLocation for an include?
schemaLocation="../core-types.xsd" didn't quite work (I know, it's needs a URI), but what exactly is the proper or standard approach to this? Google found me more people asking this question that clear-cut, standard approaches on what really is the correct way to handle this.
It can't be that I have programmatically adjust the schemaLocation during runtime... or that I'd need a build step/script that will dynamically replaced the schemaLocation during compilation... right?
I'm not looking for answers like "put them in a shared location". I'm looking for something more along the lines of a dev environment that uses relative references instead of hardcoded references.
FYI, I'm using IntelliJ IDEA, in case there's an IDE-specific approach.
If you just want IntelliJ to stop showing your includes in red, you can use some custom URI in your include. You then go to Project Settings -> Schema's and DTD's where you can map this URI onto a local file.
If you need to do schema validation at run time, that's a different story. You probably need to use an XML Catalog. If you're using JAXB, you should have a look at this question: jaxb - how to map xsd files to URL to find them
You should use XML Catalogs. This link gives a thorough introduction to XML catalogs - and how to use them in Java for instance - by XML expert Norman Walsh. Quote:
These catalog files can be used to map public and system identifiers and other URIs to local files (or just other URIs).
The aforementioned identifiers are typically the schemalocations or namespaces you use in schema imports.
When using such catalogs, in order to avoid confusions and some bug in XJC, I strongly recommend you remove all schemaLocations from the schema imports in XML schemas, and only keep the namespace (if you have a choice of course). For example:
<import namespace="http://www.w3.org/1999/xlink" />
Then specify the mappings for each namespace to the actual schema location in the catalog. For example, using the OASIS XML catalog format:
<?xml version="1.0" encoding="UTF-8"?>
<catalog xmlns="urn:oasis:names:tc:entity:xmlns:xml:catalog">
<uri name="http://www.w3.org/1999/xlink" uri="w3c/1999/xlink.xsd" />
</catalog>
At compilation time, if you are generating JAXB-annotated classes from the schemas, I recommend you use Episodes to achive separate schema compilation aka modular schema compilation. This is supported by maven-jaxb2-plugin for instance, which also has advanced support for catalogs.
For runtime, depending on your XML use case, you should try to use a library with native support for XML catalogs, such as most Java web service frameworks (JAX-WS RI/Metro, Apache CXF...) if you are developing web services for example. If you can't or if you want finer control over the XML catalog (e.g. being able to load schemas from the classpath), I invite you to look at the XML Entity and URI Resolvers page mentioned earlier, especially the sections Using Catalogs with Popular Applications and Adding Catalog Support to Your Applications. Basically, you play with org.apache.xml.resolver.tools.CatalogResolver and optionally (for finer control) org.apache.xml.resolver.CatalogManager classes. For concrete examples of custom CatalogResolver/CatalogManager, you may look at code sources from Apache commons-configuration, AuthzForce, CXF, etc.

Options for JAXB 2.1 Bindings Customization

I am working on generating Java objects from an XSD file using JAXB 2.1. The XSD file has several elements in it representing business model entities, with common names like Account, etc. The system which is using the generated files to unmarshal the XML has several conflicting class names in its domain model. Though we can use different package names to get around class name conflicts, I think it will be more readable/maintainable to have objects of different names.
Because of this, I'd like to alter the XJC compilation so that is produces objects like: DataTransferAccount.java, etc. instead of Account.java. Super, I'll use one of the two options JAXB provides when binding a schema (http://java.sun.com/webservices/docs/2.0/tutorial/doc/JAXBUsing4.html):
Inline Customizations - Annotate the XSD itself using the jaxb namespace to specify class names
External Bindings File - Provide an extra file to the XJC which has rules on how to map schema elements to java classes
Is there a good argument for using option 1, aside from the ease of use? Naively, I'm tempted to use it because it is easy, but down the road I can see maintenance headaches if we decide to move away from JAXB XML unmarshalling.
Your instincts are good - the only situation in which I'd consider adding inline annotations to the schema is if you or your developers were the ones responsible for maintaining the schema itself.
If the schema is someone else's, and there's any danger of it changing in future, then resist the temptation - use an external binding customization. Yes, it's a bit awkward to use, but it's worth the effort.
As for your original issue of name clashes, an XML Schema is not allowed to re-use the same name either. The only way you should be getting name clashes in the generated Java is if you're compiling schemas from multiple namespaces into the same java package. If you have multiple namespaces, I strongly suggest you put each namespace into its own package, it does tend to make things clearer. It also avoids these name clashes.

Categories

Resources