How do you organize WSDL based Web Services in Java at project structural level?
Personally I like to store the WSDL in a distinct maven module or project, which is built into a JAR. The resulting JAR contains the WSDL, SEI, client and JAXB classes. Of course the java class generation is automated by the build. Then I can just add the artifact as a dependency to any dependent project. Now, if I need to change the service, I only have to edit the WSDL and build the project. Re-generated java classes are then propagated to all service consumers by the dependency management.
Is this the best way to organize WSDL's? Are there some drawbacks in this approach at certain situations I haven't thought of, or would some other approach provide some additional value?
Related
Is it possible to have a multi module Maven project using JHipster?
At first sight seems not but I want to know if there's a way to share common classes like domain classes or repository classes among different web modules in my Project using Maven.
Suppose to have a Web module with an HTML GUI made with Thymeleaf (no React / Angular), a classic Backoffice.
Then I want to have another web module that expose some REST API that needs for the same domain classes and the existing repository layer.
At first It seems that I've have to duplicate these classes and code into another JHipster application but obviously It's not the best solution.
Without JHipster I create a multi module Maven project with 2 web modules (Backoffice + API) and a third module with these common classes packaged in a shared JAR included as dependency in the first two modules.
How (if It is possibile) can achive this with JHipster?
Thanks
JHipster won't be able to generate what you want, it's up to you manually refactor the generated project to suit your needs and it's not difficult because JHipster puts entity classes in domain package and repositories in repository package. You will then have to decide how you want to execute the Liquibase migrations.
You can generate only backend code using --skip-client option, see command line options in doc.
An alternative (if you are motivated) would be to write a JHipster blueprint to generate a project with the structure you want.
I have created a Selenium Java Framework with a proper folder structure. Basically my framework consists of few common utilities(page objects, reporting configurations and, driver initialization settings, etc.). This framework was developed to automate and validate web applications. We have a bunch of web applications in our organization that are common in nature and behavior. The Java framework that I have developed has some generic methods and page objects that can be utilized in all the web applications.
Now, I have pushed my framework to the Github. And, I want other teams in my organization also to use my framework. So, in my organization for each project, we create a new repo. Therefore, I wanted to know if by any chance my framework can be accessed by other teams of my organization in their projects.
I don't want anyone to clone my framework repo, add their tests, and push it back. As each project in my organization will have their own repo. Therefore, I simply want them to add my framework as a dependency in their project repo. And, when they clone their repo and do a maven build, they should be able to access the utilities of my framework. Please let me know if this is possible by any chance. Thanks! in advance.
You have multiple options.
Option 1:
Give read-only access to the outside your project users, so that they can extract and re-use the items from the framework without impact your code. Other teams can tailor the framework according to their needs (you can consider it as adv/ disadv)
Option 2:
Convert your framework into a jar and then share it with other teams. Ask them to use the jar. No Edits in framework possible.
We are exactly doing the same thing (we are using option-2 as below). Other teams need to use it as a dependency in their pom.xml. Two ways to use the dependency in maven:
If your company has maven artifactory management system, you can publish your framework jar into that and ask the other teams to use it as a dependency in their pom.xml directly
Else,
2. You need to prepare a jar file, and the other teams need to use it as a dependency using system scope level as below:
<!-- Framework -->
<dependency>
<groupId>com.test.group></groupId>
<artifactId>automation-framework</artifactId>
<version>0.0.1-SNAPSHOT</version>
<scope>system</scope>
<systemPath>${jar.location}</systemPath>
</dependency>
groupId, artifactId, and version are the details of your framework project.
Other teams can create a folder called "libs" in in their project, and store your framework's jar in there. That location will go here: ${jar.location}
Every time you make changes to your framework and build new jar, they need to update the jar file under "libs" folder.
In this way, they can use all your utilities, but can't modify or publish any tests into your project.
You need to keep utils package and create all utils classes into that package and use it
In every projects you need to keep package for separate keeping utils classes
We maintain multiple projects that communicate via XML.
The interfaces are defined in XML schemas (.xsd files).
We use JAXB to generate classes from those schemas that are then used in the projects.
We also use the .xsd files to validate input or output.
Sometimes, we need to update the schemas to create a new version that may or may not be backwards compatible.
How can we effectively manage these schemas? Projects should be able to select which version(s) of the schemas they want to work with. It would be nice if every project's build didn't have to integrate and maintain the class generation step again. Are there any good practices for this?
I'm currently thinking about two options:
Package the generated classes as an artifact and deploy them to a Maven repo from where projects can pull them in. Projects don't have to deal with the class generation but access to the .xsd file itself becomes more complicated.
Pull the schemas into the projects as Git submodules. This gives simple access to the schema file but each project's build has to bother with generating the classes.
Basically, JAXB (and XML data binding generally) is a bad idea unless the schema is very stable. You may be using the wrong technology. Working with multiple versions of the schema means you are working with multiple versions of compiled Java code, and that's always going to be a maintenance nightmare.
It may not be a helpful suggestion, but my advice is, don't start from here. If multiple versions of the schema need to coexist, then you want a technology where the application doesn't need to be recompiled every time there is a schema change; look at either a generic low-level API such as JDOM2 or XOM, or a declarative XML-oriented language such as XSLT, XQuery, or LINQ.
I have two java projects that are fairly independent beside the fact that they share a common mysql database.
I wanted to refactor these project and extract everything regarding the common data layer. I am using jOOQ, so most of this layer gets autogenerated in my build. Beside that i then have a few common entity classes that are used in both projects.
what would be the best practice to separate this, so that any change can be done one place and still propagate to both projects? create a third java simple project with the common code? what would you do
I work on a distributed system, and multiple daemons need access to the same Postgres database via jOOQ. Since each daemon is its own Java project, I am in the same boat as you basically.
The solution I've been using is to create a third Java project as a Java Library. If you're using Netbeans you can just include it as a subproject dependency and any changes to the library project can be recompiled into the individual application projects.
One thing of note, you'll need to specify the jOOQ library jars in all 3 projects. In Netbeans its easy to specify a project's library directory, and have multiple projects share these dependencies. Netbeans will copy the dependencies at deployment time.
Edit:
The steps are basically:
create a master layout for system, IE:
/master-project/
/master-project/library
/master-project/software
/master-project/software/daemon1
/master-project/software/daemon2
/master-project/common
/master-project/common/utility1
/master-project/common/utility2
create third-party "library" bundles of {jar,src,docs} under /master-project/library.
create "application" projects under /master-project/software, making sure to tell Netbeans to only use third-party libraries under /master-project/library.
create "library" projects under /master-project/common, making sure to tell NB only to use third-party libraries under /master-project/library.
create a "library" for jOOQ code to be shared, as in step 4.
Each project is responsible for its own compile script (including generating jOOQ code, if desirable), and correctly specifying its dependencies out of /master-project/library, and /master-project/common.
I have a Java project that expects external modules to be registered with it. These modules:
Implement a particular interface in the main project
Are packaged into a uni-jar (along with any dependencies)
Contain some human-readable meta-information (like the module name).
My main project needs to be able to load at runtime (e.g. using its own classloader) any of these external modules. My question is: what's the best way of registering these modules with the main project (I'd prefer to keep this vanilla Java, and not use any third-party frameworks/libraries for this isolated issue)?
My current solution is to keep a single .properties file in the main project with key=name, value=class |delimiter| human-readable-name (or coordinate two .properties files in order to avoid the delimiter parsing). At runtime, the main project loads in the .properties file and uses any entries it finds to drive the classloader.
This feels hokey to me. Is there a better way to this?
The standard approach in Java is to define a Service Provider.
Let all module express their metadata via a standard xml file. Call it "my-module-data.xml".
On your main container startup it looks for a classpath*:my-module-data.xml" (which can have a FrontController class) and delegates to the individual modules FrontController class to do whatever it wants :)
Also google for Spring-OSGI and their doco can be helpful here.
Expanding on #ZZ Coder...
The Service Provider pattern mentioned, and used internally within the JDK is now a little more formalized in JDK 6 with ServiceLoader. The concept is further expanded up by the Netbeans Lookup API.
The underlying infrastructure is identical. That is, both API use the same artifacts, the same way. The NetBeans version is just a more flexible and robust API (allowing alternative lookup services, for example, as well as the default one).
Of course, it would be remiss to not mention the dominant, more "heavyweight" standards of EJB, Spring, and OSGi.