Is this project structure valid? - java

I have a dilemma: In the university we learn to create modular software (on java), but this modularity is explained using a single project with packages (a package for business, another one for DAOS and another one for the model, oh and a last package for frontend).
But in my work we use the next structure:
I will try to explain:
First we create a java library project where the model (entities classes) are created in a package.
Next we create an EJB named DAOS and using the netbeans wizard we store the DAOS interfaces in the library project in another package , these interfaces are implemented in the DAOS bean.
So the next part is the business logic, we create a business EJB for each group of functions , again using the wizard we store the interface in the java library project in another package then is implemented on the business bean.
The final part (for the backend) is a bean that I have suggested: a Facade bean who will gather every method of the business beans in a single bean and this has an interface too that is created in our library project and implemented in the bean.
So the next part is call the facade module on the web project.
But I don't know how valid or viable is this, maybe I'm doing everything wrong and I don't even know! so I want to ask your opinion about this.

Storing the interfaces of DAOS, BusinessBeans and Facade in the model project is not a good choice.
Arguments against:
Data structure (entities classes), business logic (BusinessRemote, FacadeRemote) and technical logic (DAOSRemote) are mixed in one project. They are in different packages but will be delivered in one jar.
Reusing the data structure in other applications will lead to classes in your project that are not necessary for the new application.
Changes in the one of the interfaces will force a rebuild of your projects. If the facade interface needs to be changed, because of new requirements or features, you must change the interface class in your model project. This will lead to a new jar of your project that needs to be distributed.
Suggestions:
Separating implementation and interfaces is a good approach; but go all the way. Create separate interface projects (Business, Facade, DAOS). If you are using some dependency management tool (Maven, Ivy, ...) you will add only dependencies to this interface projects. For your WAR you will include all jars separately.
Good point:
Adding an additional interface as application facade is good. We use this approach to offer different application facades for different user roles, without mixing this role logic into our business logic.

Related

How to share Repository and Service classes between 2 projects

I am working on 2 projects, one web app (Spring MVC) and one standalone backend service application (Spring boot) that heavily interact together. I am using hibernate for both and they are both coded using the Netbeans IDE.
My "issue" is that i end up with duplicate code in both project, mainly in the Repository and Service layers. My entities are obviously also duplicated since both projects use the same database.
Is there a way to make some sort of class library (a third project maybe?) and put all the common code in there? If that is indeed possible, how do you then change each project so they can still access this code as if it were part of them? I was thinking of putting all my Repositories, Services and entities in there to avoid code duplication and greatly reduce the risk of error.
Thank you!
Separate those Repository and Service classes to a submodule.
The structure looks like:
-- your app
-- api (dependent on `common` module)
-- webapp (dependent on `common` module)
-- common
Then the problem is to initialize beans inside common module. AFAIK, you have two options:
In #Configuration class of api or webapp module, add base packages of common module to component scan packages
In api or webapp resources folder, add Spring configuration factory
/src/main/resources/META-INF/spring.factories
org.springframework.boot.autoconfigure.EnableAutoConfiguration=your.path.AutoConfiguration
Define service/repository #Bean inside AutoConfiguration class
I am assuming in this answer your projects are connected to each other
You can set multiple properties within one Spring project, where you store your database connection parameters etc. with the help of multiple property files.
For example:
application-web.properties
application-backend.properties
You can use these in your project, by activating the needed properties file per application. The profile names will be web and backend in these cases.
When using maven, this is the command line I am using:
mvn spring-boot:run -Drun.profiles=<<profile>>
Now, back to your java code.
If there are classes only one of your application is using, you can specify this by 'profile'. Example:
#Controller
#Profile({ "web" })
public class WebEndpoint {
}
This way you can make the shared code available for both applications, without duplicating most of the code.

Java SE - Clever way to implement "plug and play" for different library modules

I'm trying to do something clever. I am creating a weather application in which we can replace the weather API with another weather API without affecting the code base. So I started with a Maven project with multiple modules.
I have a Base module that contains the Interface class and the Base class. The Interface class contains the calls to the APIs (all calls are similar, if not exact) and the Base class contains the properties to the APIs (again, all properties are similar, if not exact).
I have a module for each of the two weather APIs we are testing with plans to create more modules for new weather APIs as we grow the application.
Finally, I have created a Core module (includes main) to implement the specific module class for the weather API I want to test.
Now, I know the simplest way to do this would be to use a switch statement and enumeration. But I want to know if there is a more clever way to do this. Maybe using a Pattern? Any suggestions?
Here is a picture of the structure I have just described:
Here is the UML representation:
This is a learning process for me. I want to discover how a real Java Guru would implement the appropriate module and class based on a specified configuration.
Thank you for your suggestions.
I'm trying to do something clever. I am creating a weather application
in which we can replace the weather API with another weather API
without affecting the code base.
Without reading further down, this first statement makes me think about a plugin architecture design, but in the process of software design, decisions must not be rushed, the more you delay, the more information you have and a better informed decision can be made, for now is just an idea to keep in mind.
I have a Base module that contains the Interface class and the Base
class. The Interface class contains the calls to the APIs (all calls
are similar, if not exact) and the Base class contains the properties
to the APIs (again, all properties are similar, if not exact).
When different modules share behaviour/state, it is a good idea to refactor them and produce base abstract classes and interfaces, so you are on the right track, but, if there are differences, those shouldn't be refactored into the base module. The reason behind that is simple, maintainability. If you start adding if clauses or switches to deal with these differences, you just introduced coupling between modules, and you'll be always having to make changes in the base module, whenever you add/modify other modules, and this is not desirable at all.
This is reflected by the Open/Closed principle form the SOLID principles, which states that a class should be open for extension but closed for modifications.
So after you've refactored the common behaviour into the base modules, then each new API should extend the base module, as you did.
Finally, I have created a Core module (includes main) to implement the
specific module class for the weather API I want to test.
Now, I know the simplest way to do this would be to use a switch
statement and enumeration. But I want to know if there is a more
clever way to do this. Maybe using a Pattern? Any suggestions?
Indeed, making use of a switch, makes it work, but its not a clean design at all, for the same reason as before, when adding, modifying or removing modules, would require to modify this module aswell, and also this code can potentially break.
One possible solution, would be to delegate this responsability on a new component and make use of a creational design pattern like the Abstract Factory, which will provide a interface to instantiate components without specifying its classes.
As for the architecture, so far, the plugin architecture still makes sense, but what if the different modules extend the base contract adding more features? One option is to use the Facade pattern to adapt the module calls and provide an output that implements an interface that clients expect.
But then again, with the provided details, this is the solution I'd suggest, but the scenario should be studied carefully and in greater detail, in order to be able to assure that these are the right tools for the job, and commit to them.
In addition to Salvador Juan Martinez's answer...
To implement a plugin architecture Java's Jar File Specification provides support for service provider interfaces (SPI) and how they are looked up.
As of Java 1.6. you can use the ServiceLoader to lookup service providers. For Java 1.5. and less you must do it on your own or use a library. E.g. commons-discovery.
The usage is quiet simple. In your case put a META-INF/services/com.a2i.weatherbase.IWeather file in each plugin module.
In the Weather Forecast IO module the file should contain only one line
com.a2i.weatherforecastio.ForecastIO
The line must be the full quallified name of an IWeather implementation class.
Do the same for the other module and you can load the implementations via ServiceLoader.
ServiceLoader<IWeather> weatherServicesLoader = ServiceLoader.load(IWeather.class);
Iterator<IWeather> weatherServices = weatherServicesLoader.iterator();
Now it depends on your runtime classpath how many services will be found. Try to add and remove module jar archives from the classpath and run your application.
EDIT
I wrote a blog about a pluggable architecture with standard java. See http://www.link-intersystems.com/blog/2016/01/02/a-plug-in-architecture-implemented-with-java/
Source code is also available at https://github.com/link-intersystems/blog/tree/master/java-plugin-architecture
One solution is you have to define the common interface with all the identified common operations. The extensions/plugins need to implement that interface and have to provide the implementation to common operations.
You can use an abstract factory design pattern to hook up the exact implementation at runtime based on the input parameters.
Interfaces and abstract classes are always good in such scenarios, Thanks.

Little confused about the directory structure

I am creating one web application using Spring and hibernate.
I am little confused about the approach for directory structure.
Approach 1:
Create separate folder/package for each module.
Example, If i have to create login and uploadfile module and my base package is com.abc then i will create package com.abc.login and inside that i will create controller,service,form,dao folders and same for uploadfile model.
Appraoch 2:
Under the same project create controller,service,form,dao folders and then add all controller classes for all modules under com.abc.controller and so all services for all modules under one service folder and so forms and daos
Which approached should I follow ?
The packages are just a way to group together classes that make sense to go together, and avoiding name clashes with other classes. It has absolutely 0 impact on performance. Do whatever you find the best. Both approaches are common (technical-based separation first vs. functional-based separation first).
I prefer your first approach (functional-based separation first), but YMMV.

How to move hibernate related code to its own 'project' so I can share it?

If I want to be able to re-use my hibernate related code with multiple IntelliJ solutions, what should I do?
Should I move my models (with annotations) and Dao's and service classes to their own module?
How would I then be able to re-use this module/project with other intellij solutions?
I guess they would have to compile down to a seperate .jar right?
It is possible to configure an IDEA project to point to a module in an external location. So you could configure multiple IDEA projects to point to the same hibernate module. This is a solution for a one-man show, primarily (although see here about using a variable to make this location configurable).
In order to make this distributable and sharable among multiple developers, you are looking at building a jar out of one module, or if it has no particular meaning to any specific project, making a new project that has the code and produces the jar, which other projects then have as a library.
You can use Spring or Guice for dependency injection. Refactor your dao/services to use generic, so if your children modules don't share the same pojo you can still reuse all your hibernate codes (for dao and services) without any duplications (although you might want to make them abstract, in this case)

Best Practice For Referencing an External Module In a Java Project

I have a Java project that expects external modules to be registered with it. These modules:
Implement a particular interface in the main project
Are packaged into a uni-jar (along with any dependencies)
Contain some human-readable meta-information (like the module name).
My main project needs to be able to load at runtime (e.g. using its own classloader) any of these external modules. My question is: what's the best way of registering these modules with the main project (I'd prefer to keep this vanilla Java, and not use any third-party frameworks/libraries for this isolated issue)?
My current solution is to keep a single .properties file in the main project with key=name, value=class |delimiter| human-readable-name (or coordinate two .properties files in order to avoid the delimiter parsing). At runtime, the main project loads in the .properties file and uses any entries it finds to drive the classloader.
This feels hokey to me. Is there a better way to this?
The standard approach in Java is to define a Service Provider.
Let all module express their metadata via a standard xml file. Call it "my-module-data.xml".
On your main container startup it looks for a classpath*:my-module-data.xml" (which can have a FrontController class) and delegates to the individual modules FrontController class to do whatever it wants :)
Also google for Spring-OSGI and their doco can be helpful here.
Expanding on #ZZ Coder...
The Service Provider pattern mentioned, and used internally within the JDK is now a little more formalized in JDK 6 with ServiceLoader. The concept is further expanded up by the Netbeans Lookup API.
The underlying infrastructure is identical. That is, both API use the same artifacts, the same way. The NetBeans version is just a more flexible and robust API (allowing alternative lookup services, for example, as well as the default one).
Of course, it would be remiss to not mention the dominant, more "heavyweight" standards of EJB, Spring, and OSGi.

Categories

Resources