I'm working on a server engine, and I am not sure in what form to distribute it. It is quite modular and it uses interfaces/abstract classes.
Should it be:
a library (no entry-point, write your own main() and call new Server().setSomeHandler(myHandler).run())
a binary (executable entrypoint with config file where you can inject JAR with handlers)
something else?
Basically, the developer should be able to completely extend or change the way the server works. I don't like the idea of making it a library because it should be a platform by itself, a whole server system.
Providing a programmatic way is more versatile than just an executable. And both ways aren't mutually exclusive. After all, even if a developer provides a handler, internally you probably still need to call something like the setter in your example to actually use that handler. Exposing that API shouldn't be too difficult. You could still provide a small launcher application that loads some config file and wraps it in some API calls, if that is needed.
The more important question would be, are there predefined extension points where developers can plug in their own implementations or is everything completely modular and exchangeable?
For a simple way to provide implementations of predefined interfaces, you can use the ServiceLoader/SPI mechanism. You can built a basic plug-in system with it.
If you want to create a platform, something like OSGi seems more appropriate. Here you could define APIs/SPIs for fine grained components/services. Developers could then provide their own modules that extend the server or even replace your default modules.
Related
I have 2 maven applications that should communicate via server socket and socket. If possible, I want to send the data as a Java-object, but for that I need both of those projects to include the class of the object.
If possible, I don't want to make a third maven project with the class and add it to the server and client project as a dependency. Is there any other way to do that?
Thanks for your answers!
You could make your server project a sub-project of your client project, meaning that your server has access to all the classes the client needs, plus some extras.
Alternatively, you can create a JAR containing shared classes, and install this to your local (or remote if you have access) maven repository, using mvn install (docs here: https://maven.apache.org/guides/mini/guide-3rd-party-jars-local.html)
For the actual transfer of data, you could serialize your objects using the Serializable interface, however there are many issues with this approach:
Fragile to class changes - if you change your class, old objects will likely break unless you manually manage this
Java-only - you will not be able to, for example, write your client in C++ and your server in Java, if you ever decide to do something similar.
Framework incompatibility - many popular frameworks work primarily with other formats, and cannot guarantee compatibility.
Instead, you can use:
JSONs - Using Jackson Databind, or Google Gson libraries, they are flexible, powerful and standardised
XML - Similar to JSONs, with some subtle differences
Google Protobuf - Also has some limitations but very underrated for resource-constrained environments.
Custom String Format - implement your own toDataString() and fromDataString() methods. This is only really feasible for small classes, as there are many issues with Unicode, escape characters, encodings etc that most libraries hide from you. This is more risky than other methods.
In general, I would reccomend JSON unless you have a good reason to do otherwise. I personally use Jackson, here is a link to a tutorial: http://tutorials.jenkov.com/java-json/jackson-objectmapper.html
Are there ways to automaticlly apply changes of an API or library that have been made while in parallel there was ongoing development of the consuming part in another branch for example, just as if one would use a rename or signature refactoring operation in common IDEs, which are usually applied automatically to all consuming parts.
What are the usual strategies to handle these cases as automated as possible when no fully automated tool is used to deal with this.
I'm mainly working with the IntelliJ platform (Pycharm, IDEA), so any possibility directly inside it would be preferred.
So, as I understand git checkout changes some signatures in the API, and you want IDE to automatically detect the signature change and refactor all its clients in the project? If it is so, it is not possible in the current version.
The best way to handle this, I believe, is to change clients along with the API. This can be achieved by having all the clients in the same project or monitoring API usages via some service.
For jstl tags, an API javax.servlet.jsp.jstl-api-1.2.1.jar & implementation javax.servlet.jsp.jstl-1.2.1.jar are provided.
For servlets, an API in servlet-api.jar & implementation jar from tomcat or GlassFish are provided.
For collections, API like java.util.List & corresponding implementations like java.util.ArrayList & java.util.LinkedList are provided.
IDE Netbeans is another example.
Implementation jar includes both the API(mostly interfaces) and its implementation, What are the advantages in providing a solution with an API JAR and their corresponding implementations(a separate JAR), for programmers/developers to use?
For developing enterprise applications using java, Is providing an API the standard approach for stable contract between developers?
I have advocated separating the API from its implementation in my Practical API Design book. At that time I valued the simple library as well as modular library approaches. We used them both successfully when designing NetBeans Platform APIs. I usually felt uneasy when it came to vendor library style - a common approach used in Java EE world.
Later I realized the optimal solution depends on the desired degree of proximity (see my detailed explanation). In short, it depends how closely the API author is related to the one who implements the API. Is it the same person? Is it a group that sit down and agreed on a specification? Are they different, but we expect way more users of the library than those who implement it? Or do we expect almost every user to implement (something in) the library? The answer to this question then leads to:
None to many
One to many
Few to many
Many to many
Proximity classification. Each of them can be handy in some situations. However, my all time favorite is many to many approach with full featured modular library design.
1) If you put the API in a different jar, then you can let it be used by clients that can't access the implementation. For example:
You can exclude the implementation from the compile-time classpath of clients, to ensure that clients of the API don't require any particular implementation.
You can exclude the implementation from the run-time classpath of API clients (either via ClassLoaders like servlets do or separate JVMs), so that clients can't depend on any particular implementation, and so that they can use libraries that would conflict with the ones that the implementation uses.
2) Not really individual developers, but it's common to use a strategy like that to avoid conflicts and unwanted dependencies between different development teams.
This is because only the API is standardised, and multiple implementations are possible, and the API is an incomplete specification. In the case of servlets, in addition to the servlets API, which your Web App uses, there is the web application server (Tomcat or Glassfish). The application server is a large program, with many other features and APIs. For a Web Application, your servlets are not "the program"; the application server is. It is not that your servlets delegate to the server, the server delegates to your code (in a WAR).
In programming using an implementation, you might need the API specification (interfaces, abstract classes etc) too.
Interface obj = new ClassImpletingInterface();
which could also be done as
ClassImpletingInterface obj = new ClassImpletingInterface();
If your program used only the latter, you might get away having just the implementation jar that didn't include the API package. As far as possible, one should use the former for better maintainability etc. Now the question of why can't the API package just be bundled into the implementation jar - one API one jar. Might sound simple, but may not be desirable. You might prefer to use the javax.servelet.jsp.jstl-api package obtained from the authentic source; not bundled in com.newbie.servlet-0.0.1.jar. There can be legal aspects that prevent such bundling. Further, an implementation not necessarily provide functionality for the complete specification. Your imports for an API could come from two different implementations as different parts, and they could be for different release levels of the specification. In that case, perhaps rare, each bundling a different release level of API may cause trouble because jar search within a directory is not completely defined. So, bundling of API and implementation into separate jars is cleaner.
What is the proper way to make java reusable components that is capable of used in various applications. I mean for example I'm doing a application that has its own user interface and database and etc. If I want to make this app reusable for many other applications as a component of other applications. For example one feature of my first app may needed by other app. So how to make it possible for other apps to use this feature of my app without changing my original code. What are the proper ways to achieve this re usability.
Write something simple which does what it does very well. Document it and unit test it and make it open source.
Have the the reusable components in another project (e.g. "common") and package them as .jar. Then include that jar in the projects where it's needed.
Extracting a separate project might be tricky though. You should observer the following:
the common components should not be dependent on anything in the higher level of abstraction (i.e. you services must not have any UI-related dependencies)
the internals of the components must not be visible to the application using them. I.e. your jar should expose a minimum API.
You have a couple of options for the mechanics of packaging:
simple IDE-dependant packaging - declare a inter-project dependency. On build export the jar and put on the classpath of the client application
Maven/Ivy - install the dependency in a repository (local or remote) and use the dependency resolution mechanisms of maven/ivy
This is a rather broad question. As such, I am offering broad suggestions:
Know your OO basics. Inheritance, encapsulation, polymorphism. It gets crazier from there on out.
Learn about design patterns, start observing them in applications you already use.
Look at popular open libraries to see how they implement patterns and modules.
Try things in sandbox projects. Grow your knowledge in clean environments.
Since you mention Java, check out the Spring Framework.
Hope that helps.
You need code in such a way that your components are loosely coupled. Then the re-usability is very much high. Take a look at this and this.
Sun Microsystems, the creators of the Java language, have at last recognized this need, and have released the Java Beans Component Architecture. Java Beans are, quite simply, reusable controls written in Java, for Java application development.
Beans are "capsules" of code, each designed for a specific purpose. The advantage of Java Beans over standard programming controls is that Beans are independent. They are not specific to operating systems or development environments. A Bean created in one development environment can be easily copied and modified by another. This allows Java Beans greater flexibility in enterprise computing, as components are easily shared between developers.
I intend to develop a system that is entirely based on modules. The system base should have support for finding out about plugins, starting them up and being able to provide ways for those modules to communicate. Ideally, one should be able to put in new modules and yank out unused modules at will, and modules should be able to use each other's funcionality if it is available.
This system should be used as a basis for simulation systems where a lot of stuff happens in different modules, and other modules might want to do something based on that.
The system I intend to develop is going to be in Java. The way I see it, I intend to have a folder with a subfolder for each module that includes a XML that describes the module with information such as name, maybe which events it might raise, stuff like that. I suppose I might need to write a custom ClassLoader to work this stuff out.
The thing is, I don't know if my idea actually holds any water and, of course, I intend on building a working prototype. However, I never worked on a truly modular system before, and I'm not really sure what is the best way to take on this problem.
Where should I start? Are there common problems and pitfalls that are found while developing this kind of system? How do I make the modules talk with each other while maintaining isolation (i.e, you remove a module and another module that was using it stays sane)? Are there any guides, specifications or articles I can read that could give me some ideas on where to start? It would be better if they were based on Java, but this is not a requirement, as what I'm looking for right now are ideas, not code.
Any feedback is appreciated.
You should definitely look at OSGi. It aims at being the component/plugin mechanism for Java. It allows you to modularize your code (in so-called bundles) and update bundles at runtime. You can also completely hide implementation packages from unwanted access by other bundles, eg. only provide the API.
Eclipse was the first major open-source project to implement and use OSGi, but they didn't fully leverage it (no plugin installations/updates without restarts). If you start from scratch though, it will give you a very good framework for a plugin system.
Apache Felix is a complete open-source implementation (and there are others, such as Eclipse Equinox).
Without getting into great detail, you should be looking at Spring and a familiarization with OSGI or the Eclipse RCP frameworks will also give you some fundamental concepts you will need to keep in mind.
Another option is the ServiceLoader added in Java 1.6.
They are many way to do it but something simple can be by using Reflection. You write in your XML file name of file (that would be a class in reallity). You can than check what type is it and create it back with reflection. The class could have a common Interface that will let you find if the external file/class is really one of your module. Here is some information about Reflexion.
You can also use a precoded framework like this SourceForge onelink text that will give you a first good step to create module/plugin.