Im working on a project in java that will eventually run on linux and windows machines and maybe mac. My program installs/configures vnc server so I'm looking for suggestions on how I should implement this part of the project. Should I just have a modular design or would it be possible to create a platform independent architecture for this problem?
I think that if VNC configuration is different on different platform you should just create interface and hierarchy of classes that implement it, i.e.
public interface VncConfigurator {
public void configure(Configuration cofiguration) throws ConfigurationException;
}
public class WindowsVncConfigurator implements VncConfgurator {
public void configure(Configuration cofiguration) throws ConfigurationException {}
}
public class LinuxVncConfigurator implements VncConfgurator {
public void configure(Configuration cofiguration) throws ConfigurationException {}
}
etc, etc.
You can also create abstract configurator or cofigurator utils where the common logic will be implemented.
Now create factory that instantiates "right" implementation of configurator according to the platform. And you are done.
I believe that on Windows you will need some additional libraries, e.g. those that provide access to registry. But if you need this first check the following link: http://alexradzin.blogspot.com/2011/01/access-windows-registry-with-pure-java.html
Related
I want to create a Java IPC server module (with Jigsaw) which has two packages:
com.example.ipc.backend
com.example.ipc.backend.api
in my module-info.java I have the following export:
module ipclib {
exports com.example.ipc.backend.api;
}
I can import this module in my JavaFX GUI module without problems, but I'm having a problem creating a class inside the api module.
I have one interface in this class which some class in the GUI module should implement to register itself for changes in the IPC module (e.g. when a client connects to the IPC server):
public interface IpcCallback {
void clientConnected(Client client);
}
The class which implements this interface should then receive a Client with the information about the connected client. This Client class also internally holds a reference to the thread which holds the socket to the client.
public class Client {
private IpcConnection connection; //IpcConnection is from com.example.ipc.backend
public Client(IpcConnection connection) {
this.connection = connection;
}
}
My problem is now that I want to create an instance of Client in the non-exported backend module, whenever a new connection is created. How can I make the constructor to be only accesible within the module, without making it public?
If I make the constructor public, IntelliJ offers to use the constructor to create a new Client object in the GUI module, but does not have access to the IpcConnection class. It offers a quick fix to export the backend module, but this is not what I want when exporting the api.
Therefore, I wonder if this is still “allowed”, because the Java compiler compiles this without any warnings and problems, and it is just a problem of IntelliJ. Or shouldn't this be done like this?
But if this is not the allowed way to do so, I wonder why the module system actually allows exporting only some packages, as there will always be a boundary between exported packages and not-exported packages (only except if the not-exported classes are called from within the exported class, but not the other way round).
Minimal project
Module 'backend'
// backend/src/main/java/module-info.java
module ipclib {
exports com.example.ipc.backend.api;
}
// backend/src/main/java/com/example/ipc/backend/IpcConnection.java
package com.example.ipc.backend;
public class IpcConnection {
}
// backend/src/main/java/com/example/ipc/backend/api/Client.java
package com.example.ipc.backend.api;
import com.example.ipc.backend.IpcConnection;
public class Client {
private IpcConnection connection;
public Client(IpcConnection connection) {
this.connection = connection;
}
public String hello() {
return "Hello";
}
}
// backend/src/main/java/com/example/ipc/backend/api/IpcCallback.java
package com.example.ipc.backend.api;
public interface IpcCallback {
void clientConnected(Client client);
}
Module 'gui'
// gui/src/main/java/module-info.java
module gui {
requires ipclib;
}
// gui/src/main/java/com/example/ipc/gui/App.java
package com.example.ipc.gui;
import com.example.ipc.backend.api.Client;
import com.example.ipc.backend.api.IpcCallback;
public class App implements IpcCallback {
public static void main(String[] args) {
}
#Override
public void clientConnected(Client client) {
System.out.println(client.hello());
}
}
I think, this is not possible in java. You can not declare an interface that is "visible in namespace A and all its sub-namespaces").
In contrast to .net, Java does not really know about the "internal" principle.
If you just omit the word public on an interface or class definition, this element is only visible inside it's current package.
But this is not recursive, as it is in .net (where you create an internal object which is visible only inside the current project, no matter which subfolder or namespace).
In Java, the object is only visible inside its current package (i.e. the very same namespace). Sub-namespaces are not included. Neither are parent namespaces.
If I understood correctly you have subpackages A and B, and only A is made public. A has a class C that has a constructor that should be called from B but not be made public.
I think your best option is to make class C (Client) implement and interface that is public. So the interface class can be in the public package and the actual implementation can be in the non-public package.
I thought I would use the new ResourceBundleControlProvider framework in Java 8 to fix something which Oracle themselves will never fix - the default encoding used when reading resource bundles.
So I made a control:
package com.acme.resources;
import java.io.IOException;
import java.util.Locale;
import java.util.ResourceBundle;
public class AcmeResourceBundleControl extends ResourceBundle.Control
{
#Override
public ResourceBundle newBundle(String baseName, Locale locale, String format,
ClassLoader loader, boolean reload)
throws IllegalAccessException, InstantiationException, IOException
{
throw new UnsupportedOperationException("TODO");
}
}
Then I made a provider:
package com.acme.resources;
import java.util.ResourceBundle;
import java.util.spi.ResourceBundleControlProvider;
public class AcmeResourceBundleControlProvider implements ResourceBundleControlProvider
{
private static final ResourceBundle.Control CONTROL = new AcmeResourceBundleControl();
#Override
public ResourceBundle.Control getControl(String baseName)
{
if (baseName.startsWith("com.acme."))
{
return CONTROL;
}
else
{
return null;
}
}
}
Then in META-INF/services/java.util.spi.ResourceBundleControlProvider:
com.acme.resources.AcmeResourceBundleControlProvider
Then I just tried to run our application from IDEA and I find that it never loads my provider (otherwise the exception would be raised.)
I have checked the names and they all seem to match up. I have checked the compiler output directory IDEA is using and it does contain the service file. I wrote a simple test program which just tries to look up the service:
public static void main(String[] args)
{
for (ResourceBundleControlProvider provider :
ServiceLoader.load(ResourceBundleControlProvider.class))
{
System.out.println(provider.getClass());
}
}
This does print out one entry which is the name of my implementation class. So the issue is not in the service file.
If I breakpoint inside ResourceBundle, I seem to be able to access the custom provider class. Initial forays into the debugger show that ServiceLoader isn't finding any implementations, but I can't figure out why. I'm sure there is some dodgy class loader magic going on which results in not loading my class. :(
Some scary documentation on the Javadoc makes it sound like it might have to be installed as a global extension. If that really is the case, it's a bit of a shame, because it seemed like a useful way to override the default (and in my opinion broken) behaviour. But I also read the tutorial on the matter and it didn't seem to be describing anything like that (unless the good behaviour was pulled out of Java 8 at the very last minute and the docs are out of date!)
The tutorial does state that the JAR containing the ResourceBundleControlProvider must be in the JVM's system extension directory. Section 6 of the tutorial describes the requirement:
java -Djava.ext.dirs=lib -cp build RBCPTest
When you install a Java extension, you typically put the JAR file of the extension in the lib/ext directory of your JRE. However, this command specifies the directory that contains Java extensions with the system property java.ext.dirs.
The JavaDoc for ServiceLoader.loadInstalled() also states that providers on the application's class path are ignored.
Your problem is that the java.util.ResourceBundle that comes with the JVM does a ServiceLoader.loadInstalled(ResourceBundleControlProvider.class) to obtain a list of providers in the static initializer, and uses the thus obtained list ever after.
is there any good tutorial where I can learn how to run a Java Application with multiple classes (one class after the other, in a package) on Netbeans IDE 7.4?
Thank you!
I don't think that's possible, and I don't even know why you'd have multiple mains, but one way to do this manually is simply to write another main instead and call all the classes you want to run.
public class MultiMain {
public static void main(String[] args) throws IOException {
System.out.println(InsertionSort.class);
InsertionSort.main(args);
System.out.println(ConsoleTest.class);
ConsoleTest.main(args);
System.out.println(Main.class);
Main.main(args);
}
}
Also, maybe something like unit tests are what you are looking for ...?
I'm developing a java application that has 20 plugins, each plugin's have some similar GUI menu item's and menu event's but they store data in different table's in a database. Currently I'm creating separate GUI, event and model classes for each plugin by copying and pasting in diffrent class files.
Is it wise to develop separate GUI, event and model classes for each plugin's and duplicate similar methods to other plugin's?
I need your advice on how to create a generic GUI, event and model interface for all the plugin's without making my application uneasy to maintain.
Thank you.
We have a plug-in system in our product, and ran into the same issue. Many of the plug-ins share a lot of code. We ultimately decided on the following:
Define clean Interfaces (PluginABC implements MyProductInterface). These interfaces don't require a specific implementation but...
We provided a AbstractPlugin that Plugins can extend. This AbstractPlugin provides a ton of standard functionality that is useful for most plug-ins.
Example:
public interface MyProductInterface {
public void doIt();
}
public class MyPlugin implements MyProductInterface extends AbtractPlugin {
public doIt() {
// ...
usefulMethodForMostPlugins();
/// ...
}
}
public abstract class AbstractPlugin {
public usefulMethodForMostPlugins() ...
}
I have a java project that is referenced in j2me project and in android project.
In this project i would like to use conditional compilation.
Something like...
//#if android
...
//#endif
//if j2me
...
//#endif
I have been reading about this but i did not find anything useful yet.
You could use Antenna (there is a plugin for Eclipse, and you can use it with the Ant build system).
I'm using it in my projects in a way you've described and it works perfectly :)
EDIT: here is the example related to #WhiteFang34 solution that is a way to go:
In your core project:
//base class Base.java
public abstract class Base {
public static Base getInstance()
{
//#ifdef ANDROID
return new AndroidBaseImpl();
//#elif J2ME
return new J2MEBaseImpl();
//#endif
}
public abstract void doSomething();
}
//Android specific implementation AndroidBaseImpl.java
//#ifdef ANDROID
public class AndroidBaseImpl extends Base {
public void doSomething() {
//Android code
}
}
//#endif
//J2ME specific implementation J2MEBaseImpl.java
//#ifdef J2ME
public class J2MEBaseImpl extends Base {
public void doSomething() {
// J2Me code
}
}
//#endif
In your project that uses the core project:
public class App {
public void something {
// Depends on the preprocessor symbol you used to build a project
Base.getInstance().doSomething();
}
}
Than if you want to build for the Android, you just define ANDROID preprocessor symbol or J2ME if you want to do a build for a J2ME platform...
Anyway, I hope it helps :)
Perhaps you should consider creating interfaces around the logic that's specific to a profile (J2ME, Android or other in the future). Then create concrete implementations of your interface for each profile. Any common parts you could split out into an abstract base class for both implementations to extend. This way your logic for each profile is nicely separated for different concerns. For each profile just build the appropriate set of classes (you could separate them by package for example). It'll be easier to maintain, debug, test and understand in the long run.
Eclipse MTJ project provides preprocessing support as documented . This support was mainly targeted for tackling fragmentation problems on JavaME. I have not tested the preprocessing support together with the Android tooling but it may just work.