I'm writing a tool in java and I need to provide some parameters that user can set.
I thought it is good to have ability to save all parameters in a file (and just run the .jar) and to alter saved parameters through command line.
So, I need to somehow handle parameters from two sources (priority, validity, etc.). Currently I use Apache.commons.cli to read cli-provided parameters and java.util.Properties for file-provided properties. And then I combine these properties together (and add some defaults if needed). But I don't like the result, it seems over-complicated to me.
So the code is something like this:
Properties fromFile = new Properties();
fromFile.load(new FileInputStream("settings.properties"));
cli.Options cliOptions = new cli.Options();
cliOptions.addOption(longName, shortName, hasArg, description);
//add more options
Parser parser = new DefaultParser();
CommandLine fromCli = parser.parse(cliOptions, args);
//at this point I have two different objects with properties I need,
//and I need to get every property from fromCli, check it's not empty,
// if it is, get it from fromFile, etc
So the question is: is there any library to handle properties from different sources (cli, file, defaults)? I tried googling, but did not succeed. Sorry if my googling skills are just not enough.
I'd like the code to be something like this:
import org.supertools.allPropsLib;
allPropsLib.PropsHandler handler = new allPropsLib.PropsHandler();
handler.addOptions(name, shortName, hasArg, description, defaultsTo);
handler.addSource(allPropsLib.Sources.CLI);
handler.addSource(allPropsLib.Sources.FILE);
handler.addSource(allPropsLib.Sources.DEFAULTS);
handler.setFileSource("filename");
allPropsLib.PropsContainer properties = handler.readAllProps();
// and at this point container should contain properties combined
// maybe there should be some handler function to tell the priorities,
// but I don't need to decide from where each properties should be taken
After you define the properties, load them into a java.util.Properties container regardless of the source. Then call the logic and pass it the container as a parameter.
Related
picoCLI's #-file mechanism is almost what I need, but not exactly. The reason is that I want to control the exact location of additional files parsed -- depending on previous option values.
Example: When called with the options
srcfolder=/a/b optionfile=of.txt, my program should see the additional options read from /a/b/of.txt, but when called with srcfolder=../c optionfile=of.txt, it should see those from ../c/of.txt.
The #-file mechanism can't do that, because it expands ALL the option files (always relative to the current folder, if they're relative) prior to processing ANY option values.
So I'd like to have picoCLI...
process options "from left to right",
recursively parse an option file when it's mentioned in an optionfile option,
and after that continue with the following options.
I might be able to solve this by recursively starting to parse from within the annotated setter method:
...
Config cfg = new Config();
CommandLine cmd = new CommandLine(cfg);
cmd.parseArgs(a);
...
public class Config {
#Option(names="srcfolder")
public void setSrcfolder(String path) {
this.srcfolder=path;
}
#Option(names="optionfile")
public void parseOptionFile(String pathAndName) {
// validate path, do some other housekeeping...
CommandLine cmd = new CommandLine(this /* same Config instance! */ );
cmd.parseArgs(new String[] { "#"+this.srcfolder + pathAndName });
}
...
This way several CommandLine instances would call setter methods on the same Config instance, recursively "interrupting" each other. Now comes the actual question: Is that a problem?
Of course my Config class has state. But do CommandLine instances also have state that might get messed up if other CommandLine instances also modify cfg "in between options"?
Thanks for any insights!
Edited to add: I tried, and I'm getting an UnmatchedArgumentException on the #-file option:
Exception in thread "main" picocli.CommandLine$UnmatchedArgumentException: Unmatched argument at index 0: '#/path/to/configfile'
at picocli.CommandLine$Interpreter.validateConstraints(CommandLine.java:13490)
...
So first I have to get around this: Obviously picoCLI doesn't expand the #-file option unless it's coming directly from the command line.
I did get it to work: several CommandLine instance can indeed work on the same instance of an annotated class, without interfering with each other.
There are some catches and I had to work around a strange picoCLI quirk, but that's not exactly part of an answer to this question, so I explain them in this other question.
I have to create a BroadcastStream to be able to change the properties on the database and see the application of these properties in real time on the application.
The problems I have are 2:
1) When I read the database I need to have all the lines at the same time, via resultSet, HashMap or anything that can contain a structure of the type key-value, as some properties depend on other properties, so I cannot process them individually.
The structure of my MapStateDescriptor will be:
//String = topic name
//TopicProperties = object containing all the topic properties
MapStateDescriptor<String, TopicProperties> propertiesStateDescriptor = new MapStateDescriptor<String, TopicProperties>("properties",
BasicTypeInfo.STRING_TYPE_INFO,
BasicTypeInfo.of(new TypeHint<TopicProperties>() {}));
BroadcastStream<Row> propertiesBroadcastStream = env.createInput(JDBCInputFormat)
.map(new TopicPropertiesDbMapper()
.broadcast(propertiesStateDescriptor);
TopicPropertiesDbMapper converts what JDBCInputFormat returns to the String structure, TopicProperties.
The problem is that it is processed one row at a time, but I need to process them all together, as mentioned above.
2) Repeat the reading of the properties and update the BroadcastStream once an hour.
I specify that I have already made a version of the one above, but with the reading of the properties from file, through:
readFile (FileInputFormat, path file, FileProcessingMode, milliseconds of interval for re-reading)
it is working and I solved the two problems listed above for the database case with:
1) Set "unsplittable" flag of the FileInputFormat class to "true";
2) FileProcessingMode.PROCESS_CONTINUOUSLY.
I'm trying to define a Pentaho Kettle (ktr) transformation via code. I would like to add to the transformation a Text File Input Step: http://wiki.pentaho.com/display/EAI/Text+File+Input.
I don't know how to do this (note that I want to achieve the result in a custom Java application, not using the standard Spoon GUI). I think I should use the TextFileInputMeta class, but when I try to define the filename the trasformation doesn't work anymore (it seems empty in Spoon).
This is the code I'm using. I think the third line has something wrong:
PluginRegistry registry = PluginRegistry.getInstance();
TextFileInputMeta fileInMeta = new TextFileInputMeta();
fileInMeta.setFileName(new String[] {myFileName});
String fileInPluginId = registry.getPluginId(StepPluginType.class, fileInMeta);
StepMeta fileInStepMeta = new StepMeta(fileInPluginId, myStepName, fileInMeta);
fileInStepMeta.setDraw(true);
fileInStepMeta.setLocation(100, 200);
transAWMMeta.addStep(fileInStepMeta);
To run a transformation programmatically, you should do the following:
Initialise Kettle
Prepare a TransMeta object
Prepare your steps
Don't forget about Meta and Data objects!
Add them to TransMeta
Create Trans and run it
By default, each transformation germinates a thread per step, so use trans.waitUntilFinished() to force your thread to wait until execution completes
Pick execution's results if necessary
Use this test as example: https://github.com/pentaho/pentaho-kettle/blob/master/test/org/pentaho/di/trans/steps/textfileinput/TextFileInputTests.java
Also, I would recommend you create the transformation manually and to load it from file, if it is acceptable for your circumstances. This will help to avoid lots of boilerplate code. It is quite easy to run transformations in this case, see an example here: https://github.com/pentaho/pentaho-kettle/blob/master/test/org/pentaho/di/TestUtilities.java#L346
I need to create a log file in application directory. Currently I'm using hard-coded absolute path, like
public class Transformer
{
static String LOG_DIR = "ABSOLUTE_PATH_TO_THE_APPLICATION_DIRECTORY";
File log = new File(log_DIR, "log");
}
If someone else have my code, he will has to go the source code, change the LOG_DIR and then recompile. I just know a little about GNU make. My question is, how can I create a "installer" that works like:
./config
make install
and what's the standard/better way of achieving this?
Currently I have:
String path = Transformer.class.getProtectionDomain().getCodeSource().getLocation().getPath()
Best way to load application settings
With 2, I think I can use a shell script to generate a .properties file.
update
Sorry about the confusing "log file", but I actually mean a regular file, it just happen to containing some sort of log information.
You could define the "app.logs.dir" property on application start and refer to it in the log4j configuration:
main(String[] args) {
...
final String appRootDir = /* Detect app location. */;
System.setProperty("app.logs.dir", appRootDir);
// Now we can use the logger.
...
}
And in "log4j.properties":
log4j.appender.filelog.file=${app.logs.dir}/myapp.log
UPDATE
Option #1, the getProtectionDomain() one, might not work depending on the JVM's SecurityManager settings so the best way probably is option #2 - your installer script should store the application installation location in the application configuration file.
And to keep the number of component references low you can read the configuration file into System properties just like I showed above.
File log = new File(System.getProperty("app.logs.dir"), "log");
Background:
I have a requirement that messages displayed to the user must vary both by language and by company division. Thus, I can't use out of the box resource bundles, so I'm essentially writing my own version of resource bundles using PropertiesConfiguration files.
In addition, I have a requirement that messages must be modifiable dynamically in production w/o doing restarts.
I'm loading up three different iterations of property files:
-basename_division.properties
-basename_2CharLanguageCode.properties
-basename.properties
These files exist in the classpath. This code is going into a tag library to be used by multiple portlets in a Portal.
I construct the possible .properties files, and then try to load each of them via the following:
PropertiesConfiguration configurationProperties;
try {
configurationProperties = new PropertiesConfiguration(propertyFileName);
configurationProperties.setReloadingStrategy(new FileChangedReloadingStrategy());
} catch (ConfigurationException e) {
/* This is ok -- it just means that the specific configuration file doesn't
exist right now, which will often be true. */
return(null);
}
If it did successfully locate a file, it saves the created PropertiesConfiguration into a hashmap for reuse, and then tries to find the key. (Unlike regular resource bundles, if it doesn't find the key, it then tries to find the more general file to see if the key exists in that file -- so that only override exceptions need to be put into language/division specific property files.)
The Problem:
If a file did not exist the first time it was checked, it throws the expected exception. However, if at a later time a file is then later dropped into the classpath and this code is then re-run, the exception is still thrown. Restarting the portal obviously clears the problem, but that's not useful to me -- I need to be able to allow them to drop new messages in place for language/companyDivision overrides w/o a restart. And I'm not that interested in creating blank files for all possible divisions, since there are quite a few divisions.
I'm assuming this is a classLoader issue, in that it determines that the file did not exist in the classpath the first time, and caches that result when trying to reload the same file. I'm not interested in doing anything too fancy w/ the classLoader. (I'd be the only one who would be able to understand/maintain that code.) The specific environment is WebSphere Portal.
Any ways around this or am I stuck?
My guess is that I am not sure if Apache's FileChangedReloadingStrategy also reports the events of ENTRY_CREATE on a file system directory.
If you're using Java 7, I propose to try the following. Simply, implement a new ReloadingStrategy using Java 7 WatchService. In this way, every time either a file is changed in your target directories or a new property file is placed there, you poll for the event and able to add the properties to your application.
If not on Java 7, maybe using a library such as JNotify would be a better solution to get the event of a new entry in a directory. But again, you need to implement the ReloadingStrategy.
UPDATE for Java 6:
PropertiesConfiguration configurationProperties;
try {
configurationProperties = new PropertiesConfiguration(propertyFileName);
configurationProperties.setReloadingStrategy(new FileChangedReloadingStrategy());
} catch (ConfigurationException e) {
JNotify.addWatch(propertyFileDirectory, JNotify.FILE_CREATED, false, new FileCreatedListener());
}
where
class FileCreatedListener implements JNotifyListener {
// other methods
public void fileCreated(int watchId, String rootPath, String fileName) {
configurationProperties = new PropertiesConfiguration(rootPath + "/" + fileName);
configurationProperties.setReloadingStrategy(new FileChangedReloadingStrategy());
// or any other business with configurationProperties
}
}