Java code to use variable value depending on build flag - java

makefile that compiles all java files.

The way I have done this multiple times in past is to generate a Java file depending on the flag. If you are using ant then this code generation is very simple. Otherwise, you can use a template file with placeholder and do some shell-scripting or similar to generate the file.

In ant you can use the replace task to modify files as part of your build.
We do this in our builds, but we use it to modify a Java .properties file which the application will read for its configurable behavior.

I've written rather pleasant flag-controlled systems using a combination of Google Guice and Apache CLI to inject flag-controlled variables into constructors.

Related

How do I configure AWS Amplify without the command-line tool?

I'm trying to write an application that uses the AWS API from an Android app written in Java. It seems that the recommended way to do it is using a special set of libraries called "Amplify." I was able to import the appropriate Amplify Java classes into my code, but I see that not all the parameters I want to supply (such as the S3 bucket or the API access key) can be given as method arguments.
All the advice I see online suggests running a command-line configuration command using npm install aws-amplify. But I'd prefer not to use a command-line tool which asks me questions: I'd prefer to configure everything in code. And I don't want to install npm or mess around with it (full disclosure, I tried installing it and got some hassles).
Is there a way to supply the Amplify configuration without using the command-line tool, perhaps via a configuration file or some additional arguments to the methods I'm calling within Java?
I figured it out!
The Amplify.configure() has a not-well-documented overload where you can specify a config file in the form of an Android "resource."
So instead of using
Amplify.configure(getApplicationContext());
as directed in the tutorials, I use
Amplify.configure(
AmplifyConfiguration.fromConfigFile(getApplicationContext(), R.raw.amplifyconfiguration),
getApplicationContext());
The config file needs to be located in the app/src/main/res/raw/ path of the project, named amplifyconfiguration.json. The development environment automatically generates the definition of the value R.raw.amplifyconfiguration, which is a number identifying the file.
That solves the problem of loading the configuration from an explicit file, without using the amplify CLI. The next hurdle is figuring out what keys can be specified in the file...

Compiling a Single NiFi Standard Processor

I have an issue with the NiFi InvokeHTTP processor which requires me to make modifications to it. I am not trying to replace it but to create a fork which I can use alongside the original.
The easiest way I have found to do this is to clone the code, checkout the 1.10 tag and run mvn clean install in the nifi/nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors directory.
However, the result of this is a JAR file named "nifi-standard-processors-1.10.0.jar". This contains ALL of the standard processors. Instead of this, I am looking to output each processor individually so I can upload only the modified InvokeHTTP processor to NiFi.
The only thing I can think of is to delete the source for the other processors individually which seems a little long-winded. I have had a look in pom.xml and cannot see anything obvious which would allow me to do this either.
Does anyone know how I can achieve this? Apologies if this is an easy question; I haven't used Java in over a decade and this is my first time using Maven.
Thank you in advance.
Is the code change one you can make by extending the processor rather than changing it at the source? If so, I'd recommend creating a custom processor which extends InvokeHTTP in its own Maven bundle (e.g. nifi-harry-bundle) which depends on nifi-standard-processors. This will allow you to use the functionality already provided, modify what you need to, and then only compile and build the new code, and copy/paste that NAR (NiFi Archive) directly into the NiFi lib/ directory to consume it.
See building custom processors and this presentation for more details.

execute a Kermeta file from a Java program

I developed a transformation that generates a Kermeta XMI file.
My problem is that I would run this transformation (.kmt file) in the background, that is to say from a Java program.
I tried a lot of code but always without result.
Someone can help me?
Thank you in advance for your help.
Running a kermeta program depends on the version of Kermeta and about its execution context.
With Kermeta 2, the kmt code is compiled into scala. This scala code can be directly called from java thanks to scala compatibility with java.
You first need to declare at least one main operation in order to generate the appropriate initialization operation.
Let's say you have a method "transformFooToBar" in a class "org::example::FooToBar" in your kermeta code.
Declare it as a main entry point using the tags
#mainClass "org::example::FooToBar"
#mainOperation "transformFooToBar"
This main operation must have only string parameters.
This will generate an utility scala class org.example.MainRunner which contains useful initialization methods and a main operation that you can call from the command line.
Case 1/ With Kermeta 2 code called from an eclipse plugin:
This is the simpliest case, call the method org.example.MainRunner.init4eclipse()
before any other call. Then call your main method org.example.KerRichFactory.createFooToBar.transformFooToBar()
You can call any other classes or methods (ie. even with parameters that aren't String)
This is quite convinient for building tool for eclipse based on kermeta transformation.
Case 2/ Kermeta 2 code called from a standard java application (ie. not running in an eclipse plugin)
The initialiation method is then org.example.MainRunner.init()
Common trap for case 2: many transformations that run as standalone still need to manipulate models using eclipse uri scheme in their internal refencing system. Ie. platform:/plugin/... , platform:/resource/..., pathmap:/... , or even more complex uri mapping (typically using custom protocols ) (you can check that easily in by looking in the xmi files as text)
In that case, since Eclipse platform isn't running to provide the search mechanism, you need to provide manually the equivalent URI mapping to map these URI to your local system URI (Ie. to file:/... or jar:file:/ URIs)
One possibility is to use an urimap.properties file that provide such mapping. By default when running a kermeta program in eclipse, an urimap.properties is generated for the current eclipse configuration.
When deploying a kermeta program to another computer, or using a custom deployment packaging, you will have to provide/compute an equivalent file for the target computer.
For convinience, you can set the location of this urimap.properties files thanks to the system propery "urimap.file.location"

Use and Configure Maven by Java program, using Maven Java API

I'm using Maven Java API to configure Maven in a custom Java project.
In particular I need to configure some Maven settings, among which there are proxy settings.
How can i do this? I googled a lot, but I found no examples on how to use Maven from Java.
Can You give me an example or a guide, a snippet of code, whatever you want to clarify HOW TO USE (AND CONFIGURE) Maven by Java API, i.e from Java code?
I found this maven reference, but what do I specifically need?
Thanks in advance.
I've already seen this question, but unfortunately there is no mention on how to edit settings.xml from maven api, I suppose it is possible, but I'm not sure of it, so I asked a new question, wider than that one, how can I manage Maven from Java? settings, run, properties, whatever... is it possible?
For example, about settings management, I found this API maven-settings, it can be useful? It's "read-only" API? I guess it isn't, but I've found no way how to "write" modifications to file, there are no examples on how to use it.
Well, yes, you are a bit crazy. You can take a look at some plug-ins which modify pom.xml files. For example, the versions-set facility shown here:
http://www.mojohaus.org/versions-maven-plugin/set-mojo.html
The source code for that plug-in will show you how to modify pom.xml files, but you also want to modify the settings.xml file.
All of these files are XML. Basically, you want to obtain a DOM for the .xml file. So, you can use generic XML tools to (1) read the file, (2) modify the document model, (3) write the data back to disk.
Note well: Maven caches the .xml files. You have to stop the maven executable and restart it to force it to re-read the .xml files. It sounds like you'll probably be doing this as a matter of course. :-)

Java preprocess phase

I'm writing a Java application that needs a lot of static data that is stored in many enum types. Since I would like an user-friendly way to customize this data using for example xml or json files but I'm not allowed to do it directly with enums I was looking for a way to elegantly do it.
Maybe a good solution would be to have a separate java program that reads the xml files and produces the java sources that are then compiled with the remaining part of the sources. My doubs is how to automatize this process in a stand alone way (eg ant?) and how to integrate it seamlessly with eclipse so that it is autOmatically done when I'm working with the project.. Does anything similar to what I'm looking already exists? Any suggestion to solve my problem?
Thanks!
If the items and the overall structure are somehow fixed (and what varies most is the values of the attributes), you could consider defining the enum with one entry for each of your items and let the enum populate its own constants with data read from an external source (XML / JSON) -- at load time or on demand.
Create a project whose sole job is to generate java from your sources.
Make sure that the generation phase is done by Ant.
Now, wrap this project into eclipse and use a custom ant builder, that calls the target in your already existing build.xml.
This is a standard part of our dev infrastructure, so this definitely works.
You can write a maven plugin that generates the code. There are some plugins that do that. It won't work automatically, but you can connect it to the standard maven lifecycle so it gets executed just before compile.
I just did something like that recently.
You can have ant seamlessly integrate with eclipse to achive that:
In Eclipse open project properties, go to "Builders", click "New...", select "Ant Builder", select a build file, go to "Targets" tab and click "Set Targets..." for "Auto Build". Select the desired target and you are done. The target will run every time you save a source file (if "Build Automatically" is selected).
Have you considered including the XML files in your jar, and loading them on startup into maps that use the enum as a key?

Categories

Resources