Set a variable value within a jar using a gradle task - java

I have a client and server architecture.
The client is a runnable .jar file built using the following Gradle task:
jar {
archiveName = "app.jar"
from sourceSets.main.output.classesDir
include '**/*.class'
manifest {
attributes 'Main-Class': 'com.bobbyrne01.app.Main'
}
exclude 'META-INF/*.RSA', 'META-INF/*.SF','META-INF/*.DSA'
}
I'm making a change to how this client will authenticate.
The problem is, I'd like to specify which method of authentication the client should use during the build.
So that when the user downloads the .jar, they will just run it and whichever method of authentication I activated during the build will be active in the .jar.
What would be the best way to set this authenticationType variable/flag?
For the server, there is an interface for authentication and a number of classes which implement different authentication methods. While building the server, I specify a gradle property which gets set as an environment variable on the docker image.
So at runtime, the server uses reflection to determine which authentication class to instantiate.
But I'm unsure how I can set a similar value within the jar.

Use a properties file containing a value allowing to decide, at runtime, which authentication mechanism you want to use.
Define it as
authentication=${authentication} or
or
authentication=##authentication##
in the source properties file. And use expand or filter in the standard processResources gradle task (of type Copy) in order to replace the placeholder by the actual value you want to use.
See https://docs.gradle.org/current/dsl/org.gradle.api.tasks.Copy.html#org.gradle.api.tasks.Copy:expand(java.util.Map) and https://docs.gradle.org/current/dsl/org.gradle.api.tasks.Copy.html#org.gradle.api.tasks.Copy:filter(java.util.Map,%20java.lang.Class).

Related

jni.h - No such file or directory - Correctly setting JAVA_HOME environment variable

I'm currently trying to use jni.h in Windows Subsystem for Linux.
Any time I try to setup/run it I get this error message.
I just wanted to know how to correctly set my Java_HOME variable successfully so it can find JNI.h
Any help would be sorely needed as I don't really know where to start as I'm not familiar with setting paths yet.
So far my two build.gradle files include this:
This is the build.gradle inside my c subproject
plugins {
// We're actually using C++, but we can essentially pretend that it's C.
// The cpp-library plugin compiles our C/C++ code and generates a library file.
id 'cpp-library'
}
tasks.withType(CppCompile).configureEach
{
// The actual 'jni.h' file lives in the 'include' directory, but there are
also a series of
// other, platform-specific header files in 'include/linux' and/or
'include/win32'. Your actual
// JDK may only have one of these platform-specific directories.
includes "$System.env.JAVA_HOME/include"
includes "$System.env.JAVA_HOME/include/linux"
includes "$System.env.JAVA_HOME/include/win32"
includes "$System.env.JAVA_HOME/include/win64"
}
library {
// Define the library's name. (The file produced will be 'lib<baseName>.so' on Linux or
// '<baseName>.dll' on Windows.)
baseName = 'example_c_library'
// Create a 'shared' library only (not a 'static' library).
linkage = [Linkage.SHARED]
// What is the target platform?
targetMachines = [
machines.linux.x86_64,
machines.windows.x86_64,
machines.macOS.x86_64
]
}
This is the build.gradle inside my java subproject
plugins {
id 'java'
id 'application'
}
// We need Gradle to finish configuring the other sub-project first, because we need to
// refer to two of its tasks below.
evaluationDependsOn ':c_library'
def libTasks = project(':c_library').tasks
def debugLibTask = libTasks.matching{ it.name.startsWith('linkDebug') }.first()
def releaseLibTask = libTasks.matching{ it.name.startsWith('linkRelease') }.first()
dependencies {
// Make this subproject ('java_app') depend on the file produced by the linking task in the
// other subproject.
runtimeOnly files(releaseLibTask.linkedFile)
// This declaration is more convoluted than you might expect. We can't simply depend on the
// other subproject as a whole, because that makes the Java plugin complain that it isn't
// another Java project. There's no automated logic for tying a C library into a Java
// application.
// Instead, this dependency simply causes our C library file to be included as part of the Java
// application's distributable .zip file. We then have to do some setting up, in 'run{}' and
// 'startScripts{}', to ensure that Java will be able to load the library.
}
application {
mainClassName = 'org.example.ExampleJavaApp'
}
run {
// Make 'gradlew run' set the library path correctly. There is a Java "system property" for
// this, which needs to be set to the *directory* containing the shared library.
// We first depend on the 'linkDebug' task that creates the debug version of the library, to
// ensure that task runs before the 'run' task. Then we make a few more calls to extract the
// actual directory, and set the library path.
// Debug vs release? Gradle builds two versions of our C code with different compiler options,
// one intended for debugging (which is what we're theoretically doing when we execute 'run'),
// and one for release (which is what the final .zip file is for).
dependsOn debugLibTask
systemProperty 'java.library.path', debugLibTask.linkedFile.get().asFile.parentFile
}
startScripts {
// Make the start-up scripts (both UNIX and Windows) set the library path correctly, so that
// our application is properly distributable.
// When our application is distributed, the native library will live inside the same 'lib/'
// directory that contains the rest of our code. So the library path needs to be the 'lib/'
// directory. However, we can't hardcode the location of this directory, because we can't know
// in advance where the user has installed the application. Instead, we have to get the
// start-up script (both the UNIX and Windows version of it) to figure it out.
// How we do *that* is a bit hacky though. We first tell the script generator how to set the
// path, but at the "last minute" we go in and tweak the result, because the actual path
// depends on a variable in the script that we can't access in the first instance.
defaultJvmOpts = ['-Djava.library.path=APP_HOME_PLACEHOLDER/lib']
doLast {
unixScript.text = unixScript.text.replace('APP_HOME_PLACEHOLDER', '$APP_HOME')
windowsScript.text = windowsScript.text.replace('APP_HOME_PLACEHOLDER', '%APP_HOME%')
}
}

Corda RPC client external dependencies

Suppose I have a small RPC client application for corda, which should be uploaded into node, and be called to do some utility work on node.
In build.gradle file of my utility RPC client I have just the following dependency cordaCompile "$corda_release_group:corda-rpc:$corda_release_version" and my JAR task looks like this
jar {
version = ''
baseName = 'rpc_utility'
duplicatesStrategy = DuplicatesStrategy.EXCLUDE
manifest {
attributes 'Main-Class': 'com.example.rpcclient.MainKt'
}
zip64=true
from { (configurations.compile).collect { it.isDirectory() ? it : zipTree(it) } }
}
but when I create my jar, it results in 58 MB Jar file, with all the dependencies of Corda in it, which are already there inside the node, packed in the corda.jar file. And cordapps can use these libraries without having them inside their JAR files.
Now the question is, how should I configure my jar task and what should I include inside it to tell Java that all the dependencies it needs are right there, in the same folder inside corda.jar file.
P.S. I also tried to create a fat jar like I do now, and then minify it with Proguard, but even after a long proguard-rules list I still have errors, as Proguard seems to remove a lot of files that Corda needs, so this seems to be not a good solution, and even if I succeed, I will get a ~20 MB file, just for a few lines of code that I have in real...
The client jar is about the right size. It not only includes the dependencies for Corda but also includes your actual CorDapp jars as well (Contract & workflows). The reason is that theļ¼š
RPC client needs to talk to the Corda node, so it needs the Corda dependencies.
It also needs to understand your CorDapps as your controller will invoke the flow directly.

How to load external file into classpath Play Framework 2.3

I have a need to load an external file into the classpath using Play Framework 2.3 (Java)
Stipulations:
The external file cannot live inside my Play app (i.e. the /conf and /lib directories, etc. are not an option)
The file needs to be a .properties or a .conf file so I can specify property values
Here's my scenario:
I have a custom JAR that has some code which is looking for a specific file (let's call it myproperties.properties) in the classpath when being used. The way that I'm attempting to find myproperties.properties is by doing this inside a class that resides inside that custom JAR:
ClassLoader classLoader = com.my.package.MyCustomJavaClass.class.getClassLoader();
InputStream inputStream = classLoader.getResourceAsStream("/path/to/myproperties.properties");
I have access to change the properties file name and the path to it inside the JAR.
My Play Framework App (using Java) has this custom JAR in it's /lib folder, so it gets automatically added to the classpath (this is tested and works correctly). My Play App calls MyCustomJavaClass when it first loads the / route (index route), so the class loader and input stream code above gets kicked off when I hit my play app in the browser.
Problem:
I have not been successful in my attempts to load /path/to/myproperties.properties into the classpath when starting the Play App in a way that my code in the custom JAR can see it.
I've been attempting to start play with the classpath command like so in an attempt to feed the JVM the external file:
activator start -J-classpath "-J-classpath:/path/to/myproperties.properties"
I'm adding -J-classpath; to the beginning of the path in an attempt to copy everything that's currently in the classpath and then just adding my single, external file. However, doing this doesn't seem to be working (i.e. my inputStream is null).
Questions:
Am I doing the activator start -J-classpath command correctly when starting the play app? Other variations in an attempt to copy the existing classpath first were not allowing the play app to start, but this command at least starts my app.
Reference (Specifying additional JVM arguments): https://www.playframework.com/documentation/2.3.x/ProductionConfiguration
What are some other ways that I could possibly get this done? I've explored overriding the application.conf file using activator start -Dconfig.file=/path/to/application-override.conf and putting my properties inside the new application-override.conf file. However, it doesn't seem to put that file into the classpath for MyCustomJavaClass to find using the Class Loader. Maybe I'm doing this command incorrectly as well?
Is it possible that somehow the Play Framework classpath is separate from the classpath that my custom JAR is seeing? I've been under the assumption that it's all in one JVM and classpath.
here's the solution I came up with, hopefully it helps someone else out there:
in my "upper environments" (AWS servers) where my play app is deployed, I put an application-override.conf file in the conf folder in the play framework app directory
the application-override.conf is the exact same as my application.conf but I have some custom properties in both whose values are different in each environment that the play app lives on
my play framework app is in a git repo, which is cloned on each upper environments, so I added application-override.conf to the .gitignore (I don't want it checked it to the repo so it only lives on the servers)
when starting the play app, I now use activator start "-Dconfig.trace=loads -Dconfig.file=conf/application-override.conf". this will override the application.conf file with application-override.conf and application-override.conf will be in the JVM classpath that play uses to run the app (since it's in the conf directory). -Dconfig.trace=loads spits out more logging to let you know if the .conf file was loaded properly or not; it's not a necessary flag if everything is working properly.
on the java side, in my custom JAR, I can now do the following:
Properties properties;
InputStream stream;
ClassLoader classLoader = com.my.package.MyCustomJavaClass.class.getClassLoader();
// first, look for application-override.conf in the classpath (upper environments)
stream = classLoader.getResourceAsStream("application-override.conf");
// if null, check for application.conf (local environment)
if (stream == null) {
stream = classLoader.getResourceAsStream("application.conf");
}
properties = new Properties();
properties.load(stream);
stream.close();
other notes:
I thought about doing a symlink/softlink in the conf directory and put the application-override.conf file somewhere else on my environment, but prior to Play 2.4, you can't have symlinks in the conf directory, so I just put the actual application-override.conf file in the conf folder
The application-override.conf file has different property values for each "upper environment", otherwise I would have just delivered a single override file to the git repo. And in the custom JAR, I didn't want to put in logic that looked for varying file names like dev-override.conf, pre-prod-override.conf, prod-override.conf, etc. I wanted a single upper environments override file.
I didn't have success with the -classpath=/path/to/myproperties.properties or -J-classpath=/path/to/myproperties.properties commands in conjuction with activator start. nor did I have success with attempting to append to the classpath, e.g. activator start -J-classpath=-J-classpath:/path/to/myproperties.properties or other similar combinations
going the route of putting properties in an application-override.conf file actually killed two birds with one stone for me because I've been wanting to make some environment specific changes by having overriding .conf files on each of my environments as well as custom properties
the HOCON format of the .conf files required me to put double quotes around my property values due to the nature of my property values. when reading in those properties in Java, the quotes were still there for me, so I had to do an str.replace("\"","") when reading in those properties

Getting the current working resource directory in java maven project

I am currently working on a JUnit test that checks functionality responsible for loading/saving a process configuration from/to some file. Given that a particular configuration file is present in resources, the functionality loads parameters from the file. Otherwise the functionality attempts to create new configuration file and persist a default configuration coded in the class. Right now I am using .class.getResource() method to check if configuration file exists, and to retrieve the necessary information. This approach is proven to be working fine from both maven's "test-class" and "class" directories. However, I am having problems while attempting to save default configuration when the file does not exist, namely the .class.getResource() method returns null, as the resource does not yet exist. This stops me from building the target resource directory (context-dependent) where the file should be saved.
Is there a way to code my functionality to evaluate whether particular object is being executed as a test or in production? More precisely, how can I build a relative path to my resource files to point to either production resources (../classes/...) or test resources (../test-classes/..) depending on the execution mode in which the project currently is?
My question is somewhat similar to the following How should I discover test-resource files in a Maven-managed Java project? but I think it is different enough to warrant new thread.
If I understand you right, essentially your issue is that you have a Maven project, which reads a particular file (normally, and during unit tests), that determines the application's behaviour. If that file doesn't exist, your application creates it.
The problem with ClassLoader.getSystemResource(...), is that it's not actually scanning a single directory. Instead it's looking at Java's classpath to determine the location of that particular resource. If there's multiple directories on the classpath, it'll have a number of areas that the file could potentially be located in.
In a sense then, .getSystemResource(...) is one way. You're able to look-up the location of a file, but not get the appropriate location to place it.
*So what about when you need to put the file in the correct location?*
You have two options essentially:
Hard-code the location of the file: Noone likes doing that.
The locations that are scanned on the classpath are passed into the classloader. You could use, for example, the first one and create the file there.
The second option isn't actually a bad one; have a look at this sample code.
final Enumeration<URL> urls = ClassLoader.getSystemClassLoader().getResources("");
if(! urls.hasMoreElements()) {
LOG.error("No entries exist on the class path!");
System.exit(1);
}
final File configFile = new File(urls.nextElement().getFile(), "config.xml");
configFile.createNewFile();
LOG.info("Create a new configuration file: " + configFile.getPath());
System.exit(0);
This resolved the configuration file to be within my target folder: ..\target\classes\config.xml
Up to you what you do; happy to provide more tips & advice if you feel more is required.
It sounds like you want to do the following:
When your code runs, it tries to load the configuration file. When the configuration file is not found you want to create the configuration file. The twist is that
if you are executing the code in "production mode" (I presume using something like the exec-maven-plugin or jetty-maven-plugin depending on the nature of your code) you want the configuration file to be placed in ${project.build.outputDirectory}
if you are executing the code in "test mode" (e.g. via surefire or failsafe) you want the configuration file to be placed in ${project.build.testOutputDirectory}
What I would do is use the ServiceLoader pattern.
You create a ConfigFileStore interface that is responsible for storing your configuration.
The ConfigFileStoreFactory enumerates all the services implementing that interface (using the ServiceLoader API by getting all the /META-INF/services/com.yourpackage.ConfigFileStore resources and extracting the class names from those. If there are no implementations registered then it will instantiate a default implementation that stores the file in the path based on getClass() (i.e. working backwards to get to the ${project.build.outputDirectory} note that it should handle the case where the classes get bundled up into a JAR, and I would presume in such a case the config file might get stored adjacent to the JAR)
Note: The default implementation will not be registered in /META-INF/services
Then in src/test/java you extend the default implementation and register that extended implementation in src/test/resources/META-INF/services/com.yourpackage.ConfigFileStore
Now when running code that has the test code on the classpath, the test version will be found, that will pick up the getClass() for a class from ${project.build.testOutputDirectory} because it is from the test classpath's /META-INF/services.
When running code that does not have the test code on the classpath, the default implementation will pick up the getClass() for a class from ${project.build.outputDirectory}
Should do what you want.

Android - use ant to create build configurations that change configuration values

What I want is a way to have settings that are dependent on build configuration. To give a specific example, my android application connects to a web service. In development, I want the service url to be pulled in from a configurable value. In Test, I want a different value pulled in. In production, yet another value.
So, in code I have something like this:
public class HttpRequestHelper
{
private static String GetServiceUrl(ServiceAction action)
{
return serviceUrl + action.toString();
}
}
By default (when debugging/running through eclipse), I want that url to be http://localhost:1234
In Test I want https://test.mydomain.com
In Production I want https://mydomain.com
I am new to eclipse and ant and it has been a long time since I used java. How do I go about setting this up? What should the build.xml look like? I understand that when I want to build the test/prod versions I will need to use the command line. That's okay. But I don't know how to get this serviceUrl auto-set dependent on the build. I'm not even sure the best place to put this information (a resource, a properties file?). I really want to avoid setting it, building, setting it, building, etc.
As answers mentioned above says, you have to place the URLs in a property file like dev.properties, test.properties, prod.properties etc..
Now only thing that you need to do is making your build intelligent enough to choose a property file depending upon environment.
That can be done by passing a parameter to ANT, something like:
$ ant -file MyBuild.xml -DcurrentEnv=dev (For Development environment)
$ ant -file MyBuild.xml -DcurrentEnv=test (For Test)
$ ant -file MyBuild.xml -DcurrentEnv=prod (For Production)
Inside your build script, this is how you can include your property file:
<target name="jarMe">
<jar destfile="sample.jar" basedir="src" includes="${currentEnv}.properties"/>
</target>
With this in place, whatever name you supply at the time of build, property file with that name will be picked up.
You could try to have a following property file in your build.properties file:
service.url=*
And you could have http://localhost:1234 or https://test.mydomain.com in local.properties for your development and integration testing, and it could be set to https://mydomain.com in default.properties.
By do ing this, you have will get different value for service.url in different build environment. You could use that value to generate a config file, and parse it into your code, or set it to env variable, or just put it into a resource file, and Android will read it for you:
<?xml version="1.0" encoding="utf-8"?>
<resources>
<string name="service-url">##toben_to_be_replaced_during_build_time##</string>
</resources>
I would start by placing the urls into a properties file that you can then place onto the classpath. Make a test and a production properties file. Then depending on the build place the correct file onto the classpath and pull the properties at runtime.
Found a tutorial which goes through all the details of using ant to automate a build system, to create and use build configurations, as well as to build the release project with one command. Here it is: http://www.androidengineer.com/2010/06/using-ant-to-automate-building-android.html
Seems a little long, but it goes through all the steps and details involved.

Categories

Resources