Configuration depending on launch mode - java

Play can be launched in dev mode (via run), in production mode (via start) or in test mode. Is there a way to provide a different config file (conf/application.conf) depending on which mode it is launched in?

I usually have a base configuration (application.conf) and three extra configs per environment. In Play Framework 2.4 it can be done by extending GuiceApplicationLoader and merging base conf with your environment specific conf. You can go one step forward and provide different guice modules per environment.
Scala version:
class CustomApplicationLoader extends GuiceApplicationLoader {
override protected def builder(context: Context): GuiceApplicationBuilder = {
val builder = initialBuilder.in(context.environment).overrides(overrides(context): _*)
context.environment.mode match {
case Prod =>
// start mode
val prodConf = Configuration(ConfigFactory.load("prod.conf"))
builder.loadConfig(prodConf ++ context.initialConfiguration).bindings(new ProdModule());
case Dev =>
// run mode
val devConf = Configuration(ConfigFactory.load("dev.conf"))
builder.loadConfig(devConf ++ context.initialConfiguration).bindings(new DevModule());
case Test =>
// test mode
val testConf = Configuration(ConfigFactory.load("test.conf"))
builder.loadConfig(testConf ++ context.initialConfiguration).bindings(new TestModule());
}
}
}
Java version:
public class CustomApplicationLoader extends GuiceApplicationLoader {
#Override
public GuiceApplicationBuilder builder(ApplicationLoader.Context context) {
final Environment environment = context.environment();
GuiceApplicationBuilder builder = initialBuilder.in(environment);
Configuration config = context.initialConfiguration();
if (environment.isTest()) {
config = merge("test.conf", config);
builder = builder.bindings(new TestModule());
} else if (environment.isDev()) {
config = merge("dev.conf", config);
builder = builder.bindings(new DevModule());
} else if (environment.isProd()) {
config = merge("prod.conf", config);
builder = builder.bindings(new DevModule());
} else {
throw new IllegalStateException("No such mode.");
}
return builder.in(environment).loadConfig(config);
}
private Configuration merge(String configName, Configuration currentConfig) {
return new Configuration(currentConfig.getWrappedConfiguration().$plus$plus(new play.api.Configuration(ConfigFactory.load(configName))));
}
}
Do not forget to include play.application.loader = "modules.CustomApplicationLoader" to your application.conf.
In lower versions of Play something similar can be achieved by using GlobalSettings class and overriding onLoadConfig. Mind GlobalSettings in Play 2.4 is depracted.
If you don't like including test.conf and test mocks from TestModule to your production build, you can filter the files with sbt.

You can set a different config file using one of the 3 ways play gives to you:
1 - Using -Dconfig.resource
It will search for an alternative configuration file in the
application classpath (you usually provide these alternative
configuration files into your application conf/ directory before
packaging). Play will look into conf/ so you don’t have to add conf/.
$ /path/to/bin/ -Dconfig.resource=prod.conf
2 - Using -Dconfig.file
You can also specify another local configuration file not packaged
into the application artifacts:
$ /path/to/bin/ -Dconfig.file=/opt/conf/prod.conf
3 - Using -Dconfig.url
You can also specify a configuration file to be loaded from any URL:
$ /path/to/bin/
-Dconfig.url=http://conf.mycompany.com/conf/prod.conf
Checkout more on:
https://www.playframework.com/documentation/2.3.x/ProductionConfiguration

This thing can be done by having loading config files based on the environment which can be supplied via -Dmode=staging/dev/prod, and for loading the files I override onLoadConfig of GlobalSettings in Global.java.
Java snippet-
#Override
public Configuration onLoadConfig(Configuration config, File file,ClassLoader classLoader) {
Configuration updatedConfig = config;
String mode = config.getString("mode");
if (StringUtils.isNotEmpty(mode)) {
try {
File modeFolder = FileUtils.getFile(file, "conf/" + mode);
if (modeFolder.exists()) {
play.api.Configuration modeConfig = config.getWrappedConfiguration();
IOFileFilter fileFilter = new WildcardFileFilter("*.conf");
Collection<File> fileList = FileUtils.listFiles(modeFolder, fileFilter, null);
for (File confFile : fileList) {
modeConfig = modeConfig
.$plus$plus(new play.api.Configuration(ConfigFactory.parseFile(confFile)));
}
updatedConfig = new Configuration(modeConfig);
}
} catch (Exception e) {
Logger.error("Exception while loading configuration for mode : " + mode, e);
}
} else {
Logger.error("Please provide mode in which play application has to start (Ex. play -Dmode=<mode>) ");
}
For each mode, create a folder(name same as environment) and keep environment specific config in that folder.

Related

Constant path replacement for different environments using gradle

This is a Apache Storm based project. I have a Constants file which looks something like this
public class Constant {
public static final String CONTEXT_PATH ="<some path to a context.xml file>";
public static final String APP_PROPERTIES_PATH = "<path to the properties file>";
//...More static properties
}
This CONTEXT_PATH variable is different for different environments (dev, test, prod).
I have a gradle task which generates the JAR file for deployment
task stormJar(type: Jar) {
baseName = 'diagnostic'
from { configurations.compile.collect { it.isDirectory() ? it : zipTree(it) } }
with jar
}
I was looking to dynamically change or refer the CONTEXT_PATH variable so that we can create builds for different environments without making any changes to this file.
I believe there are solutions to do so in the Android territory (BuildConfig), but not able to find a solution for a plain Java project.
The solution should be such, it should work for IDE (IntelliJ and Eclipse) as well as create environment specific build.
Doing something like below should get the work done
gradlew build -pEnvironment=prod
Not much experience with gradle. Please point me in the right direction.
Note there is a working example here.
One technique is to generate Constants.java as part of the build. Consider this template (stored as a resource, not as Java code):
public class Constants {
public static final String CONTEXT_PATH = "__CONTEXT_PATH";
public static final String APP_PROPERTIES_PATH = "__APP_PROPERTIES_PATH";
}
and the following generates Constants.java early in the compileJava task:
compileJava.doFirst {
def newConstantsFile = new File("${projectDir}/src/main/java/net/codetojoy/util/Constants.java")
def templateConstantsFile = new File("${projectDir}/resources/TemplateConstants.java")
newConstantsFile.withWriter { def writer ->
templateConstantsFile.eachLine { def line ->
def newLine = line.replace("__PACKAGE", "net.codetojoy.util")
.replace("__CONTEXT_PATH", getContextPath())
.replace("__APP_PROPERTIES_PATH", getAppPropertiesPath())
writer.write(newLine + "\n");
}
}
}
and then the crucial env-specific predicates:
def getContextPath = { ->
def result = "default"
if (project.Environment == "prod") {
result = "PROD context path here"
} else if (project.Environment == "uat") {
result = "UAT context path here"
} else if (project.Environment == "dev") {
result = "DEV context path here"
}
return result
}
def getAppPropertiesPath = { ->
def result = "default"
if (project.Environment == "prod") {
result = "PROD app properties path here"
} else if (project.Environment == "uat") {
result = "UAT app properties path here"
} else if (project.Environment == "dev") {
result = "DEV app properties path here"
}
return result
}
Note that the ENV-specific values could easily be abstracted into config files, ENV vars, etc.
Also note that the example addresses: Java package used, project version, and build timestamp as well. I write something like this for most projects.

How to print full path to script files in #sql annotation in spring boot test

In a multi-module project I want to be sure that Spring's #sql annotation uses correct resources. Is there a way to log full path of those files to console somehow?
Spring does log script file name before execution, but in tests for different modules those file names are the same sometimes.
SqlScriptsTestExecutionListener - responsible for the processing of #Sql, for the first step you can change to debug related log by adding property logging.level.org.springframework.test.context.jdbc=debug, but the debug message is not fully and if is not enough you should create your own TestExecutionListener and declare on test class #TestExecutionListeners(listeners = SqlScriptsCustomTestExecutionListener.class)
for example:
public class SqlScriptsCustomTestExecutionListener extends AbstractTestExecutionListener {
#Override
public void beforeTestMethod(TestContext testContext) {
List<Resource> scriptResources = new ArrayList<>();
Set<Sql> sqlAnnotations = AnnotatedElementUtils.getMergedRepeatableAnnotations(testContext.getTestMethod(), Sql.class);
for (Sql sqlAnnotation : sqlAnnotations) {
String[] scripts = sqlAnnotation.scripts();
scripts = TestContextResourceUtils.convertToClasspathResourcePaths(testContext.getTestClass(), scripts);
scriptResources.addAll(TestContextResourceUtils.convertToResourceList(testContext.getApplicationContext(), scripts));
}
if (!scriptResources.isEmpty()) {
String debugString = scriptResources.stream().map(r -> {
try {
return r.getFile().getAbsolutePath();
} catch (IOException e) {
System.out.println("Unable to found file resource");
}
return null;
}).collect(Collectors.joining(","));
System.out.println(String.format("Execute sql script :[%s]", debugString));
}
}
It is just quick example and it works. Most of source code i copied from SqlScriptsTestExecutionListener just for explanation. It is just realization in case of #Sql annotation on method level, and not included class level.
I hope it will be helps you.

jOOQ Gradle plugin does not update generated files

For some reason I have to manually delete generated folder and run gradle task to get updated POJOs. Is this my setup, expected behavior or a bug? My setup is as follows:
jooq {
library(sourceSets.main) {
jdbc {
driver = 'com.mysql.jdbc.Driver'
url = 'jdbc:mysql://localhost:3306/library'
user = 'library'
password = '123'
schema = 'library'
}
generator {
name = 'org.jooq.util.DefaultGenerator'
strategy {
name = 'org.jooq.util.DefaultGeneratorStrategy'
}
database {
name = 'org.jooq.util.mysql.MySQLDatabase'
inputSchema = 'library'
}
generate {
daos = true
}
target {
packageName = 'com.example.library.db'
directory = 'src/main/java'
}
}
}
}
Currently when you generated the files they're added under src/main/java folder. This is not a good idea since you have mixed source and generated files. It's much better to add a separate folder src/main/generated and modify the build.gradle in the following way:
def generatedDir = 'src/main/generated'
sourceSets {
main {
java {
srcDirs += [generatedDir]
}
}
}
clean.doLast {
project.file(generatedDir).deleteDir()
}
and change:
target {
packageName = 'com.example.library.db'
directory = generatedDir
}
This way you can easily manage the generated classes. All the classes will be removed automatically when clean task is run.
You also need to define a dependency between compileJava and the generator task. It can be done in the following way:
compileJava.dependsOn YOUR_GENERATOR_TASK_NAME
jOOQ will not delete the files automatically.

Tomcat 8 new Resource implementation to map Jar files in a separate directory

With tomcat 8 I have extend the WebAppClassLoader and add some jar filed from a shared location to the classloader path using addRepository() method. With tomcat 8 addRepository have been removed and new resource implementation have been introduced. I'm still able to use the addUrl method to add jar files. But I would like to implement the new resource based implementation.
I've tried with
DirResourceSet dirResourceSet = new DirResourceSet(getContext().getResources(), "/WEB-INF/lib", "/home/thusitha/lib/runtimes/cxf", "/");
WebResourceRoot webResourceRoot = getContext().getResources();
webResourceRoot.getContext().getResources().addPreResources(dirResourceSet);
But this is not working and still it throws classnotfoundexception
Can someone tell me how to map a directory which contains jars to a particular webapp using Tomcat new resource implementation?
A solution to this problem is to register your resources by overriding the ContextConfig class (org.apache.catalina.startup.ContextConfig). Catalina enters a starting state immediately after it scans your document path for resources. Most of the processing of those resources, such as annotations, is handled by the ContextConfig LifecycleListener. To ensure the resources are added before the context configuration takes place, override the ContextConfig.
final Context currentContext = ctx;
ContextConfig ctxCfg = new ContextConfig() {
#Override
public void lifecycleEvent(LifecycleEvent event) {
if (event.getType().equals(Lifecycle.CONFIGURE_START_EVENT)) {
WebResourceRoot webResourcesRoot = currentContext.getResources();
String baseDir = Platform.getBaseDir(); // Server Base Directory
File libDir = new File(baseDir + File.separator + "lib");
DirResourceSet dirResourceSet = null;
try {
dirResourceSet = new DirResourceSet(webResourcesRoot, "/WEB-INF/lib", libDir.getCanonicalPath(), "/");
} catch (IOException e) {
throw new RuntimeException(e);
}
webResourcesRoot.addPostResources(dirResourceSet);
String[] possibleJars = dirResourceSet.list("/WEB-INF/lib");
for(String libfile : possibleJars) {
WebResource possibleJar = dirResourceSet.getResource("/WEB-INF/lib/"+libfile);
System.err.println(String.format("Loading possible jar %s",possibleJar.getCanonicalPath())); // Just checking...
if (possibleJar.isFile() && possibleJar.getName().endsWith(".jar")) {
WebResourceSet resourceSet = new JarResourceSet(webResourcesRoot, "/WEB-INF/classes", possibleJar.getCanonicalPath(),"/");
webResourcesRoot.addPostResources(resourceSet);
}
}
}
super.lifecycleEvent(event);
}
};
ctx.addLifecycleListener(ctxCfg);
This is an undocumented solution that works on Tomcat 8.0.23. Considering the complexity and difficulty of this I can't say it is a better solution than adding jars directly to ClassLoaders.

Strange Class-Not-Found-Exception with ClusterActorRefProvider

I have a strange behaviour of my little akka cluster projekt:
I have a very simple application.conf:
akka {
# specifiy logger
loggers = ["akka.event.slf4j.Slf4jLogger"]
loglevel = "DEBUG"
stdout-loglevel = "DEBUG"
# configure remote connection
actor {
provider = "akka.cluster.ClusterActorRefProvider"
}
remote {
enabled-transport = ["akka.remote.netty.tcp"]
netty.tcp {
hostname = "127.0.0.1"
port = 3100
}
}
cluster {
seed-nodes = [
"akka.tcp://mycluster#127.0.0.1:3100"
]
}
}
And a very simple main program:
public class MediatorNodeStartup {
public static void main(String[] args) {
String port = System.getProperty("config.port") == null ? "3100" : System.getProperty("config.port");
Config config = ConfigFactory.parseString("akka.remote.netty.tcp.port=" + port)
.withFallback(ConfigFactory.load());
ActorSystem system = ActorSystem.create("mycluster", config);
}
}
Akka, Akka-Remote and Akka-Cluster are all included via maven and visible in the class path.
Now when I execute this, it just fails with a ClassNotFoundException: akka.cluster.ClusterActorRefProvider
although the akka.cluster.* package definitely is in my classpath.
Strangely enough, on another machine this code just works.
So I suppose it has something to do with my eclipse or runtime configuration... but sadly I have no idea where to start searching for the error.
Any ideas? I will provide further information if necessary.

Categories

Resources