TL:DR How do you run jar files as deltas using spring flyway.
Some background to the problem i encountered.
When running a flyway java delta for example the following:
public class V3__0001 implements SpringJdbcMigration {
#Override
public void migrate(JdbcTemplate jdbcTemplate) throws Exception {
List<NewCustomerEntity> newCustomerEntityList = new ArrayList<>();
for(CustomerEntity customer : findAll(jdbcTemplate)){
NewCustomerEntity c = new NewCustomerEntity();
c.setDate(SomeUtilityTool.transformDate(customer.getDate()));
newCustomerEntityList.add(c);
}
insertAll(newCustomerEntityList);
}
private void insertAll(JdbcTemplate jdbcTemplate,List<NewCustomerEntity> newEntities){
// INSERT STATEMENT
}
private List<Customer> findAll(JdbcTemplate jdbcTemplate){
String sql = "SELECT * FROM CUSTOMER";
List<CustomerEntity> customers = getJdbcTemplate().query(sql,
new BeanPropertyRowMapper(CustomerEntity.class));
return customers;
}
}
It takes a list of customers, does something with it using a utility tool accessible in the java environment, for example it can be a maven or gradle dependency.
You create your App-1.1 and run it.
Later when you write the next database migration in java. You want to upgrade SomeUtilityTool from a 1.X to a 2.X, lets say this changes the output of the transformDate function in the example.
Now when someone that uses App-1.0 and upgrades to App-1.2 It does not run the same delta as the one that upgraded from App-1.0 to App-1.1.
My conclusion is that all dependencies should probably have its own module that create an jar artifact. This jar artifact ensures by encapsulating all dependencies that change in a dependency later on will not affect how previous delta was run.
The question is it possible to get spring flyway to pick up and run the jarfile?
Related
I'm writing a custom gradle plugin in which I want to have a bunch of common for several of my projects tasks and a sort of a 'main' task to control which of these tasks to turn on.
Regular tasks in the plugin are e.g.:
CopyDockerResourcesTask
CopyContainerFilesTask
PerformAnalysisTask
and the 'main' task is:
BaseProjectTask
so then in the project in build.gradle I'd like to be able to do this:
BaseProjectTask {
copyDockerResources = true
copyContainerFiles = true
performAnalysis = true
}
I want the default behaviour of the plugin to be to not to do anything, only add certain tasks if they are turned on in BaseProjectTask.
I wanted to achieve this with adding task dependency in #TaskAction method of BaseProjectTask:
class BaseProjectTask extends DefaultTask {
private final BaseProjectExtension extension
private final Project project
#Optional
#Input
Boolean copyContainerFiles = false
...
#Inject
BaseProjectTask(Project project, BaseProjectExtension extension) {
this.project = project
this.extension = extension
}
#TaskAction
def execute() {
if (copyContainerFiles) {
project.tasks.assemble.dependsOn(project.tasks.copyContainerFiles)
}
...
}
}
Creating task dependency, this line:
project.tasks.assemble.dependsOn(project.tasks.copyContainerFiles)
doesn't work.
Edit:
My current findings are that defining task dependency in #TaskAction is too late as this is execution phase. I could do it in the constructor (this way it works) but its too early as property copyContainerFiles isn't set yet.
Does anyone know a way of adding code in the task class that would be fired in the configuration phase? I think this is what I'm missing.
You need to configure task dependencies during the build configuration phase, as you surmised.
It's not possible to do it in the #TaskAction method. It's fundamental to the way Gradle works that it needs to know how tasks depend on each other before it starts executing the build. That allows Gradle to do some useful things, such as only executing the tasks that are not up to date, or working out what tasks will execute without actually executing them.
In general, tasks should not be aware of one another1.
When you are trying to do this in a plugin using values in a project extension, you must wait until after the project has evaluated so that the build script code executes first. You can do this with project.afterEvaluate()2.
So you can do the following (using Kotlin DSL3):
project.afterEvaluate {
tasks.register("baseTask") {
if (baseProjectExtension.copyDockerResources)
dependsOn(tasks.getByName("copyDockerResources"))
if (baseProjectExtension.copyContainerFiles)
dependsOn(tasks.getByName("copyContainerFiles"))
if (baseProjectExtension.performAnalysis)
dependsOn(tasks.getByName("performAnalysis"))
}
}
1See How to declare dependencies of a Gradle custom task?
2See https://docs.gradle.org/current/userguide/build_lifecycle.html#sec:project_evaluation
3What I am familiar with. Hopefully not too much trouble to convert to Groovy.
I use Elasticsearch 7.11.2 with the following dependencies to write unit tests
implementation 'org.codelibs.elasticsearch.module:lang-painless:7.10.2'
implementation 'org.codelibs.elasticsearch.module:analysis-common:7.10.2'
Spinning up a node looks like
public EmbeddedElastic() throws NodeValidationException, IOException {
tempDir = Files.createTempDirectory("elastic_search_temp").toAbsolutePath().toString();
int port = getAvailableBasePort();
Map<String, String> settings = new HashMap<>();
settings.put("path.home", tempDir);
settings.put("http.port", Integer.toString(port));
settings.put("transport.tcp.port", Integer.toString(port + 1));
settings.put("transport.type", "netty4");
settings.put("http.cors.enabled", Boolean.FALSE.toString());
new PluginNode(settings).start();
client = new RestHighLevelClient(RestClient.builder(new HttpHost("localhost", port, "http")));
System.out.println("Client: " + client.getClass().getName());
Runtime.getRuntime().addShutdownHook(new Thread() {
#Override
public void run() {
// file deletion
}
});
}
private static class PluginNode extends Node {
public PluginNode(Map<String, String> preparedSettings) {
super(InternalSettingsPreparer.prepareEnvironment(Settings.EMPTY, preparedSettings,
null, () -> "node-test"),
Lists.newArrayList(Netty4Plugin.class, ParentJoinPlugin.class, CommonAnalysisPlugin.class,
PainlessPlugin.class),
false);
System.out.println("Started local elastic with PainlessPlugin loaded.");
}
}
Now I am upgrading to ES 7.16.2 and the same code doesn't work well. I presume its because of the lang-painless and analysis-common libraries. Their latest version is only 7.10.
The exception I receive after upgrading the elasticsearch client to 7.16 and spring-data-elasticsearch to 4.3.0
Failed to instantiate [org.springframework.data.elasticsearch.core.ElasticsearchOperations]: Factory method 'elasticsearchTemplate' threw exception; nested exception is java.lang.BootstrapMethodError: java.lang.NoClassDefFoundError: org/elasticsearch/common/xcontent/ToXContentObject
Before I dive deep into fixing it, I am wondering if anyone has actually made unit tests work with 7.16 ES version? I have seen that Elasticsearch Test containers is the recommended way to go
EDIT:
Thanks to Augusto's tip on checking classes in Maven Central, I found out that lang-painless depends on certain libraries from elasticsearch client which are not available in the higher versions of ES client. So its not possible to write unit tests with the same libs.
Elasticsearch moved this class in 7.16 from the org.elasticsearch.common.xcontent package to the org.elasticsearch.xcontent package, a breaking change between 7.15 and 7.16.
Spring Data Elasticsearch 4.3 is built against 7.15.x and so won't work with Elasticsearch 7.16 libraries.
You either need to use Elasticsearch 7.15 or wait for Spring Data Elasticsearch 4.4 (the first milestone for that should be published next week) to be released which is built against 7.16.
And, yes, you should better use testcontainers.
You are probably missing the dependency with this class. It could be that before it came transitevily with the ES client.
Try adding org.elasticsearch:elasticsearch-x-content:7.15.2 to your build tool. A word of caution, you might be missing other dependencies too.
Just as a tip, if you ever get a NoClassDefFoundError, one of the easiest ways to find where that class is defined is to search in maven central. Search for fc:<fully_qualified_class_name>, and it will show all the jars where the class is present.
I have a multi-module project which is using
Java 1.8
JUnit 4.12
Gradle
When compiling a single module, its unit tests are failing on asserting null with Gradle 5.1, but same tests pass on Gradle 1.12 and module compiles successfully. Not changing anything except what is deprecated in 5.1. I can't understand why same framework fails on a recent Gradle version.
One test fails on JUunit Assert.assertNotNull(), which is checking for a string is null or not.
A second test fails on JUnit Assert.assertTrue().
build.gradle is same in both except configuration name changes and I have confirmed all dependencies are downloaded and compiling.
Can't share build script, but if you don't understand something I'll try to make a pseudo script.
I thought assertion errors were more related to language version than tools?
public class Test {
private String property;
#Before
public void setUp() {
property = Singleton.getInstance().getProperty();
}
// test failure 1
#Test
public void shouldAbleToGetProperty() {
assertNotNull(property);
}
// test failure 2
#Test
public void shouldReturnTrueIfPropertyIsTrue() {
Assert.assertTrue(Singleton.getInstance().isTrue());
}
}
Singleton class is a normal singleton which reads property files in resources folder.
NOT ACTUAL CODE
class Singleton {
private Map<String, Properties> properties;
public static Singleton getInstance() {
// return singleton as its meant to be ...
// read property file from project and hold it in map.
}
}
Say we are running mvn test.
I am wondering if there is a way to configure Maven to run some files before executing tests. In my case, I want to configure a library, but don't want to have to configure this library for every entrypoint in my app/tests. I am just looking to configure the lib for every mvn lifecycle hook which invokes a runtime.
Something like this:
#MavenRuntimeLifecycle
public class Whatever {
public void runtimeBegin(){
// right when the java process starts up
Mylib.configure("foo");
}
public void runtimeEnd(){
// right before the process shuts down
}
}
I assume this would be a Maven specific thing - not that it has to be in the same Java process as my server or tests etc.
Note that using Node.js, I would simply do it like so:
export class MyLib {
isConfigLoaded = false;
static loadConfig(){
// ...
}
static void run(){
if(!this.isConfigLoaded){
MyLib.loadConfig(require('../some/path/to/.mylib.config.js'));
this.isConfigLoaded = true;
}
this.doTheThing();
}
}
I could do the same thing with Java or Maven project, and just store a .java file in the resources directory. It's more manual, but it could be done.
We have a multimodule Maven project and intend on performing tests on this. Because our tests are very homogenous, instead of writing the same test over and over, we wrote a parameterised test, which fetches all the files which stand to test and runs its tests against them. Now we want this as a maven plugin so you could just do something like mvn xquerytestrunner:test
I created a separate project, created a Java File in there and annotated it with
#Mojo( name = "xquerytester")
public class XQueryTestRunner extends AbstractMojo {
#Parameter( property = "xquerytester.querytotest", defaultValue = "" )
private static String queryToTest;
public void execute() throws MojoExecutionException, MojoFailureException {
JUnitCore junit = new JUnitCore();
Result result = junit.run(ParameterizedGenericXQueryTest.class);
}
}
Now my question. Will this run? And does it make sense?
My other option was to just have the test in the src/test/java folder of the main module and run it with mvn -Dtest=TestCircle test but the problem is that we use a plugin by Oracle ( oracle-soa-plugin ) that messes up everything around the project but we have to use it.
Our main pom.xml has <packaging>pom</packaging> which is why running the above test goal doesnt work. it just doesnt build or test anything. If i change it to jar the plugin throws errors during build. and I cannot skip build phase because the plugin just does its stuff anyways.
My goal is just to have a one liner for the console that runs my parameterized tests. It just seems Oracle didnt read the howtos on how to write a maven plugin and now i have to work with it.
Update 1:
We now went with a maven plugin. This can be run independent of the oracle soa plugin and also on a mvn project that has a "pom" packagint type.
The Mojo Class:
#Mojo( name = "xsl")
public class XslTestRunner {
#Parameter( property = "xsl.name", defaultValue = "" )
private static String name;
public void execute() throws MojoExecutionException, MojoFailureException {
JUnitCore junit = new JUnitCore();
Result result = junit.run(ParameterizedGenericXslTest.class);
PrintHelper.printResults(result, TransformationTestType.XQUERY);
}
}
The Maven Plugin Pom:
In general I followed these instructions: Your First Plugin