How to run Spring Shell scripts in a JUnit test - java

I have a Spring Shell-based application and a couple of scripts. Is there an easy way to run the scripts in a JUnit test such that a test fails, if some exception/error occurs during the execution of the script?
The purpose of the tests is to make sure that all correct scripts run without errors.
Update 1:
Here's a little helper class for running scripts in JUnit:
import org.apache.commons.io.FileUtils;
import org.springframework.shell.Bootstrap;
import org.springframework.shell.core.CommandResult;
import org.springframework.shell.core.JLineShellComponent;
import java.io.File;
import java.io.IOException;
import java.util.List;
import static org.fest.assertions.api.Assertions.*;
public class ScriptRunner {
public void runScript(final File file) throws IOException
{
final Bootstrap bootstrap = new Bootstrap();
final JLineShellComponent shell = bootstrap.getJLineShellComponent();
final List<String> lines = FileUtils.readLines(file);
for (final String line : lines) {
execVerify(line, shell);
}
}
private void execVerify(final String command, final JLineShellComponent shell) {
final CommandResult result = shell.executeCommand(command);
assertThat(result.isSuccess()).isTrue();
}
}

You can create an instance of Bootstrap, get the shell out of it and then executeCommand() (including the shell command) on it.
You may be interested in what is done in Spring XD for this: https://github.com/spring-projects/spring-xd/blob/master/spring-xd-shell/src/test/java/org/springframework/xd/shell/AbstractShellIntegrationTest.java (although there are a lot of XD specific details)

Related

Load neo4j dump in test container during integration test run

I am trying to write integration test for neo4j using spring boot. I am using test container. Can anyone help me how to load database dump file to testcontainer?
here's one way to do this (using current Neo4j 5.3). I have a dump file called neo4j.dump created from a local instance and I use withCopyFileToContainer to copy it to the container before it starts.
As I use the community edition in this example, there is no online backup/restore, but only dump/load. So therefor I have to change the startup command. The load needs to happen before Neo4j starts. I can't stop Neo4j in the container because it would stop the container.
Therefor I create a small shell script that executes the desired other command and than delegates to the original entry point.
This script is transferred with file mode 0100555, corresponding to r-xr-xr-x so that it is executable.
Finally, the container is started with the above script as command.
import org.junit.jupiter.api.AfterAll;
import org.junit.jupiter.api.Assertions;
import org.junit.jupiter.api.BeforeAll;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.TestInstance;
import org.neo4j.driver.AuthTokens;
import org.neo4j.driver.Driver;
import org.neo4j.driver.GraphDatabase;
import org.testcontainers.containers.Neo4jContainer;
import org.testcontainers.images.builder.Transferable;
import org.testcontainers.utility.MountableFile;
#TestInstance(TestInstance.Lifecycle.PER_CLASS)
public class LoadDumpTest {
Neo4jContainer<?> neo4j;
Driver driver;
#BeforeAll
void initNeo4j() {
neo4j = new Neo4jContainer<>("neo4j:5.3.0")
.withCopyFileToContainer(MountableFile.forClasspathResource("neo4j.dump"),
"/var/lib/neo4j/data/dumps/neo4j.dump")
.withCopyToContainer(Transferable.of("""
#!/bin/bash -eu
/var/lib/neo4j/bin/neo4j-admin database load neo4j
/startup/docker-entrypoint.sh neo4j
""", 0100555), "/startup/load-dump-and-start.sh")
.withCommand("/startup/load-dump-and-start.sh")
.withLogConsumer(f -> System.out.print(f.getUtf8String()));
neo4j.start();
driver = GraphDatabase.driver(neo4j.getBoltUrl(), AuthTokens.basic("neo4j", neo4j.getAdminPassword()));
}
#Test
void dataShouldHaveBeenLoaded() {
try (var session = driver.session()) {
var numNodes = session.run("MATCH (n) RETURN count(n)").single().get(0).asLong();
Assertions.assertTrue(numNodes > 0);
}
}
#AfterAll
void stopNeo4j() {
neo4j.stop();
}
}
Edit:
If you are on a recent enterprise edition, Christophe suggested the following solution on Twitter, which I do personally think it's superior, as it is less hacky.
https://github.com/ikwattro/neo4j-5-testcontainers-restore-backup/blob/main/src/test/java/dev/ikwattro/Neo4j5RestoreBackupExampleTest.java
It makes use of seed URIs for databases. Copying or binding the resource in his example works the same as in mine.
If you're using Neo4j 5, I suggest you make backups instead of dumps. Using backups you can take advantage of the seedUri option when creating a database, meaning you can create a database from an URI pointing to a backup on disk ( or in the neo4j container ). https://neo4j.com/docs/operations-manual/current/clustering/databases/#cluster-seed-uri
Here is an example using Testcontainers
package dev.ikwattro;
import org.junit.jupiter.api.BeforeAll;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.TestInstance;
import org.neo4j.driver.AuthTokens;
import org.neo4j.driver.Driver;
import org.neo4j.driver.GraphDatabase;
import org.neo4j.driver.Session;
import org.neo4j.driver.SessionConfig;
import org.testcontainers.containers.BindMode;
import org.testcontainers.containers.Neo4jContainer;
import org.testcontainers.junit.jupiter.Container;
import org.testcontainers.junit.jupiter.Testcontainers;
import static org.assertj.core.api.Assertions.assertThat;
#TestInstance(TestInstance.Lifecycle.PER_CLASS)
#Testcontainers(disabledWithoutDocker = true)
public class Neo4j5RestoreBackupExampleTest {
#Container
private Neo4jContainer<?> neo4j = new Neo4jContainer<>("neo4j:5.3.0-enterprise")
.withAdminPassword("password")
.withEnv("NEO4J_dbms_memory_heap_max__size", "256M")
.withEnv("NEO4J_dbms_databases_seed__from__uri__providers", "URLConnectionSeedProvider")
.withClasspathResourceMapping("backups", "/backups", BindMode.READ_ONLY)
.withEnv("NEO4J_ACCEPT_LICENSE_AGREEMENT", "yes");
#BeforeAll
void beforeAll() {
neo4j.start();
createDbFromBackup();
}
#Test
void testCreatingDbFromBackup() {
try (Driver driver = GraphDatabase.driver(neo4j.getBoltUrl(), AuthTokens.basic("neo4j", "password"))) {
try (Session session = driver.session(SessionConfig.forDatabase("worldcup22"))) {
var result = session.run("MATCH (n) RETURN count(n) AS c").single().get("c").asLong();
assertThat(result).isPositive();
}
}
}
private void createDbFromBackup() {
try (Driver driver = GraphDatabase.driver(neo4j.getBoltUrl(), AuthTokens.basic("neo4j", "password"))) {
try (Session session = driver.session(SessionConfig.forDatabase("system"))) {
session.run("""
CREATE DATABASE worldcup22 OPTIONS { existingData: "use", seedUri: "file:///backups/world-cup-2022-neo4j.backup"}
""");
}
}
}
}
You can find a working maven project here https://github.com/ikwattro/neo4j-5-testcontainers-restore-backup

Caused by: java.lang.NoClassDefFoundError: software/constructs/Construct

I’m beginner to Terraform CDK. I’ve created simple code in terraform CDK to create an EC2 instance. but here instead of run cdktf deploy in terminal I’m calling is via java processbuilder inside my main method.
Every thing good till now. My Code is compile successful and Jar build. But we I run the jar by command java -jar target/ getting the below error.
└─[$] java -jar target/irm-1.0-SNAPSHOT.jar [0:24:43]
Error: Unable to initialize main class com.example.test.Main
Caused by: java.lang.NoClassDefFoundError: software/constructs/Construct
Here is the my file structure
Here is the Main.java
package com.example.test;
import com.hashicorp.cdktf.App;
import com.hashicorp.cdktf.TerraformStack;
import java.io.BufferedReader;
import java.io.IOException;
import java.io.InputStreamReader;
import java.util.ArrayList;
import java.util.List;
public class Main {
public static void main(String[] args) throws IOException {
final App app = new App();
TerraformStack stack = new MainStack(app, "aws_instance");
// new RemoteBackend(stack, RemoteBackendProps.builder()
// .hostname("app.terraform.io")
// .organization("<YOUR_ORG>")
// .workspaces(new NamedRemoteWorkspace("learn-cdktf"))
// .build());
app.synth();
//calling cdktf deploy
List<String> list = new ArrayList<String>();
list.add("/usr/local/bin/cdktf");
list.add("deploy");
// create the process
ProcessBuilder build = new ProcessBuilder(list);
// starting the process
Process process = build.start();
// for reading the output from stream
BufferedReader stdInput
= new BufferedReader(new InputStreamReader(
process.getInputStream()));
String s = null;
while ((s = stdInput.readLine()) != null) {
System.out.println(s);
}
}
}
Here is the MainStack.java
package com.example.test;
import software.constructs.Construct;
import com.hashicorp.cdktf.TerraformStack;
import com.hashicorp.cdktf.TerraformOutput;
import com.hashicorp.cdktf.providers.aws.AwsProvider;
import com.hashicorp.cdktf.providers.aws.ec2.Instance;
public class MainStack extends TerraformStack
{
public MainStack(final Construct scope, final String id) {
super(scope, id);
AwsProvider.Builder.create(this, "AWS")
.region("ap-south-1")
.build();
Instance instance = Instance.Builder.create(this, "compute")
.ami("ami-0e18b1d379af4e263")
.instanceType("t3a.micro")
.build();
TerraformOutput.Builder.create(this, "public_ip")
.value(instance.getPublicIp())
.build();
}
}
There are two problems I can spot here:
The error you are getting hints towards the construct package not being installed in this project. I'd recommend using the "normal" workflow of running your cdktf program by running cdktf synth or cdktf deploy in the CLI. You can also compile your program (it it's a standard program like we initialize it) by running mvn -e -q compile
It seems like you are trying to execute deploy from within your cdktf program. This won't work, it will create an infinite loop since this program is being run by the synth operation that is being run before the deploy executes. If you want to start cdktf programmatically it has to be from another program that is independent from you CDKTF application

How to run a JMeter JMX file in Java code?

I am trying to run a JMeter MS SQL database test plan from Java code through my Spring Boot app but it's showing the following errors:
I have loaded the plugin manager in JMeter and put that in jmeter/lib/ext folder and installed all required plugins.
Java code to run JMeter test case:
package com.example.demofin;
import java.io.File;
import org.apache.jmeter.engine.StandardJMeterEngine;
import org.apache.jmeter.reporters.ResultCollector;
import org.apache.jmeter.reporters.Summariser;
import org.apache.jmeter.save.SaveService;
import org.apache.jmeter.util.JMeterUtils;
import org.apache.jorphan.collections.HashTree;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.CommandLineRunner;
import org.springframework.stereotype.Component;
import com.example.demofin.config.Properties;
#Component
public class Demofin implements CommandLineRunner {
#Autowired
private Properties properties;
private Summariser result;
#Override
public void run(String... args) throws Exception {
for(String i : args)
System.out.println(i);
// String j = properties.getPropertyByKey("JMETER_HOME");
String j="D:\\apache-jmeter-4.0";
// System.out.println(j);
//System.out.println("Jmeter home path: " + properties.getPropertyByKey("JMETER_HOME"));
StandardJMeterEngine jmeter = new StandardJMeterEngine();
// // Initialize Properties, logging, locale, etc.
JMeterUtils.setJMeterHome(j);
JMeterUtils.loadJMeterProperties(j+"/bin/jmeter.properties");
//
// // you can comment this line out to see extra log messages of i.e. DEBUG level
JMeterUtils.initLogging();
JMeterUtils.initLocale();
// Initialize JMeter SaveService
SaveService.loadProperties();
// Load existing .jmx Test Plan
HashTree testPlanTree = SaveService.loadTree(new File("D:\\apache-jmeter-4.0\\" + "bin\\JDBC Connection Configuration.jmx"));
Summariser summer = null;
String summariserName = JMeterUtils.getPropDefault("summariser.name", "summary");
if (summariserName.length() > 0) {
summer = new Summariser(summariserName);
}
ResultCollector logger = new ResultCollector(summer);
testPlanTree.add(testPlanTree.getArray()[0], logger);
// Run JMeter Test
jmeter.configure(testPlanTree);
jmeter.run();
result = summer;
}
}
Usually, we compile java code to jar and push it to jmeter/lib/ext and use it in Jmeter.
If you attempt to using Jmeter in java, I suggest downloading source code of JMeter and merge to your project.

How to declare second database for testing in Play Framework?

I want to run unit tests on a database other than the default one. Here is my application.conf:
application.secret="[cut]"
application.langs="en"
db.default.driver=com.mysql.jdbc.Driver
db.default.url="jdbc:mysql://localhost:3306/city_game?characterEncoding=UTF-8"
db.default.user=root
db.default.password=""
db.test.driver=com.mysql.jdbc.Driver
db.test.url="jdbc:mysql://localhost:3306/play_test?characterEncoding=UTF-8"
db.test.user=root
db.test.password=""
ebean.default="models.*"
ebean.test="models.*"
logger.root=ERROR
logger.play=INFO
logger.application=DEBUG
BaseModelTest.java:
package models;
import com.avaje.ebean.Ebean;
import com.avaje.ebean.EbeanServer;
import com.avaje.ebean.config.ServerConfig;
import com.avaje.ebeaninternal.server.ddl.DdlGenerator;
import com.avaje.ebean.config.dbplatform.MySqlPlatform;
import com.avaje.ebeaninternal.api.SpiEbeanServer;
import org.junit.AfterClass;
import org.junit.Before;
import org.junit.BeforeClass;
import play.test.FakeApplication;
import play.test.Helpers;
import java.io.IOException;
public class BaseModelTest
{
public static FakeApplication app;
public static DdlGenerator ddl;
#BeforeClass
public static void startApp() throws IOException
{
app = Helpers.fakeApplication();
Helpers.start(app);
String serverName = "test";
EbeanServer server = Ebean.getServer(serverName);
ServerConfig config = new ServerConfig();
ddl = new DdlGenerator();
ddl.setup((SpiEbeanServer) server, new MySqlPlatform(), config);
}
#AfterClass
public static void stopApp()
{
Helpers.stop(app);
}
#Before
public void dropCreateDb() throws IOException
{
// Drop
ddl.runScript(false, ddl.generateDropDdl());
// Create
ddl.runScript(false, ddl.generateCreateDdl());
}
}
However, I get results saved in the default database, and the test one has its tables created but empty. What I expect is to have the results written to the test db and default one untouched.
I somehow ended with different approach.
I still created separate real test database instance (because of stored procedures), but instead I used the Play1-like approach.
I have separates configuration sides beneath my main configuration (e.g. test configuration, prod specific stuff, stage specific stuff etc ..)
I load it via Global.scala as shown below (please note the exaple provided below works in in Play for java developers version as well)
object Global extends GlobalSettings {
override def onLoadConfig(config: Configuration, path: File, cl: ClassLoader, mode: Mode.Mode): Configuration = {
val modeFile: String = s"application.${mode.toString.toLowerCase}.conf"
Logger.error(s"Loading {${path.toURI}conf/application.conf}")
Logger.error(s"Appending mode specific configuration {${path.toURI}conf/$modeFile}")
val modeConfig = config ++ Configuration(ConfigFactory.load(modeFile))
super.onLoadConfig(modeConfig, path, cl, mode)
}
}
And the application.test.conf config file is as follows:
# test database
db.default.logStatements=false
db.default.jndiName=DefaultDS
db.default.url="jdbc:postgresql://127.0.0.1:5432/db-test"
db.default.user=user
db.default.password="password!##$"
db.default.driver=org.postgresql.Driver
This way I get following benefits:
I still write my tests the usual way
Play evolutions gets tested on CI / jenkins as well
I have to write my tests in the way I could safely retun them on the existing db instance w/ minimal assumptions about data and userbase. That way I'm 90% certain I will be able to run those against staging / prod environment with much less friction. (Controversial point)
I think you should separate your code
as these
#BeforeClass
public static void startApp() throws IOException {
app = Helpers.fakeApplication();
Helpers.start(app);
}
#Before
public void dropCreateDb() throws IOException {
String serverName = "test";
EbeanServer server = Ebean.getServer(serverName);
ServerConfig config = new ServerConfig();
DdlGenerator ddl = new DdlGenerator((SpiEbeanServer) server, new MySqlPlatform(), config);
// Drop
ddl.runScript(false, ddl.generateDropDdl());
// Create
ddl.runScript(false, ddl.generateCreateDdl());
}

Running command line tools inside of a Java program

Hello StackOverflow Community,
I have this JUnit Tests that need to run a Server with the command mvn exec:java, and I need to delete the contents of a directory before the tests are executed. Otherwise, the JUnit test will fail. Is there any way I can include these steps into my source code?
Ejay
You should use JUnit's #BeforeClass notation which will be called before the first test starts to clean up the target directory. You should also use the commons-io library avoid unnecessary coding.
import java.io.File;
import java.io.IOException;
import org.apache.commons.io.FileUtils;
import org.junit.BeforeClass;
import org.junit.Test;
public class DeleteDirectoryTest {
private static final String DIRECTORY_PATH = "C:/TEMP";
#BeforeClass
public static void cleanUp() throws IOException {
FileUtils.deleteDirectory(new File(DIRECTORY_PATH));
}
#Test
public void doSomeTest() {
// Test code goes here
}
}
You can place a recursive delete for your directory in your JUnit '#BeforeClass' init method.
public static boolean emptyDir(File dir) {
if (dir.isDirectory()) {
String[] children = dir.list();
for (int i=0; i<children.length; i++) {
boolean success = deleteDir(new File(dir, children[i]));
if (!success) {
return false;
}
}
}
return true;
}
you can use ProcessBuilder to execute commands from java applications

Categories

Resources