How to run cucumber examples in parallels using testNG [duplicate] - java

I have below feature files (Separate feature files) in src/test/resources/feature/ and I would like to run them in parallel. Like: One feature file has to execute in chrome and another one has to execute in firefox as mentioned #Tags name.
Feature: Refund item
#chrome
Scenario: Jeff returns a faulty microwave
Given Jeff has bought a microwave for $100
And he has a receipt
When he returns the microwave
Then Jeff should be refunded $100
Feature: Refund Money
#firefox
Scenario: Jeff returns the money
Given Jeff has bought a microwave for $100
And he has a receipt
When he returns the microwave
Then Jeff should be refunded $100
Can somebody assist me to achieve this.I'm using cucumber-java 1.2.2 version, and AbstractTestNGCucumberTests using as runner. Also, let me know how can I create a Test Runner dynamically by using feature files and make them run in parallel.

Update: 4.0.0 version is available at maven central repository with bunch of changes.for more details go here.
Update: 2.2.0 version is available at maven central repository.
You can use opensource plugin cucumber-jvm-parallel-plugin which has many advantages over existing solutions. Available at maven repository
<dependency>
<groupId>com.github.temyers</groupId>
<artifactId>cucumber-jvm-parallel-plugin</artifactId>
<version>2.1.0</version>
</dependency>
First you need to add this plugin with required configuration in your project pom file.
<plugin>
<groupId>com.github.temyers</groupId>
<artifactId>cucumber-jvm-parallel-plugin</artifactId>
<version>2.1.0</version>
<executions>
<execution>
<id>generateRunners</id>
<phase>generate-test-sources</phase>
<goals>
<goal>generateRunners</goal>
</goals>
<configuration>
<!-- Mandatory -->
<!-- comma separated list of package names to scan for glue code -->
<glue>foo, bar</glue>
<outputDirectory>${project.build.directory}/generated-test-sources/cucumber</outputDirectory>
<!-- The directory, which must be in the root of the runtime classpath, containing your feature files. -->
<featuresDirectory>src/test/resources/features/</featuresDirectory>
<!-- Directory where the cucumber report files shall be written -->
<cucumberOutputDir>target/cucumber-parallel</cucumberOutputDir>
<!-- comma separated list of output formats json,html,rerun.txt -->
<format>json</format>
<!-- CucumberOptions.strict property -->
<strict>true</strict>
<!-- CucumberOptions.monochrome property -->
<monochrome>true</monochrome>
<!-- The tags to run, maps to CucumberOptions.tags property you can pass ANDed tags like "#tag1","#tag2" and ORed tags like "#tag1,#tag2,#tag3" -->
<tags></tags>
<!-- If set to true, only feature files containing the required tags shall be generated. -->
<filterFeaturesByTags>false</filterFeaturesByTags>
<!-- Generate TestNG runners instead of default JUnit ones. -->
<useTestNG>false</useTestNG>
<!-- The naming scheme to use for the generated test classes. One of 'simple' or 'feature-title' -->
<namingScheme>simple</namingScheme>
<!-- The class naming pattern to use. Only required/used if naming scheme is 'pattern'.-->
<namingPattern>Parallel{c}IT</namingPattern>
<!-- One of [SCENARIO, FEATURE]. SCENARIO generates one runner per scenario. FEATURE generates a runner per feature. -->
<parallelScheme>SCENARIO</parallelScheme>
<!-- This is optional, required only if you want to specify a custom template for the generated sources (this is a relative path) -->
<customVmTemplate>src/test/resources/cucumber-custom-runner.vm</customVmTemplate>
</configuration>
</execution>
</executions>
</plugin>
Now add below plugin just below above plugin which will invoke runner classes generated by above plugin
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.19</version>
<configuration>
<forkCount>5</forkCount>
<reuseForks>true</reuseForks>
<includes>
<include>**/*IT.class</include>
</includes>
</configuration>
</plugin>
Above two plugins will do magic for cucumber test running in parallel (provided you machine also have advanced hardware support).
Strictly provided <forkCount>n</forkCount> here 'n' is directly proportional to 1) Advanced Hardware support and 2) you available nodes i.e. registered browser instances to HUB.
One major and most important changes is your WebDriver class must be SHARED and you should not implement driver.quit() method, as closing is take care by shutdown hook.
import cucumber.api.Scenario;
import cucumber.api.java.After;
import cucumber.api.java.Before;
import org.openqa.selenium.OutputType;
import org.openqa.selenium.WebDriver;
import org.openqa.selenium.WebDriverException;
import org.openqa.selenium.firefox.FirefoxDriver;
import org.openqa.selenium.support.events.EventFiringWebDriver;
public class SharedDriver extends EventFiringWebDriver {
private static WebDriver REAL_DRIVER = null;
private static final Thread CLOSE_THREAD = new Thread() {
#Override
public void run() {
REAL_DRIVER.close();
}
};
static {
Runtime.getRuntime().addShutdownHook(CLOSE_THREAD);
}
public SharedDriver() {
super(CreateDriver());
}
public static WebDriver CreateDriver() {
WebDriver webDriver;
if (REAL_DRIVER == null)
webDriver = new FirefoxDriver();
setWebDriver(webDriver);
return webDriver;
}
public static void setWebDriver(WebDriver webDriver) {
this.REAL_DRIVER = webDriver;
}
public static WebDriver getWebDriver() {
return this.REAL_DRIVER;
}
#Override
public void close() {
if (Thread.currentThread() != CLOSE_THREAD) {
throw new UnsupportedOperationException("You shouldn't close this WebDriver. It's shared and will close when the JVM exits.");
}
super.close();
}
#Before
public void deleteAllCookies() {
manage().deleteAllCookies();
}
#After
public void embedScreenshot(Scenario scenario) {
try {
byte[] screenshot = getScreenshotAs(OutputType.BYTES);
scenario.embed(screenshot, "image/png");
} catch (WebDriverException somePlatformsDontSupportScreenshots) {
System.err.println(somePlatformsDontSupportScreenshots.getMessage());
}
}
}
Considering you want to execute more than 50 threads i.e. same no of browser instances are registered to HUB but Hub will die if it doesn't get enough memory therefore to avoid this critical situation you should start hub with -DPOOL_MAX=512 (or larger) as stated in grid2 documentation.
Really large (>50 node) Hub installations may need to increase the jetty threads by setting -DPOOL_MAX=512 (or larger) on the java command line.
java -jar selenium-server-standalone-<version>.jar -role hub -DPOOL_MAX=512

Cucumber does not support parallel execution out of the box.
I've tried, but it is not friendly.
We have to use maven's capability to invoke it in parallel. Refer link
Also there is a github project which uses custom plugin to execute in parallel.
Refer cucumber-jvm-parallel-plugin

If all you are expecting is to be able to run multiple features in parallel, then you can try doing the following :
Duplicate the class AbstractTestNGCucumberTests in your test project and set the attribute parallel=true to the #DataProvider annotated method.
Since the default dataprovider-thread-count from TestNG is 10 and now that you have instructed TestNG to run features in parallel, you should start seeing your feature files get executed in parallel.
But I understand that Cucumber reporting is inherently not thread safe, so your reports may appear garbled.

I achieved cucumber parallelism using courgette-jvm . It worked out of the box and run parallel test at scenario level
Simply inlclude similar runner class in cucumber. My tests are further using RemoteWebdriver to open multiple instances on selenium grid. Make sure grid is up and running and node is registered to the grid.
import courgette.api.CourgetteOptions;
import courgette.api.CourgetteRunLevel;
import courgette.api.CucumberOptions;
import courgette.api.testng.TestNGCourgette;
import io.cucumber.testng.AbstractTestNGCucumberTests;
import org.testng.annotations.DataProvider;
import org.testng.annotations.Test;
#Test
#CourgetteOptions(
threads = 10,
runLevel = CourgetteRunLevel.SCENARIO,
rerunFailedScenarios = true,
rerunAttempts = 1,
showTestOutput = true,
reportTitle = "Courgette-JVM Example",
reportTargetDir = "build",
environmentInfo = "browser=chrome; git_branch=master",
cucumberOptions = #CucumberOptions(
features = "src/test/resources/com/test/",
glue = "com.test.stepdefs",
publish = true,
plugin = {
"pretty",
"json:target/cucumber-report/cucumber.json",
"html:target/cucumber-report/cucumber.html"}
))
class AcceptanceIT extends TestNGCourgette {
}
RemoteWebdriver config is
protected RemoteWebDriver createDriver() throws MalformedURLException , IOException {
Properties properties = new Properties();
ClassLoader loader = Thread.currentThread().getContextClassLoader();
String hubURL = "http://192.168.1.7:65299/wd/hub";
System.setProperty("webdriver.gecko.driver", "/Users/amit/Desktop/amit/projects/misc/geckodriver");
FirefoxProfile profile = new FirefoxProfile();
DesiredCapabilities capabilities = DesiredCapabilities.firefox();
capabilities.setCapability(FirefoxDriver.PROFILE, profile);
capabilities.setPlatform(Platform.ANY);
FirefoxOptions options = new FirefoxOptions();
options.merge(capabilities);
driver.set(new RemoteWebDriver(new URL(hubURL),options));
return driver.get();
}

To take maximum advantage of TestNG you can use Testng's third party extension QAF framework. It supports multiple bdd syntax including gherkin using GherkinFactory.
While using BDD with QAF, you can take advantage of each TestNG features, including data-providers, parallel execution configuration in different ways (groups/tests/methods), TestNG listeners.
QAF considers each scenario as TestNG test and Scenario Outline as TestNG data-driven test. As qaf provides driver management and resource management in-built, you don't need to write single line of code for driver management or resource management. All you need to do is create TestNG xml configuration file as per your requirement either to run parallel methods (scenarios) or groups or xml test on one or more browser.
It enables different possible configuration combinations. Below is the xml configuration to address this question which will run scenarios in two browsers and in parallel. You can configure number of threads for each browser as standard TestNG xml configuration as well.
<suite name="AUT Test Automation" verbose="0" parallel="tests">
<test name="Test-on-chrome">
<parameter name="scenario.file.loc" value="resources/features" />
<parameter name="driver.name" value="chromeDriver" />
<classes>
<class name="com.qmetry.qaf.automation.step.client.gherkin.GherkinScenarioFactory" />
</classes>
</test>
<test name="Test-on-FF">
<parameter name="scenario.file.loc" value="resources/features" />
<parameter name="driver.name" value="firefoxDriver" />
<classes>
<class name="com.qmetry.qaf.automation.step.client.gherkin.GherkinScenarioFactory" />
</classes>
</test>
</suite>
More over the latest BDDTestFactory2 supports syntax that is derived from QAF BDD, Jbehave and gherkin. It supports meta-data from qaf bdd as tags and examples from gherkin. You can take benefit of inbuilt data-providers to provide test data in XML/JSON/CSV/EXCEL/DB using meta-data in BDD.
EDIT:
Cucumber-5 users can take benefit of qaf-cucumber support library.

Since v4 of Cucumber you can do this without extensions of plugins.
https://github.com/cucumber/cucumber-jvm/tree/main/cucumber-testng#parallel-execution
Cucumber TestNG supports parallel execution of scenarios. Override the
scenarios method to enable parallel execution.
public class RunCucumberTest extends
AbstractTestNGCucumberTests {
#Override
#DataProvider(parallel = true)
public Object[][] scenarios() {
return super.scenarios();
}
}
Maven Surefire plugin configuration for parallel execution
<plugins> <plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<properties>
<property>
<name>dataproviderthreadcount</name>
<value>${threadcount}</value>
</property>
</properties>
</configuration>
</plugin>
</plugins>
Where dataproviderthreadcount is the default number of threads to use
for data providers when running tests in parallel.

Related

TestNG. Change testng-results to not include configuration methods

I'm trying to reconfigure my test results to not show skip results for configuration methods & is skews my data.
I'm running tests through TestNG where each method has a #beforeMethod and #afterMethod configuration method. In the beforeMethod I check whether a #Test method should run or not & if not I throw a SkipException to skip it.
In my current situation I have 2 test methods I run. One is made to fail and the other is designed to be skipped. So I expect to get a result of 1 fail and 1 skip. In the IDE console this is the result I get, but when I run it through Maven I get 1 fail and 3 skips. Here is my emailable-result.html. The failed test case has no #beforeMethod or #afterMethod showing.
I found out about the IConfigurationListener but I am not sure how to use it to remove a configuration method from the report. I am also using maven surefire
This is what I have so far.
public class MyConfigurationListenerAdapter implements IConfigurationListener {
#Override
public void onConfigurationSkip(ITestResult itr) {
String configName = itr.getName();
if (configName.equals("beforeMethod")||configName.equals("afterMethod")){
//TODO remove itr from test result
}
}
}
pom.xml
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.14.1</version>
<configuration>
<properties>
<property>
<name>listener</name>
<value>java_accelerator.testng.classes.MyConfigurationListenerAdapter</value>
</property>
</properties>
<!-- Suite testng xml file to consider for test execution -->
<suiteXmlFiles>
<suiteXmlFile>src/test/java/testclasses/tests/testng.xml</suiteXmlFile>
</suiteXmlFiles>
</configuration>
</plugin>
Anyone able to help me from here?

System properties and parallel test execution in JUnit 5

One of our classes uses a system property to load a configuration file. In a particular test case, we set this property to an invalid value to check the behavior of the class under test (CUT) in this situation.
The same class is also used transitively in various integration tests. Since we are using JUnit Jupiter's parallel test execution capabilities, we are witnessing race conditions in those integration tests that are using the CUT under the hood. They are failing because the system property is sometimes still invalid.
The parallelism itself is configured globally via the Maven Surefire plugin:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.22.0</version>
<configuration>
<properties>
<configurationParameters>
junit.jupiter.execution.parallel.enabled=true
junit.jupiter.execution.parallel.config.dynamic.factor=1
</configurationParameters>
</properties>
</configuration>
</plugin>
Consequently, all tests run in parallel per default. The CUT looks like this:
class AttributesProviderTest {
#Test
#ResourceLock( value = SYSTEM_PROPERTIES, mode = READ_WRITE )
void invalid_attributes_file_should_yield_UncheckedIOException() throws Exception {
final Properties backup = new Properties();
backup.putAll( System.getProperties() );
final String attributesFile = "foo";
System.setProperty( AttributesProvider.ATTRIBUTES_FILE_PROPERTY, attributesFile );
assertThatThrownBy( AttributesProvider::getTestInstance )
.isExactlyInstanceOf( UncheckedIOException.class )
.hasMessage( "Cannot read attributes file '" + attributesFile + "'." );
System.setProperties( backup );
}
// Some other tests...
}
As it can be seen, we are trying to synchronize the system property access. However, if I have understood correctly, this only works for other #ResourceLock-annotated tests, and does not magically synchronize system property access in general?
Is there a way to fix the race condition (without annotating all other tests)? Some ideas:
Ensure the CUT is executed sequentially at the beginning (or some other sort of synchronization).
Refactor the CUT and invoke corresponding file-reading method with a parameter directly.
Throw away the test case.

JUnit 4 - How to ignore all package with the tests?

I run my JUnit and Mockito tests in a big project. I use them for testing my server-side components that connect to web-service. All these connections require some time and it is not neccessary for them to be executed during the build.
I would like that my tests would be ignored during the build.
I have about 10 classes with tests. So the obvious way is to annotate all the classes with #Ignore. However I should do this every time I commit my code to the project and then re-annotate all tests. Not the very best solution I think.
So is this possible somehow simply ignore all package (let say com.example.tests) with the tests?
Or what might be the solution to ignore tests in the build in a simple way?
You can mention on your build.gradle what packages to exclude from tests
test {
exclude '**/*IntegrationTest*'
}
same for maven:
must consider this notation:
By default, the Surefire Plugin will automatically include all test classes with the following wildcard patterns:
"**/Test*.java" - includes all of its subdirectories and all Java filenames that start with "Test".
"**/*Test.java" - includes all of its subdirectories and all Java filenames that end with "Test".
"**/*Tests.java" - includes all of its subdirectories and all Java filenames that end with "Tests".
"**/*TestCase.java" - includes all of its subdirectories and all Java filenames that end with "TestCase".
<project>
[...]
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.20</version>
<configuration>
<excludes>
<exclude>*com.example.tests*/*Test.java</exclude>
</excludes>
</configuration>
</plugin>
</plugins>
</build>
[...]
</project>
Another option is the old
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.19.1</version>
<configuration>
<skipTests>true</skipTests>
</configuration>
</plugin>
or even when call it
mvn install -DskipTests
Using Categories seems to be an option that can come in handy
This is how you may add these to your Gradle script.
test {
useJUnit {
includeCategories 'org.gradle.junit.CategoryA'
excludeCategories 'org.gradle.junit.CategoryB'
}
}
A sample can be found here, adding it for a quick reference.
public interface FastTests
{
/* category marker */
}
public interface SlowTests
{
/* category marker */
}
public class A
{
#Category(SlowTests.class)
#Test public void a()
{
}
}
#Category(FastTests.class})
public class B
{
#Test public void b()
{
}
}
#RunWith(Categories.class)
#IncludeCategory(SlowTests.class)
#ExcludeCategory(FastTests.class)
#SuiteClasses({ A.class, B.class })
public class SlowTestSuite
{
}
I have found the solution for my case.
To disable all the tests during the build or even in any other context that you want the Spring annotation #IfProfileValue can be used. All tests with this annotation will be executed only in wanted context.
The example is this:
#IfProfileValue(name="enableTests", value="true")
public class DemoApplicationTests {
#Test
public void contextLoads() {
...
}
}
In my IDE I can edit the configuration and set the variable by:
-DenableTests=true
This annotation can be used on the level of a class or on the level of a test also.
All classes or tests annotated with such #IfProfileValue will be executed only in my environment and will be ignored during the build.
This approach is the best for me because it is not convenient in my project to change main pom.xml for my own test needs.
Addition.
Also in Spring or Spring Boot you should add Runner.
For example in Spring:
#RunWith(SpringJUnit4ClassRunner.class)
#IfProfileValue(name="enableTests", value="true")
#ContextConfiguration(classes = { YourClassConfig.class })
YourClassConfig might be empty:
#Configuration
public class YourClassConfig {
}

Easier DynamoDB local testing

I'm using DynamoDB local for unit testing. It's not bad, but has some drawbacks. Specifically:
You have to somehow start the server before your tests run
The server isn't started and stopped before each test so tests become inter-dependent unless you add code to delete all tables, etc. after each test
All developers need to have it installed
What I want to do is something like put the DynamoDB local jar, and the other jars upon which it depends, in my test/resources directory (I'm writing in Java). Then before each test I'd start it up, running with -inMemory, and after the test I'd stop it. That way anyone pulling down the git repo gets a copy of everything they need to run the tests and each test is independent of the others.
I have found a way to make this work, but it's ugly, so I'm looking for alternatives. The solution I have is to put a .zip file of the DynamoDB local stuff in test/resources, then in the #Before method, I'd extract it to some temporary directory and start a new java process to execute it. That works, but it's ugly and has some drawbacks:
Everyone needs the java executable on their $PATH
I have to unpack a zip to the local disk. Using local disk is often dicey for testing, especially with continuous builds and such.
I have to spawn a process and wait for it to start for each unit test, and then kill that process after each test. Besides being slow, the potential for left-over processes seems ugly.
It seems like there should be an easier way. DynamoDB Local is, after all, just Java code. Can't I somehow ask the JVM to fork itself and look inside the resources to build a classpath? Or, even better, can't I just call the main method of DynamoDB Local from some other thread so this all happens in a single process? Any ideas?
PS: I am aware of Alternator, but it appears to have other drawbacks so I'm inclined to stick with Amazon's supported solution if I can make it work.
In order to use DynamoDBLocal you need to follow these steps.
Get Direct DynamoDBLocal Dependency
Get Native SQLite4Java dependencies
Set sqlite4java.library.path to show native libraries
1. Get Direct DynamoDBLocal Dependency
This one is the easy one. You need this repository as explained here.
<!--Dependency:-->
<dependencies>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>DynamoDBLocal</artifactId>
<version>1.11.0.1</version>
<scope></scope>
</dependency>
</dependencies>
<!--Custom repository:-->
<repositories>
<repository>
<id>dynamodb-local</id>
<name>DynamoDB Local Release Repository</name>
<url>https://s3-us-west-2.amazonaws.com/dynamodb-local/release</url>
</repository>
</repositories>
2. Get Native SQLite4Java dependencies
If you do not add these dependencies, your tests will fail with 500 internal error.
First, add these dependencies:
<dependency>
<groupId>com.almworks.sqlite4java</groupId>
<artifactId>sqlite4java</artifactId>
<version>1.0.392</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>com.almworks.sqlite4java</groupId>
<artifactId>sqlite4java-win32-x86</artifactId>
<version>1.0.392</version>
<type>dll</type>
<scope>test</scope>
</dependency>
<dependency>
<groupId>com.almworks.sqlite4java</groupId>
<artifactId>sqlite4java-win32-x64</artifactId>
<version>1.0.392</version>
<type>dll</type>
<scope>test</scope>
</dependency>
<dependency>
<groupId>com.almworks.sqlite4java</groupId>
<artifactId>libsqlite4java-osx</artifactId>
<version>1.0.392</version>
<type>dylib</type>
<scope>test</scope>
</dependency>
<dependency>
<groupId>com.almworks.sqlite4java</groupId>
<artifactId>libsqlite4java-linux-i386</artifactId>
<version>1.0.392</version>
<type>so</type>
<scope>test</scope>
</dependency>
<dependency>
<groupId>com.almworks.sqlite4java</groupId>
<artifactId>libsqlite4java-linux-amd64</artifactId>
<version>1.0.392</version>
<type>so</type>
<scope>test</scope>
</dependency>
Then, add this plugin to get native dependencies to specific folder:
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<version>2.10</version>
<executions>
<execution>
<id>copy</id>
<phase>test-compile</phase>
<goals>
<goal>copy-dependencies</goal>
</goals>
<configuration>
<includeScope>test</includeScope>
<includeTypes>so,dll,dylib</includeTypes>
<outputDirectory>${project.basedir}/native-libs</outputDirectory>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
3. Set sqlite4java.library.path to show native libraries
As last step, you need to set sqlite4java.library.path system property to native-libs directory. It is OK to do that just before creating your local server.
System.setProperty("sqlite4java.library.path", "native-libs");
After these steps you can use DynamoDBLocal as you want. Here is a Junit rule that creates local server for that.
import com.amazonaws.auth.BasicAWSCredentials;
import com.amazonaws.services.dynamodbv2.AmazonDynamoDB;
import com.amazonaws.services.dynamodbv2.AmazonDynamoDBClient;
import com.amazonaws.services.dynamodbv2.local.main.ServerRunner;
import com.amazonaws.services.dynamodbv2.local.server.DynamoDBProxyServer;
import org.junit.rules.ExternalResource;
import java.io.IOException;
import java.net.ServerSocket;
/**
* Creates a local DynamoDB instance for testing.
*/
public class LocalDynamoDBCreationRule extends ExternalResource {
private DynamoDBProxyServer server;
private AmazonDynamoDB amazonDynamoDB;
public LocalDynamoDBCreationRule() {
// This one should be copied during test-compile time. If project's basedir does not contains a folder
// named 'native-libs' please try '$ mvn clean install' from command line first
System.setProperty("sqlite4java.library.path", "native-libs");
}
#Override
protected void before() throws Throwable {
try {
final String port = getAvailablePort();
this.server = ServerRunner.createServerFromCommandLineArgs(new String[]{"-inMemory", "-port", port});
server.start();
amazonDynamoDB = new AmazonDynamoDBClient(new BasicAWSCredentials("access", "secret"));
amazonDynamoDB.setEndpoint("http://localhost:" + port);
} catch (Exception e) {
throw new RuntimeException(e);
}
}
#Override
protected void after() {
if (server == null) {
return;
}
try {
server.stop();
} catch (Exception e) {
throw new RuntimeException(e);
}
}
public AmazonDynamoDB getAmazonDynamoDB() {
return amazonDynamoDB;
}
private String getAvailablePort() {
try (final ServerSocket serverSocket = new ServerSocket(0)) {
return String.valueOf(serverSocket.getLocalPort());
} catch (IOException e) {
throw new RuntimeException("Available port was not found", e);
}
}
}
You can use this rule like this
#RunWith(JUnit4.class)
public class UserDAOImplTest {
#ClassRule
public static final LocalDynamoDBCreationRule dynamoDB = new LocalDynamoDBCreationRule();
}
In August 2018 Amazon announced new Docker image with Amazon DynamoDB Local onboard. It does not require downloading and running any JARs as well as adding using third-party OS-specific binaries (I'm talking about sqlite4java).
It is as simple as starting a Docker container before the tests:
docker run -p 8000:8000 amazon/dynamodb-local
You can do that manually for local development, as described above, or use it in your CI pipeline. Many CI services provide an ability to start additional containers during the pipeline that can provide dependencies for your tests. Here is an example for Gitlab CI/CD:
test:
stage: test
image: openjdk:8-alpine
services:
- name: amazon/dynamodb-local
alias: dynamodb-local
script:
- DYNAMODB_LOCAL_URL=http://dynamodb-local:8000 ./gradlew clean test
Or Bitbucket Pipelines:
definitions:
services:
dynamodb-local:
image: amazon/dynamodb-local
…
step:
name: test
image:
name: openjdk:8-alpine
services:
- dynamodb-local
script:
- DYNAMODB_LOCAL_URL=http://localhost:8000 ./gradlew clean test
And so on. The idea is to move all the configuration you can see in other answers out of your build tool and provide the dependency externally. Think of it as of dependency injection / IoC but for the whole service, not just a single bean.
After you've started the container you can create a client pointing to it:
private AmazonDynamoDB createAmazonDynamoDB(final DynamoDBLocal configuration) {
return AmazonDynamoDBClientBuilder
.standard()
.withEndpointConfiguration(
new AwsClientBuilder.EndpointConfiguration(
"http://localhost:8000",
Regions.US_EAST_1.getName()
)
)
.withCredentials(
new AWSStaticCredentialsProvider(
// DynamoDB Local works with any non-null credentials
new BasicAWSCredentials("", "")
)
)
.build();
}
Now to the original questions:
You have to somehow start the server before your tests run
You can just start it manually, or prepare a developsers' script for it. IDEs usually provide a way to run arbitrary commands before executing a task, so you can make IDE to start the container for you. I think that running something locally should not be a top priority in this case, but instead you should focus on configuring CI and let the developers start the container as it's comfortable to them.
The server isn't started and stopped before each test so tests become inter-dependent unless you add code to delete all tables, etc. after each test
That's trueee, but… You should not start and stop such heavyweight things
and recreate tables before / after each test. DB tests are almost always inter-dependent and that's ok for them. Just use unique values for each test case (e.g. set item's hash key to ticket id / specific test case id you're working on). As for the seed data, I'd recommend moving it from the build tool and test code as well. Either make your own image with all the data you need or use AWS CLI to create tables and insert data. Follow the single responsibility principle and dependency injection principles: your test code must not do anything but tests. All the environment (tables and data in this case should be provided for them). Creating a table in a test is wrong, because in a real life that table already exist (unless you're testing a method that actually creates a table, of course).
All developers need to have it installed
Docker should be a must for every developer in 2018, so that's not a problem.
And if you're using JUnit 5, it can be a good idea to use a DynamoDB Local extension that will inject the client in your tests (yes, I'm doing a self-promotion):
Add a dependency on me.madhead.aws-junit5:dynamo-v1
pom.xml:
<dependency>
<groupId>me.madhead.aws-junit5</groupId>
<artifactId>dynamo-v1</artifactId>
<version>6.0.1</version>
<scope>test</scope>
</dependency>
build.gradle
dependencies {
testImplementation("me.madhead.aws-junit5:dynamo-v1:6.0.1")
}
Use the extension in your tests:
#ExtendWith(DynamoDBLocalExtension.class)
class MultipleInjectionsTest {
#DynamoDBLocal(
url = "http://dynamodb-local-1:8000"
)
private AmazonDynamoDB first;
#DynamoDBLocal(
urlEnvironmentVariable = "DYNAMODB_LOCAL_URL"
)
private AmazonDynamoDB second;
#Test
void test() {
first.listTables();
second.listTables();
}
}
This is a restating of bhdrkn's answer for Gradle users (his is based on Maven). It's still the same three steps:
Get Direct DynamoDBLocal Dependency
Get Native SQLite4Java dependencies
Set sqlite4java.library.path to show native libraries
1. Get Direct DynamoDBLocal Dependency
Add to the dependencies section of your build.gradle file...
dependencies {
testCompile "com.amazonaws:DynamoDBLocal:1.+"
}
2. Get Native SQLite4Java dependencies
The sqlite4java libraries will already be downloaded as a dependency of DynamoDBLocal, but the library files need to be copied to the right place. Add to your build.gradle file...
task copyNativeDeps(type: Copy) {
from(configurations.compile + configurations.testCompile) {
include '*.dll'
include '*.dylib'
include '*.so'
}
into 'build/libs'
}
3. Set sqlite4java.library.path to show native libraries
We need to tell Gradle to run copyNativeDeps for testing and tell sqlite4java where to find the files. Add to your build.gradle file...
test {
dependsOn copyNativeDeps
systemProperty "java.library.path", 'build/libs'
}
You can use DynamoDB Local as a Maven test dependency in your test code, as is shown in this announcement. You can run over HTTP:
import com.amazonaws.services.dynamodbv2.local.main.ServerRunner;
import com.amazonaws.services.dynamodbv2.local.server.DynamoDBProxyServer;
final String[] localArgs = { "-inMemory" };
DynamoDBProxyServer server = ServerRunner.createServerFromCommandLineArgs(localArgs);
server.start();
AmazonDynamoDB dynamodb = new AmazonDynamoDBClient();
dynamodb.setEndpoint("http://localhost:8000");
dynamodb.listTables();
server.stop();
You can also run in embedded mode:
import com.amazonaws.services.dynamodbv2.local.embedded.DynamoDBEmbedded;
AmazonDynamoDB dynamodb = DynamoDBEmbedded.create();
dynamodb.listTables();
I have wrapped the answers above into two JUnit rules that does not require changes to the build script as the rules handles the native library stuff. I did this as I found that Idea did not like the Gradle/Maven solutions as it just went off and did its own thing anyhoos.
This means the steps are:
Get the AssortmentOfJUnitRules version 1.5.32 or above dependency
Get the Direct DynamoDBLocal dependency
Add the LocalDynamoDbRule or HttpDynamoDbRule to your JUnit test.
Maven:
<!--Dependency:-->
<dependencies>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>DynamoDBLocal</artifactId>
<version>1.11.0.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>com.github.mlk</groupId>
<artifactId>assortmentofjunitrules</artifactId>
<version>1.5.36</version>
<scope>test</scope>
</dependency>
</dependencies>
<!--Custom repository:-->
<repositories>
<repository>
<id>dynamodb-local</id>
<name>DynamoDB Local Release Repository</name>
<url>https://s3-us-west-2.amazonaws.com/dynamodb-local/release</url>
</repository>
</repositories>
Gradle:
repositories {
mavenCentral()
maven {
url = "https://s3-us-west-2.amazonaws.com/dynamodb-local/release"
}
}
dependencies {
testCompile "com.github.mlk:assortmentofjunitrules:1.5.36"
testCompile "com.amazonaws:DynamoDBLocal:1.+"
}
Code:
public class LocalDynamoDbRuleTest {
#Rule
public LocalDynamoDbRule ddb = new LocalDynamoDbRule();
#Test
public void test() {
doDynamoStuff(ddb.getClient());
}
}
Try out tempest-testing! It ships a JUnit4 Rule and a JUnit5 Extension. It also supports both AWS SDK v1 and SDK v2.
Tempest provides a library for testing DynamoDB clients
using DynamoDBLocal
. It comes with two implementations:
JVM: This is the preferred option, running a DynamoDBProxyServer backed by sqlite4java,
which is available on most platforms.
Docker: This runs dynamodb-local in a Docker
container.
Feature matrix:
Feature
tempest-testing-jvm
tempest-testing-docker
Start up time
~1s
~10s
Memory usage
Less
More
Dependency
sqlite4java native library
Docker
To use tempest-testing, first add this library as a test dependency:
For AWS SDK 1.x:
dependencies {
testImplementation "app.cash.tempest:tempest-testing-jvm:1.5.2"
testImplementation "app.cash.tempest:tempest-testing-junit5:1.5.2"
}
// Or
dependencies {
testImplementation "app.cash.tempest:tempest-testing-docker:1.5.2"
testImplementation "app.cash.tempest:tempest-testing-junit5:1.5.2"
}
For AWS SDK 2.x:
dependencies {
testImplementation "app.cash.tempest:tempest2-testing-jvm:1.5.2"
testImplementation "app.cash.tempest:tempest2-testing-junit5:1.5.2"
}
// Or
dependencies {
testImplementation "app.cash.tempest:tempest2-testing-docker:1.5.2"
testImplementation "app.cash.tempest:tempest2-testing-junit5:1.5.2"
}
Then in tests annotated with #org.junit.jupiter.api.Test, you may add TestDynamoDb as a test
extension. This extension spins up a
DynamoDB server. It shares the server across tests and keeps it running until the process exits. It
also manages test tables for you, recreating them before each test.
class MyTest {
#RegisterExtension
TestDynamoDb db = new TestDynamoDb.Builder(JvmDynamoDbServer.Factory.INSTANCE) // or DockerDynamoDbServer
// `MusicItem` is annotated with `#DynamoDBTable`. Tempest recreates this table before each test.
.addTable(TestTable.create(MusicItem.TABLE_NAME, MusicItem.class))
.build();
#Test
public void test() {
PutItemRequest request = // ...;
// Talk to the local DynamoDB.
db.dynamoDb().putItem(request);
}
}
It seems like there should be an easier way. DynamoDB Local is, after all, just Java code. Can't I somehow ask the JVM to fork itself and look inside the resources to build a classpath?
You can do something along these lines, but much simpler: programmatically search the classpath for the location of the native libraries, then set the sqlite4java.library.path property before starting DynamoDB. This is the approach implemented in tempest-testing, as well as in this answer (code here) which is why they just work as pure library/classpath dependency and nothing more.
In my case needed access to DynamoDB outside of a JUnit extension, but I still wanted something self-contained in library code, so I extracted the approach it takes:
import com.amazonaws.services.dynamodbv2.local.embedded.DynamoDBEmbedded;
import com.amazonaws.services.dynamodbv2.local.shared.access.AmazonDynamoDBLocal;
import com.google.common.collect.MoreCollectors;
import java.io.File;
import java.util.Arrays;
import java.util.stream.Stream;
import org.junit.jupiter.api.condition.OS;
...
public AmazonDynamoDBLocal embeddedDynamoDb() {
final OS os = Stream.of(OS.values()).filter(OS::isCurrentOs)
.collect(MoreCollectors.onlyElement());
final String prefix;
switch (os) {
case LINUX:
prefix = "libsqlite4java-linux-amd64-";
break;
case MAC:
prefix = "libsqlite4java-osx-";
break;
case WINDOWS:
prefix = "sqlite4java-win32-x64-";
break;
default:
throw new UnsupportedOperationException(os.toString());
}
System.setProperty("sqlite4java.library.path",
Arrays.asList(System.getProperty("java.class.path").split(File.pathSeparator))
.stream()
.map(File::new)
.filter(file -> file.getName().startsWith(prefix))
.collect(MoreCollectors.onlyElement())
.getParent());
return DynamoDBEmbedded.create();
}
Not had a chance to test on a lot of platforms, and the error handling could likely be improved.
It's a pity AWS haven't taken the time to make the library more friendly, as this could easily be done in the library code itself.
For unit testing at work I use Mockito, then just mock the AmazonDynamoDBClient. then mock out the returns using when. like the following:
when(mockAmazonDynamoDBClient.getItem(isA(GetItemRequest.class))).thenAnswer(new Answer<GetItemResult>() {
#Override
public GetItemResult answer(InvocationOnMock invocation) throws Throwable {
GetItemResult result = new GetItemResult();
result.setItem( testResultItem );
return result;
}
});
not sure if that is what your looking for but that's how we do it.
Shortest solution with fix for sqlite4java.SQLiteException UnsatisfiedLinkError if it is a java/kotlin project built with gradle (a changed $PATH is not needed).
repositories {
// ... other dependencies
maven { url 'https://s3-us-west-2.amazonaws.com/dynamodb-local/release' }
}
dependencies {
testImplementation("com.amazonaws:DynamoDBLocal:1.13.6")
}
import org.gradle.internal.os.OperatingSystem
test {
doFirst {
// Fix for: UnsatisfiedLinkError -> provide a valid native lib path
String nativePrefix = OperatingSystem.current().nativePrefix
File nativeLib = sourceSets.test.runtimeClasspath.files.find {it.name.startsWith("libsqlite4java") && it.name.contains(nativePrefix) } as File
systemProperty "sqlite4java.library.path", nativeLib.parent
}
}
Straightforward usage in test classes (src/test):
private lateinit var db: AmazonDynamoDBLocal
#BeforeAll
fun runDb() { db = DynamoDBEmbedded.create() }
#AfterAll
fun shutdownDb() { db.shutdown() }
There are couple of node.js wrappers for DynamoDB Local. These allows to easily execute unit tests combining with task runners like gulp or grunt. Try dynamodb-localhost,
dynamodb-local
I have found that the amazon repo as no index file, so does not seem to function in a way that allows you to bring it in like this:
maven {
url = "https://s3-us-west-2.amazonaws.com/dynamodb-local/release"
}
The only way I could get the dependencies to load is by downloading DynamoDbLocal as a jar and bringing it into my build script like this:
dependencies {
...
runtime files('libs/DynamoDBLocal.jar')
...
}
Of course this means that all the SQLite and Jetty dependencies need to be brought in by hand - I'm still trying to get this right. If anyone knows of a reliable repo for DynamoDbLocal, I would really love to know.
You could also use this lightweight test container 'Dynalite'
https://www.testcontainers.org/modules/databases/dynalite/
From testcontainers:
Dynalite is a clone of DynamoDB, enabling local testing. It's light
and quick to run.
The DynamoDB Gradle dependency already includes the SQLite libraries. You can pretty easily instruct the Java runtime to use it in your Gradle build script. Here's my build.gradle.kts as an example:
import org.apache.tools.ant.taskdefs.condition.Os
plugins {
application
}
repositories {
mavenCentral()
maven {
url = uri("https://s3-us-west-2.amazonaws.com/dynamodb-local/release")
}
}
dependencies {
implementation("com.amazonaws:DynamoDBLocal:[1.12,2.0)")
}
fun getSqlitePath(): String? {
val dirName = when {
Os.isFamily(Os.FAMILY_MAC) -> "libsqlite4java-osx"
Os.isFamily(Os.FAMILY_UNIX) -> "libsqlite4java-linux-amd64"
Os.isFamily(Os.FAMILY_WINDOWS) -> "sqlite4java-win32-x64"
else -> throw kotlin.Exception("DynamoDB emulator cannot run on this platform")
}
return project.configurations.runtimeClasspath.get().find { it.name.contains(dirName) }?.parent
}
application {
mainClass.set("com.amazonaws.services.dynamodbv2.local.main.ServerRunner")
applicationDefaultJvmArgs = listOf("-Djava.library.path=${getSqlitePath()}")
}
tasks.named<JavaExec>("run") {
args("-inMemory")
}

Using a compile-time environment variable to configure RestApplicationPath

As stated in the documentation of rest-dispatch, the rest application path must be configured in the GIN module via a constant, here "/api/v1":
public class DispatchModule extends AbstractGinModule {
#Override
protected void configure() {
RestDispatchAsyncModule.Builder dispatchBuilder =
new RestDispatchAsyncModule.Builder();
install(dispatchBuilder.build());
bindConstant().annotatedWith(RestApplicationPath.class).to("/api/v1");
}
}
I would like to make the "/api/v1" constant be resolved at compile time, based on an environment variable set by the build system depending on the target environment (prod, dev, etc...), and on other criteria (the build artifact major version...).
The problem is I do not manage to rely on a compile time variable.
Neither TextResource/CssResource nor GWT's deferred binding won't help here, since GWT.create() cannot be used in GIN module. Another option I considered is using a custom Generator, but this seems to be too complex for this very simple need.
How do you solve this problem ?
If you use Maven as your build system, you could leverage the templating-maven-plugin to generate a Java class that will contain static variables defined in your POM file. That generated class will be used by your GWT code.
For example, you would want to populate a BuildConstants class template
public class BuildConstants {
// will be replaced by Maven
public static final String API_VERSION = "${myapi.version}";
}
and using a Maven property:
<myapi.version>v1</myapi.version>
that will be compiled to
public class BuildConstants {
// will be replaced by Maven
public static final String API_VERSION = "v1";
}
and you could reference those constants from within your DispatchModule:
bindConstant().annotatedWith(RestApplicationPath.class).to("/api/" + BuildConstants.API_VERSION);
Here's a sample config of the templating-maven-plugin that I use in a project:
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>templating-maven-plugin</artifactId>
<version>1.0-alpha-3</version>
<executions>
<execution>
<id>filter-src</id>
<goals>
<goal>filter-sources</goal>
</goals>
<configuration>
<sourceDirectory>${basedir}/src/main/java-templates</sourceDirectory>
<outputDirectory>${project.build.directory}/generated-sources/java-templates
</outputDirectory>
</configuration>
</execution>
</executions>
</plugin>
There's no reason you couldn't replace the bindConstant() with a #Provides method (or other bind().toProvider(), which would let you use a TextResource and/or deferred-binding, or whatever.
Asn an example (untested though), the code below uses JSNI to read the value from the host page, which makes it runtime dependent (rather than compile-time):
#Provides #RestApplicationPath native String provideRestApplicationPath() /*-{
return $wnd.restApplicationPath;
}-*/;
Following Thomas Broyer suggestion and Simon-Pierre, you could even bind different root .gwt.xml files depending on your maven profile. Then you choose the appropriate Gin module class where your constants are bound.
That is what we do inside the CarStore companion project of GWTP do do Form factors for example.

Categories

Resources