Im having trouble resolving the following runtime error: "Multiple HTTP implementations were found on the classpath. To avoid non-deterministic loading implementations, please explicitly provide an HTTP client via the client builders, set the software.amazon.awssdk.http.service.impl system property with the FQCN of the HTTP service to use as the default, or remove all but one HTTP implementation from the classpath"
I have the following two dependencies in my gradle.build :
implementation 'software.amazon.lambda:powertools-parameters:1.12.3'
implementation 'software.amazon.awssdk:sns:2.15.0'
They both seem to use the default HTTP client and compiler cannot determine which one to use. See below the declaration and use of them in code:
private static SsmClient = SsmClient.builder().region(Region.of((region == null) ? Regions.US_EAST_1.getName() : region)).build();
private static SSMProvider ssmProvider = ParamManager.getSsmProvider(client);
static SnsClient sns = SnsClient.builder().credentialsProvider(DefaultCredentialsProvider.builder().build())
.region((region == null) ? Region.US_EAST_1 : Region.of(region)).build();
I cannot remove one from the class path since I need both for my application and I have not succesfully been able to define an awssdk client via the builders.
I tried this but still got same runtime error:
client = SsmClient.builder().httpClientBuilder(new SdkHttpClient() {
#Override
public void close() {
}
#Override
public ExecutableHttpRequest prepareRequest(HttpExecuteRequest request) {
return null;
}
})
Looks like you will have to exclude one of the versions of http-client in your pom
implementation('<dependency>') {
exclude group: '<group>', module: '<module>'
}
where dependency is one of:
'software.amazon.lambda:powertools-parameters:1.12.3'
'software.amazon.awssdk:sns:2.15.0'
You can run gradle dependencies to see the dependency tree to figure out which one you want to exclude from
I have a gradle/SpringBoot project. I'm trying to run some junit tests that use 3rd party annotations to inject classes. The injected classes are creating a text file (call it 'foo.txt') under $(PROJECT_DIR)/app/build/resources/test. However the same injected class then uses a classLoader.getReource("foo.txt") to try to find the file to try to process it. The classLoader looks like it is searching for foo.text under $(PROJECT_DIR)/build/resources/test and since it can't find it there the 3rd party class throws an exception.
So I somehow either need to make classLoader.getResource() called from the 3rd party class search in $(PROJECT_DIR)/app/build/resources/test for foo.txt or make the 3rd party class create foo.txt in $(PROJECT_DIR)/build/resources/test.
I was thinking the way to resolve this might be to add $(PROJECT_DIR)/app/build/resources/test to the classpath in gradle for the test task. But I don't know how to add a new path to the classpath through gradle. I searched all over google but don't see any examples.
Also, If there is a better solution I'm open to that too, but I have limitations in how I can try to resolve this i.e. assume I can't change the 3rd party app for now, and can't refactor project structure of the project I am creating tests in.
Update:
In case it is helpful here is what the test/s looks like:
#ExtendWith({3rdPartyClassResolver.class})
public class 3rdPartyTest {
//the actual failing test
#Test
#3rdPartyClass(
storageLocation = "https://storage-location.us/",
application = "app1",
version = "latest",
requestor = "app2")
public void testing3rdPartyClass(Map<3rdPartyClass, List<Event>> map) {
System.out.println(); //I don't make it to this point
}
//this is what the 3rd party class is trying to do.
//I'm able to reproduce it using this test. it fails the same way.
#Test
public void writeToFile() throws Exception {
Path pathnio = Paths.get("build/resources/test/foo.txt");
Files.write(pathnio, Collections.singleton("some text"));
ClassLoader classLoader = Thread.currentThread().getContextClassLoader();
URL resource = classLoader.getResource("foo.txt");
assertNotNull(resource); //this assert fails
}
}
TL:DR How do you run jar files as deltas using spring flyway.
Some background to the problem i encountered.
When running a flyway java delta for example the following:
public class V3__0001 implements SpringJdbcMigration {
#Override
public void migrate(JdbcTemplate jdbcTemplate) throws Exception {
List<NewCustomerEntity> newCustomerEntityList = new ArrayList<>();
for(CustomerEntity customer : findAll(jdbcTemplate)){
NewCustomerEntity c = new NewCustomerEntity();
c.setDate(SomeUtilityTool.transformDate(customer.getDate()));
newCustomerEntityList.add(c);
}
insertAll(newCustomerEntityList);
}
private void insertAll(JdbcTemplate jdbcTemplate,List<NewCustomerEntity> newEntities){
// INSERT STATEMENT
}
private List<Customer> findAll(JdbcTemplate jdbcTemplate){
String sql = "SELECT * FROM CUSTOMER";
List<CustomerEntity> customers = getJdbcTemplate().query(sql,
new BeanPropertyRowMapper(CustomerEntity.class));
return customers;
}
}
It takes a list of customers, does something with it using a utility tool accessible in the java environment, for example it can be a maven or gradle dependency.
You create your App-1.1 and run it.
Later when you write the next database migration in java. You want to upgrade SomeUtilityTool from a 1.X to a 2.X, lets say this changes the output of the transformDate function in the example.
Now when someone that uses App-1.0 and upgrades to App-1.2 It does not run the same delta as the one that upgraded from App-1.0 to App-1.1.
My conclusion is that all dependencies should probably have its own module that create an jar artifact. This jar artifact ensures by encapsulating all dependencies that change in a dependency later on will not affect how previous delta was run.
The question is it possible to get spring flyway to pick up and run the jarfile?
I have an Enum inside a jar that I have produced myself. This jar is a dependency of a second jar, which uses the enum values.
Now, the second jar is a logging framework, whereas the first jar in this case is the model classes of the logging framework.
I am trying to implement this logging framework into a web application that I have made. Long story short, it still needs some work, but I am stuck on a single problem. An error in the framework's configuration initialization is caught as an exception, and it calls a method. This method has an Enum value as one of it's parameters. However, I get a java.lang.NoSuchFieldError on this enum.
The Enum value was ERROR, so i figured it could be a coincidence. But when I changed it to BABYLOVE the error message changed as well.
I've checked for redundancies and/or possible overlappings in class/enum names, but there are none that I can find.
Sequence of order:
Web App calls for initialization of logging-framework (direct dependency)
logging-framework has issues loading it's own configuration, and throws an exception
Exception is handeled, and a method is called to register the error
The method is called with several parameters, one which is an enum value from logging-framework-model.jar, which is a transitive dependency of the web app
The web-app throws an exception
java.lang.NoSuchFieldError: BABYLOVE
at logging.framework.Constants.<clinit>(Constants.java:52)
at logging.framework.Logger.<init>(Logger.java:60)
at logging.framework.LogContext.getLoggerFromContext(LogContext.java:95)
at logging.framework.LogContext.getCurrent(LogContext.java:48)
at action.navigation.CalendarElementEditorAction.execute(CalendarElementEditorAction.java:39)
Truncated. see log file for complete stacktrace
Constants, line 51-52:
public static final Event ConfigValidationFailed =
EventLogHelper.getEvent(EventLogSource.LoggingFramework, EventLogEntryType.BABYLOVE");
EventLogEntryType:
#XmlType(name = "EventLogEntryType")
#XmlEnum
public enum EventLogEntryType {
//for test purposes, should be removed. This variable is given a name that can not be confused with standard names in error messages, like Error and Warning can.
#XmlEnumValue("BabyLove")
BABYLOVE("BabyLove"),
#XmlEnumValue("Error")
ERROR("Error"),
#XmlEnumValue("Warning")
WARNING("Warning"),
#XmlEnumValue("Information")
INFORMATION("Information"),
#XmlEnumValue("SuccessAudit")
SUCCESSAUDIT("SuccessAudit"),
#XmlEnumValue("FailureAudit")
FAILUREAUDIT("FailureAudit");
private final String value;
EventLogEntryType(String v) {
value = v;
}
public String value() {
return value;
}
public static EventLogEntryType fromValue(String v) {
for (EventLogEntryType c: EventLogEntryType .values()) {
if (c.value.equals(v)) {
return c;
}
}
throw new IllegalArgumentException(v);
}
I don't know if it matters, but I am using maven2 to deal with my dependencies.
I was told to check if the versions of my dependencies had mismatches, and after checking the war's content, I found that to be the problem.
My webapp is one of two very similar ones, that both has a dependency to a jar containing some base model and business logic classes. I had previously added the logging framework (version 1) to that project's pom.xml. So the logging framework 1.0 was a transitive dependency of the web app, while the logging framework 2.0 was a direct dependency of the web app. I am guessing that direct dependencies has precedence over transitive dependencies, so 2.0 was the one who was packaged into my war. However, since the logging framework is composed of a framework (direct dependency), and a set of model classes (transitive dependency), the war was packaged with logging framework model version 1.0.
After I unpacked the war, and found this, it was a pretty easy process to find out where it was wrongly imported, and I ended up with only logging framework version 2.0 for the complete set.
I'm using DynamoDB local for unit testing. It's not bad, but has some drawbacks. Specifically:
You have to somehow start the server before your tests run
The server isn't started and stopped before each test so tests become inter-dependent unless you add code to delete all tables, etc. after each test
All developers need to have it installed
What I want to do is something like put the DynamoDB local jar, and the other jars upon which it depends, in my test/resources directory (I'm writing in Java). Then before each test I'd start it up, running with -inMemory, and after the test I'd stop it. That way anyone pulling down the git repo gets a copy of everything they need to run the tests and each test is independent of the others.
I have found a way to make this work, but it's ugly, so I'm looking for alternatives. The solution I have is to put a .zip file of the DynamoDB local stuff in test/resources, then in the #Before method, I'd extract it to some temporary directory and start a new java process to execute it. That works, but it's ugly and has some drawbacks:
Everyone needs the java executable on their $PATH
I have to unpack a zip to the local disk. Using local disk is often dicey for testing, especially with continuous builds and such.
I have to spawn a process and wait for it to start for each unit test, and then kill that process after each test. Besides being slow, the potential for left-over processes seems ugly.
It seems like there should be an easier way. DynamoDB Local is, after all, just Java code. Can't I somehow ask the JVM to fork itself and look inside the resources to build a classpath? Or, even better, can't I just call the main method of DynamoDB Local from some other thread so this all happens in a single process? Any ideas?
PS: I am aware of Alternator, but it appears to have other drawbacks so I'm inclined to stick with Amazon's supported solution if I can make it work.
In order to use DynamoDBLocal you need to follow these steps.
Get Direct DynamoDBLocal Dependency
Get Native SQLite4Java dependencies
Set sqlite4java.library.path to show native libraries
1. Get Direct DynamoDBLocal Dependency
This one is the easy one. You need this repository as explained here.
<!--Dependency:-->
<dependencies>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>DynamoDBLocal</artifactId>
<version>1.11.0.1</version>
<scope></scope>
</dependency>
</dependencies>
<!--Custom repository:-->
<repositories>
<repository>
<id>dynamodb-local</id>
<name>DynamoDB Local Release Repository</name>
<url>https://s3-us-west-2.amazonaws.com/dynamodb-local/release</url>
</repository>
</repositories>
2. Get Native SQLite4Java dependencies
If you do not add these dependencies, your tests will fail with 500 internal error.
First, add these dependencies:
<dependency>
<groupId>com.almworks.sqlite4java</groupId>
<artifactId>sqlite4java</artifactId>
<version>1.0.392</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>com.almworks.sqlite4java</groupId>
<artifactId>sqlite4java-win32-x86</artifactId>
<version>1.0.392</version>
<type>dll</type>
<scope>test</scope>
</dependency>
<dependency>
<groupId>com.almworks.sqlite4java</groupId>
<artifactId>sqlite4java-win32-x64</artifactId>
<version>1.0.392</version>
<type>dll</type>
<scope>test</scope>
</dependency>
<dependency>
<groupId>com.almworks.sqlite4java</groupId>
<artifactId>libsqlite4java-osx</artifactId>
<version>1.0.392</version>
<type>dylib</type>
<scope>test</scope>
</dependency>
<dependency>
<groupId>com.almworks.sqlite4java</groupId>
<artifactId>libsqlite4java-linux-i386</artifactId>
<version>1.0.392</version>
<type>so</type>
<scope>test</scope>
</dependency>
<dependency>
<groupId>com.almworks.sqlite4java</groupId>
<artifactId>libsqlite4java-linux-amd64</artifactId>
<version>1.0.392</version>
<type>so</type>
<scope>test</scope>
</dependency>
Then, add this plugin to get native dependencies to specific folder:
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<version>2.10</version>
<executions>
<execution>
<id>copy</id>
<phase>test-compile</phase>
<goals>
<goal>copy-dependencies</goal>
</goals>
<configuration>
<includeScope>test</includeScope>
<includeTypes>so,dll,dylib</includeTypes>
<outputDirectory>${project.basedir}/native-libs</outputDirectory>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
3. Set sqlite4java.library.path to show native libraries
As last step, you need to set sqlite4java.library.path system property to native-libs directory. It is OK to do that just before creating your local server.
System.setProperty("sqlite4java.library.path", "native-libs");
After these steps you can use DynamoDBLocal as you want. Here is a Junit rule that creates local server for that.
import com.amazonaws.auth.BasicAWSCredentials;
import com.amazonaws.services.dynamodbv2.AmazonDynamoDB;
import com.amazonaws.services.dynamodbv2.AmazonDynamoDBClient;
import com.amazonaws.services.dynamodbv2.local.main.ServerRunner;
import com.amazonaws.services.dynamodbv2.local.server.DynamoDBProxyServer;
import org.junit.rules.ExternalResource;
import java.io.IOException;
import java.net.ServerSocket;
/**
* Creates a local DynamoDB instance for testing.
*/
public class LocalDynamoDBCreationRule extends ExternalResource {
private DynamoDBProxyServer server;
private AmazonDynamoDB amazonDynamoDB;
public LocalDynamoDBCreationRule() {
// This one should be copied during test-compile time. If project's basedir does not contains a folder
// named 'native-libs' please try '$ mvn clean install' from command line first
System.setProperty("sqlite4java.library.path", "native-libs");
}
#Override
protected void before() throws Throwable {
try {
final String port = getAvailablePort();
this.server = ServerRunner.createServerFromCommandLineArgs(new String[]{"-inMemory", "-port", port});
server.start();
amazonDynamoDB = new AmazonDynamoDBClient(new BasicAWSCredentials("access", "secret"));
amazonDynamoDB.setEndpoint("http://localhost:" + port);
} catch (Exception e) {
throw new RuntimeException(e);
}
}
#Override
protected void after() {
if (server == null) {
return;
}
try {
server.stop();
} catch (Exception e) {
throw new RuntimeException(e);
}
}
public AmazonDynamoDB getAmazonDynamoDB() {
return amazonDynamoDB;
}
private String getAvailablePort() {
try (final ServerSocket serverSocket = new ServerSocket(0)) {
return String.valueOf(serverSocket.getLocalPort());
} catch (IOException e) {
throw new RuntimeException("Available port was not found", e);
}
}
}
You can use this rule like this
#RunWith(JUnit4.class)
public class UserDAOImplTest {
#ClassRule
public static final LocalDynamoDBCreationRule dynamoDB = new LocalDynamoDBCreationRule();
}
In August 2018 Amazon announced new Docker image with Amazon DynamoDB Local onboard. It does not require downloading and running any JARs as well as adding using third-party OS-specific binaries (I'm talking about sqlite4java).
It is as simple as starting a Docker container before the tests:
docker run -p 8000:8000 amazon/dynamodb-local
You can do that manually for local development, as described above, or use it in your CI pipeline. Many CI services provide an ability to start additional containers during the pipeline that can provide dependencies for your tests. Here is an example for Gitlab CI/CD:
test:
stage: test
image: openjdk:8-alpine
services:
- name: amazon/dynamodb-local
alias: dynamodb-local
script:
- DYNAMODB_LOCAL_URL=http://dynamodb-local:8000 ./gradlew clean test
Or Bitbucket Pipelines:
definitions:
services:
dynamodb-local:
image: amazon/dynamodb-local
…
step:
name: test
image:
name: openjdk:8-alpine
services:
- dynamodb-local
script:
- DYNAMODB_LOCAL_URL=http://localhost:8000 ./gradlew clean test
And so on. The idea is to move all the configuration you can see in other answers out of your build tool and provide the dependency externally. Think of it as of dependency injection / IoC but for the whole service, not just a single bean.
After you've started the container you can create a client pointing to it:
private AmazonDynamoDB createAmazonDynamoDB(final DynamoDBLocal configuration) {
return AmazonDynamoDBClientBuilder
.standard()
.withEndpointConfiguration(
new AwsClientBuilder.EndpointConfiguration(
"http://localhost:8000",
Regions.US_EAST_1.getName()
)
)
.withCredentials(
new AWSStaticCredentialsProvider(
// DynamoDB Local works with any non-null credentials
new BasicAWSCredentials("", "")
)
)
.build();
}
Now to the original questions:
You have to somehow start the server before your tests run
You can just start it manually, or prepare a developsers' script for it. IDEs usually provide a way to run arbitrary commands before executing a task, so you can make IDE to start the container for you. I think that running something locally should not be a top priority in this case, but instead you should focus on configuring CI and let the developers start the container as it's comfortable to them.
The server isn't started and stopped before each test so tests become inter-dependent unless you add code to delete all tables, etc. after each test
That's trueee, but… You should not start and stop such heavyweight things
and recreate tables before / after each test. DB tests are almost always inter-dependent and that's ok for them. Just use unique values for each test case (e.g. set item's hash key to ticket id / specific test case id you're working on). As for the seed data, I'd recommend moving it from the build tool and test code as well. Either make your own image with all the data you need or use AWS CLI to create tables and insert data. Follow the single responsibility principle and dependency injection principles: your test code must not do anything but tests. All the environment (tables and data in this case should be provided for them). Creating a table in a test is wrong, because in a real life that table already exist (unless you're testing a method that actually creates a table, of course).
All developers need to have it installed
Docker should be a must for every developer in 2018, so that's not a problem.
And if you're using JUnit 5, it can be a good idea to use a DynamoDB Local extension that will inject the client in your tests (yes, I'm doing a self-promotion):
Add a dependency on me.madhead.aws-junit5:dynamo-v1
pom.xml:
<dependency>
<groupId>me.madhead.aws-junit5</groupId>
<artifactId>dynamo-v1</artifactId>
<version>6.0.1</version>
<scope>test</scope>
</dependency>
build.gradle
dependencies {
testImplementation("me.madhead.aws-junit5:dynamo-v1:6.0.1")
}
Use the extension in your tests:
#ExtendWith(DynamoDBLocalExtension.class)
class MultipleInjectionsTest {
#DynamoDBLocal(
url = "http://dynamodb-local-1:8000"
)
private AmazonDynamoDB first;
#DynamoDBLocal(
urlEnvironmentVariable = "DYNAMODB_LOCAL_URL"
)
private AmazonDynamoDB second;
#Test
void test() {
first.listTables();
second.listTables();
}
}
This is a restating of bhdrkn's answer for Gradle users (his is based on Maven). It's still the same three steps:
Get Direct DynamoDBLocal Dependency
Get Native SQLite4Java dependencies
Set sqlite4java.library.path to show native libraries
1. Get Direct DynamoDBLocal Dependency
Add to the dependencies section of your build.gradle file...
dependencies {
testCompile "com.amazonaws:DynamoDBLocal:1.+"
}
2. Get Native SQLite4Java dependencies
The sqlite4java libraries will already be downloaded as a dependency of DynamoDBLocal, but the library files need to be copied to the right place. Add to your build.gradle file...
task copyNativeDeps(type: Copy) {
from(configurations.compile + configurations.testCompile) {
include '*.dll'
include '*.dylib'
include '*.so'
}
into 'build/libs'
}
3. Set sqlite4java.library.path to show native libraries
We need to tell Gradle to run copyNativeDeps for testing and tell sqlite4java where to find the files. Add to your build.gradle file...
test {
dependsOn copyNativeDeps
systemProperty "java.library.path", 'build/libs'
}
You can use DynamoDB Local as a Maven test dependency in your test code, as is shown in this announcement. You can run over HTTP:
import com.amazonaws.services.dynamodbv2.local.main.ServerRunner;
import com.amazonaws.services.dynamodbv2.local.server.DynamoDBProxyServer;
final String[] localArgs = { "-inMemory" };
DynamoDBProxyServer server = ServerRunner.createServerFromCommandLineArgs(localArgs);
server.start();
AmazonDynamoDB dynamodb = new AmazonDynamoDBClient();
dynamodb.setEndpoint("http://localhost:8000");
dynamodb.listTables();
server.stop();
You can also run in embedded mode:
import com.amazonaws.services.dynamodbv2.local.embedded.DynamoDBEmbedded;
AmazonDynamoDB dynamodb = DynamoDBEmbedded.create();
dynamodb.listTables();
I have wrapped the answers above into two JUnit rules that does not require changes to the build script as the rules handles the native library stuff. I did this as I found that Idea did not like the Gradle/Maven solutions as it just went off and did its own thing anyhoos.
This means the steps are:
Get the AssortmentOfJUnitRules version 1.5.32 or above dependency
Get the Direct DynamoDBLocal dependency
Add the LocalDynamoDbRule or HttpDynamoDbRule to your JUnit test.
Maven:
<!--Dependency:-->
<dependencies>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>DynamoDBLocal</artifactId>
<version>1.11.0.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>com.github.mlk</groupId>
<artifactId>assortmentofjunitrules</artifactId>
<version>1.5.36</version>
<scope>test</scope>
</dependency>
</dependencies>
<!--Custom repository:-->
<repositories>
<repository>
<id>dynamodb-local</id>
<name>DynamoDB Local Release Repository</name>
<url>https://s3-us-west-2.amazonaws.com/dynamodb-local/release</url>
</repository>
</repositories>
Gradle:
repositories {
mavenCentral()
maven {
url = "https://s3-us-west-2.amazonaws.com/dynamodb-local/release"
}
}
dependencies {
testCompile "com.github.mlk:assortmentofjunitrules:1.5.36"
testCompile "com.amazonaws:DynamoDBLocal:1.+"
}
Code:
public class LocalDynamoDbRuleTest {
#Rule
public LocalDynamoDbRule ddb = new LocalDynamoDbRule();
#Test
public void test() {
doDynamoStuff(ddb.getClient());
}
}
Try out tempest-testing! It ships a JUnit4 Rule and a JUnit5 Extension. It also supports both AWS SDK v1 and SDK v2.
Tempest provides a library for testing DynamoDB clients
using DynamoDBLocal
. It comes with two implementations:
JVM: This is the preferred option, running a DynamoDBProxyServer backed by sqlite4java,
which is available on most platforms.
Docker: This runs dynamodb-local in a Docker
container.
Feature matrix:
Feature
tempest-testing-jvm
tempest-testing-docker
Start up time
~1s
~10s
Memory usage
Less
More
Dependency
sqlite4java native library
Docker
To use tempest-testing, first add this library as a test dependency:
For AWS SDK 1.x:
dependencies {
testImplementation "app.cash.tempest:tempest-testing-jvm:1.5.2"
testImplementation "app.cash.tempest:tempest-testing-junit5:1.5.2"
}
// Or
dependencies {
testImplementation "app.cash.tempest:tempest-testing-docker:1.5.2"
testImplementation "app.cash.tempest:tempest-testing-junit5:1.5.2"
}
For AWS SDK 2.x:
dependencies {
testImplementation "app.cash.tempest:tempest2-testing-jvm:1.5.2"
testImplementation "app.cash.tempest:tempest2-testing-junit5:1.5.2"
}
// Or
dependencies {
testImplementation "app.cash.tempest:tempest2-testing-docker:1.5.2"
testImplementation "app.cash.tempest:tempest2-testing-junit5:1.5.2"
}
Then in tests annotated with #org.junit.jupiter.api.Test, you may add TestDynamoDb as a test
extension. This extension spins up a
DynamoDB server. It shares the server across tests and keeps it running until the process exits. It
also manages test tables for you, recreating them before each test.
class MyTest {
#RegisterExtension
TestDynamoDb db = new TestDynamoDb.Builder(JvmDynamoDbServer.Factory.INSTANCE) // or DockerDynamoDbServer
// `MusicItem` is annotated with `#DynamoDBTable`. Tempest recreates this table before each test.
.addTable(TestTable.create(MusicItem.TABLE_NAME, MusicItem.class))
.build();
#Test
public void test() {
PutItemRequest request = // ...;
// Talk to the local DynamoDB.
db.dynamoDb().putItem(request);
}
}
It seems like there should be an easier way. DynamoDB Local is, after all, just Java code. Can't I somehow ask the JVM to fork itself and look inside the resources to build a classpath?
You can do something along these lines, but much simpler: programmatically search the classpath for the location of the native libraries, then set the sqlite4java.library.path property before starting DynamoDB. This is the approach implemented in tempest-testing, as well as in this answer (code here) which is why they just work as pure library/classpath dependency and nothing more.
In my case needed access to DynamoDB outside of a JUnit extension, but I still wanted something self-contained in library code, so I extracted the approach it takes:
import com.amazonaws.services.dynamodbv2.local.embedded.DynamoDBEmbedded;
import com.amazonaws.services.dynamodbv2.local.shared.access.AmazonDynamoDBLocal;
import com.google.common.collect.MoreCollectors;
import java.io.File;
import java.util.Arrays;
import java.util.stream.Stream;
import org.junit.jupiter.api.condition.OS;
...
public AmazonDynamoDBLocal embeddedDynamoDb() {
final OS os = Stream.of(OS.values()).filter(OS::isCurrentOs)
.collect(MoreCollectors.onlyElement());
final String prefix;
switch (os) {
case LINUX:
prefix = "libsqlite4java-linux-amd64-";
break;
case MAC:
prefix = "libsqlite4java-osx-";
break;
case WINDOWS:
prefix = "sqlite4java-win32-x64-";
break;
default:
throw new UnsupportedOperationException(os.toString());
}
System.setProperty("sqlite4java.library.path",
Arrays.asList(System.getProperty("java.class.path").split(File.pathSeparator))
.stream()
.map(File::new)
.filter(file -> file.getName().startsWith(prefix))
.collect(MoreCollectors.onlyElement())
.getParent());
return DynamoDBEmbedded.create();
}
Not had a chance to test on a lot of platforms, and the error handling could likely be improved.
It's a pity AWS haven't taken the time to make the library more friendly, as this could easily be done in the library code itself.
For unit testing at work I use Mockito, then just mock the AmazonDynamoDBClient. then mock out the returns using when. like the following:
when(mockAmazonDynamoDBClient.getItem(isA(GetItemRequest.class))).thenAnswer(new Answer<GetItemResult>() {
#Override
public GetItemResult answer(InvocationOnMock invocation) throws Throwable {
GetItemResult result = new GetItemResult();
result.setItem( testResultItem );
return result;
}
});
not sure if that is what your looking for but that's how we do it.
Shortest solution with fix for sqlite4java.SQLiteException UnsatisfiedLinkError if it is a java/kotlin project built with gradle (a changed $PATH is not needed).
repositories {
// ... other dependencies
maven { url 'https://s3-us-west-2.amazonaws.com/dynamodb-local/release' }
}
dependencies {
testImplementation("com.amazonaws:DynamoDBLocal:1.13.6")
}
import org.gradle.internal.os.OperatingSystem
test {
doFirst {
// Fix for: UnsatisfiedLinkError -> provide a valid native lib path
String nativePrefix = OperatingSystem.current().nativePrefix
File nativeLib = sourceSets.test.runtimeClasspath.files.find {it.name.startsWith("libsqlite4java") && it.name.contains(nativePrefix) } as File
systemProperty "sqlite4java.library.path", nativeLib.parent
}
}
Straightforward usage in test classes (src/test):
private lateinit var db: AmazonDynamoDBLocal
#BeforeAll
fun runDb() { db = DynamoDBEmbedded.create() }
#AfterAll
fun shutdownDb() { db.shutdown() }
There are couple of node.js wrappers for DynamoDB Local. These allows to easily execute unit tests combining with task runners like gulp or grunt. Try dynamodb-localhost,
dynamodb-local
I have found that the amazon repo as no index file, so does not seem to function in a way that allows you to bring it in like this:
maven {
url = "https://s3-us-west-2.amazonaws.com/dynamodb-local/release"
}
The only way I could get the dependencies to load is by downloading DynamoDbLocal as a jar and bringing it into my build script like this:
dependencies {
...
runtime files('libs/DynamoDBLocal.jar')
...
}
Of course this means that all the SQLite and Jetty dependencies need to be brought in by hand - I'm still trying to get this right. If anyone knows of a reliable repo for DynamoDbLocal, I would really love to know.
You could also use this lightweight test container 'Dynalite'
https://www.testcontainers.org/modules/databases/dynalite/
From testcontainers:
Dynalite is a clone of DynamoDB, enabling local testing. It's light
and quick to run.
The DynamoDB Gradle dependency already includes the SQLite libraries. You can pretty easily instruct the Java runtime to use it in your Gradle build script. Here's my build.gradle.kts as an example:
import org.apache.tools.ant.taskdefs.condition.Os
plugins {
application
}
repositories {
mavenCentral()
maven {
url = uri("https://s3-us-west-2.amazonaws.com/dynamodb-local/release")
}
}
dependencies {
implementation("com.amazonaws:DynamoDBLocal:[1.12,2.0)")
}
fun getSqlitePath(): String? {
val dirName = when {
Os.isFamily(Os.FAMILY_MAC) -> "libsqlite4java-osx"
Os.isFamily(Os.FAMILY_UNIX) -> "libsqlite4java-linux-amd64"
Os.isFamily(Os.FAMILY_WINDOWS) -> "sqlite4java-win32-x64"
else -> throw kotlin.Exception("DynamoDB emulator cannot run on this platform")
}
return project.configurations.runtimeClasspath.get().find { it.name.contains(dirName) }?.parent
}
application {
mainClass.set("com.amazonaws.services.dynamodbv2.local.main.ServerRunner")
applicationDefaultJvmArgs = listOf("-Djava.library.path=${getSqlitePath()}")
}
tasks.named<JavaExec>("run") {
args("-inMemory")
}