For an artifact on Maven central (or any other given Nexus repository), I want to make a list of all direct dependencies.
At first, I thought about just reading the pom.xml and gather all entries from the dependency section. But I noticed that these entries might have no version (supplied by dependency management) or that entries might come from parent poms.
My second idea was to build some kind of artificial Maven project and gather the dependencies with mvn dependency:tree. This might become complicated.
What would be the most direct (but also reliable) way?
Outside of a Maven plugin, you can do this programmatically using Aether. There is a method readArtifactDescriptor that does exactly this:
Gets information about an artifact like its direct dependencies and potential relocations.
First, add the Aether dependencies to your POM:
<dependencies>
<dependency>
<groupId>org.eclipse.aether</groupId>
<artifactId>aether-impl</artifactId>
<version>${aetherVersion}</version>
</dependency>
<dependency>
<groupId>org.eclipse.aether</groupId>
<artifactId>aether-connector-basic</artifactId>
<version>${aetherVersion}</version>
</dependency>
<dependency>
<groupId>org.eclipse.aether</groupId>
<artifactId>aether-transport-file</artifactId>
<version>${aetherVersion}</version>
</dependency>
<dependency>
<groupId>org.eclipse.aether</groupId>
<artifactId>aether-transport-http</artifactId>
<version>${aetherVersion}</version>
</dependency>
<dependency>
<groupId>org.apache.maven</groupId>
<artifactId>maven-aether-provider</artifactId>
<version>${mavenVersion}</version>
</dependency>
</dependencies>
<properties>
<aetherVersion>1.1.0</aetherVersion>
<mavenVersion>3.3.9</mavenVersion>
</properties>
Then you can have:
public static void main(final String[] args) throws Exception {
DefaultServiceLocator locator = MavenRepositorySystemUtils.newServiceLocator();
RepositorySystem system = newRepositorySystem(locator);
RepositorySystemSession session = newSession(system);
RemoteRepository central = new RemoteRepository.Builder("central", "default", "http://repo1.maven.org/maven2/").build();
Artifact artifact = new DefaultArtifact("groupId:artifactId:version");
ArtifactDescriptorRequest request = new ArtifactDescriptorRequest(artifact, Arrays.asList(central), null);
ArtifactDescriptorResult result = system.readArtifactDescriptor(session, request);
for (Dependency dependency : result.getDependencies()) {
System.out.println(dependency);
}
}
private static RepositorySystem newRepositorySystem(DefaultServiceLocator locator) {
locator.addService(RepositoryConnectorFactory.class, BasicRepositoryConnectorFactory.class);
locator.addService(TransporterFactory.class, FileTransporterFactory.class);
locator.addService(TransporterFactory.class, HttpTransporterFactory.class);
return locator.getService(RepositorySystem.class);
}
private static RepositorySystemSession newSession(RepositorySystem system) {
DefaultRepositorySystemSession session = MavenRepositorySystemUtils.newSession();
LocalRepository localRepo = new LocalRepository("target/local-repo");
session.setLocalRepositoryManager(system.newLocalRepositoryManager(session, localRepo));
// set possible proxies and mirrors
session.setProxySelector(new DefaultProxySelector().add(new Proxy(Proxy.TYPE_HTTP, "host", 3625), Arrays.asList("localhost", "127.0.0.1")));
session.setMirrorSelector(new DefaultMirrorSelector().add("my-mirror", "http://mirror", "default", false, "external:*", null));
return session;
}
What this does is creating a Aether repository system and telling it to read the artifact descriptor of a given artifact. The artifact is constructed with the constructor new DefaultArtifact("...") giving it the wanted coordinates.
A request object is created with this artifact and the list of repositories to fetch it from. In the above sample, only Maven Central was added, but you could add more RemoteRepository by creating them with the RemoteRepository.Builder class. After calling readArtifactDescriptor, the result contains the list of direct dependencies, that can be retrieved with getDependencies().
Proxies and mirrors can be configured with the help of the DefaultProxySelector and DefaultMirrorSelector respectively. DefaultProxySelector.add takes a Proxy as argument, which can be created with its constructor by passing it its type (like "http"), host, port and optionally an Authentication (take a look at the AuthenticationBuilder class to create authentication objects), and a list of non-proxied host. In the same way, DefaultMirrorSelector.add takes its id, URL, type (which is "default" for Maven, but you could have P2 for example), whether it is a repository manager, the actual repository ids mirrored (according to the mirror specification), and the repository type not mirrored.
The following workflow seems to be ok:
Download the pom from Maven central, put it a separate directory and name it pom.xml.
Apply mvn dependency:list -DexcludeTransitive and grab the console output.
Filter the list of dependencies from the console output.
Related
I am getting the error for ResponseLoggingFilter in Rest Assured framework while running the code with RestAssured java libraries. I am using the 4.5.1 rest assured version in the POM file. Has anybody faced the similar issue? Need inputs.
**Error:**
java.lang.NoSuchMethodError: 'org.hamcrest.Matcher org.hamcrest.core.IsInstanceOf.any(java.lang.Class)'
at org.hamcrest.Matchers.any(Matchers.java:349)
at io.restassured.filter.log.ResponseLoggingFilter.<init>(ResponseLoggingFilter.java:73)
at io.restassured.filter.log.ResponseLoggingFilter.logResponseTo(ResponseLoggingFilter.java:182)
**Framework Utility:**
public RequestSpecification setBaseURI() throws IOException {
if (requestSpec == null) {
baseURL = System.getProperty("baseURL", config.getConfig().getProperty("BASE_URL"));
log.info(baseURL);
config.setProperty("APIBase.properties");
RestAssured.urlEncodingEnabled = false;
PrintStream log = new PrintStream(new FileOutputStream("log.txt"));
requestSpec = new RequestSpecBuilder().
setBaseUri(config.getConfig().getProperty("BASE_URL"))
.addFilter(RequestLoggingFilter.logRequestTo(log))
.addFilter(ResponseLoggingFilter.logResponseTo(log))
.build();
}
return requestSpec;
}
**POM.xml**
<dependency>
<groupId>io.rest-assured</groupId>
<artifactId>rest-assured</artifactId>
<version>4.5.1</version>
</dependency>
<dependency>
<groupId>io.rest-assured</groupId>
<artifactId>rest-assured-common</artifactId>
<version>4.5.1</version>
</dependency>
According to the stacktrace, missing method has this signature:
org.hamcrest.Matcher org.hamcrest.core.IsInstanceOf.any(java.lang.Class)
According to the POM file, rest-assured 4.5.1 has hamcrest 2.1 as a dependency.
In Hamcrest 2.1, the any method is declared as:
public static <T> Matcher<T> any(Class<T> type) {
return (Matcher<T>) new IsInstanceOf(type);
}
The erased type signature will be org.hamcrest.Matcher any(java.lang.Class) ... which is exactly what the JVM is looking for.
Stranger, and stranger ...
Somehow the JVM is loading a version of Hamcrest that doesn't have that particular any method. But according to the Git history ... all official versions of Hamcrest have an any method with that signature, all the way back to Hamcrest 1.0. Indeed, the method hasn't changed between 1.0 and 2.2 ... which is the latest official release.
Are you sure that you are getting the Rest-Assured and Hamcrest JARs from a reliable place?
The next thing to do would be to examine the JARs that you are actually using, check their versions and where they came from, unpack them, and examine the respective ".class" files forensically with a javap.
My plugin mojo test class leverages maven-plugin-test-harness to build the complete maven environment with all pom config, plexus container context and repo access.
The following should all actually work:
the test will refer to a test pom.xml in the plugin project's test resources directory
the mojo will be populated with defaults from the mojo annotations
all specified configuration in the test pom is accepted
the maven project object is initialised
all dependencies from the repo are available
the tests should pass in IntelliJ IDE as well as Maven on CI server
Because of the lack of concrete working examples I've been trying many different fixes using what I've collected from SO, and other blogs or articles online.
I am currently struggling to get the maven to resolve the artifacts. While I've got the dependency list from the maven project object, the artifact list is empty.
This is what I've built up by dissecting AbstractMojoTestCase.
I can't use MojoRule because JUnit5 doesn't use #Rules anymore.
Plus, some of the maven API calls are deprecated, but I couldn't find a new implementation. I think it won't come until mvn4. See the quote below.
#BeforeEach
protected void setUp() throws Exception {
super.setUp();
cleanUp();
ClassLoader classLoader = getClass().getClassLoader();
URL url = classLoader.getResource(TEST_POM);
if (url == null) {
throw new MojoExecutionException(String.format(
"Cannot locate %s", TEST_POM));
}
File pom = new File(url.getFile());
//noinspection deprecation - wait on maven-plugin-testing-harness update
MavenSettingsBuilder mavenSettingsBuilder = (MavenSettingsBuilder)
getContainer().lookup(MavenSettingsBuilder.ROLE);
Settings settings = mavenSettingsBuilder.buildSettings();
MavenExecutionRequest request = new DefaultMavenExecutionRequest();
request.setPom(pom);
request.setLocalRepositoryPath(settings.getLocalRepository());
MavenExecutionRequestPopulator populator =
getContainer().lookup(MavenExecutionRequestPopulator.class);
populator.populateDefaults(request);
DefaultMaven maven = (DefaultMaven) getContainer().lookup(Maven.class);
DefaultRepositorySystemSession repoSession =
(DefaultRepositorySystemSession)
maven.newRepositorySession(request);
LocalRepository localRepository = new LocalRepository(
request.getLocalRepository().getBasedir());
SimpleLocalRepositoryManagerFactory factory =
new SimpleLocalRepositoryManagerFactory();
LocalRepositoryManager localRepositoryManager =
factory.newInstance(repoSession, localRepository);
repoSession.setLocalRepositoryManager(localRepositoryManager);
ProjectBuildingRequest buildingRequest =
request.getProjectBuildingRequest()
.setRepositorySession(repoSession)
.setResolveDependencies(true);
ProjectBuilder projectBuilder = lookup(ProjectBuilder.class);
MavenProject project =
projectBuilder.build(pom, buildingRequest).getProject();
//noinspection deprecation - wait on maven-plugin-testing-harness update
MavenSession session = new MavenSession(getContainer(), repoSession,
request, new DefaultMavenExecutionResult());
session.setCurrentProject(project);
session.setProjects(Collections.singletonList(project));
request.setSystemProperties(System.getProperties());
testMojo = (GenerateConfig) lookupConfiguredMojo(session,
newMojoExecution("configure"));
copyTestProjectResourcesToTarget(getContainer(), project, session);
}
[UPDATE 2017-07-27]: actually this now solves most of my problems.
Only a couple of minor issues now:
the code to grab the settings.xml is marked as #Deprecated so I assume there is a better way of doing it (using the MavenSettingsBuilder.buildSettings())
probably quite a lot of the setup code is duplicating process that occurs anyway when running in native maven, but is required to run with JUnit in IntelliJ.
[UPDATE 2017-08-01]: test now needs to access some property files which would be on the classpath in a live environment in the target/classes dir.
Logically they are test resources in my maven-plugin project, so I have included them under the same directory as the test pom.xml in src/test/resources/my-test-project dir.
That didn't work, so I also tried src/test/resources/my-test-project/src/main/resources but that's also not good.
I am having a hard time establishing what exactly is on the plugin's classpath during the test, or working out how to cause it to be constructed correctly.
[UPDATE 2017-08-02]: although I've answered my own question (as opposed to extending this question), this whole thing isn't finished yet so I'm not marking this as answered quite yet.
And just for the record, these are the dependencies required:
<dependency>
<groupId>org.junit.jupiter</groupId>
<artifactId>junit-jupiter-api</artifactId>
<version>5.0.0-M4</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.junit.vintage</groupId>
<artifactId>junit-vintage-engine</artifactId>
<version>4.12.0-M4</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.maven</groupId>
<artifactId>maven-plugin-api</artifactId>
<version>3.5.0</version>
</dependency>
<dependency>
<groupId>org.apache.maven.plugin-tools</groupId>
<artifactId>maven-plugin-annotations</artifactId>
<version>3.5</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.maven.plugin-testing</groupId>
<artifactId>maven-plugin-testing-harness</artifactId>
<version>3.3.0</version>
<scope>test</scope>
<exclusions>
<exclusion>
<groupId>org.codehaus.plexus</groupId>
<artifactId>plexus-container-default</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.maven</groupId>
<artifactId>maven-core</artifactId>
<version>3.5.0</version>
</dependency>
<dependency>
<groupId>org.apache.maven</groupId>
<artifactId>maven-compat</artifactId>
<version>3.5.0</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.twdata.maven</groupId>
<artifactId>mojo-executor</artifactId>
<version>2.3.0</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-resources-plugin</artifactId>
<version>3.0.2</version>
<scope>test</scope>
</dependency>
[UPDATE 2017-08-09]:
I have to add some more functionality and discovered that the test was fine if the dependency it wanted to unpack was in the local repo already, but if not, it won't fetch it.
I now need to determine how to instruct maven to fetch the dependency from the remote repo.
I tried launching the dependency plugin and invoking resolve in the test setup, but it dies badly, I think there must be a simpler way.
No need for so much complexity.
I can't use MojoRule because JUnit5 doesn't use #Rules anymore.
Since version 3.2 you don't need to use #MojoRule, 7 years ago. Just follow the three steps below:
Your test class should extend AbstractMojoTestCase
Before your tests, call super.setUp()
Perform a lookup for your mojo:
MyMojo myMojo = (MyMojo) super.lookupMojo("myGoal", "src/test/resources/its/my-test-mojo.pom.xml");
With that, you can work with Junit 5, Mockito, etc, with no overhead.
Some comments in the maven source code for MavenProject said
With changes during 3.2.2 release MavenProject is closer to being immutable after construction with the removal of all components from this class, and the upfront construction taken care of entirely by the #{ProjectBuilder}. There is still the issue of having to run the lifecycle in order to find all the compile source roots and resource directories but I hope to take care of this during the Maven 4.0 release (jvz).
I figure this whole maven plugin integration test thing is not going to work until then... and so looking around, I found a great blog entry on invoking plugins. So I invoked the maven-resources-plugin directly to get it to copy across what it was meant to. That's what the copyTestProjectResourcesToTarget() call does.
private void copyTestProjectResourcesToTarget(PlexusContainer container,
MavenProject project,
MavenSession session)
throws ComponentLookupException, MojoExecutionException {
logger.fine("generateConfig dependencies: ");
project.getDependencies().forEach(d -> logger.fine(d.getArtifactId()));
Optional<Dependency> resourcesPluginDepOpt =
project.getDependencies().stream()
.filter(d -> Objects.equals(d.getArtifactId(),
MAVEN_RESOURCES_ARTIFACT_ID))
.findFirst();
// don't want to define the version here so we read it from what we have
if (!resourcesPluginDepOpt.isPresent()) {
throw new MojoExecutionException("Require " +
MAVEN_RESOURCES_ARTIFACT_ID);
}
Plugin resourcePlugin = MojoExecutor.plugin(
MojoExecutor.groupId(MAVEN_RESOURCES_GROUP_ID),
MojoExecutor.artifactId(MAVEN_RESOURCES_ARTIFACT_ID),
MojoExecutor.version(resourcesPluginDepOpt.get().getVersion()));
MojoExecutor.executeMojo(resourcePlugin,
MojoExecutor.goal("resources"),
MojoExecutor.configuration(),
MojoExecutor.executionEnvironment(
project, session,
container.lookup(BuildPluginManager.class)));
}
I found the solution to fetching dependencies from the remote repository.
Working with the maven internals like this and judging from the deprecated classes and the amount of duplicated functionality, it gives me the strong impression that maven v4 will make this redundant.
One glitch with this setup routine is that it creates a local repository directory tree in the maven project directory. This is obviously not desirable but will need some more tweaking to solve.
#BeforeEach
public void setUp() throws Exception {
super.setUp();
ClassLoader classLoader = getClass().getClassLoader();
URL url = classLoader.getResource(TEST_POM);
if (url == null) {
throw new MojoExecutionException(String.format(
"Cannot locate %s", TEST_POM));
}
File pom = new File(url.getFile());
Settings settings = getMavenSettings();
if (settings.getLocalRepository() == null) {
settings.setLocalRepository(
org.apache.maven.repository.RepositorySystem
.defaultUserLocalRepository.getAbsolutePath());
}
MavenExecutionRequest request = new DefaultMavenExecutionRequest();
request.setPom(pom);
ArtifactRepository artifactRepository =
new org.apache.maven.artifact.repository.
DefaultArtifactRepository(
"id", settings.getLocalRepository(),
new DefaultRepositoryLayout());
request.setLocalRepository(artifactRepository);
MavenExecutionRequestPopulator populator =
getContainer().lookup(MavenExecutionRequestPopulator.class);
populator.populateFromSettings(request, settings);
DefaultMaven maven = (DefaultMaven)
getContainer().lookup(Maven.class);
DefaultRepositorySystemSession repositorySystemSession =
(DefaultRepositorySystemSession)
maven.newRepositorySession(request);
SimpleLocalRepositoryManagerFactory factory =
new SimpleLocalRepositoryManagerFactory();
LocalRepositoryManager localRepositoryManager =
factory.newInstance(repositorySystemSession,
new LocalRepository(settings.getLocalRepository()));
repositorySystemSession.setLocalRepositoryManager(
localRepositoryManager);
ProjectBuildingRequest buildingRequest =
request.getProjectBuildingRequest()
.setRepositorySession(repositorySystemSession)
.setResolveDependencies(true);
ProjectBuilder projectBuilder = lookup(ProjectBuilder.class);
ProjectBuildingResult projectBuildingResult =
projectBuilder.build(pom, buildingRequest);
MavenProject project = projectBuildingResult.getProject();
MavenSession session = new MavenSession(getContainer(),
repositorySystemSession, request,
new DefaultMavenExecutionResult());
session.setCurrentProject(project);
session.setProjects(Collections.singletonList(project));
request.setSystemProperties(System.getProperties());
testMojo = (GenerateConfig) lookupConfiguredMojo(session,
newMojoExecution("configure"));
testMojo.getLog().debug(String.format("localRepo = %s",
request.getLocalRepository()));
copyTestProjectResourcesToTarget(getContainer(), project, session);
resolveConfigurationFromRepo(repositorySystemSession, project);
}
private Settings getMavenSettings()
throws ComponentLookupException,
IOException,
XmlPullParserException {
org.apache.maven.settings.MavenSettingsBuilder mavenSettingsBuilder
= (org.apache.maven.settings.MavenSettingsBuilder)
getContainer().lookup(
org.apache.maven.settings.MavenSettingsBuilder.ROLE);
return mavenSettingsBuilder.buildSettings();
}
/**
* This is ugly but there seems to be no other way to accomplish it. The
* artifact that the mojo finds on its own will not resolve to a jar file
* on its own in the test harness. So we use aether to resolve it, by
* cloning the maven default artifact into an aether artifact and feeding
* an artifact request to the repo system obtained by the aether service
* locator.
*/
private void resolveConfigurationFromRepo(
DefaultRepositorySystemSession repositorySystemSession,
MavenProject project)
throws ArtifactResolutionException, MojoExecutionException {
org.apache.maven.artifact.Artifact defaultArtifact =
testMojo.getConfigArtifact();
Artifact artifact = new DefaultArtifact(
defaultArtifact.getGroupId(),
defaultArtifact.getArtifactId(),
null,
defaultArtifact.getType(),
defaultArtifact.getVersion());
List<RemoteRepository> remoteArtifactRepositories =
project.getRemoteProjectRepositories();
DefaultServiceLocator locator =
MavenRepositorySystemUtils.newServiceLocator();
locator.addService(RepositoryConnectorFactory.class,
BasicRepositoryConnectorFactory.class);
locator.addService(TransporterFactory.class,
FileTransporterFactory.class);
locator.addService(TransporterFactory.class,
HttpTransporterFactory.class);
RepositorySystem repositorySystem = locator.getService(
RepositorySystem.class);
ArtifactRequest artifactRequest = new ArtifactRequest();
artifactRequest.setArtifact(artifact);
artifactRequest.setRepositories(remoteArtifactRepositories);
ArtifactResult result = repositorySystem.resolveArtifact(
repositorySystemSession, artifactRequest);
defaultArtifact.setFile(result.getArtifact().getFile());
testMojo.getLog().debug( "Resolved artifact " + artifact + " to " +
result.getArtifact().getFile() + " from "
+ result.getRepository() );
}
/**
* Need manual copy of resources because only parts of the maven lifecycle
* happen automatically with this test harness.
*/
private void copyTestProjectResourcesToTarget(PlexusContainer container,
MavenProject project,
MavenSession session)
throws ComponentLookupException, MojoExecutionException {
Optional<Dependency> resourcesPluginDepOpt =
project.getDependencies().stream()
.filter(d -> Objects.equals(d.getArtifactId(),
MAVEN_RESOURCES_ARTIFACT_ID))
.findFirst();
// don't want to define the version here so we read it from what we have
if (!resourcesPluginDepOpt.isPresent()) {
throw new MojoExecutionException("Require " +
MAVEN_RESOURCES_ARTIFACT_ID);
}
Plugin resourcePlugin = MojoExecutor.plugin(
MojoExecutor.groupId(MAVEN_PLUGINS_GROUP_ID),
MojoExecutor.artifactId(MAVEN_RESOURCES_ARTIFACT_ID),
MojoExecutor.version(resourcesPluginDepOpt.get().getVersion()));
MojoExecutor.executeMojo(resourcePlugin,
MojoExecutor.goal("resources"),
MojoExecutor.configuration(),
MojoExecutor.executionEnvironment(
project, session,
container.lookup(BuildPluginManager.class)));
}
and here are the packages used, quite important to use the classes from the right package but easily confused:
import org.apache.maven.DefaultMaven;
import org.apache.maven.Maven;
import org.apache.maven.artifact.repository.ArtifactRepository;
import org.apache.maven.artifact.repository.layout.DefaultRepositoryLayout;
import org.apache.maven.execution.DefaultMavenExecutionRequest;
import org.apache.maven.execution.DefaultMavenExecutionResult;
import org.apache.maven.execution.MavenExecutionRequest;
import org.apache.maven.execution.MavenExecutionRequestPopulator;
import org.apache.maven.execution.MavenSession;
import org.apache.maven.model.Dependency;
import org.apache.maven.model.Plugin;
import org.apache.maven.plugin.BuildPluginManager;
import org.apache.maven.plugin.MojoExecutionException;
import org.apache.maven.plugin.MojoFailureException;
import org.apache.maven.plugin.testing.AbstractMojoTestCase;
import org.apache.maven.project.MavenProject;
import org.apache.maven.project.ProjectBuilder;
import org.apache.maven.project.ProjectBuildingRequest;
import org.apache.maven.project.ProjectBuildingResult;
import org.apache.maven.repository.internal.MavenRepositorySystemUtils;
import org.apache.maven.settings.Settings;
import org.codehaus.plexus.PlexusContainer;
import org.codehaus.plexus.component.repository.exception.ComponentLookupException;
import org.codehaus.plexus.util.xml.pull.XmlPullParserException;
import org.eclipse.aether.DefaultRepositorySystemSession;
import org.eclipse.aether.RepositorySystem;
import org.eclipse.aether.artifact.Artifact;
import org.eclipse.aether.artifact.DefaultArtifact;
import org.eclipse.aether.connector.basic.BasicRepositoryConnectorFactory;
import org.eclipse.aether.impl.DefaultServiceLocator;
import org.eclipse.aether.internal.impl.SimpleLocalRepositoryManagerFactory;
import org.eclipse.aether.repository.LocalRepository;
import org.eclipse.aether.repository.LocalRepositoryManager;
import org.eclipse.aether.repository.RemoteRepository;
import org.eclipse.aether.resolution.ArtifactRequest;
import org.eclipse.aether.resolution.ArtifactResolutionException;
import org.eclipse.aether.resolution.ArtifactResult;
import org.eclipse.aether.spi.connector.RepositoryConnectorFactory;
import org.eclipse.aether.spi.connector.transport.TransporterFactory;
import org.eclipse.aether.transport.file.FileTransporterFactory;
import org.eclipse.aether.transport.http.HttpTransporterFactory;
I try to run Hibernate validator in osgi container.
<dependency>
<groupId>javax.el</groupId>
<artifactId>javax.el-api</artifactId>
<version>2.2.4</version>
</dependency>
<dependency>
<groupId>org.glassfish.web</groupId>
<artifactId>javax.el</artifactId>
<version>2.2.4</version>
</dependency>
<dependency>
<groupId>org.apache.servicemix.bundles</groupId>
<artifactId>org.apache.servicemix.bundles.hibernate-validator</artifactId>
<version>5.0.2.Final_1</version>
</dependency>
<dependency>
<groupId>joda-time</groupId>
<artifactId>joda-time</artifactId>
<version>2.7</version>
</dependency>
<dependency>
<groupId>org.jsoup</groupId>
<artifactId>jsoup</artifactId>
<version>1.8.1</version>
</dependency>
public class HibernateValidationProviderResolver implements ValidationProviderResolver {
#Override
public List<ValidationProvider<?>> getValidationProviders() {
List<ValidationProvider<?>> list = new ArrayList<>(1);
list.add(new HibernateValidator());
return list;
}
}
Configuration<?> configuration = Validation.byDefaultProvider().providerResolver(
new HibernateValidationProviderResolver()
).configure();
ValidatorFactory validatorFactory = configuration.buildValidatorFactory();
Validator validator = validatorFactory.getValidator();
Set<ConstraintViolation<Group>> constraintViolations = validator.validate(group);
public class Group {
#NotNull
#Size(min=2)
private String title;
}
Try to run, equinox console is okay:
10 RESOLVED org.glassfish.web.javax.el_2.2.4
39 RESOLVED org.apache.servicemix.bundles.hibernate-validator_5.0.2.Final_1
47 RESOLVED javax.validation.api_1.1.0.Final
49 RESOLVED javax.el-api_2.2.4
If I pass Group class instance with title = null, then validation is okay and constraintViolations contains one violation "not null".
If I pass Group class instance with title = "A" (one character against minimal length = 2), then it throws an exception
Caused by: javax.el.ELException: Provider com.sun.el.ExpressionFactoryImpl not found
Caused by: java.lang.ClassNotFoundException: com.sun.el.ExpressionFactoryImpl
It 100% caused by osgi, but how I should setup hibernate-validator in osgi? All articles what I can found describes creating of HibernateValidationProviderResolver and that's all.
UPDATE 1
Maven: javax.el:javax.el-api:2.2.4
Export-Package: javax.el;version="2.2.4"
Import-Package: javax.el;version="2.2.4"
Maven: org.glassfish.web:javax.el:2.2.4 MANIFEST.MF
Export-Package: com.sun.el;uses:="javax.el";version="2.2.4"
Private-Package: com.sun.el.lang;version="2.2.4",com.sun.el.parser;version="2.2.4",com.sun.el.util;version="2.2.4"
Import-Package: com.sun.el;version="2.2.4",javax.el;version="2.2"
Maven: org.apache.servicemix.bundles:org.apache.servicemix.bundles.hibernate-validator:5.0.2.Final_1
Implementation-Version: 5.0.2.Final
Import-Package: javax.el,javax.persistence;resolution:=optional, ...
Export-Package: org.hibernate.validator.internal.engine.messageinterpola
tion.el;uses:="javax.el,javax.validation,org.hibernate.validator.intern
al.engine.messageinterpolation";version="5.0.2.Final",org.hibernate.val
idator.internal.engine.messageinterpolation;uses:="javax.validation.met
adata,org.hibernate.validator.internal.engine.messageinterpolation.el,j
avax.el,javax.validation,org.hibernate.validator.internal.util.logging"
;version="5.0.2.Final", ...
Any version for import in hibernate bundle, 2.2.4 in export of el-api and el-impl and el-impl imports el-api as 2.2, not a 2.2.4. All bundles are resolved.
Update 2
Decision 1
my implementation of #hwellmann's idea. #hwellmann, is it correct?
public void createGroup(Group group) {
ClassLoader prevClassLoader = Thread.currentThread().getContextClassLoader();
try {
ClassLoader[] classLoaders = new ClassLoader[] {
prevClassLoader,
ExpressionFactoryImpl.class.getClassLoader()
};
// build composite classloader
Thread.currentThread().setContextClassLoader(compositeClassLoader);
Set<ConstraintViolation<Group>> constraintViolations = validator.validate(group);
} finally {
Thread.currentThread().setContextClassLoader(prevClassLoader);
}
}
It works but looks strange. And change TCCL on each validation processing looks as some overhead.
Decision 2
The error has gone when I add my own message attribute to each validation annotation, for example for Group:
public class Group {
#NotNull
#Size(min=2, message="field.too_short")
private String title;
}
It seems in this case hibernate interpolator is not started so ExpressionFactoryImpl is not retrieved from TCCL (previously we read min=2 value, now we don't). If it is okay for us, this decision is simplest.
I will investigate this area futhermore and share my observations there.
There is a related issue in JBoss Fuse:
https://access.redhat.com/solutions/1479723
They recommend a downgrade to 4.3.1.Final:
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-validator</artifactId>
<version>4.3.1.Final</version>
</dependency>
The full stack trace might give some more insight than just the exception messages.
My guess is that you're seeing the results of a failed service lookup via META-INF/service/javax.el.ExpressionFactory. The bundle that's doing the lookup apparently can't see com.sun.el.
Importing this package into your application bundle and setting the thread context classloader to your bundle classloader might help.
I had a similar issue with Hibernate 5.2.5 and Jetty 9.x. Basically, it looked like Hibernate didn't check if there was already an EL implementation on the classpath and tried to use javax.el-api instead of Jetty 9's apache-el. An upgrade to Hibernate 5.2.10 solved the issue for me. No conflicting el-api on the classpath any more.
Background
I use hibernate validator version 5.4.3.Final.
Starting from 5.3.1.Final, it mentions that ExpressionFactoryImpl is a hard requirement.
there is OSGi issue in 5.3.1.Final,issue is ClassLoader issues in modularized environments. It has been fixed in 5.3.3.Final. Ticket number isHV-1155
Fix
Based on the above information,
there is no need to use Update 2 -> Decision 1.
I found out that you could simply change the code posted by OP from
Configuration<?> configuration = Validation.byDefaultProvider().providerResolver(
new HibernateValidationProviderResolver()
).configure();
to
Configuration<?> configuration = Validation.byProvider(HibernateValidator.class).providerResolver(
new HibernateValidationProviderResolver()
).externalClassLoader(getClass().getClassLoader()).configure();
11.15. Customizing class-loading mentions this trick.
In addition, if you are using hibernate validator version 6.0.0.CR3 above, you don't need to apply the trick above.HV-1426 covers it.
I have the following program, which executes a parser. This is developed in grappa (a fork of parboiled)
package com.test;
import org.parboiled.Parboiled;
import org.parboiled.parserunners.BasicParseRunner;
import org.parboiled.parserunners.ParseRunner;
import org.parboiled.support.ParsingResult;
public final class SampleRun
{
public static void main(final String... args)
{
// 1. create a parser
final TestGrammar parser = Parboiled.createParser(TestGrammar.class);
// 2. create a runner
final ParseRunner<String> runner
= new BasicParseRunner<String>(parser.oneLine());
// 3. collect the result
#SuppressWarnings("deprecation")
final ParsingResult<String> result
= runner.run("sno101 snamegowtham");
// 4. success or not?
System.out.println(result.isSuccess());
}
}
TestGrammar
package com.test;
import com.github.parboiled1.grappa.parsers.EventBusParser;
import org.parboiled.Rule;
import org.parboiled.support.Var;
import java.util.HashMap;
import java.util.Map;
public class TestGrammar
extends EventBusParser<String>
{
protected final Map<String, String> collectedValues
= new HashMap<String, String>();
protected final Var<String> var = new Var<String>();
Rule key()
{
return sequence(
firstOf(ignoreCase("sno"), ignoreCase("sname")),
var.set(match().toLowerCase()),
!collectedValues.containsKey(var.get())
);
}
Rule separator()
{
return optional(anyOf(":-*_ "));
}
Rule value()
{
return sequence(
oneOrMore(testNot(wsp()), ANY),
collectedValues.put(var.get(), match()) == null
);
}
Rule oneLine()
{
return join(sequence(key(), separator(), value()))
.using(oneOrMore(wsp()))
.min(2);
}
}
But, I am getting the following error when I try to execute the above program.
Exception in thread "main" java.lang.NoSuchMethodError: org.objectweb.asm.tree.ClassNode.<init>(I)V
at org.parboiled.transform.ParserClassNode.<init>(ParserClassNode.java:50)
at org.parboiled.transform.ParserTransformer.extendParserClass(ParserTransformer.java:93)
at org.parboiled.transform.ParserTransformer.transformParser(ParserTransformer.java:63)
at org.parboiled.Parboiled.createParser(Parboiled.java:64)
at com.test.SampleRun.main(SampleRun.java:15)
I have the following maven dependencies
grappa-1.0.4.jar
asm-debug-all-5.0.3.jar
guava-18.0.jar
jitescript-0.4.0.jar
Here is my pom.xml
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.test</groupId>
<artifactId>parboiledprogram</artifactId>
<version>0.0.1-SNAPSHOT</version>
<build>
<sourceDirectory>src</sourceDirectory>
<plugins>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.1</version>
<configuration>
<source>1.7</source>
<target>1.7</target>
</configuration>
</plugin>
</plugins>
</build>
<dependencies>
<dependency>
<groupId>com.github.parboiled1</groupId>
<artifactId>grappa</artifactId>
<version>1.0.4</version>
</dependency>
</dependencies>
</project>
Note: I am using Eclipse Juno Service Release 2
Some attempts that didn't work
I have noticed the following icon for asm-debug-all-5.0.3.jar, I am not sure what this icon means in eclipse juno.
Also, in the pom.xml of the dependency jitesscript-0.4.0.jar I have noticed relocation of the org.objectweb.asm package. However, the classes in that too contain the ClassNode(int) jitescript was updated is on 16-Apr-2014 whereas asm-debug-all-5.0.3 was on 24-May-2014
I have tried to remove jitescript.jar and updated the maven project and also cleaned and build it but still no use.
I have also tested this in KEPLER without using maven by manually including all the dependencies that are listed above. But still, I am getting the same error. This means that the problem was not with Maven but something else.
The error is due to the presence of dcevm.jar present in my jre\lib\ext folder. The jdk which I have used for the project contains the dcevm.jar file and that this jar contains the package org.objectweb.asm.tree which containsClassNodewith no constructor like ClassNode(int).
The jvm is using this jar instead of the jar file in the maven dependencies i.e. if there are classes that are present in any of the jar files in the JRE system library used by the project, they override the classes present in the maven jars.
Therefore, I have deleted the dcevm.jar from the jre\lib\ext and then in my eclipse preferences, I have changed the execution environment to a new one (to my jdk path, jdk1.7.0_13) and used at is default.
Summary
Such errors occur clearly because of different versions of jar files containing the class. To identify them:
First, search for the class (ex. ClassNode) in the eclipse Open Type (Ctrl+Shift+T) and you'll find different files with same classname and package too. You can also see where they are located.
Open each one and check in which the method is not found. Eliminate it.
Note: If the class is present in JRE system library, then this is likely to override the one in the maven dependency.
I'm using DynamoDB local for unit testing. It's not bad, but has some drawbacks. Specifically:
You have to somehow start the server before your tests run
The server isn't started and stopped before each test so tests become inter-dependent unless you add code to delete all tables, etc. after each test
All developers need to have it installed
What I want to do is something like put the DynamoDB local jar, and the other jars upon which it depends, in my test/resources directory (I'm writing in Java). Then before each test I'd start it up, running with -inMemory, and after the test I'd stop it. That way anyone pulling down the git repo gets a copy of everything they need to run the tests and each test is independent of the others.
I have found a way to make this work, but it's ugly, so I'm looking for alternatives. The solution I have is to put a .zip file of the DynamoDB local stuff in test/resources, then in the #Before method, I'd extract it to some temporary directory and start a new java process to execute it. That works, but it's ugly and has some drawbacks:
Everyone needs the java executable on their $PATH
I have to unpack a zip to the local disk. Using local disk is often dicey for testing, especially with continuous builds and such.
I have to spawn a process and wait for it to start for each unit test, and then kill that process after each test. Besides being slow, the potential for left-over processes seems ugly.
It seems like there should be an easier way. DynamoDB Local is, after all, just Java code. Can't I somehow ask the JVM to fork itself and look inside the resources to build a classpath? Or, even better, can't I just call the main method of DynamoDB Local from some other thread so this all happens in a single process? Any ideas?
PS: I am aware of Alternator, but it appears to have other drawbacks so I'm inclined to stick with Amazon's supported solution if I can make it work.
In order to use DynamoDBLocal you need to follow these steps.
Get Direct DynamoDBLocal Dependency
Get Native SQLite4Java dependencies
Set sqlite4java.library.path to show native libraries
1. Get Direct DynamoDBLocal Dependency
This one is the easy one. You need this repository as explained here.
<!--Dependency:-->
<dependencies>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>DynamoDBLocal</artifactId>
<version>1.11.0.1</version>
<scope></scope>
</dependency>
</dependencies>
<!--Custom repository:-->
<repositories>
<repository>
<id>dynamodb-local</id>
<name>DynamoDB Local Release Repository</name>
<url>https://s3-us-west-2.amazonaws.com/dynamodb-local/release</url>
</repository>
</repositories>
2. Get Native SQLite4Java dependencies
If you do not add these dependencies, your tests will fail with 500 internal error.
First, add these dependencies:
<dependency>
<groupId>com.almworks.sqlite4java</groupId>
<artifactId>sqlite4java</artifactId>
<version>1.0.392</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>com.almworks.sqlite4java</groupId>
<artifactId>sqlite4java-win32-x86</artifactId>
<version>1.0.392</version>
<type>dll</type>
<scope>test</scope>
</dependency>
<dependency>
<groupId>com.almworks.sqlite4java</groupId>
<artifactId>sqlite4java-win32-x64</artifactId>
<version>1.0.392</version>
<type>dll</type>
<scope>test</scope>
</dependency>
<dependency>
<groupId>com.almworks.sqlite4java</groupId>
<artifactId>libsqlite4java-osx</artifactId>
<version>1.0.392</version>
<type>dylib</type>
<scope>test</scope>
</dependency>
<dependency>
<groupId>com.almworks.sqlite4java</groupId>
<artifactId>libsqlite4java-linux-i386</artifactId>
<version>1.0.392</version>
<type>so</type>
<scope>test</scope>
</dependency>
<dependency>
<groupId>com.almworks.sqlite4java</groupId>
<artifactId>libsqlite4java-linux-amd64</artifactId>
<version>1.0.392</version>
<type>so</type>
<scope>test</scope>
</dependency>
Then, add this plugin to get native dependencies to specific folder:
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<version>2.10</version>
<executions>
<execution>
<id>copy</id>
<phase>test-compile</phase>
<goals>
<goal>copy-dependencies</goal>
</goals>
<configuration>
<includeScope>test</includeScope>
<includeTypes>so,dll,dylib</includeTypes>
<outputDirectory>${project.basedir}/native-libs</outputDirectory>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
3. Set sqlite4java.library.path to show native libraries
As last step, you need to set sqlite4java.library.path system property to native-libs directory. It is OK to do that just before creating your local server.
System.setProperty("sqlite4java.library.path", "native-libs");
After these steps you can use DynamoDBLocal as you want. Here is a Junit rule that creates local server for that.
import com.amazonaws.auth.BasicAWSCredentials;
import com.amazonaws.services.dynamodbv2.AmazonDynamoDB;
import com.amazonaws.services.dynamodbv2.AmazonDynamoDBClient;
import com.amazonaws.services.dynamodbv2.local.main.ServerRunner;
import com.amazonaws.services.dynamodbv2.local.server.DynamoDBProxyServer;
import org.junit.rules.ExternalResource;
import java.io.IOException;
import java.net.ServerSocket;
/**
* Creates a local DynamoDB instance for testing.
*/
public class LocalDynamoDBCreationRule extends ExternalResource {
private DynamoDBProxyServer server;
private AmazonDynamoDB amazonDynamoDB;
public LocalDynamoDBCreationRule() {
// This one should be copied during test-compile time. If project's basedir does not contains a folder
// named 'native-libs' please try '$ mvn clean install' from command line first
System.setProperty("sqlite4java.library.path", "native-libs");
}
#Override
protected void before() throws Throwable {
try {
final String port = getAvailablePort();
this.server = ServerRunner.createServerFromCommandLineArgs(new String[]{"-inMemory", "-port", port});
server.start();
amazonDynamoDB = new AmazonDynamoDBClient(new BasicAWSCredentials("access", "secret"));
amazonDynamoDB.setEndpoint("http://localhost:" + port);
} catch (Exception e) {
throw new RuntimeException(e);
}
}
#Override
protected void after() {
if (server == null) {
return;
}
try {
server.stop();
} catch (Exception e) {
throw new RuntimeException(e);
}
}
public AmazonDynamoDB getAmazonDynamoDB() {
return amazonDynamoDB;
}
private String getAvailablePort() {
try (final ServerSocket serverSocket = new ServerSocket(0)) {
return String.valueOf(serverSocket.getLocalPort());
} catch (IOException e) {
throw new RuntimeException("Available port was not found", e);
}
}
}
You can use this rule like this
#RunWith(JUnit4.class)
public class UserDAOImplTest {
#ClassRule
public static final LocalDynamoDBCreationRule dynamoDB = new LocalDynamoDBCreationRule();
}
In August 2018 Amazon announced new Docker image with Amazon DynamoDB Local onboard. It does not require downloading and running any JARs as well as adding using third-party OS-specific binaries (I'm talking about sqlite4java).
It is as simple as starting a Docker container before the tests:
docker run -p 8000:8000 amazon/dynamodb-local
You can do that manually for local development, as described above, or use it in your CI pipeline. Many CI services provide an ability to start additional containers during the pipeline that can provide dependencies for your tests. Here is an example for Gitlab CI/CD:
test:
stage: test
image: openjdk:8-alpine
services:
- name: amazon/dynamodb-local
alias: dynamodb-local
script:
- DYNAMODB_LOCAL_URL=http://dynamodb-local:8000 ./gradlew clean test
Or Bitbucket Pipelines:
definitions:
services:
dynamodb-local:
image: amazon/dynamodb-local
…
step:
name: test
image:
name: openjdk:8-alpine
services:
- dynamodb-local
script:
- DYNAMODB_LOCAL_URL=http://localhost:8000 ./gradlew clean test
And so on. The idea is to move all the configuration you can see in other answers out of your build tool and provide the dependency externally. Think of it as of dependency injection / IoC but for the whole service, not just a single bean.
After you've started the container you can create a client pointing to it:
private AmazonDynamoDB createAmazonDynamoDB(final DynamoDBLocal configuration) {
return AmazonDynamoDBClientBuilder
.standard()
.withEndpointConfiguration(
new AwsClientBuilder.EndpointConfiguration(
"http://localhost:8000",
Regions.US_EAST_1.getName()
)
)
.withCredentials(
new AWSStaticCredentialsProvider(
// DynamoDB Local works with any non-null credentials
new BasicAWSCredentials("", "")
)
)
.build();
}
Now to the original questions:
You have to somehow start the server before your tests run
You can just start it manually, or prepare a developsers' script for it. IDEs usually provide a way to run arbitrary commands before executing a task, so you can make IDE to start the container for you. I think that running something locally should not be a top priority in this case, but instead you should focus on configuring CI and let the developers start the container as it's comfortable to them.
The server isn't started and stopped before each test so tests become inter-dependent unless you add code to delete all tables, etc. after each test
That's trueee, but… You should not start and stop such heavyweight things
and recreate tables before / after each test. DB tests are almost always inter-dependent and that's ok for them. Just use unique values for each test case (e.g. set item's hash key to ticket id / specific test case id you're working on). As for the seed data, I'd recommend moving it from the build tool and test code as well. Either make your own image with all the data you need or use AWS CLI to create tables and insert data. Follow the single responsibility principle and dependency injection principles: your test code must not do anything but tests. All the environment (tables and data in this case should be provided for them). Creating a table in a test is wrong, because in a real life that table already exist (unless you're testing a method that actually creates a table, of course).
All developers need to have it installed
Docker should be a must for every developer in 2018, so that's not a problem.
And if you're using JUnit 5, it can be a good idea to use a DynamoDB Local extension that will inject the client in your tests (yes, I'm doing a self-promotion):
Add a dependency on me.madhead.aws-junit5:dynamo-v1
pom.xml:
<dependency>
<groupId>me.madhead.aws-junit5</groupId>
<artifactId>dynamo-v1</artifactId>
<version>6.0.1</version>
<scope>test</scope>
</dependency>
build.gradle
dependencies {
testImplementation("me.madhead.aws-junit5:dynamo-v1:6.0.1")
}
Use the extension in your tests:
#ExtendWith(DynamoDBLocalExtension.class)
class MultipleInjectionsTest {
#DynamoDBLocal(
url = "http://dynamodb-local-1:8000"
)
private AmazonDynamoDB first;
#DynamoDBLocal(
urlEnvironmentVariable = "DYNAMODB_LOCAL_URL"
)
private AmazonDynamoDB second;
#Test
void test() {
first.listTables();
second.listTables();
}
}
This is a restating of bhdrkn's answer for Gradle users (his is based on Maven). It's still the same three steps:
Get Direct DynamoDBLocal Dependency
Get Native SQLite4Java dependencies
Set sqlite4java.library.path to show native libraries
1. Get Direct DynamoDBLocal Dependency
Add to the dependencies section of your build.gradle file...
dependencies {
testCompile "com.amazonaws:DynamoDBLocal:1.+"
}
2. Get Native SQLite4Java dependencies
The sqlite4java libraries will already be downloaded as a dependency of DynamoDBLocal, but the library files need to be copied to the right place. Add to your build.gradle file...
task copyNativeDeps(type: Copy) {
from(configurations.compile + configurations.testCompile) {
include '*.dll'
include '*.dylib'
include '*.so'
}
into 'build/libs'
}
3. Set sqlite4java.library.path to show native libraries
We need to tell Gradle to run copyNativeDeps for testing and tell sqlite4java where to find the files. Add to your build.gradle file...
test {
dependsOn copyNativeDeps
systemProperty "java.library.path", 'build/libs'
}
You can use DynamoDB Local as a Maven test dependency in your test code, as is shown in this announcement. You can run over HTTP:
import com.amazonaws.services.dynamodbv2.local.main.ServerRunner;
import com.amazonaws.services.dynamodbv2.local.server.DynamoDBProxyServer;
final String[] localArgs = { "-inMemory" };
DynamoDBProxyServer server = ServerRunner.createServerFromCommandLineArgs(localArgs);
server.start();
AmazonDynamoDB dynamodb = new AmazonDynamoDBClient();
dynamodb.setEndpoint("http://localhost:8000");
dynamodb.listTables();
server.stop();
You can also run in embedded mode:
import com.amazonaws.services.dynamodbv2.local.embedded.DynamoDBEmbedded;
AmazonDynamoDB dynamodb = DynamoDBEmbedded.create();
dynamodb.listTables();
I have wrapped the answers above into two JUnit rules that does not require changes to the build script as the rules handles the native library stuff. I did this as I found that Idea did not like the Gradle/Maven solutions as it just went off and did its own thing anyhoos.
This means the steps are:
Get the AssortmentOfJUnitRules version 1.5.32 or above dependency
Get the Direct DynamoDBLocal dependency
Add the LocalDynamoDbRule or HttpDynamoDbRule to your JUnit test.
Maven:
<!--Dependency:-->
<dependencies>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>DynamoDBLocal</artifactId>
<version>1.11.0.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>com.github.mlk</groupId>
<artifactId>assortmentofjunitrules</artifactId>
<version>1.5.36</version>
<scope>test</scope>
</dependency>
</dependencies>
<!--Custom repository:-->
<repositories>
<repository>
<id>dynamodb-local</id>
<name>DynamoDB Local Release Repository</name>
<url>https://s3-us-west-2.amazonaws.com/dynamodb-local/release</url>
</repository>
</repositories>
Gradle:
repositories {
mavenCentral()
maven {
url = "https://s3-us-west-2.amazonaws.com/dynamodb-local/release"
}
}
dependencies {
testCompile "com.github.mlk:assortmentofjunitrules:1.5.36"
testCompile "com.amazonaws:DynamoDBLocal:1.+"
}
Code:
public class LocalDynamoDbRuleTest {
#Rule
public LocalDynamoDbRule ddb = new LocalDynamoDbRule();
#Test
public void test() {
doDynamoStuff(ddb.getClient());
}
}
Try out tempest-testing! It ships a JUnit4 Rule and a JUnit5 Extension. It also supports both AWS SDK v1 and SDK v2.
Tempest provides a library for testing DynamoDB clients
using DynamoDBLocal
. It comes with two implementations:
JVM: This is the preferred option, running a DynamoDBProxyServer backed by sqlite4java,
which is available on most platforms.
Docker: This runs dynamodb-local in a Docker
container.
Feature matrix:
Feature
tempest-testing-jvm
tempest-testing-docker
Start up time
~1s
~10s
Memory usage
Less
More
Dependency
sqlite4java native library
Docker
To use tempest-testing, first add this library as a test dependency:
For AWS SDK 1.x:
dependencies {
testImplementation "app.cash.tempest:tempest-testing-jvm:1.5.2"
testImplementation "app.cash.tempest:tempest-testing-junit5:1.5.2"
}
// Or
dependencies {
testImplementation "app.cash.tempest:tempest-testing-docker:1.5.2"
testImplementation "app.cash.tempest:tempest-testing-junit5:1.5.2"
}
For AWS SDK 2.x:
dependencies {
testImplementation "app.cash.tempest:tempest2-testing-jvm:1.5.2"
testImplementation "app.cash.tempest:tempest2-testing-junit5:1.5.2"
}
// Or
dependencies {
testImplementation "app.cash.tempest:tempest2-testing-docker:1.5.2"
testImplementation "app.cash.tempest:tempest2-testing-junit5:1.5.2"
}
Then in tests annotated with #org.junit.jupiter.api.Test, you may add TestDynamoDb as a test
extension. This extension spins up a
DynamoDB server. It shares the server across tests and keeps it running until the process exits. It
also manages test tables for you, recreating them before each test.
class MyTest {
#RegisterExtension
TestDynamoDb db = new TestDynamoDb.Builder(JvmDynamoDbServer.Factory.INSTANCE) // or DockerDynamoDbServer
// `MusicItem` is annotated with `#DynamoDBTable`. Tempest recreates this table before each test.
.addTable(TestTable.create(MusicItem.TABLE_NAME, MusicItem.class))
.build();
#Test
public void test() {
PutItemRequest request = // ...;
// Talk to the local DynamoDB.
db.dynamoDb().putItem(request);
}
}
It seems like there should be an easier way. DynamoDB Local is, after all, just Java code. Can't I somehow ask the JVM to fork itself and look inside the resources to build a classpath?
You can do something along these lines, but much simpler: programmatically search the classpath for the location of the native libraries, then set the sqlite4java.library.path property before starting DynamoDB. This is the approach implemented in tempest-testing, as well as in this answer (code here) which is why they just work as pure library/classpath dependency and nothing more.
In my case needed access to DynamoDB outside of a JUnit extension, but I still wanted something self-contained in library code, so I extracted the approach it takes:
import com.amazonaws.services.dynamodbv2.local.embedded.DynamoDBEmbedded;
import com.amazonaws.services.dynamodbv2.local.shared.access.AmazonDynamoDBLocal;
import com.google.common.collect.MoreCollectors;
import java.io.File;
import java.util.Arrays;
import java.util.stream.Stream;
import org.junit.jupiter.api.condition.OS;
...
public AmazonDynamoDBLocal embeddedDynamoDb() {
final OS os = Stream.of(OS.values()).filter(OS::isCurrentOs)
.collect(MoreCollectors.onlyElement());
final String prefix;
switch (os) {
case LINUX:
prefix = "libsqlite4java-linux-amd64-";
break;
case MAC:
prefix = "libsqlite4java-osx-";
break;
case WINDOWS:
prefix = "sqlite4java-win32-x64-";
break;
default:
throw new UnsupportedOperationException(os.toString());
}
System.setProperty("sqlite4java.library.path",
Arrays.asList(System.getProperty("java.class.path").split(File.pathSeparator))
.stream()
.map(File::new)
.filter(file -> file.getName().startsWith(prefix))
.collect(MoreCollectors.onlyElement())
.getParent());
return DynamoDBEmbedded.create();
}
Not had a chance to test on a lot of platforms, and the error handling could likely be improved.
It's a pity AWS haven't taken the time to make the library more friendly, as this could easily be done in the library code itself.
For unit testing at work I use Mockito, then just mock the AmazonDynamoDBClient. then mock out the returns using when. like the following:
when(mockAmazonDynamoDBClient.getItem(isA(GetItemRequest.class))).thenAnswer(new Answer<GetItemResult>() {
#Override
public GetItemResult answer(InvocationOnMock invocation) throws Throwable {
GetItemResult result = new GetItemResult();
result.setItem( testResultItem );
return result;
}
});
not sure if that is what your looking for but that's how we do it.
Shortest solution with fix for sqlite4java.SQLiteException UnsatisfiedLinkError if it is a java/kotlin project built with gradle (a changed $PATH is not needed).
repositories {
// ... other dependencies
maven { url 'https://s3-us-west-2.amazonaws.com/dynamodb-local/release' }
}
dependencies {
testImplementation("com.amazonaws:DynamoDBLocal:1.13.6")
}
import org.gradle.internal.os.OperatingSystem
test {
doFirst {
// Fix for: UnsatisfiedLinkError -> provide a valid native lib path
String nativePrefix = OperatingSystem.current().nativePrefix
File nativeLib = sourceSets.test.runtimeClasspath.files.find {it.name.startsWith("libsqlite4java") && it.name.contains(nativePrefix) } as File
systemProperty "sqlite4java.library.path", nativeLib.parent
}
}
Straightforward usage in test classes (src/test):
private lateinit var db: AmazonDynamoDBLocal
#BeforeAll
fun runDb() { db = DynamoDBEmbedded.create() }
#AfterAll
fun shutdownDb() { db.shutdown() }
There are couple of node.js wrappers for DynamoDB Local. These allows to easily execute unit tests combining with task runners like gulp or grunt. Try dynamodb-localhost,
dynamodb-local
I have found that the amazon repo as no index file, so does not seem to function in a way that allows you to bring it in like this:
maven {
url = "https://s3-us-west-2.amazonaws.com/dynamodb-local/release"
}
The only way I could get the dependencies to load is by downloading DynamoDbLocal as a jar and bringing it into my build script like this:
dependencies {
...
runtime files('libs/DynamoDBLocal.jar')
...
}
Of course this means that all the SQLite and Jetty dependencies need to be brought in by hand - I'm still trying to get this right. If anyone knows of a reliable repo for DynamoDbLocal, I would really love to know.
You could also use this lightweight test container 'Dynalite'
https://www.testcontainers.org/modules/databases/dynalite/
From testcontainers:
Dynalite is a clone of DynamoDB, enabling local testing. It's light
and quick to run.
The DynamoDB Gradle dependency already includes the SQLite libraries. You can pretty easily instruct the Java runtime to use it in your Gradle build script. Here's my build.gradle.kts as an example:
import org.apache.tools.ant.taskdefs.condition.Os
plugins {
application
}
repositories {
mavenCentral()
maven {
url = uri("https://s3-us-west-2.amazonaws.com/dynamodb-local/release")
}
}
dependencies {
implementation("com.amazonaws:DynamoDBLocal:[1.12,2.0)")
}
fun getSqlitePath(): String? {
val dirName = when {
Os.isFamily(Os.FAMILY_MAC) -> "libsqlite4java-osx"
Os.isFamily(Os.FAMILY_UNIX) -> "libsqlite4java-linux-amd64"
Os.isFamily(Os.FAMILY_WINDOWS) -> "sqlite4java-win32-x64"
else -> throw kotlin.Exception("DynamoDB emulator cannot run on this platform")
}
return project.configurations.runtimeClasspath.get().find { it.name.contains(dirName) }?.parent
}
application {
mainClass.set("com.amazonaws.services.dynamodbv2.local.main.ServerRunner")
applicationDefaultJvmArgs = listOf("-Djava.library.path=${getSqlitePath()}")
}
tasks.named<JavaExec>("run") {
args("-inMemory")
}