When trying to connect to the secret manager my code is throwing this exception. I am trying to Create a Secrets Manager client.
AWSSecretsManager client =
AWSSecretsManagerClientBuilder.standard()
.withRegion(region)
.build();
In Pom.xml have added the following dependencies.
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-secretsmanager</artifactId>
<version>1.11.965</version>
</dependency>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk</artifactId>
<version>1.11.965</version>
</dependency>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-core</artifactId>
<version>1.11.965</version>
</dependency>
The solution provided by #smac2020 was not helpful for me, as I was not dealing with secrets, but AWS Cognito.
And this is what I found out - might be helpful for someone in the future
I got this error while I updated to the latest version of cognitoip 1.12.167
implementation group: 'io.awspring.cloud', name: 'spring-cloud-starter-aws-parameter-store-config', version: 1.12.167
This automatically downloads/is bundled with an OLD VERSION of aws-java-sdk-core and jmespath-java dependenies.
The EnhancedJsonErrorUnmarshaller is a new class with the latest version of aws-java-sdk-core, which was not in the default version that cognitoip was bundled with.
Solution: Update the aws-java-sdk-core manually to match the version of cognitoip.
implementation group: 'com.amazonaws', name: 'aws-java-sdk-core', version: '1.12.167'
This has the EnhancedJsonErrorUnmarshaller in it.
Amazon suggests moving to the AWS SDK for Java V2. You can find Secret Manager V2 code in Github here.
The POM file that contains the dependencies is located in Github in the SecretManager folder.
V2 code has been tested many times and this code works:
package com.example.secrets;
//snippet-start:[secretsmanager.java2.create_secret.import]
import software.amazon.awssdk.regions.Region;
import software.amazon.awssdk.services.secretsmanager.SecretsManagerClient;
import software.amazon.awssdk.services.secretsmanager.model.CreateSecretRequest;
import software.amazon.awssdk.services.secretsmanager.model.CreateSecretResponse;
import software.amazon.awssdk.services.secretsmanager.model.SecretsManagerException;
//snippet-end:[secretsmanager.java2.create_secret.import]
/**
* To run this AWS code example, ensure that you have setup your development environment, including your AWS credentials.
*
* For information, see this documentation topic:
*
*https://docs.aws.amazon.com/sdk-for-java/latest/developer-guide/get-started.html
*/
public class CreateSecret {
public static void main(String[] args) {
final String USAGE = "\n" +
"Usage:\n" +
" CreateSecret <secretName> <secretValue> \n\n" +
"Where:\n" +
" secretName - the name of the secret (for example, tutorials/MyFirstSecret). \n"+
" secretValue - the secret value. \n";
if (args.length != 2) {
System.out.println(USAGE);
System.exit(1);
}
String secretName = args[0];
String secretValue= args[1];
Region region = Region.US_EAST_1;
SecretsManagerClient secretsClient = SecretsManagerClient.builder()
.region(region)
.build();
String secretARN = createNewSecret(secretsClient, secretName, secretValue);
System.out.println("The secret ARN is "+ secretARN);
secretsClient.close();
}
//snippet-start:[secretsmanager.java2.create_secret.main]
public static String createNewSecret( SecretsManagerClient secretsClient, String secretName, String secretValue) {
try {
CreateSecretRequest secretRequest = CreateSecretRequest.builder()
.name(secretName)
.description("This secret was created by the AWS Secret Manager Java API")
.secretString(secretValue)
.build();
CreateSecretResponse secretResponse = secretsClient.createSecret(secretRequest);
return secretResponse.arn();
} catch (SecretsManagerException e) {
System.err.println(e.awsErrorDetails().errorMessage());
System.exit(1);
}
return "";
}
//snippet-end:[secretsmanager.java2.create_secret.main]
}
Related
i am a new bee to GCP and trying to develop springboot rest api that connects to GCP Bigtable, do we have a quickstart guide that helps in development.
Yup, there is. Check out this link to get started. You need the Java client library.
https://cloud.google.com/bigtable/docs/reference/libraries#client-libraries-install-java
for Maven:
<dependencyManagement>
<dependencies>
<dependency>
<groupId>com.google.cloud</groupId>
<artifactId>libraries-bom</artifactId>
<version>26.1.4</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
</dependencyManagement>
<dependencies>
<dependency>
<groupId>com.google.cloud</groupId>
<artifactId>google-cloud-bigtable</artifactId>
</dependency>
Then check this for usage:
https://cloud.google.com/bigtable/docs/reference/libraries#client-libraries-usage-java
import com.google.api.gax.rpc.NotFoundException;
import com.google.cloud.bigtable.data.v2.BigtableDataClient;
import com.google.cloud.bigtable.data.v2.BigtableDataSettings;
import com.google.cloud.bigtable.data.v2.models.Row;
import com.google.cloud.bigtable.data.v2.models.RowCell;
public class Quickstart {
public static void main(String... args) {
String projectId = args[0]; // my-gcp-project-id
String instanceId = args[1]; // my-bigtable-instance-id
String tableId = args[2]; // my-bigtable-table-id
quickstart(projectId, instanceId, tableId);
}
public static void quickstart(String projectId, String instanceId, String tableId) {
BigtableDataSettings settings =
BigtableDataSettings.newBuilder().setProjectId(projectId).setInstanceId(instanceId).build();
// Initialize client that will be used to send requests. This client only needs to be created
// once, and can be reused for multiple requests. After completing all of your requests, call
// the "close" method on the client to safely clean up any remaining background resources.
try (BigtableDataClient dataClient = BigtableDataClient.create(settings)) {
System.out.println("\nReading a single row by row key");
Row row = dataClient.readRow(tableId, "r1");
System.out.println("Row: " + row.getKey().toStringUtf8());
for (RowCell cell : row.getCells()) {
System.out.printf(
"Family: %s Qualifier: %s Value: %s%n",
cell.getFamily(), cell.getQualifier().toStringUtf8(), cell.getValue().toStringUtf8());
}
} catch (NotFoundException e) {
System.err.println("Failed to read from a non-existent table: " + e.getMessage());
} catch (Exception e) {
System.out.println("Error during quickstart: \n" + e.toString());
}
}
}
I am developing a Java Azure function that needs to download a file from Azure Datalake Gen2.
When the function tries to read the file, it freezes and no exception is thrown, and nothing is written to the console.
I am using the azure-storage-file-datalake SDK for Java dependency and this is my code:
import com.azure.storage.common.StorageSharedKeyCredential;
import com.azure.storage.file.datalake.DataLakeDirectoryClient;
import com.azure.storage.file.datalake.DataLakeFileClient;
import com.azure.storage.file.datalake.DataLakeFileSystemClient;
import com.azure.storage.file.datalake.DataLakeServiceClient;
import com.azure.storage.file.datalake.DataLakeServiceClientBuilder;
public DataLakeServiceClient GetDataLakeServiceClient(String accountName, String accountKey)
{
StorageSharedKeyCredential sharedKeyCredential = new StorageSharedKeyCredential(accountName, accountKey);
DataLakeServiceClientBuilder builder = new DataLakeServiceClientBuilder();
builder.endpoint("https://" + accountName + ".dfs.core.windows.net");
builder.credential(sharedKeyCredential);
return builder.buildClient();
}
public void DownloadFile(DataLakeFileSystemClient fileSystemClient, String fileName) throws Exception{
DataLakeDirectoryClient directoryClient = fileSystemClient.getDirectoryClient("DIR");
DataLakeDirectoryClient subdirClient= directoryClient.getSubdirectoryClient("SUBDIR");
DataLakeFileClient fileClient = subdirClient.getFileClient(fileName);
File file = new File("downloadedFile.txt");
OutputStream targetStream = new FileOutputStream(file);
fileClient.read(targetStream);
targetStream.close();
}
#FunctionName("func")
public HttpResponseMessage run(
#HttpTrigger(name = "req", methods = {HttpMethod.GET}, authLevel = AuthorizationLevel.ANONYMOUS)
HttpRequestMessage<Optional<String>> request,
final ExecutionContext context
)
{
String fileName= request.getQueryParameters().get("file");
DataLakeServiceClient datalakeClient= GetDataLakeServiceClient("datalake", "<the shared key>");
DataLakeFileSystemClient datalakeFsClient= datalakeClient.getFileSystemClient("fs");
DownloadFile(datalakeFsClient, fileName);
}
The app freezes when it hits fileClient.read(targetStream);
I've tried with really small files, I've checked the credentials and the file paths, the access rights to datalake, I've switched to SAS token - the result is the same: no error at all, but the app freezes.
I am using these Maven dependencies:
<dependency>
<groupId>com.microsoft.azure.functions</groupId>
<artifactId>azure-functions-java-library</artifactId>
</dependency>
<dependency>
<groupId>com.azure</groupId>
<artifactId>azure-storage-file-datalake</artifactId>
<version>12.2.0</version>
</dependency>
so i was facing the same problem.Then i came across this.
https://github.com/Azure/azure-functions-java-library/issues/113
This worked for me on java 8,azure function v3.
Set FUNCTIONS_WORKER_JAVA_LOAD_APP_LIBS to True
in the function app Application settings.Then save and restart the function app.It will work.
Please check and do update if it worked for you as well.
I want to use Java to access Dynamodb on an Ec2 instance.
This Ec2 instance has been granted a IAM role, with which I can directly access the Dynamodb by using aws CLI: aws dynamodb list-table.
Now I try to access the Dynamodb via Java. The Java code should be able to assume role, but it didn't work.
public static void main(String[] args) throws Exception {
String ROLE_ARN = "arn:aws:iam::....";
AWSSecurityTokenServiceClient stsClient = new AWSSecurityTokenServiceClient();
AssumeRoleRequest assumeRequest = new AssumeRoleRequest()
.withRoleArn(ROLE_ARN)
.withDurationSeconds(3600)
.withRoleSessionName("demo");
AssumeRoleResult assumeResult = stsClient.assumeRole(assumeRequest);
BasicSessionCredentials temporaryCredentials = new BasicSessionCredentials(
assumeResult.getCredentials().getAccessKeyId(),
assumeResult.getCredentials().getSecretAccessKey(),
assumeResult.getCredentials().getSessionToken());
AmazonDynamoDBClient client = new AmazonDynamoDBClient(temporaryCredentials)
DynamoDB dynamoDB = new DynamoDB(client);
TableCollection<ListTablesResult> tables = dynamoDB.listTables();
Iterator<Table> iterator_t = tables.iterator();
System.out.println("Listing table names");
while (iterator_t.hasNext()) {
Table table = iterator_t.next();
System.out.println(table.getTableName());
}
}
When I ran the code on the ec2 instance, I got
Exception in thread "main" com.amazonaws.services.securitytoken.model.AWSSecurityTokenServiceException: Not authorized to perform sts:AssumeRole (Service: AWSSecurityTokenService; Status Code: 403; Error Code: AccessDenied; Request ID: 60313562-d462-11e6-a116-5bf8bb6a59ce)
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.handleErrorResponse(AmazonHttpClient.java:1586)
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeOneRequest(AmazonHttpClient.java:1254)
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeHelper(AmazonHttpClient.java:1035)
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.doExecute(AmazonHttpClient.java:747)
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeWithTimer(AmazonHttpClient.java:721)
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.execute(AmazonHttpClient.java:704)
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.access$500(AmazonHttpClient.java:672)
at com.amazonaws.http.AmazonHttpClient$RequestExecutionBuilderImpl.execute(AmazonHttpClient.java:654)
at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:518)
at com.amazonaws.services.securitytoken.AWSSecurityTokenServiceClient.doInvoke(AWSSecurityTokenServiceClient.java:1188)
at com.amazonaws.services.securitytoken.AWSSecurityTokenServiceClient.invoke(AWSSecurityTokenServiceClient.java:1164)
at com.amazonaws.services.securitytoken.AWSSecurityTokenServiceClient.assumeRole(AWSSecurityTokenServiceClient.java:419)
at com.spokeo.dynamo_elas.AccessAwsD.main(AccessAwsD.java:stsClient.assumeRole(assumeRequest))
Anybody knows how to solve this problem?
Thanks.
After a long time exploring, finally figured out the following solution.
AWSCredentialsProvider provider = new InstanceProfileCredentialsProvider();
AWSCredentials credential = provider.getCredentials();
AmazonDynamoDBClient client = new AmazonDynamoDBClient(credential);
client.setRegion(Region.getRegion(Regions.US_WEST_2));
DynamoDB dynamoDB = new DynamoDB(client);
TableCollection<ListTablesResult> tables = dynamoDB.listTables();
Also the dependencies in pom.xml needs to be configured correctly to avoid conflicts, say,
com.amazonaws
aws-java-sdk
1.11.72
<dependency>
<groupId>org.apache.httpcomponents</groupId>
<artifactId>httpclient</artifactId>
<version>4.5.2</version>
</dependency>
<!-- https://mvnrepository.com/artifact/com.fasterxml.jackson.core/jackson-databind -->
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>2.8.5</version>
</dependency>
<!-- https://mvnrepository.com/artifact/com.fasterxml.jackson.dataformat/jackson-dataformat-cbor -->
<dependency>
<groupId>com.fasterxml.jackson.dataformat</groupId>
<artifactId>jackson-dataformat-cbor</artifactId>
<version>2.8.5</version>
</dependency>
When I've done this I've never had to do anything specifically with the role - indeed, I have no idea what role I'm using. I use something like:
AWSCredentialsProviderChain credentialsProvider;
try {
credentialsProvider = new DefaultAWSCredentialsProviderChain();
}
catch (Exception e) {
throw new RuntimeException("Error loading credentials", e);
}
AmazonDynamoDBClient client = new AmazonDynamoDBClient(credentialsProvider);
The advantage of using the default provider is that if I'm developing locally with a ~/.aws/credentials it is used. If I'm on the EC2 with IAM credentials then it is used.
I'm trying to build SIP application using JAIN SIP 1.2 and the NIST implementation on android.
I have rebuilt jain-sip-api-1.2.jar and jain-sip-ri-1.2.1111.jar from source, and renamed javax -> jain_javax and gov.nist.javax -> jain_gov.nist.jain_javax. I tested the jar files on textclient example on standard java without problem. However, when I run it on Android I still get the error:
"The Peer SIP Stack: jain_gov.nist.jain_javax.sip.SipstackImpl could not be instantiated. Ensure the Path Name has been set".
Did I miss anything here?
It is not sufficient to rename the packages. JAIN-SIP has internal references to some classes by their original package name "gov.nist". You should also double check all your code to rename any "gov.nist" references such as the prefix for the stack classes.
Android has built-in an older version of JAIN-SIP which is taking over some of the existing references to those "gov.nist" classes. It's not an exported API, so not quite obvious. That's why it may behave differently on desktop machines. Post you code and full error messages/debug logs if you need more help.
Sovled. Jain Sip is using log4i-1.2.x.jar which does not work properly on Android. There are lots of discussion on Internet how to make log4j working on Android but none of them works for me. I have removed all log4j related code from Jain Sip source and now the sip stack is working properly on Android.
I am using JAIN-SIP-1-2-164. Here is the app code:
import java.io.UnsupportedEncodingException;
import java.net.InetAddress;
import java.text.ParseException;
import java.util.*;
import android.os.Handler;
import jain_javax.sip.*;
import jain_javax.sip.address.*;
import jain_javax.sip.header.*;
import jain_javax.sip.message.*;
public class SipLayer implements SipListener {
private SipStack sipStack;
private SipFactory sipFactory;
private Properties properties;
private String local_ip;
int listen_port;
/** Here we initialize the SIP stack. */
public SipLayer(int listen_port) {
try {
setUsername(username);
this.local_ip = InetAddress.getLocalHost().getHostAddress();;
this.listen_port = listen_port;
// Create the SIP factory and set the path name.
this.sipFactory = SipFactory.getInstance();
this.sipFactory.setPathName("jain_gov.nist");
// Create and set the SIP stack properties.
this.properties = new Properties();
this.properties.setProperty("jain_javax.sip.STACK_NAME", "stack");
this.properties.setProperty("jain_javax.sip.IP_ADDRESS", local_ip);
if(proxy != null)
this.properties.setProperty("jain_javax.sip.OUTBOUND_PROXY", proxy + ':' + server_port + '/' + protocol);
//DEBUGGING: Information will go to files textclient.log and textclientdebug.log
this.properties.setProperty("jain_gov.nist.javax.sip.TRACE_LEVEL", "32");
// this.properties.setProperty("jain_gov.nist.javax.sip.SERVER_LOG", "textclient.txt");
// this.properties.setProperty("jain_gov.nist.javax.sip.DEBUG_LOG", "textclientdebug.log");
// Create the SIP stack.
this.sipStack = this.sipFactory.createSipStack(properties);
}
catch (Exception e) {
msgProc.processError("SipLayer failed: " + e.getMessage() + "\n");
}
}
}
Same code runs ok on java on a windows machine, but android emulator I got above mentioned error message.
I found that it failed in following Jain SIP 1.2 routine at "SipStack sipStack = (SipStack) sipStackConstructor.newInstance(conArgs);"
private SipStack createStack(Properties properties)
throws PeerUnavailableException {
try {
// create parameters argument to identify constructor
Class[] paramTypes = new Class[1];
paramTypes[0] = Class.forName("java.util.Properties");
// get constructor of SipStack in order to instantiate
Constructor sipStackConstructor = Class.forName(
getPathName() + ".jain_javax.sip.SipStackImpl").getConstructor(
paramTypes);
// Wrap properties object in order to pass to constructor of
// SipSatck
Object[] conArgs = new Object[1];
conArgs[0] = properties;
// Creates a new instance of SipStack Class with the supplied
// properties.
SipStack sipStack = (SipStack) sipStackConstructor.newInstance(conArgs);
sipStackList.add(sipStack);
String name = properties.getProperty("jain_javax.sip.STACK_NAME");
this.sipStackByName.put(name, sipStack);
return sipStack;
} catch (Exception e) {
String errmsg = "The Peer SIP Stack: "
+ getPathName()
+ ".jain_javax.sip.SipStackImpl"
+ " could not be instantiated. Ensure the Path Name has been set.";
throw new PeerUnavailableException(errmsg, e);
}
}
Any suggestion or how to debug further?
Maybe this is going to be a larger task than I had originally thought, but regardless, I'm trying to load a MavenProject from a file and then resolve its dependencies. I've got the code for both bits but I'm missing some object references that I need; specifically I need to get instances of RepositorySystemSession and RepositorySystem. Any tips?
Note: I have tagged this question with maven-plugin, but this is not a Maven plugin. I am happy to mandate Maven 3 (think I already have anyway..)
Here's the code I have so far:
Constructing the MavenProject:
public static MavenProject loadProject(File pomFile) throws Exception
{
MavenProject ret = null;
MavenXpp3Reader mavenReader = new MavenXpp3Reader();
if (pomFile != null && pomFile.exists())
{
FileReader reader = null;
try
{
reader = new FileReader(pomFile);
Model model = mavenReader.read(reader);
model.setPomFile(pomFile);
ret = new MavenProject(model);
}
finally
{
// Close reader
}
}
return ret;
}
Resolving dependencies:
public static List<Dependency> getArtifactsDependencies(MavenProject project, String dependencyType, String scope) throws Exception
{
DefaultArtifact pomArtifact = new DefaultArtifact(project.getId());
RepositorySystemSession repoSession = null; // TODO
RepositorySystem repoSystem = null; // TODO
List<RemoteRepository> remoteRepos = project.getRemoteProjectRepositories();
List<Dependency> ret = new ArrayList<Dependency>();
Dependency dependency = new Dependency(pomArtifact, scope);
CollectRequest collectRequest = new CollectRequest();
collectRequest.setRoot(dependency);
collectRequest.setRepositories(remoteRepos);
DependencyNode node = repoSystem.collectDependencies(repoSession, collectRequest).getRoot();
DependencyRequest projectDependencyRequest = new DependencyRequest(node, null);
repoSystem.resolveDependencies(repoSession, projectDependencyRequest);
PreorderNodeListGenerator nlg = new PreorderNodeListGenerator();
node.accept(nlg);
ret.addAll(nlg.getDependencies(true));
return ret;
}
I realise this might be an unusual request, maybe I should just scrap what I was trying to do and wrap it as a plugin...but I kind of just want to finish what I started now! Thanks in advance.
Try jcabi-aether, which is a wrapper around Apache Aether from Sonatype:
final File repo = this.session.getLocalRepository().getBasedir();
final Collection<Artifact> deps = new Aether(this.getProject(), repo).resolve(
new DefaultArtifact("junit", "junit-dep", "", "jar", "4.10"),
JavaScopes.RUNTIME
);
If you are outside of Maven plugin:
final File repo = new File(System.getProperty("java.io.tmpdir"), "my-repo");
final MavenProject project = new MavenProject();
project.setRemoteArtifactRepositories(
Arrays.asList(
new RemoteRepository(
"maven-central",
"default",
"http://repo1.maven.org/maven2/"
)
)
);
final Collection<Artifact> deps = new Aether(project, repo).resolve(
new DefaultArtifact("junit", "junit-dep", "", "jar", "4.10"),
JavaScopes.RUNTIME
);
I would recommend to read the information about Aether lib which is exactly is for such kind of purposes.
Note: Aether was previousely developed at Sonatype, but has since moved to Eclipse.
I just whipped up a solution to both your and my problem:
/*******************************************************************************
* Copyright (c) 2013 TerraFrame, Inc. All rights reserved.
*
* This file is part of Runway SDK(tm).
*
* Runway SDK(tm) is free software: you can redistribute it and/or modify
* it under the terms of the GNU Lesser General Public License as
* published by the Free Software Foundation, either version 3 of the
* License, or (at your option) any later version.
*
* Runway SDK(tm) is distributed in the hope that it will be useful, but
* WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Lesser General Public License for more details.
*
* You should have received a copy of the GNU Lesser General Public
* License along with Runway SDK(tm). If not, see <http://www.gnu.org/licenses/>.
******************************************************************************/
package com.test.mavenaether;
import java.io.File;
import java.io.FileReader;
import java.io.IOException;
import java.util.Arrays;
import java.util.Collection;
import java.util.Iterator;
import java.util.List;
import org.apache.maven.artifact.repository.ArtifactRepository;
import org.apache.maven.artifact.repository.ArtifactRepositoryPolicy;
import org.apache.maven.artifact.repository.MavenArtifactRepository;
import org.apache.maven.artifact.repository.layout.DefaultRepositoryLayout;
import org.apache.maven.model.Model;
import org.apache.maven.model.io.xpp3.MavenXpp3Reader;
import org.apache.maven.project.MavenProject;
import org.codehaus.plexus.util.xml.pull.XmlPullParserException;
import org.sonatype.aether.artifact.Artifact;
import org.sonatype.aether.resolution.DependencyResolutionException;
import org.sonatype.aether.util.artifact.DefaultArtifact;
import org.sonatype.aether.util.artifact.JavaScopes;
import com.jcabi.aether.Aether;
public class MavenAether
{
public static void main(String[] args) throws Exception
{
String classpath = getClasspathFromMavenProject(new File("/users/terraframe/documents/workspace/MavenSandbox/pom.xml"), new File("/users/terraframe/.m2/repository"));
System.out.println("classpath = " + classpath);
}
public static String getClasspathFromMavenProject(File projectPom, File localRepoFolder) throws DependencyResolutionException, IOException, XmlPullParserException
{
MavenProject proj = loadProject(projectPom);
proj.setRemoteArtifactRepositories(
Arrays.asList(
(ArtifactRepository) new MavenArtifactRepository(
"maven-central", "http://repo1.maven.org/maven2/", new DefaultRepositoryLayout(),
new ArtifactRepositoryPolicy(), new ArtifactRepositoryPolicy()
)
)
);
String classpath = "";
Aether aether = new Aether(proj, localRepoFolder);
List<org.apache.maven.model.Dependency> dependencies = proj.getDependencies();
Iterator<org.apache.maven.model.Dependency> it = dependencies.iterator();
while (it.hasNext()) {
org.apache.maven.model.Dependency depend = it.next();
final Collection<Artifact> deps = aether.resolve(
new DefaultArtifact(depend.getGroupId(), depend.getArtifactId(), depend.getClassifier(), depend.getType(), depend.getVersion()),
JavaScopes.RUNTIME
);
Iterator<Artifact> artIt = deps.iterator();
while (artIt.hasNext()) {
Artifact art = artIt.next();
classpath = classpath + " " + art.getFile().getAbsolutePath();
}
}
return classpath;
}
public static MavenProject loadProject(File pomFile) throws IOException, XmlPullParserException
{
MavenProject ret = null;
MavenXpp3Reader mavenReader = new MavenXpp3Reader();
if (pomFile != null && pomFile.exists())
{
FileReader reader = null;
try
{
reader = new FileReader(pomFile);
Model model = mavenReader.read(reader);
model.setPomFile(pomFile);
ret = new MavenProject(model);
}
finally
{
reader.close();
}
}
return ret;
}
}
pom.xml:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.test</groupId>
<artifactId>MavenSandbox</artifactId>
<version>0.0.1-SNAPSHOT</version>
<dependencies>
<dependency>
<groupId>com.jcabi</groupId>
<artifactId>jcabi-aether</artifactId>
<version>0.7.19</version>
</dependency>
<dependency>
<groupId>org.apache.maven</groupId>
<artifactId>maven-core</artifactId>
<version>3.0.3</version>
</dependency>
</dependencies>
</project>
The code first loads the Maven project (using your function provided in the original question), then uses jcabi-aether to find the artifact in your local repository. You will need to change the two parameters in the main function: the location for the pom.xml of your project, and the location of your local repository.
Enjoy! :)
Try this (as can be seen from the ather-demo):
...
LocalRepository localRepository = new LocalRepository("/path/to/local-repo");
RepositorySystem system = getRepositorySystemInstance();
RepositorySystemSession session = getRepositorySystemSessionInstance(system, localRepository);
....
public static RepositorySystem getRepositorySystemInstance()
{
/**
* Aether's components implement org.sonatype.aether.spi.locator.Service to ease manual wiring and using the
* prepopulated DefaultServiceLocator, we only need to register the repository connector factories.
*/
MavenServiceLocator locator = new MavenServiceLocator();
locator.addService(RepositoryConnectorFactory.class, FileRepositoryConnectorFactory.class);
locator.addService(RepositoryConnectorFactory.class, WagonRepositoryConnectorFactory.class);
locator.setServices(WagonProvider.class, new ManualWagonProvider());
return locator.getService(RepositorySystem.class);
}
private static RepositorySystemSession getRepositorySystemSessionInstance(RepositorySystem system,
LocalRepository localRepo)
{
MavenRepositorySystemSession session = new MavenRepositorySystemSession();
session.setLocalRepositoryManager(system.newLocalRepositoryManager(localRepo));
session.setTransferListener(new ConsoleTransferListener());
session.setRepositoryListener(new ConsoleRepositoryListener());
// Set this in order to generate dirty trees
session.setDependencyGraphTransformer(null);
return session;
}
There is a nice set of standalone examples for Eclipses Aether API which is used in the latest Maven (3.1.1) and it can be found here.
Note: Maven 3.1.X still uses Aether 0.9.0.M2 (and the latest version which us used in the examples is 0.9.0.M3). So to run these examples inside a Maven plugin, version M2 is required, and a standalone application can use the latest M3 version.
This is covered in "Aether/Setting Aether Up" for the RepositorySystem and in "Aether/Creating a Repository System Session" in the eclipse wiki.
There is not a ton of documentation, but you can create a RepositorySystem as follows:
// verbatim copy from the Setting Aether Up link
private static RepositorySystem newRepositorySystem()
{
DefaultServiceLocator locator = MavenRepositorySystemUtils.newServiceLocator();
locator.addService( RepositoryConnectorFactory.class, BasicRepositoryConnectorFactory.class );
locator.addService( TransporterFactory.class, FileTransporterFactory.class );
locator.addService( TransporterFactory.class, HttpTransporterFactory.class );
return locator.getService( RepositorySystem.class );
}
Do note that this requires the dependencies detailed in "Getting Aether", most notably maven-aether-provider.
When you have your repository system the tutorial goes on to create a RepositorySystemSession with the following factory method:
// copied verbatim from "Creating a Repository System Session"
private static RepositorySystemSession newSession( RepositorySystem system )
{
DefaultRepositorySystemSession session = MavenRepositorySystemUtils.newSession();
LocalRepository localRepo = new LocalRepository( "target/local-repo" );
session.setLocalRepositoryManager( system.newLocalRepositoryManager( session, localRepo ) );
return session;
}