I am creating a simple application where I want to upload a file to my AWS S3 bucket. Here is my code:
import java.io.File;
import java.io.IOException;
import com.amazonaws.AmazonServiceException;
import com.amazonaws.SdkClientException;
import com.amazonaws.auth.AWSStaticCredentialsProvider;
import com.amazonaws.auth.BasicAWSCredentials;
import com.amazonaws.auth.profile.ProfileCredentialsProvider;
import com.amazonaws.services.s3.AmazonS3;
import com.amazonaws.services.s3.AmazonS3ClientBuilder;
import com.amazonaws.services.s3.model.ObjectMetadata;
import com.amazonaws.services.s3.model.PutObjectRequest;
import com.fasterxml.jackson.*;
public class UploadFileInBucket {
public static void main(String[] args) throws IOException {
String clientRegion = "<myRegion>";
String bucketName = "<myBucketName>";
String stringObjKeyName = "testobject";
String fileObjKeyName = "testfileobject";
String fileName = "D:\\Attachments\\LICENSE";
try {
BasicAWSCredentials awsCreds = new BasicAWSCredentials("<myAccessKey>", "<mySecretKey>");
AmazonS3 s3Client = AmazonS3ClientBuilder.standard()
.withRegion(clientRegion)
.withCredentials(new AWSStaticCredentialsProvider(awsCreds))
.build();
// Upload a text string as a new object.
s3Client.putObject(bucketName, stringObjKeyName, "Uploaded String Object");
// Upload a file as a new object with ContentType and title specified.
PutObjectRequest request = new PutObjectRequest(bucketName, fileObjKeyName, new File(fileName));
ObjectMetadata metadata = new ObjectMetadata();
metadata.setContentType("plain/text");
metadata.addUserMetadata("x-amz-meta-title", "someTitle");
request.setMetadata(metadata);
s3Client.putObject(request);
}
catch(AmazonServiceException e) {
// The call was transmitted successfully, but Amazon S3 couldn't process
// it, so it returned an error response.
e.printStackTrace();
}
catch(SdkClientException e) {
// Amazon S3 couldn't be contacted for a response, or the client
// couldn't parse the response from Amazon S3.
e.printStackTrace();
}
}
}
I am unable to upload a file and getting an error as:
Exception in thread "main" java.lang.NoSuchFieldError:
ALLOW_FINAL_FIELDS_AS_MUTATORS
at com.amazonaws.partitions.PartitionsLoader.<clinit>(PartitionsLoader.java:52)
at com.amazonaws.regions.RegionMetadataFactory.create(RegionMetadataFactory.java:30)
at com.amazonaws.regions.RegionUtils.initialize(RegionUtils.java:64)
at com.amazonaws.regions.RegionUtils.getRegionMetadata(RegionUtils.java:52)
at com.amazonaws.regions.RegionUtils.getRegion(RegionUtils.java:105)
at com.amazonaws.client.builder.AwsClientBuilder.getRegionObject(AwsClientBuilder.java:249)
at com.amazonaws.client.builder.AwsClientBuilder.withRegion(AwsClientBuilder.java:238)
at UploadFileInBucket.main(UploadFileInBucket.java:28)
I have added required AWS bucket credentials, permissions and dependencies to execute this code.
What changes I should made in the code to get my file uploaded to desired bucket?
It looks as though you either have the wrong version of the Jackson libraries or are somehow linking with multiple versions of them.
The AWS for Java SDK distribution contains a third-party/lib directory which contains all of the (correct versions of) the libraries that version of the SDK should be built with. Depending on which features of the SDK you are using you may not need all of them, but those are the specific 3rd party libraries you should be using.
You need to add Jackson to your classpath. Its classes are missing.
I don't know which version you need, but you can download them from their gitpage: https://github.com/FasterXML/jackson/
Related
I am able to call AWS Textract to read an image from my local path. How can I integrate this textract code to read the image uploaded onto a created S3 bucket with the S3 bucket codes.
Working Textract Code to textract images from local path
package aws.cloud.work;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileWriter;
import java.io.IOException;
import java.nio.ByteBuffer;
import java.io.InputStream;
import org.json.simple.JSONArray;
import org.json.simple.JSONObject;
import com.amazonaws.auth.AWSStaticCredentialsProvider;
import com.amazonaws.auth.BasicAWSCredentials;
import com.amazonaws.regions.Regions;
import com.amazonaws.services.textract.AmazonTextract;
import com.amazonaws.services.textract.AmazonTextractClientBuilder;
import com.amazonaws.services.textract.model.DetectDocumentTextRequest;
import com.amazonaws.services.textract.model.DetectDocumentTextResult;
import com.amazonaws.services.textract.model.Document;
import com.amazonaws.util.IOUtils;
public class TextractDemo {
static AmazonTextractClientBuilder clientBuilder = AmazonTextractClientBuilder.standard()
.withRegion(Regions.US_EAST_1);
private static FileWriter file;
public static void main(String[] args) throws IOException {
//AWS Credentials to access AWS Textract services
clientBuilder.setCredentials(new AWSStaticCredentialsProvider(
new BasicAWSCredentials("Access Key", "Secret key")));
//Set the path of the image to be textract. Can be configured to use from S3
String document="C:\\Users\\image-local-path\\sampleTT.jpg";
ByteBuffer imageBytes;
//Code to use AWS Textract services
try (InputStream inputStream = new FileInputStream(new File(document))) {
imageBytes = ByteBuffer.wrap(IOUtils.toByteArray(inputStream));
}
AmazonTextract client = clientBuilder.build();
DetectDocumentTextRequest request = new DetectDocumentTextRequest()
.withDocument(new Document().withBytes(imageBytes));
/*
* DetectDocumentTextResult result = client.detectDocumentText(request);
* System.out.println(result); result.getBlocks().forEach(block ->{
* if(block.getBlockType().equals("LINE")) System.out.println("text is "+
* block.getText() + " confidence is "+ block.getConfidence());
*/
//
DetectDocumentTextResult result = client.detectDocumentText(request);
System.out.println(result);
JSONObject obj = new JSONObject();
result.getBlocks().forEach(block -> {
if (block.getBlockType().equals("LINE"))
System.out.println("text is " + block.getText() + " confidence is " + block.getConfidence());
JSONArray fields = new JSONArray();
fields.add(block.getText() + " , " + block.getConfidence());
obj.put(block.getText(), fields);
});
//To import the results into JSON file and output the console output as sample.txt
try {
file = new FileWriter("/Users/output-path/sample.txt");
file.write(obj.toJSONString());
} catch (IOException e) {
e.printStackTrace();
} finally {
try {
file.flush();
file.close();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
}
This is an example of the console out where the "text" and corresponding "confidence scores" are returned
S3 bucket code integration I managed to find from the docs:
String document = "sampleTT.jpg";
String bucket = "textract-images";
AmazonS3 s3client = AmazonS3ClientBuilder.standard()
.withEndpointConfiguration(
new EndpointConfiguration("https://s3.amazonaws.com","us-east-1"))
.build();
// Get the document from S3
com.amazonaws.services.s3.model.S3Object s3object = s3client.getObject(bucket, document);
S3ObjectInputStream inputStream = s3object.getObjectContent();
BufferedImage image = ImageIO.read(inputStream);
(Edited) - Thanks #smac2020, I currently have a working Rekognition Code that reads from my AWS console S3 bucket and runs the Rekognition services that I am referencing to. However, I am unable to modify and merge it with
the Textract source code
package com.amazonaws.samples;
import com.amazonaws.auth.AWSCredentialsProvider;
import com.amazonaws.auth.AWSStaticCredentialsProvider;
import com.amazonaws.auth.BasicAWSCredentials;
import com.amazonaws.services.rekognition.AmazonRekognition;
import com.amazonaws.services.rekognition.AmazonRekognitionClientBuilder;
import com.amazonaws.services.rekognition.model.AmazonRekognitionException;
import com.amazonaws.services.rekognition.model.DetectLabelsRequest;
import com.amazonaws.services.rekognition.model.DetectLabelsResult;
import com.amazonaws.services.rekognition.model.Image;
import com.amazonaws.services.rekognition.model.Label;
import com.amazonaws.services.rekognition.model.S3Object;
import java.util.List;
public class DetectLabels {
public static void main(String[] args) throws Exception {
String photo = "sampleTT.jpg";
String bucket = "Textract-bucket";
// AmazonRekognition rekognitionClient = AmazonRekognitionClientBuilder.standard().withRegion("ap-southeast-1").build();
AWSCredentialsProvider credentialsProvider = new AWSStaticCredentialsProvider (new BasicAWSCredentials("Access Key", "Secret Key"));
AmazonRekognition rekognitionClient = AmazonRekognitionClientBuilder.standard().withCredentials(credentialsProvider).withRegion("ap-southeast-1").build();
DetectLabelsRequest request = new DetectLabelsRequest()
.withImage(new Image()
.withS3Object(new S3Object()
.withName(photo).withBucket(bucket)))
.withMaxLabels(10)
.withMinConfidence(75F);
try {
DetectLabelsResult result = rekognitionClient.detectLabels(request);
List <Label> labels = result.getLabels();
System.out.println("Detected labels for " + photo);
for (Label label: labels) {
System.out.println(label.getName() + ": " + label.getConfidence().toString());
}
} catch(AmazonRekognitionException e) {
e.printStackTrace();
}
}
}
Looks like you are trying to read an Amazon S3 object from a Spring boot app and then pass that byte array to DetectDocumentTextRequest.
There is a tutorial that shows a very similar use case where a Spring BOOT app reads the bytes from an Amazon S3 object and passes it to the Amazon Rekognition service (instead of Textract).
The Java code is:
// Get the byte[] from this AWS S3 object.
public byte[] getObjectBytes (String bucketName, String keyName) {
s3 = getClient();
try {
GetObjectRequest objectRequest = GetObjectRequest
.builder()
.key(keyName)
.bucket(bucketName)
.build();
ResponseBytes<GetObjectResponse> objectBytes = s3.getObjectAsBytes(objectRequest);
byte[] data = objectBytes.asByteArray();
return data;
} catch (S3Exception e) {
System.err.println(e.awsErrorDetails().errorMessage());
System.exit(1);
}
return null;
}
See this AWS development article to see how to build a Spring BOOT app that has this functionality.
Creating an example AWS photo analyzer application using the AWS SDK for Java
This example uses the AWS SDK For Java V2. If you are not familiar with working with the latest SDK version, I recommend that you start here:
Get started with the AWS SDK for Java 2.x
I'm using Google Cloud Speech to text api in Java.
I'm getting 0 results when I call speechClient.recognize
pom.xml:
<dependency>
<groupId>com.google.cloud</groupId>
<artifactId>google-cloud-speech</artifactId>
<version>0.80.0-beta</version>
</dependency>
Java code:
import java.io.FileInputStream;
import java.nio.file.Files;
import java.nio.file.Path;
import java.nio.file.Paths;
import java.util.List;
import com.google.api.gax.core.FixedCredentialsProvider;
import com.google.auth.oauth2.GoogleCredentials;
import com.google.cloud.speech.v1.RecognitionAudio;
import com.google.cloud.speech.v1.RecognitionConfig;
import com.google.cloud.speech.v1.RecognitionConfig.AudioEncoding;
import com.google.cloud.speech.v1.RecognizeResponse;
import com.google.cloud.speech.v1.SpeechClient;
import com.google.cloud.speech.v1.SpeechRecognitionAlternative;
import com.google.cloud.speech.v1.SpeechRecognitionResult;
import com.google.cloud.speech.v1.SpeechSettings;
import com.google.protobuf.ByteString;
public class SpeechToText {
public static void main(String[] args) {
// Instantiates a client
try {
String jsonFilePath = System.getProperty("user.dir") + "/serviceaccount.json";
FileInputStream credentialsStream = new FileInputStream(jsonFilePath);
GoogleCredentials credentials = GoogleCredentials.fromStream(credentialsStream);
FixedCredentialsProvider credentialsProvider = FixedCredentialsProvider.create(credentials);
SpeechSettings speechSettings =
SpeechSettings.newBuilder()
.setCredentialsProvider(credentialsProvider)
.build();
SpeechClient speechClient = SpeechClient.create(speechSettings);
//SpeechClient speechClient = SpeechClient.create();
// The path to the audio file to transcribe
String fileName = System.getProperty("user.dir") + "/call-recording-790.opus";
// Reads the audio file into memory
Path path = Paths.get(fileName);
byte[] data = Files.readAllBytes(path);
ByteString audioBytes = ByteString.copyFrom(data);
System.out.println(path.toAbsolutePath());
// Builds the sync recognize request
RecognitionConfig config = RecognitionConfig.newBuilder().setEncoding(AudioEncoding.LINEAR16)
.setSampleRateHertz(8000).setLanguageCode("en-US").build();
RecognitionAudio audio = RecognitionAudio.newBuilder().setContent(audioBytes).build();
System.out.println("recognize builder");
// Performs speech recognition on the audio file
RecognizeResponse response = speechClient.recognize(config, audio);
List<SpeechRecognitionResult> results = response.getResultsList();
System.out.println(results.size()); // ***** HERE 0
for (SpeechRecognitionResult result : results) {
// There can be several alternative transcripts for a given chunk of speech.
// Just use the
// first (most likely) one here.
SpeechRecognitionAlternative alternative = result.getAlternativesList().get(0);
System.out.printf("Transcription: %s%n", alternative.getTranscript());
}
} catch (Exception e) {
System.out.println(e);
}
}
}
In the code above, I'm getting results.size as 0. When I upload the same opus file on demo at https://cloud.google.com/speech-to-text/, it gives output text correctly.
So why is the recognize call giving zero results?
There could be 3 reasons for Speech-to-Text to return an empty response:
Audio is not clear.
Audio is not intelligible.
Audio is not using the proper encoding.
From what I can see, reason 3 is the most possible cause of your issue. To resolve this, check this page to know how to verify the encoding of your audio file which must match the parameters you sent in InitialRecognizeRequest.
I want to move the file inside the S3 folder to another folder which is in the same s3 Bucket. I tried the below code
CopyObjectRequest copyObjRequest = new CopyObjectRequest(bucketName,
srcFolder+"/"+Filename, bucketName,
targetFolder+"/"+Filename);
s3Client.copyObject(copyObjRequest);
DeleteObjectRequest deleteObjRequest = new DeleteObjectRequest(bucketName,
srcFolder+"/"+Filename);
s3Client.deleteObject(deleteObjRequest);
The folder may contain multiple file, i want to move only the selected file. Above code is not showing any error, but nothing happens. Can anyone please suggest me the right solution for it.
It would be a good initial stab just to run the following code and check what the output is, without and deletions.
Also worth checking the ACL and bucket policy on the object.
This is the format expected
CopyObjectRequest(java.lang.String sourceBucketName, java.lang.String sourceKey, java.lang.String destinationBucketName, java.lang.String destinationKey)
If you want a copy of the object in the same bucket
CopyObjectRequest copyObjRequest = new CopyObjectRequest("myBucket", "myObject.txt", "myBucket", "myNewObject.txt");
s3Client.copyObject(copyObjRequest);
If you want a copy of the object in a different bucket
CopyObjectRequest copyObjRequest = new CopyObjectRequest("myBucket", "myObject.txt", "myOtherBucket", "myNewObject.txt");
s3Client.copyObject(copyObjRequest);
Sample code for testing
import java.io.IOException;
import com.amazonaws.AmazonServiceException;
import com.amazonaws.SdkClientException;
import com.amazonaws.auth.profile.ProfileCredentialsProvider;
import com.amazonaws.services.s3.AmazonS3;
import com.amazonaws.services.s3.AmazonS3ClientBuilder;
import com.amazonaws.services.s3.model.CopyObjectRequest;
public class CopyObjectSingleOperation {
public static void main(String[] args) throws IOException {
String clientRegion = "*** Client region ***";
String bucketName = "*** Bucket name ***";
String sourceKey = "*** Source object key *** ";
String destinationKey = "*** Destination object key ***";
try {
AmazonS3 s3Client = AmazonS3ClientBuilder.standard()
.withCredentials(new ProfileCredentialsProvider())
.withRegion(clientRegion)
.build();
// Copy the object into a new object in the same bucket.
CopyObjectRequest copyObjRequest = new CopyObjectRequest(bucketName, sourceKey, bucketName, destinationKey);
s3Client.copyObject(copyObjRequest);
}
catch(AmazonServiceException e) {
// The call was transmitted successfully, but Amazon S3 couldn't process
// it, so it returned an error response.
e.printStackTrace();
}
catch(SdkClientException e) {
// Amazon S3 couldn't be contacted for a response, or the client
// couldn't parse the response from Amazon S3.
e.printStackTrace();
}
}
}
I am trying to develop a AWS lambda function which is triggered when a file shows up in a specific s3 bucket. I am trying to follow the examples from AWS Lambda documentation, using aws-java-sdk-lambda 1.11.192, aws-java-sdk-s3 1.11.192. But, unfortunately the these examples use RequestHandler which is deprecated in the latest version of the jar.
My code is similar to this example
package example;
import java.net.URLDecoder;
import com.amazonaws.services.lambda.runtime.Context;
import com.amazonaws.services.lambda.runtime.RequestHandler;
import com.amazonaws.services.lambda.runtime.events.S3Event;
import com.amazonaws.services.s3.AmazonS3;
import com.amazonaws.services.s3.AmazonS3Client;
import com.amazonaws.services.s3.event.S3EventNotification.S3EventNotificationRecord;
public class S3GetTextBody implements RequestHandler<S3Event, String> {
public String handleRequest(S3Event s3event, Context context) {
try {
S3EventNotificationRecord record = s3event.getRecords().get(0);
// Retrieve the bucket & key for the uploaded S3 object that
// caused this Lambda function to be triggered
String bkt = record.getS3().getBucket().getName();
String key = record.getS3().getObject().getKey().replace('+', ' ');
key = URLDecoder.decode(key, "UTF-8");
// Read the source file as text
AmazonS3 s3Client = new AmazonS3Client();
String body = s3Client.getObjectAsString(bkt, key);
System.out.println("Body: " + body);
return "ok";
} catch (Exception e) {
System.err.println("Exception: " + e);
return "error";
}
}
}
The current version of the aws sdk for lambda doesn't contain -
import com.amazonaws.services.lambda.runtime.Context;
import com.amazonaws.services.lambda.runtime.RequestHandler;
import com.amazonaws.services.lambda.runtime.events.S3Event;
What are my alternatives? How can I achieve similar functionality using the newer versions of their sdk.
You aren't required to implement the RequestHandler interface provided in their helper library. Any method will work provided the input and output parameters can be serialized properly.
See this article for more detail.
If you want to use their helper library, use the following dependency coordinates:
<groupId>com.amazonaws</groupId>
<artifactId>aws-lambda-java-core</artifactId>
<version>1.1.0</version>
And for the S3 event helper:
<groupId>com.amazonaws</groupId>
<artifactId>aws-lambda-java-events</artifactId>
<version>1.3.0</version>
It's not located within their main aws-java-sdk but instead has its own repository.
i have uploaded small file to amazon s3 bucket easily in java. but when i uploading large file with 50MB it is taking too long time i am not even getting exception but file is not uploading. my code is simple
s3client.putObject(new PutObjectRequest("dev.rivet.media.web", "all.wav",new File(file path)));
can any one please suggest me to over come this problem
Alternatively you can take a look at https://github.com/minio/minio-java
Minio Java library provides simpler API's to access S3 Compatible storage providers.
In this library putObject manages file upload automatically by doing multipart internally and continues from where it left off as well.
Here is an example program.
import io.minio.MinioClient;
import io.minio.errors.ClientException;
import org.xmlpull.v1.XmlPullParserException;
import java.io.FileInputStream;
import java.io.File;
import java.io.IOException;
public class PutObject {
public static void main(String[] args) throws IOException, XmlPullParserException, ClientException {
System.out.println("PutObject app");
// Set s3 endpoint, region is calculated automatically
MinioClient s3Client = new MinioClient("https://s3.amazonaws.com", "YOUR-ACCESSKEYID", "YOUR-SECRETACCESSKEY");
File f = new File("C:/java/hello");
InputStream f = new FileInputStream(f);
// create object
s3Client.putObject("bucketName", "objectName", "application/octet-stream",
f.length(), f);
}
}
Hope this helps.