I'm planning to deploy java 17 function to AWS lambda. But since as per the documentation AWS didn't provide a bas image for java 17.
https://docs.aws.amazon.com/lambda/latest/dg/lambda-java.html
So I have a problem with what value should I use in cloudformation template Runtime field
"AMIIDLookup": {
"Type": "AWS::Lambda::Function",
"Properties": {
"Handler": "index.handler",
"Role": {
"Fn::GetAtt": [
"LambdaExecutionRole",
"Arn"
]
},
"Code": {
"S3Bucket": "lambda-functions",
"S3Key": "amilookup.zip"
},
"Runtime": "Java11", # what is the alternative for this
"Timeout": 25,
"TracingConfig": {
"Mode": "Active"
}
}
}
There is no official Java17 runtime for Lambda yet, you would have to create a custom runtime on your own.
Robert is right, either create a custom runtime or use docker image to spin up your aws lambda function https://cloud.netapp.com/blog/aws-cvo-blg-aws-lambda-images-how-to-use-container-images-to-deploy-lambda
Related
I have running local dynamoDB in Java integration tests
AmazonDynamoDBLocal embeddedDynamo = DynamoDBEmbedded.create()
AmazonDynamoDB client = embeddedDynamo.amazonDynamoDB();
DynamoDB dynamoDB = new DynamoDB(client);
Now I need to create tables based on definition in SAM template.yaml file. I can just copy that file on classpath using Gradle and parse it by some Java/Groovy tools and then just use parsed map to feed the DynamoDB instance API to create those tables.
But I am wondering if there is any library or tool that does it for me?
Thank you,
Lukas
You will need to create it yourself. You can do this using a json file and the aws cli's dynamodb command, for example:
my-table.json
{
"TableName": "MyTable",
"KeySchema": [
{ "AttributeName": "Id", "KeyType": "HASH" }
],
"AttributeDefinitions": [
{ "AttributeName": "Id", "AttributeType": "S" }
],
"ProvisionedThroughput": {
"ReadCapacityUnits": 1,
"WriteCapacityUnits": 1
}
}
aws cli's dynamodb create-table command example
aws dynamodb create-table --cli-input-json file://my-table.json --endpoint-url http://localhost:8000
Notes
you can do this without the json file.
more info in the docs here: https://docs.aws.amazon.com/cli/latest/reference/dynamodb/create-table.html
I currently having an issue with the Avro JsonDecoder. Avro is used in Version 1.8.2. The .avsc file is defined like:
{
"type": "record",
"namespace": "my.namespace",
"name": "recordName",
"fields": [
{
"name": "Code",
"type": "string"
},
{
"name": "CodeNumber",
"type": "string",
"default": ""
}
]
}
When I now run my test cases I get an org.apache.avro.AvroTypeException: Expected string. Got END_OBJECT. The class throwing the error is JasonDecoder.
For me it looks like the defaut value handling on my side might not be correct with using just "" as the default value. The error occurs only if the field is not available at all, but this, in my understanding, is the case when the default value should be used. If I set the value in the json as "CodeNumber": "" the decoder does not have any issues.
Any hints or ideas?
Found this:
Turns out the issue is that the default values are just ignored by the java implementations. I've added a workaround which will catch the exception and then look for a default value. Will be in release 1.9.0
Source: https://github.com/sksamuel/avro4s/issues/106
If it is possible, try to upgrade your Avro Decoder to 1.9.0 version.
I want to execute the following query with JavaClient.
{
"suggest": {
"my-suggestion-1": {
"text": "sample",
"completion": {
"field": "suggest1",
"size": 10
}
}
}
}
I could not find the document.
Please let me know the URL of the document.
Also, how should I implement it?
Someone, please lend me your wisdom.
Environment
Language: Java 8
Framework: Spring
Elasticsearch client (jar) ver
Org.elasticsearch: 5.1.1
Org.elasticsearch.client: 5.1.1
Elasticsearch's ver: 5.3.1
Try something like
TermSuggestionBuilder termSuggestionBuilder = SuggestBuilders.termSuggestion("field_name").text("my suggest terms");
client.prepareSearch("my_index").setSize(0).suggest(new SuggestBuilder().addSuggestion("foo", termSuggestionBuilder)).get();
A good thing to do is usually to check the Elasticsearch tests itself, like the CompletionSuggestSearchIT
I have a property file like this.
host=192.168.1.1
port=8060
host=192.168.1.2
port=8070
host=192.168.1.3
port=8080
host=192.168.1.4
port=8090
Now I want the unique url so I can pass it to other application.
Example
HostOne : https://192.168.1.1:8060
HostTwo : https://192.168.1.2:8070
HostThree : https://192.168.1.3:8080
HostFour : https://192.168.1.4:8090
How can I get it using Java or any other library. Please help.
Thanks.
EDITED
How about this if I will this type of data.
host=192.168.1.1,8060
host=192.168.1.1,8060
host=192.168.1.1,8060
host=192.168.1.1,8060
Now is there any way to get this. ?
Basically that property file is broken. A property file is a sequence of key/value pairs which is build into a map, so it requires the keys be unique. I suspect that if you load this into a Properties object at the moment, you'll get just the last host/port pair.
Options:
Make this a real properties file by giving unique keys, e.g.
host.1=192.168.1.1
port.1=8060
host.2=192.168.1.2
port.2=8070
...
Use a different file format (e.g. JSON)
Write your own custom parser which understands your current file format, but don't call it a "properties file" as that has a specific meaning to Java developers
Personally I'd probably go with JSON. For example, your file could be represented as:
[
{ "host": "192.168.1.1", "port": 8060 },
{ "host": "192.168.1.2", "port": 8070 },
{ "host": "192.168.1.3", "port": 8080 },
{ "host": "192.168.1.4", "port": 8090 }
]
I'm using Elasticsearch 1.4.3 and I'm trying to create an automated "filler" for the database.
The idea is to use this website http://beta.json-generator.com/BhxCdZ6 to generate a random set of data and push it in an index of Elasticsearch.
For interfacing with Elasticsearch, I am using Elasticsearch for Java API mixed with the Elasticsearch web API.
I managed to push one user per time simply copy-pasting the information excluding the [ and ] characters and creating a shell script that calls
curl -XPOST 'http://localhost:9200/myindex/users/' -d '{
"name": {
"first": "Dickerson",
"last": "Wood"
}, etc...
If I try to copy a full block composed of 3 people and try to push the data with the same script
curl -XPOST 'http://localhost:9200/geocon/users/' -d '[
{
"name": {
"first": "Dickerson",
"last": "Wood"
}, etc ...
]
}'
The error returned is :
org.elasticsearch.index.mapper.MapperParsingException: Malformed content, must start with an object
How would you solve this problem? Thank you!
You are missing the closing brace wrapping the item:
[
{
"name": {
"first": "Dickerson",
"last": "Wood"
}, etc.
]
You can validate your JSON e.g. via http://jsonlint.com/.
Also, try taking a look at http://www.elasticsearch.org/guide/en/elasticsearch/reference/current/docs-bulk.html