Karate DSL: Convert String to Json and insert value? [duplicate] - java

I Have a file json named production_2.json
[
{
"v":{
"id":"rep_01564526",
"name":"tuttoverde.pgmx",
"type":"PRODUCTION_STARTED",
"ute":"CDL",
"ver":"1.0",
"status":"EXE"
},
"ts":"2020-11-19T08:00:00Z"
},
{
"v":{
"id":"rep_01564526",
"name":"tuttoverde.pgmx",
"type":"PRODUCTION_ENDED",
"ute":"CDL",
"ver":"1.0",
"status":"EXE"
},
"ts":"2020-11-19T17:00:00Z"
}
]
And have the folling Karate code, that:
Read the file production_2.json
and for each element of the array send a topic
I * def sendtopics =
"""
function(i){
var topic = "data." + machineID + ".Production";
var payload = productions[i];
karate.log('topic: ', topic )
karate.log('payload: ', payload )
return mqtt.sendMessage(payload, topic);
}
"""
* def productions = read('this:productions_json/production_2.json')
* def totalProductionEvents = productions.length
* def isTopicWasSent = karate.repeat(totalProductionEvents, sendtopics)
* match each isTopicWasSent == true
The function
mqtt.sendMessage(payload, topic);
is a function in java, that have the following segnature
public Boolean sendMessage(String payload, String topic) {
System.out.println("Publishing message: ");
System.out.println("payload " + payload);
System.out.println("topic " + topic);
return true;
}
the problem is that the value of the "payload" inside the javascript function is correct and is printed correctly, while inside the "sendMessage" function the value of the payload is formatted incorrectly.
For example here is what it prints inside karate.log('payload: ', payload )
payload: {
"v": {
"id": "rep_01564526",
"name": "tuttoverde.pgmx",
"type": "PRODUCTION_STARTED",
"ute": "CDL",
"ver": "1.0",
"status": "EXE"
},
"ts": "2021-01-08T08:00:00Z"
}
And Here instead what is printed on the function "sendMessage" of the java class
payload {v={id=rep_01564526, name=tuttoverde.pgmx, type=PRODUCTION_STARTED, ute=CDL, ver=1.0, status=EXE, ts=2021-01-08T08:00:00Z}
I don't understand why the payload is formatted incorrectly (= instead of : ) and is it not a string. I also tried using the following solution and it doesn't work for me
* def sendtopics =
"""
function(i){
var topic = "data." + machineID + ".Production";
var payload = productions[i];
var payload2 = JSON.stringify(payload);
return mqtt.sendMessage(payload2, topic);
}
"""
How do I convert an object inside javascript to a string so I can pass it to java?

You are doing some really advanced stuff in Karate. I strongly suggest you start looking at the new version (close to release) and you can find details here: https://github.com/intuit/karate/wiki/1.0-upgrade-guide
The reason is because the async and Java interop has some breaking changes - and there are some new API-s you can call on the karate object in JS to do format conversions:
var temp = karate.fromString(payload);
And karate.log() should work better and not give you that odd formatting you are complaining about. With the current version you can try karate.toJson() if that gives you the conversion you expect.
Given your advanced usage, I recommend you start using the new version and provide feedback on anything that may be still needed.

Related

Python-processed Avro formatted data sent through a Apache Kafka does not yield same output when dezerialized in Apache Camel/Java processor

I am running a Kafka broker where I push messages to via a Python program. For efficient data exhchange I use Apache Avro format. At the Kafka broker, the message is picked up by a Camel route with processor. In this processor I want to de-serialize the message and finally want to push data to an InfluxDB.
The process mechanics work, but in the Camel route I do not get the data out I put in. On the Python side
I create a dictionary:
testDict = dict()
testDict['name'] = 'avroTest'
testDict['double_one'] = 1.2345
testDict['double_two'] = 1.23
testDict['double_three'] = 2.345
testDict['time_stamp'] = long(time.time() * 1000000000)
The corresponding Avro schema on Python side looks like this:
{
"namespace": "my.namespace",
"name": "myRecord",
"type": "record",
"fields": [
{"name": "name", "type": "string"},
{"name": "double_one", "type": "double"},
{"name": "double_two", "type": "double"},
{"name": "double_three", "type": "double"},
{"name": "time_stamp", "type": "long"}
]
}
The Python code for sending the avro-formatted message to Kafka look like this:
def sendAvroFormattedMessage(self, dataDict: dict, topic_id: str, schemaDefinition: str) \
-> FutureRecordMetadata:
"""
Method for sending message to kafka broker in the avro binary format
:param dataDict: data dictionary containing message data
:param topic_id: the Kafka topic to send message to
:param schemaDefinition: JSON schema definition
:return: FurtureRecordMetadata
"""
schema = avro.schema.parse(schemaDefinition)
writer = avro.io.DatumWriter(schema)
bytes_stream = io.BytesIO()
encoder = avro.io.BinaryEncoder(bytes_stream)
writer.write(dataDict, encoder)
raw_bytes = bytes_stream.getvalue()
messageBrokerWriterConnection = KafkaProducer(bootstrap_servers=<connectionUrl>, client_id='testLogger')
result = messageBrokerWriterConnection.send(topic=topic_id, value=raw_bytes, key='AVRO_FORMAT'.encode('UTF-8'))
return result
The message arrives as expected at the broker, is picked up by camel and processed by the following JAVA code:
from(kafkaEndpoint) //
.process(exchange -> {
Long kafkaInboundTime = Long
.parseLong(exchange.getIn().getHeader("kafka.TIMESTAMP").toString());
if (exchange.getIn().getHeader("kafka.KEY") != null) {
BinaryDecoder decoder = DecoderFactory.get()
.binaryDecoder(exchange.getIn().getBody(InputStream.class), null);
SpecificDatumReader<Record> datumReader = new SpecificDatumReader<>(avroSchema);
System.out.println(datumReader.read(null, decoder).toString());
}
}) //
.to(influxdbEndpoint);
With avroSchema currently hard coded in the constructor of my class as follows:
avroSchema = SchemaBuilder.record("myRecord") //
.namespace("my.namespace") //
.fields() //
.requiredString("name") //
.requiredDouble("double_one") //
.requiredDouble("double_two") //
.requiredDouble("double_three") //
.requiredLong("time_stamp") //
.endRecord();
The output of System.out.println is
{"name": "avroTest", "double_one": 6.803527358993313E-220, "double_two": -0.9919128115125185, "double_three": -0.9775074719163893, "time_stamp": 20}
Obviously, something goes wrong, but I don't know what. Any help appreciated.
Update 1
As the Python code is running on an Intel/Window machine, Kafka (In a VM) and the Java code on Linux machines with unknown architecture, could this effect be caused by different endian-ness of the systems?
Update 1.1 Endian-ness can be excluded. Checked on both sides, both were 'little'
Update 2
As a check I changed the schema definition to string type for all fields. With this definition, values and keys are transferred correctly - Python input and Java/Camel output are the same.
Update 3
The camel rout producer endpoint to Kafka does not have any special features such as deserializers, etc.:
"kafka:myTopicName?brokers=host:9092&clientId=myClientID&autoOffsetReset=earliest"
I found a solution to my problem. The following Python code produces the desired output into Kafka:
def sendAvroFormattedMessage(self, dataDict: dict, topic_id: MessageBrokerQueue, schemaDefinition: str) \
-> FutureRecordMetadata:
"""
Method for sending message to kafka broker in the avro binary format
:param dataDict: data dictionary containing message data
:param topic_id: the Kafka topic to send message to
:param schemaDefinition: JSON schema definition
:return: None
"""
schema = avro.schema.parse(schemaDefinition)
bytes_writer = io.BytesIO()
encoder = BinaryEncoder(bytes_writer)
writer = DatumWriter(schema)
writer.write(dataDict, encoder)
raw_bytes = bytes_writer.getvalue()
self._messageBrokerWriterConnection = KafkaProducer(bootstrap_servers=self._connectionUrl)
try:
# NOTE: I use the 'AVRO' key to separate avro formatted messages from others
result = self._messageBrokerWriterConnection.send(topic=topic_id, value=raw_bytes, key='AVRO'.encode('UTF-8'))
except Exception as err:
print(err)
self._messageBrokerWriterConnection.flush()
Key to the solution was adding the valueDeserializer=... to the end point definition on the Apache Camel side:
import org.apache.kafka.common.serialization.ByteArrayDeserializer;
...
TEST_QUEUE("kafka:topic_id?brokers=host:port&clientId=whatever&valueDeserializer=" + ByteArrayDeserializer.class.getName());
The Apache camel route code including the conversion to InfluxDB point can then be written like this:
#Component
public class Route_TEST_QUEUE extends RouteBuilder {
Schema avroSchema = null;
private Route_TEST_QUEUE() {
avroSchema = SchemaBuilder //
.record("ElectronCoolerCryoMessage") //
.namespace("de.gsi.fcc.applications.data.loggers.avro.messages") //
.fields() //
.requiredString("name") //
.requiredDouble("double_one") //
.requiredDouble("double_two") //
.requiredDouble("double_three") //
.requiredLong("time_stamp") //
.endRecord();
}
private String fromEndpoint = TEST_QUEUE.definitionString();
#Override
public void configure() throws Exception {
from(fromEndpoint) //
.process(messagePayload -> {
byte[] data = messagePayload.getIn().getBody(byte[].class);
BinaryDecoder decoder = DecoderFactory.get().binaryDecoder(data, null);
SpecificDatumReader<GenericRecord> datumReader = new SpecificDatumReader<>(avroSchema);
GenericRecord record = datumReader.read(null, decoder);
try {
Point.Builder influxPoint = Point
.measurement(record.get("name").toString());
long acqStamp = 0L;
if (record.hasField("time_stamp") && (long) record.get("time_stamp") > 0L) {
acqStamp = (long) record.get("time_stamp");
} else {
acqStamp = Long.parseLong(messagePayload.getIn().getHeader("kafka.TIMESTAMP").toString());
}
influxPoint.time(acqStamp, TimeUnit.NANOSECONDS);
Map<String, Object> fieldMap = new HashMap<>();
avroSchema.getFields().stream() //
.filter(field -> !field.name().equals("keyFieldname")) //
.forEach(field -> {
Object value = record.get(field.name());
fieldMap.put(field.name().toString(), value);
});
influxPoint.fields(fieldMap);
} catch (Exception e) {
MessageLogger.logError(e);
}
}) //
.to(...InfluxEndpoint...) //
.onException(Exception.class) //
.useOriginalMessage() //
.handled(true) //
.to("stream:out");
}
}
}
This works for my purposes - no confluent, only kafka.

Unable to pass Array from Angular 2 typescript to Spring Java

I am trying to pass a String array from my typescript
tmp : Array<string> = [];
So I have a function which takes in this array as a parameter input
passValues(test : Array<string>) {
........
// some method to call post method from service
}
So in service
public passingOfValues( test : Array<string> ) : Observable<Array<string>> {
let headers = new Headers({ 'Content-Type': 'application/json'} );
let options = new RequestOptions({ headers: headers);
let response = this.http.post(this.basePath + this.modulePath + '/getArrayValue', {'test' : test }, options)
.map(this.extractData)
.catch(this.handleError);
return response;
}
But I am getting errors such as System property [org.owasp.esapi.devteam] is not set
And I read on other posts that I have to stringify the array before passing to backend.
Is there a reason why I need to stringify / also can I just pass the raw array?
EDIT 1 :
including backend controller codes
public ResponseEntity<?> getArrayValues( ArrayList<String> test ) {
logger.debug("### Test if array has a size ###" + test.size());
}
Apparently size already shows 0 from here.
EDIT 2 :
While debugging, i realised that the SQL at the back is receiving
say
HOME CHARACTER(20 OCTETS)
does this make any difference?
Like passing of string into octets or do I have to do some conversion?
Sorry if I have alot of questions am also working hard on debugging and learning more about it!
Most of the developers like JSON data as request and it's good practice in RESTful apis. why?
JSON format is {key1: value1, key2: value 2,....}
You are passing
this.http.post(this.basePath + this.modulePath + '/getArrayValue',{'test' : YOUR_ACTUAL_ARRAY})
form the front-end. The httpClient.post(url,body,options?) has url and body as mandatory. How can you get it in back-end? Since you have body only,
public ResponseEntity<?> getArrayValues(#RequestBody List<String> test) {
// codes
}
Key of passed parameter from front-end test and variable which
listens in back-end should be in same name. Otherwise
#RequestBody("KEY_NAME") List<String> any_variable
As you asked from comment, you may have two key value pairs. Eg : { "test" : value1, "tmp": value2}. Assume value1 and value2 both are String array.
this.http.post(this.basePath + this.modulePath + '/getArrayValue',{'myJson' : YOUR_JSON})
There are lot of way(Eg : Gson,ObjectMapper etc). I use another way.
Create a class called TestTmpConverter
class TestTmpConverter{
List<String> test;
List<String> tmp;
//No-argument constructors & Argument constructors
//Getters
}
In controller
public ResponseEntity<?> getArrayValues(#RequestBody List<TestTmpConverter> myJson ) {
List<TestTmpConverter> test=myJson.getTest();
List<TestTmpConverter> tmp=myJson.getTmp();
// Do your work
}
I only showed one way.There are a lot of way to pass data to back-end like #RequestParam, #PathVariable etc. I feel now you get something how you can pass the data.
For your client put your data directly on POST's body:
public passingOfValues( test : Array<string> ) : Observable<Array<string>> {
let headers = new Headers({ 'Content-Type': 'application/json'} );
let options = new RequestOptions({ headers: headers);
let response = this.http.post(this.basePath + this.modulePath + '/getArrayValue',
test, options)
.map(this.extractData)
.catch(this.handleError);
return response;
}
On your REST service use the #RequestBody annotation:
public ResponseEntity<?> getArrayValues(#RequestBody String[] test ) {
logger.debug("### Test if array has a size ###" + test.size());
}

Iteration through json with multiple API calls for other requests

I am using Postman to iterate through a json of about 40 pairs of items. I need to then take that array created and run an API call for each element in the array to return a set of results. Using the code here, i'm only able to pull the final element in the array. I attempted to put the postman.setNextRequest in the for loop but then I found out that no matter where it is, it always executes last.
tests["Status code is 200 (that's good!)"] = (responseCode.code === 200);
if (responseCode.code === 200) {
var jsonData = pm.response.json();
var json = [];
postman.setEnvironmentVariable("json", jsonData)
postman.setNextRequest('GetAdmins');
for (var key in jsonData ) {
if (jsonData.hasOwnProperty(key)) {
postman.setEnvironmentVariable("organizationId", jsonData[key].id)
postman.setEnvironmentVariable("orgname", jsonData[key].name)
tests[jsonData[key].name + " " + jsonData[key].id] = !!jsonData[key].name;
}
}
}
else {
postman.setNextRequest(null);
}
GetAdmins is another GET that uses {{organizationId}} in the call.
I think what i'm looking for is; what is the best way to go about running another API call on each element in the json?
Thanks in advance!
EDIT: Adding JSON output
[
{
"id": XXXXXX,
"name": "Name1"
},
{
"id": XXXXXX,
"name": "Name2"
},
{
"id": XXXXXX,
"name": "Name3"
}
]
This might work to get the data - I’ve not tried it out yet though so it might not work first time.
var jsonData = pm.response.json()
data = _.map(jsonData, item => {
organizationId: item.id
orgName: item.name
})
pm.environment.set('organizationData', JSON.stringify(data))
Then you have all of your organization data in a variable and you can use these to iterate over the Id’s in the next "Get Admins" request.
You would need to have some code in the Pre-request script of the next request to access each of the id’s to iterate over in the request. You need to parse the variable like this:
var orgID = pm.environment.get(JSON.parse("organizationData"))
Then orgID[0].organizationId would be the first one in the list.
Not a complete solution for your problem but it might help you get the data.
I was able to solve this using these two guides:
Loops and dynamic variables in Postman: part 1
Loops and dynamic variables in Postman: part 2
I also had to implement the bigint fix for java, but in Postman, which was very annoying... that can be found here:
Hacking bigint in API testing with Postman Runner Newman in CI Environment
Gist
A lot of google plus trial and error got me up and running.
Thanks anyway for all your help everyone!
This ended up being my final code:
GetOrgs
tests["Status code is 200 (that's good!)"] = (responseCode.code === 200);
eval(postman.getGlobalVariable("bigint_fix"));
var jsonData = JSON.parse(responseBody);
var id_list = [];
jsonData.forEach(function(list) {
var testTitle = "Org: " + list.name + " has id: " + JSON.stringify(list.id);
id_list.push(list.id);
tests[testTitle] = !!list.id;
});
postman.setEnvironmentVariable("organizationId",JSON.stringify(id_list.shift()));
postman.setEnvironmentVariable("id_list", JSON.stringify(id_list));
postman.setNextRequest("GetAdmins");
GetAdmins
eval(postman.getGlobalVariable("bigint_fix"));
var jsonData = JSON.parse(responseBody);
jsonData.forEach(function(admin) {
var testTitle = "Admin: " + admin.name + " has " + admin.orgAccess;
tests[testTitle] = !!admin.name;
});
var id_list = JSON.parse(environment.id_list);
if (id_list.length > 0) {
postman.setEnvironmentVariable("organizationId", JSON.stringify(id_list.shift());
postman.setEnvironmentVariable("id_list", JSON.stringify(id_list));
postman.setNextRequest("GetAdmins");
}
else {
postman.clearEnvrionmentVariable("organizationId");
postman.clearEnvironmentVariable("id_list");
}

Create Release Notes using JIRA Rest API in HTML format in groovy

I am working on a script where I need to create the release notes using JIRA REST API in HTML format for any project.The below four field should come in that release notes.
Issue Key Module Summary Release Note
I am trying the below code but it is giving me only the issue Key field but need all other fields as well and in html file.Could you please suggest me on this?
Issue:1
Initially it was giving me the output in below format:
NTTB-2141
NTTB-2144
NTTB-2140
But now it is giving me the output json fromat way.
Code which I am trying from the groovy console:
#Grab(group='org.codehaus.groovy.modules.http-builder', module='http-builder', version='0.7' )
import groovyx.net.http.RESTClient
final String USAGE =
"Usage: -Djira.username=xxx -Djira.password=xxx -Djira.fixVersion=1.0"
String jiraUsername = 'ABCDEF'
String jiraPassword = '********'
String jiraFixVersion = '3.8.101'
println "Getting issues..."
if (!jiraUsername?.trim()) {
fail("Empty property: jira.username " + USAGE)
}
if (!jiraPassword?.trim()) {
fail("Empty property: jira.password " + USAGE)
}
if (!jiraFixVersion?.trim()) {
fail("Empty property: jira.fixVersion " + USAGE)
}
final String JIRA_SEARCH_URL = "https://jira.test.com/rest/api/latest/"
// see JIRA docs about search:
// https://docs.atlassian.com/jira/REST/latest/#idp1389824
String JQL = "project = NCCB"
JQL += " AND issuetype in standardIssueTypes()"
JQL += " AND status in (Resolved, Closed)"
JQL += " AND fixVersion = \"${jiraFixVersion}\""
def jira = new RESTClient(JIRA_SEARCH_URL)
def query = [:]
query['os_username'] = jiraUsername
query['os_password'] = jiraPassword
query['jql'] = JQL
query['startAt'] = 0
query['maxResults'] = 1000
try {
def resp = jira.get(path: "search",
contentType: "application/json",
query: query)
assert resp.status == 200
assert (resp.data instanceof net.sf.json.JSON)
resp.data.issues.each { issue ->
println issue.key
}
println "Total issues: " + resp.data.total
} catch (groovyx.net.http.HttpResponseException e) {
if (e.response.status == 400) {
// HTTP 400: Bad Request, JIRA JQL error
fail("JIRA query failed: ${e.response.data}", e)
} else {
fail("Failure HTTP status ${e.response.status}", e)
}
}
I suspect the code is right, except for that assertion:
assert (resp.data instanceof net.sf.json.JSON)
I get a groovy.json.internal.LazyMap (maybe you have changed versions of Groovy or Jira or something).
As a result, the assertion fails and Groovy tries to be helpful by giving you a comparison... but it shows you the toString() of the result, which is a huge mess of maps.
If you remove that assertion, it works for me, and I suspect it will work for you too.
Edit: huh... you cannot literally take "all" data and print to html. You will have to select the properties you need, and those depend on your Jira configuration. Here is an example with only 2 properties that should be universal:
def resp = jira.get(path: "search",
contentType: "application/json",
query: query)
assert resp.status == 200
def output = new File('issues.html')
output << "<html><body><ul>"
resp.data.issues.each { issue ->
def url = "https://yourjirainstance/browse/${issue.key}"
output << "<li>${issue.key}: ${issue.fields.summary}</li>"
}
output << "</ul></body></html>"
println "Exported ${resp.data.total} issues to ${output.name}"
See here details about what the service will give you.
If you just want an HTML dump, maybe the REST API is not what you want: you can also ask Jira to export results of JQL as a printable output (that will actually be html).

Plain string template query for elasticsearch through java API?

I have a template foo.mustache saved in {{ES_HOME}}/config/scripts.
POST to http://localhost:9200/forward/_search/template with the following message body returns a valid response:
{
"template": {
"file": "foo"
},
"params": {
"q": "a",
"hasfilters": false
}
}
I want to translate this to using the java API now that I've validated all the different components work. The documentation here describes how to do it in java:
SearchResponse sr = client.prepareSearch("forward")
.setTemplateName("foo")
.setTemplateType(ScriptService.ScriptType.FILE)
.setTemplateParams(template_params)
.get();
However, I would instead like to just send a plain string query (i.e. the contents of the message body from above) rather than build up the response using the java. Is there a way to do this? I know with normal queries, I can construct it like so:
SearchRequestBuilder response = client.prepareSearch("forward")
.setQuery("""JSON_QUERY_HERE""")
I believe the setQuery() method wraps the contents into a query object, which is not what I want for my template query. If this is not possible, I will just have to go with the documented way and convert my json params to Map<String, Object>
I ended up just translating my template_params to a Map<String, Object> as the documentation requires. I utilized groovy's JsonSlurper to convert the text to an object with a pretty simple method.
import groovy.json.JsonSlurper
public static Map<String,Object> convertJsonToTemplateParam(String s) {
Object result = new JsonSlurper().parseText(s);
//Manipulate your result if you need to do any additional work here.
//I.e. Programmatically determine value of hasfilters if filters != null
return (Map<String,Object>) result;
}
And you could pass in the following as a string to this method:
{
"q": "a",
"hasfilters": true
"filters":[
{
"filter_name" : "foo.untouched",
"filters" : [ "FOO", "BAR"]
},
{
"filter_name" : "hello.untouched",
"list" : [ "WORLD"]
}
]
}

Categories

Resources