Pass array as value of annotation param in JavaPoet - java

Using JavaPoet I'm trying to annotate a class with an annotation which has an array as a parameter value i.e.
#MyCustom(param = { Bar.class, Another.class })
class Foo {
}
I use AnnotationSpec.builder and its addMember() method:
List<TypeMirror> moduleTypes = new ArrayList<>(map.keySet());
AnnotationSpec annotationSpec = AnnotationSpec.builder(MyCustom.class)
.addMember("param", "{ $T[] } ", moduleTypes.toArray() )
.build();
builder.addAnnotation(annotationSpec);

CodeBlock has a joining collector, and you can use that to stream this, doing something like the following (if for instance this was an enum). You can do it for any types, just the map would change.
AnnotationSpec.builder(MyCustom.class)
.addMember(
"param",
"$L",
moduleTypes.stream()
.map(type -> CodeBlock.of("$T.$L", MyCustom.class, type))
.collect(CodeBlock.joining(",", "{", "}")))
.build()

Maybe not an optimal solution but passing an array to an annotation in JavaPoet can be done in the following way:
List<TypeMirror> moduleTypes = new ArrayList<>(map.keySet());
CodeBlock.Builder codeBuilder = CodeBlock.builder();
boolean arrayStart = true;
codeBuilder.add("{ ");
for (TypeMirror modType: moduleTypes)
if (!arrayStart)
codeBuilder.add(" , ");
arrayStart = false;
codeBuilder.add("$T.class", modType);
codeBuilder.add(" }");
AnnotationSpec annotationSpec = AnnotationSpec.builder(MyCustom.class)
.addMember("param", codeBuilder.build() )
.build();
builder.addAnnotation(annotationSpec);

Related

Merge two JSON into one JSON using Java and validate in karate feature file

Json1 {"key1" :"one","key2":"two"}
Json2 {"FN": "AB","LN":"XY"}
I wish to have Json3 {"key1" :"one","key2":"two","FN": "AB","LN":"XY"}
I have used below code but it does not work:
JSONObject mergedJSON = new JSONObject();
try {
mergedJSON = new JSONObject(json1, JSONObject.getNames(json1));
for (String Key : JSONObject.getNames(json2)) {
mergedJSON.put(Key, json2.get(Key));
}
} catch (JSONException e) {
throw new RuntimeException("JSON Exception" + e);
}
return mergedJSON;
}
* call defaultCOM {ID: "COM-123"}
* def defaultResponse = response.data.default
* def jMap = mergeJSON.toMap(defaultResponse)
Here error comes (language: Java, type: com.intuit.karate.graal.JsMap) to Java type 'org.json.JSONObject': Unsupported target type
All I'll say is that the recommended way to merge 2 JSONs is given in the documentation: https://github.com/karatelabs/karate#json-transforms
* def foo = { a: 1 }
* def bar = karate.merge(foo, { b: 2 })
* match bar == { a: 1, b: 2 }
I'll also say that when you use custom Java code, you should stick to using Map or List: https://github.com/karatelabs/karate#calling-java
And if you use things like JSONObject whatever that is, you are on your own - and please consider that not supported by Karate.
When you have to mix Java and the Karate-style JS code (and this something you should try to avoid as far as possible) you need to be aware of some caveats: https://github.com/karatelabs/karate/wiki/1.0-upgrade-guide#js-to-java
Well, if you just don't care about key collisions this should work:
String jsons01 = "{\"key1\" :\"one\",\"key2\":\"two\"}";
String jsons02 = "{\"FN\": \"AB\",\"LN\":\"XY\"}";
JSONObject jsono01 = new JSONObject(jsons01);
JSONObject jsono02 = new JSONObject(jsons02);
JSONObject merged = new JSONObject(jsono01, Object.getNames(jsono01));
for (String key : JSONObject.getNames(jsono02)) {
merged.append(key, jsono02.get(key));
}
System.out.println(merged);
Result is: {"key1":"one","FN":["AB"],"key2":"two","LN":["XY"]}

mapstruct qualifiedByName without parameters simplify expression

I would like to set a constant on the field, but with method call, i do not want to create an expression it looks terrible, i would like to simplify this call
#Mapping(target = "channelNotification", expression= "java(new ChannelNotification[]{ " +
"new ChannelNotification(\"email\", 10)})")
to get something like this:
#Mapping(target = "channel", qualifiedByName = "getChannel")
Notification convert(Email emailEntity);
#Named("getChannel")
default Channel[] getChannel() {//with empty params
return new Channel[]{new Channel("email", 10)};
}
Source entity doesn't have field channelNotification, and i don't need to use it. I just want to set a constant like constant = *, but with method call
This is currently not possible. However, what you can do is to use a combination of constant and qualifiedByName.
e.g.
#Mapping(target = "channel", constant = "email" qualifiedByName = "getChannel")
Notification convert(Email emailEntity);
#Named("getChannel")
default Channel[] getChannel(String channelType) {
Channel channel;
if ("email".equals(channelType)) {
channel = new Channel("email", 10);
} else {
throw new IllegalArgumentException("unknown channel type " + channelType);
}
return new Channel[]{channel};
}
What is not known that much is that constant works in a similar way with qualifiers as other Mapping#source. Since 1.4 custom mappings between String (what is in constant) and the target type will be looked for in order to convert the constant value.

Casting to JsonObject in Kotlin

I have the following Java code:
var jwks = ((List<Object>) keys.getList()).stream()
.map(o -> new JsonObject((Map<String, Object>) o))
.collect(Collectors.toList());
and would like to translate safely into Kotlin code.
When I copy the code into Intellj, then it translates for me as follows:
val jwks = (keys.list as List<Any?>).stream()
.map { o: Any? -> JsonObject(o as Map<String?, Any?>?) }
.collect(Collectors.toList())
Can I do it better or should I let it as is.
Update
Maybe I have to provide more context. What I am trying to do is, to implement JWT Authorization for Vert.x with Keycloak in Kotlin regarding to the tutorial https://vertx.io/blog/jwt-authorization-for-vert-x-with-keycloak/.
I am trying to rewrite the method
private Future<Startup> setupJwtAuth(Startup startup) {
var jwtConfig = startup.config.getJsonObject("jwt");
var issuer = jwtConfig.getString("issuer");
var issuerUri = URI.create(issuer);
// derive JWKS uri from Keycloak issuer URI
var jwksUri = URI.create(jwtConfig.getString("jwksUri", String.format("%s://%s:%d%s",
issuerUri.getScheme(), issuerUri.getHost(), issuerUri.getPort(), issuerUri.getPath() + "/protocol/openid-connect/certs")));
var promise = Promise.<JWTAuth>promise();
// fetch JWKS from `/certs` endpoint
webClient.get(jwksUri.getPort(), jwksUri.getHost(), jwksUri.getPath())
.as(BodyCodec.jsonObject())
.send(ar -> {
if (!ar.succeeded()) {
startup.bootstrap.fail(String.format("Could not fetch JWKS from URI: %s", jwksUri));
return;
}
var response = ar.result();
var jwksResponse = response.body();
var keys = jwksResponse.getJsonArray("keys");
// Configure JWT validation options
var jwtOptions = new JWTOptions();
jwtOptions.setIssuer(issuer);
// extract JWKS from keys array
var jwks = ((List<Object>) keys.getList()).stream()
.map(o -> new JsonObject((Map<String, Object>) o))
.collect(Collectors.toList());
// configure JWTAuth
var jwtAuthOptions = new JWTAuthOptions();
jwtAuthOptions.setJwks(jwks);
jwtAuthOptions.setJWTOptions(jwtOptions);
jwtAuthOptions.setPermissionsClaimKey(jwtConfig.getString("permissionClaimsKey", "realm_access/roles"));
JWTAuth jwtAuth = JWTAuth.create(vertx, jwtAuthOptions);
promise.complete(jwtAuth);
});
return promise.future().compose(auth -> {
jwtAuth = auth;
return Future.succeededFuture(startup);
});
}
into Kotlin language.
You can use Kotlin's star projection which basically handles unknown generics in a type-safe way.
(keys.list as List<Map<*, *>>).map { JsonObject(it) }
Since there is no multi-mapping there is no need of Stream/Sequence API.
However, if you want to use lazy-evaluation (each element going through all mapping then next element in same way rather than mapping all element then running next map):
(keys.list as List<Map<*, *>>)
.asSequence()
.map { JsonObject(it) }
.map { /* Maybe some other mapping */ }
.filter { /* Maybe some filter */ }
.take(5) // Or some other intermediate operation
.toList() // Finally start the operations and collect
Edit: I forgot that the keys are String, so you can cast to List<Map<String, *>> instead :)

Foreach java 8 from list of String to String

I have a member function that will retrieve all membershipId of a member(one member might have multiples of membershipId).After retrieve all membershipId using List,it will call the url like this.
This is my service:
RestRequest request = RestRequest.newBuilder()
.url("/membership/" + membershipId + "/outlet")
.get();
This is my controller:
#RequestMapping(value = "/favouriteStores", method = RequestMethod.GET)
public Object FavouriteStores(ModelMap modelMap,HttpSession session)throws Exception {
String memberId = "5677a7075e3f1b998fc7483b";
List<Membership> membershipList= memberService.getMembershipByMemberId(memberId);
List<String> membershipIds = membershipList.stream().map(m->m.getId()).collect(Collectors.toList());
String membershipId = membershipIds.toString();
Set<Outlet> outletSet = membershipService.getOutletByMembershipId(membershipId);
My problem is it will transform the whole membershipId in one url like this
"membership/[12345, 54321]/outlet"
It should be two url like "membership/[12345]/outlet" and "membership/[54321]/outlet"
I know we can use foreach to do that in controller,but i don't know how.Thanks for any helps.
Try map method of Stream instead :
You can achieve this using map method of Stream.
Set<Outlet> outletSet = membershipIds.stream()
.map(membershipService::getOutletByMembershipId)
.collect(Collectors.toSet());
Even you can combine your previous stream operations and omit creation of intermediate list objects :
String memberId = "5677a7075e3f1b998fc7483b";
Set<Outlet> outletSet = memberService.getMembershipByMemberId(memberId)
.stream()
.map(Membership::getId)
.map(membershipService::getOutletByMembershipId)
.collect(Collectors.toSet())

Spark - How to use SparkContext within classes?

I am building an application in Spark, and would like to use the SparkContext and/or SQLContext within methods in my classes, mostly to pull/generate data sets from files or SQL queries.
For example, I would like to create a T2P object which contains methods that gather data (and in this case need access to the SparkContext):
class T2P (mid: Int, sc: SparkContext, sqlContext: SQLContext) extends Serializable {
def getImps(): DataFrame = {
val imps = sc.textFile("file.txt").map(line => line.split("\t")).map(d => Data(d(0).toInt, d(1), d(2), d(3))).toDF()
return imps
}
def getX(): DataFrame = {
val x = sqlContext.sql("SELECT a,b,c FROM table")
return x
}
}
//creating the T2P object
class App {
val conf = new SparkConf().setAppName("T2P App").setMaster("local[2]")
val sc = new SparkContext(conf)
val sqlContext = new SQLContext(sc)
val t2p = new T2P(0, sc, sqlContext);
}
Passing the SparkContext as an argument to the T2P class doesn't work since the SparkContext is not serializable (getting a task not serializable error when creating T2P objects). What is the best way to use the SparkContext/SQLContext inside my classes? Or perhaps is this the wrong way to design a data pull type process in Spark?
UPDATE
Realized from the comments on this post that the SparkContext was not the problem, but that I was using a using a method within a 'map' function, causing Spark to try to serialize the entire class. This would cause the error since SparkContext is not serializable.
def startMetricTo(userData: ((Int, String), List[(Int, String)]), startMetric: String) : T2PUser = {
//do something
}
def buildUserRollup() = {
this.userRollup = this.userSorted.map(line=>startMetricTo(line, this.startMetric))
}
This results in a 'task not serializable' exception.
I fixed this problem (with the help of the commenters and other StackOverflow users) by creating a separate MetricCalc object to store my startMetricTo() method. Then I changed the buildUserRollup() method to use this new startMetricTo(). This allows the entire MetricCalc object to be serialized without issue.
//newly created object
object MetricCalc {
def startMetricTo(userData: ((Int, String), List[(Int, String)]), startMetric: String) : T2PUser = {
//do something
}
}
//using function in T2P
def buildUserRollup(startMetric: String) = {
this.userRollup = this.userSorted.map(line=>MetricCalc.startMetricTo(line, startMetric))
}
I tried several options, this is what worked eventually for me..
object SomeName extends App {
val conf = new SparkConf()...
val sc = new SparkContext(conf)
implicit val sqlC = SQLContext.getOrCreate(sc)
getDF1(sqlC)
def getDF1(sqlCo: SQLContext): Unit = {
val query1 = SomeQuery here
val df1 = sqlCo.read.format("jdbc").options(Map("url" -> dbUrl,"dbtable" -> query1)).load.cache()
//iterate through df1 and retrieve the 2nd DataFrame based on some values in the Row of the first DataFrame
df1.foreach(x => {
getDF2(x.getString(0), x.getDecimal(1).toString, x.getDecimal(3).doubleValue) (sqlCo)
})
}
def getDF2(a: String, b: String, c: Double)(implicit sqlCont: SQLContext) : Unit = {
val query2 = Somequery
val sqlcc = SQLContext.getOrCreate(sc)
//val sqlcc = sqlCont //Did not work for me. Also, omitting (implicit sqlCont: SQLContext) altogether did not work
val df2 = sqlcc.read.format("jdbc").options(Map("url" -> dbURL, "dbtable" -> query2)).load().cache()
.
.
.
}
}
Note: In the above code, if I omitted (implicit sqlCont: SQLContext) parameter from getDF2 method signature, it would not work. I tried several other options of passing the sqlContext from one method to the other, it always gave me NullPointerException or Task not serializable Excpetion. Good thins is it eventually worked this way, and I could retrieve parameters from a row of the DataFrame1 and use those values in loading the DataFrame 2.

Categories

Resources