I'm trying to use lambdas and streams in java but I'm quite new to it.
I got this error in IntelliJ "target type of lambda conversion must be an interface" when I try to make a lambda expression
List<Callable<SomeClass>> callList = prgll.stream()
.map(p->(()->{return p.funct();} )) <--- Here I get error
.collect(Collectors.toList());
Am I doing something wrong?
I suspect it's just Java's type inference not being quite smart enough. Try
.map(p -> (Callable<SomeClass>) () -> p.funct())
Stream#map() is a typed method, so you can explicitly specify the type:
.<Callable<SomeClass>>map(p -> () -> p.funct())
or neater, use a method reference:
.<Callable<SomeClass>>map(p -> p::funct)
Related
I have a written a method in Scala that is using a method written in Java - processSale() method takes util.List<Sale> as a parameter.
But after groupByKey() I'm getting an RDD[(String, Iterable[Sale])]. I've tried to import scala.collection.JavaConverters._ and do SaleParser.processSale(a.asJava).
However it gives me an Iterable[Sale]. How is it possible to convert it into a Java util.List?
val parseSales: RDD[(String, Sale)] = rawSales
.map(sale => sale.Id -> sale)
.groupByKey()
.mapValues(a => SaleParser.processSale(???))
a.toSeq.asJava
Note that if this Iterable is actually a Seq, toSeq just returns the same object.
See API doc for the complete list of conversions.
How I can get encoded(which java library can do this)
a & to a%20%26
without using String.replace() function?
URL Encoder -> a+%26
URIUtil.encodePath -> a%20&
UrlEscapers.urlFragmentEscaper().escape -> a%20&
org.apache.catalina.util.URLEncoder is the answer
I am new to apache spark and trying to run the wordcount example . But intellij editor gives me the error at line 47 Cannot resolve method 'flatMap()' error.
Edit :
This is the line where I am getting the error
JavaRDD<String> words = lines.flatMap(s -> Arrays.asList(SPACE.split(s)).iterator());
It looks like you're using an older version of Spark that expects Iterable rather than Iterator from the flatMap() function. Try this:
JavaRDD<String> words = lines.flatMap(s -> Arrays.asList(SPACE.split(s)));
See also Spark 2.0.0 Arrays.asList not working - incompatible types
Stream#flatMap is used for combining multiple streams into one, so the supplier method you provided must return a Stream result.
you can try like this:
lines.stream().flatMap(line -> Stream.of(SPACE.split(line)))
.map(word -> // map to JavaRDD)
flatMap method take a FlatMapFunctionas parameter which is not annotated with #FunctionalInterface. So indeed you can not use it as a lambda.
Just build a real FlatMapFunctionobject as parameter and you will be sure of it.
flatMap() is Java 8 Stream API. I think you should check the IDEA compile java version.
compile java version
I'm trying to use lambdas and streams in java but I'm quite new to it.
I got this error in IntelliJ "target type of lambda conversion must be an interface" when I try to make a lambda expression
List<Callable<SomeClass>> callList = prgll.stream()
.map(p->(()->{return p.funct();} )) <--- Here I get error
.collect(Collectors.toList());
Am I doing something wrong?
I suspect it's just Java's type inference not being quite smart enough. Try
.map(p -> (Callable<SomeClass>) () -> p.funct())
Stream#map() is a typed method, so you can explicitly specify the type:
.<Callable<SomeClass>>map(p -> () -> p.funct())
or neater, use a method reference:
.<Callable<SomeClass>>map(p -> p::funct)
I tried the following code using Java 8 streams:
Arrays.asList("A", "B").stream()
.flatMap(s -> Arrays.asList("X", "Y").stream().map(s1 -> s + s1)).collect(Collectors.toList());
What I get is a List<Object> while I would expect a List<String>. If I remove the collect and I try:
Arrays.asList("A", "B").stream().flatMap(s -> Arrays.asList("X", "Y").stream().map(s1 -> s + s1));
I correctly get a Stream<String>.
Where am I wrong? Can someone help me?
Many thanks in advance.
Edit:
The problem is due to Eclipse (now using Kepler SR2 with java 8 patch 1.0.0.v20140317-1956). The problem does non appear if compiling using javac or, as commented by Holger, using Netbeans
Type inference is a new feature. Until tools and IDEs are fully developed I recommend using explicitly typed lambdas. There ware cases where Eclipse even crashed if an explicit cast was missing, but that is fixed now.
Here's a workaround:
With a typed "s1":
asList.stream()
.flatMap(s -> Arrays.asList("X", "Y").stream().map((String s1) -> s + s1))
.collect(Collectors.toList());
Or with a genric parameter:
asList.stream()
.flatMap(s -> Arrays.asList("X", "Y").stream().<String>map(s1 -> s + s1))
.collect(Collectors.toList());
The same is true if you add the parameter before flatMap instead of map.
But I suggest you use s::concat:
asList.stream()
.flatMap(s -> Arrays.asList("X", "Y").stream().map(s::concat))
.collect(Collectors.toList());