I have the following setup:
I have the following dependency in my POM:
<dependency>
<groupId>com.google.protobuf</groupId>
<artifactId>protobuf-java</artifactId>
<version>3.14.0</version>
</dependency>
I have a very simple proto file:
syntax = "proto3";
package com.ziath.genericdecoderserver;
option java_outer_classname = "DecodePackage";
message DecodeData {
string template = 1;
bytes image = 2;
int32 startColumn = 3;
int32 endColumn = 4;
}
I'm generating the proto file using the version 3.14.0, binary for win64:
PS C:\Users\neilb\Documents\GitHub\GenericDecoderServer\src\main\protobuf\bin> .\protoc.exe --version
libprotoc 3.14.0
This matches the maven dependency I'm pulling in. However the file generated has error with the override annotation:
#java.lang.Override
public com.ziath.genericdecoderserver.DecodePackage.DecodeData buildPartial() {
com.ziath.genericdecoderserver.DecodePackage.DecodeData result = new
com.ziath.genericdecoderserver.DecodePackage.DecodeData(this);
result.template_ = template_;
result.image_ = image_;
result.startColumn_ = startColumn_;
result.endColumn_ = endColumn_;
onBuilt();
return result;
}
The reported error is:
The method buildPartial() of type DecodePackage.DecodeData.Builder must override a superclass method
So this method is in the Builder class which is defined as:
public static final class Builder extends com.google.protobuf.GeneratedMessageV3.Builder<Builder> implements
// ##protoc_insertion_point(builder_implements:com.ziath.genericdecoderserver.DecodeData)
com.ziath.genericdecoderserver.DecodePackage.DecodeDataOrBuilder {
Eclipse is correct the method buildPartial is not in either of the interfaces protobuf is referencing so it looks like a version mismatch but the versions are the same. There are scores of errors in this generated code along the same lines. Does anybody know what the problem is or even seen this before because my searches show nothing from this?
Thanks.
Cheers,
Neil
Solved it! The project was created using Spring Initilaser and for some reason that made the java version to be 1.5 in eclipse. 1.5 does not allow override for interface methods.
Related
Initial situation
I've made a little test for my project today - The goal: Implement .jar files into a C# project as a .dll. My current .java / .jar file looks like the following.
package ws;
public class Adding
{
public int Add(int a, int b)
{
return a + b;
}
}
I successfully converted the above into a .dll with IKVM (Version: 7.5.0.2).
I now want to reference this .dll in my C# project and call the Add(int a, int b) method. I already added the reference like so:
Anyways I am not able to call the method, because the compiler can't find the .dll reference..
using Adding; // <= Compiler Error CS0246 (Can't find the reference)
Console.WriteLine(Add(1, 2));
Does anybody know how I could achieve this? I highly appreciate any kind of help, cheers!
Edit 1: Decompiling
I've decompiled the .dll, as demanded in the comments, with ILSpy (Version: ILSpy 7.2), which results into the following output.
// C:\Users\maikh\Desktop\Adding.dll
// Adding, Version=0.0.0.0, Culture=neutral, PublicKeyToken=null
// Global type: <Module>
// Architecture: AnyCPU (64-bit preferred)
// Runtime: v4.0.30319
// Hash algorithm: SHA1
using System.Diagnostics;
using System.Reflection;
using System.Runtime.CompilerServices;
using IKVM.Attributes;
[assembly: Debuggable(true, false)]
[assembly: RuntimeCompatibility(WrapNonExceptionThrows = true)]
[assembly: AssemblyVersion("0.0.0.0")]
[module: SourceFile(null)]
[module: JavaModule(Jars = new string[] { "Adding.jar" })]
[module: PackageList(new string[] { })]
I've also found some references, while decompiling the .dll. I don't know if this is important to mention, but I'll provide it anyways.
// Detected TargetFramework-Id: .NETFramework,Version=v4.0
// Detected RuntimePack: Microsoft.NETCore.App
// Referenced assemblies (in metadata order):
// IKVM.Runtime, Version=7.5.0.2, Culture=neutral, PublicKeyToken=00d957d768bec828
// Assembly reference loading information:
// There were some problems during assembly reference load, see below for more information!
// Error: Could not find reference: IKVM.Runtime, Version=7.5.0.2, Culture=neutral, PublicKeyToken=00d957d768bec828
// mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089
// Assembly reference loading information:
// Info: Success - Found in Assembly List
// Assembly load log including transitive references:
// IKVM.Runtime, Version=7.5.0.2, Culture=neutral, PublicKeyToken=00d957d768bec828
// Error: Could not find reference: IKVM.Runtime, Version=7.5.0.2, Culture=neutral, PublicKeyToken=00d957d768bec828
// mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089
// Info: Success - Found in Assembly List
Edit 2: Decompiling V2
I've managed to add the missing reference IKVM.Runtime. Nevertheless I can't find any informations about the namespace, class or method.
First of all, you are using a class as a namespace, and that is probably not correct. Your method call should probably look something like this:
var adder = new Adding();
Console.WriteLine(adder.Add(1, 2));
If that does not work I would inspect the produced dll to verify that it is a conforming .net dll. That should also show the namespaces, class names and other information. A decompiler like dotPeek or ilSpy might show the same information in a format that may be easier to read.
Since your Java class is in the ws package, you should be using ws in your C# code:
using ws;
Adding adding = new Adding();
Console.WriteLine(adding.Add(1, 2));
And if you want to call the Add method statically, declare it static in Java:
package ws;
public class Adding
{
public static int Add(int a, int b)
{
return a + b;
}
}
using ws;
Console.WriteLine(Adding.Add(1, 2));
Currently working on converting below code as "JAR" to register permanent UDF in Databricks cluster. facing issue like NoClassDefFoundError, But i added required Library dependencies while building Jar using SBT. source code : https://databricks.com/notebooks/enforcing-column-level-encryption.html
Used Below in build.sbt
scalaVersion := "2.13.4"
libraryDependencies += "org.apache.hive" % "hive-exec" % "0.13.1"
libraryDependencies += "com.macasaet.fernet" % "fernet-java8" % "1.5.0"
Guide me on right libraries if anything wrong above.
Kindly help me on this,
import com.macasaet.fernet.{Key, StringValidator, Token}
import org.apache.hadoop.hive.ql.exec.UDF;
class Validator extends StringValidator {
override def getTimeToLive() : java.time.temporal.TemporalAmount = {
Duration.ofSeconds(Instant.MAX.getEpochSecond());
}
}
class udfDecrypt extends UDF {
def evaluate(inputVal: String, sparkKey : String): String = {
if( inputVal != null && inputVal!="" ) {
val keys: Key = new Key(sparkKey)
val token = Token.fromString(inputVal)
val validator = new Validator() {}
val payload = token.validateAndDecrypt(keys, validator)
payload
} else return inputVal
}
}
Make sure the fernet-java library is installed in your cluster.
This topic is related to
Databricks SCALA UDF cannot load class when registering function
I tried more to install the jar file to the cluster via the Libraries in the config, not drop directly to DBFS as the userguide, then I faced the issue with validator not found and the question routed me here.
I added the maven repo to the Libraries config, but then the cluster failed to installed it, with error
Library resolution failed because unresolved dependency: com.macasaet.fernet:fernet-java8:1.5.0: not found
databricks cluster libraries
Have you experienced with this?
Trying the example found from here: https://itextpdf.com/en/products/itext-7/pdfxfa
public static void main() {
XFAFlattenerProperties flattenerProperties = new XFAFlattenerProperties()
.setPdfVersion(XFAFlattenerProperties.PDF_1_7)
.createXmpMetaData()
.setTagged()
.setMetaData(
new MetaData()
.setAuthor("iText Samples")
.setLanguage("EN")
.setSubject("Showing off our flattening skills")
.setTitle("Flattened XFA"));
XFAFlattener xfaf = new XFAFlattener()
.setFlattenerProperties(flattenerProperties);
xfaf.flatten(new FileInputStream("xfaform.pdf"), new FileOutputStream("flat.pdf"));
}
and getting java.lang.NoClassDefFoundError: org.mozilla.javascript.ScriptableObject
when trying to do
XFAFlattener xfaf = new XFAFlattener();
Not using Maven or POM. I have the following JARs in classpath:
io-7.1.10.jar
kernel-7.1.10.jar
layout-7.1.10.jar
itext-licensekey-3.0.6.jar
pdfrender-1.0.0.jar
pdfxfa-2.0.5.jar
Am I missing something?
You need org.mozilla:rhino:1.7R4 dependency: https://mvnrepository.com/artifact/org.mozilla/rhino/1.7R4
But as #Harry Coder mentioned, you should use Maven or Gradle or any other Maven-compatible build system that will download all the dependencies including transitive ones automatically for you
I am using Flink and Java8. When using lambda functions with Tuples and generic types, my compiler ends up with an exception
/Library/Java/JavaVirtualMachines/jdk1.8.0_102.jdk/Contents/Home/bin/java -Didea.launcher.port=7536 "-Didea.launcher.bin.path=/Applications/IntelliJ IDEA.app/Contents/bin" -Dfile.encoding=UTF-8 -classpath "/Library/Java/JavaVirtualMachines/jdk1.8.0_102.jdk/Contents/Home/jre/lib/charsets.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_102.jdk/Contents/Home/jre/lib/deploy.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_102.jdk/Contents/Home/jre/lib/ext/cldrdata.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_102.jdk/Contents/Home/jre/lib/ext/dnsns.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_102.jdk/Contents/Home/jre/lib/ext/jaccess.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_102.jdk/Contents/Home/jre/lib/ext/jfxrt.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_102.jdk/Contents/Home/jre/lib/ext/localedata.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_102.jdk/Contents/Home/jre/lib/ext/nashorn.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_102.jdk/Contents/Home/jre/lib/ext/sunec.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_102.jdk/Contents/Home/jre/lib/ext/sunjce_provider.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_102.jdk/Contents/Home/jre/lib/ext/sunpkcs11.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_102.jdk/Contents/Home/jre/lib/ext/zipfs.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_102.jdk/Contents/Home/jre/lib/javaws.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_102.jdk/Contents/Home/jre/lib/jce.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_102.jdk/Contents/Home/jre/lib/jfr.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_102.jdk/Contents/Home/jre/lib/jfxswt.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_102.jdk/Contents/Home/jre/lib/jsse.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_102.jdk/Contents/Home/jre/lib/management-agent.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_102.jdk/Contents/Home/jre/lib/plugin.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_102.jdk/Contents/Home/jre/lib/resources.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_102.jdk/Contents/Home/jre/lib/rt.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_102.jdk/Contents/Home/lib/ant-javafx.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_102.jdk/Contents/Home/lib/dt.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_102.jdk/Contents/Home/lib/javafx-mx.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_102.jdk/Contents/Home/lib/jconsole.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_102.jdk/Contents/Home/lib/packager.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_102.jdk/Contents/Home/lib/sa-jdi.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_102.jdk/Contents/Home/lib/tools.jar:/Users/hasan.guercan/Git/flink-java-project/target/classes:/Users/hasan.guercan/.m2/repository/org/apache/flink/flink-java/1.0.3/flink-java-1.0.3.jar:/Users/hasan.guercan/.m2/repository/org/apache/flink/flink-core/1.0.3/flink-core-1.0.3.jar:/Users/hasan.guercan/.m2/repository/org/apache/flink/flink-annotations/1.0.3/flink-annotations-1.0.3.jar:/Users/hasan.guercan/.m2/repository/com/esotericsoftware/kryo/kryo/2.24.0/kryo-2.24.0.jar:/Users/hasan.guercan/.m2/repository/com/esotericsoftware/minlog/minlog/1.2/minlog-1.2.jar:/Users/hasan.guercan/.m2/repository/org/objenesis/objenesis/2.1/objenesis-2.1.jar:/Users/hasan.guercan/.m2/repository/org/apache/avro/avro/1.7.6/avro-1.7.6.jar:/Users/hasan.guercan/.m2/repository/org/apache/flink/flink-shaded-hadoop2/1.0.3/flink-shaded-hadoop2-1.0.3.jar:/Users/hasan.guercan/.m2/repository/xmlenc/xmlenc/0.52/xmlenc-0.52.jar:/Users/hasan.guercan/.m2/repository/commons-codec/commons-codec/1.4/commons-codec-1.4.jar:/Users/hasan.guercan/.m2/repository/commons-io/commons-io/2.4/commons-io-2.4.jar:/Users/hasan.guercan/.m2/repository/commons-net/commons-net/3.1/commons-net-3.1.jar:/Users/hasan.guercan/.m2/repository/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.jar:/Users/hasan.guercan/.m2/repository/javax/servlet/servlet-api/2.5/servlet-api-2.5.jar:/Users/hasan.guercan/.m2/repository/org/mortbay/jetty/jetty-util/6.1.26/jetty-util-6.1.26.jar:/Users/hasan.guercan/.m2/repository/com/sun/jersey/jersey-core/1.9/jersey-core-1.9.jar:/Users/hasan.guercan/.m2/repository/commons-el/commons-el/1.0/commons-el-1.0.jar:/Users/hasan.guercan/.m2/repository/commons-logging/commons-logging/1.1.3/commons-logging-1.1.3.jar:/Users/hasan.guercan/.m2/repository/com/jamesmurty/utils/java-xmlbuilder/0.4/java-xmlbuilder-0.4.jar:/Users/hasan.guercan/.m2/repository/commons-lang/commons-lang/2.6/commons-lang-2.6.jar:/Users/hasan.guercan/.m2/repository/commons-configuration/commons-configuration/1.7/commons-configuration-1.7.jar:/Users/hasan.guercan/.m2/repository/commons-digester/commons-digester/1.8.1/commons-digester-1.8.1.jar:/Users/hasan.guercan/.m2/repository/org/codehaus/jackson/jackson-core-asl/1.8.8/jackson-core-asl-1.8.8.jar:/Users/hasan.guercan/.m2/repository/org/codehaus/jackson/jackson-mapper-asl/1.8.8/jackson-mapper-asl-1.8.8.jar:/Users/hasan.guercan/.m2/repository/com/thoughtworks/paranamer/paranamer/2.3/paranamer-2.3.jar:/Users/hasan.guercan/.m2/repository/org/xerial/snappy/snappy-java/1.0.5/snappy-java-1.0.5.jar:/Users/hasan.guercan/.m2/repository/com/jcraft/jsch/0.1.42/jsch-0.1.42.jar:/Users/hasan.guercan/.m2/repository/org/apache/zookeeper/zookeeper/3.4.6/zookeeper-3.4.6.jar:/Users/hasan.guercan/.m2/repository/io/netty/netty/3.7.0.Final/netty-3.7.0.Final.jar:/Users/hasan.guercan/.m2/repository/org/apache/commons/commons-compress/1.4.1/commons-compress-1.4.1.jar:/Users/hasan.guercan/.m2/repository/org/tukaani/xz/1.0/xz-1.0.jar:/Users/hasan.guercan/.m2/repository/commons-beanutils/commons-beanutils-bean-collections/1.8.3/commons-beanutils-bean-collections-1.8.3.jar:/Users/hasan.guercan/.m2/repository/commons-daemon/commons-daemon/1.0.13/commons-daemon-1.0.13.jar:/Users/hasan.guercan/.m2/repository/javax/xml/bind/jaxb-api/2.2.2/jaxb-api-2.2.2.jar:/Users/hasan.guercan/.m2/repository/javax/xml/stream/stax-api/1.0-2/stax-api-1.0-2.jar:/Users/hasan.guercan/.m2/repository/javax/activation/activation/1.1/activation-1.1.jar:/Users/hasan.guercan/.m2/repository/com/google/inject/guice/3.0/guice-3.0.jar:/Users/hasan.guercan/.m2/repository/javax/inject/javax.inject/1/javax.inject-1.jar:/Users/hasan.guercan/.m2/repository/aopalliance/aopalliance/1.0/aopalliance-1.0.jar:/Users/hasan.guercan/.m2/repository/org/apache/commons/commons-math3/3.5/commons-math3-3.5.jar:/Users/hasan.guercan/.m2/repository/org/slf4j/slf4j-api/1.7.7/slf4j-api-1.7.7.jar:/Users/hasan.guercan/.m2/repository/org/slf4j/slf4j-log4j12/1.7.7/slf4j-log4j12-1.7.7.jar:/Users/hasan.guercan/.m2/repository/log4j/log4j/1.2.17/log4j-1.2.17.jar:/Users/hasan.guercan/.m2/repository/org/apache/flink/force-shading/1.0.3/force-shading-1.0.3.jar:/Users/hasan.guercan/.m2/repository/org/apache/flink/flink-streaming-java_2.10/1.0.3/flink-streaming-java_2.10-1.0.3.jar:/Users/hasan.guercan/.m2/repository/org/apache/flink/flink-runtime_2.10/1.0.3/flink-runtime_2.10-1.0.3.jar:/Users/hasan.guercan/.m2/repository/io/netty/netty-all/4.0.27.Final/netty-all-4.0.27.Final.jar:/Users/hasan.guercan/.m2/repository/org/javassist/javassist/3.18.2-GA/javassist-3.18.2-GA.jar:/Users/hasan.guercan/.m2/repository/org/scala-lang/scala-library/2.10.4/scala-library-2.10.4.jar:/Users/hasan.guercan/.m2/repository/com/typesafe/akka/akka-actor_2.10/2.3.7/akka-actor_2.10-2.3.7.jar:/Users/hasan.guercan/.m2/repository/com/typesafe/config/1.2.1/config-1.2.1.jar:/Users/hasan.guercan/.m2/repository/com/typesafe/akka/akka-remote_2.10/2.3.7/akka-remote_2.10-2.3.7.jar:/Users/hasan.guercan/.m2/repository/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar:/Users/hasan.guercan/.m2/repository/org/uncommons/maths/uncommons-maths/1.2.2a/uncommons-maths-1.2.2a.jar:/Users/hasan.guercan/.m2/repository/com/typesafe/akka/akka-slf4j_2.10/2.3.7/akka-slf4j_2.10-2.3.7.jar:/Users/hasan.guercan/.m2/repository/org/clapper/grizzled-slf4j_2.10/1.0.2/grizzled-slf4j_2.10-1.0.2.jar:/Users/hasan.guercan/.m2/repository/com/github/scopt/scopt_2.10/3.2.0/scopt_2.10-3.2.0.jar:/Users/hasan.guercan/.m2/repository/io/dropwizard/metrics/metrics-core/3.1.0/metrics-core-3.1.0.jar:/Users/hasan.guercan/.m2/repository/io/dropwizard/metrics/metrics-jvm/3.1.0/metrics-jvm-3.1.0.jar:/Users/hasan.guercan/.m2/repository/io/dropwizard/metrics/metrics-json/3.1.0/metrics-json-3.1.0.jar:/Users/hasan.guercan/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.4.2/jackson-databind-2.4.2.jar:/Users/hasan.guercan/.m2/repository/com/fasterxml/jackson/core/jackson-annotations/2.4.0/jackson-annotations-2.4.0.jar:/Users/hasan.guercan/.m2/repository/com/fasterxml/jackson/core/jackson-core/2.4.2/jackson-core-2.4.2.jar:/Users/hasan.guercan/.m2/repository/com/twitter/chill_2.10/0.7.4/chill_2.10-0.7.4.jar:/Users/hasan.guercan/.m2/repository/com/twitter/chill-java/0.7.4/chill-java-0.7.4.jar:/Users/hasan.guercan/.m2/repository/org/apache/commons/commons-math/2.2/commons-math-2.2.jar:/Users/hasan.guercan/.m2/repository/org/apache/sling/org.apache.sling.commons.json/2.0.6/org.apache.sling.commons.json-2.0.6.jar:/Users/hasan.guercan/.m2/repository/org/apache/flink/flink-clients_2.10/1.0.3/flink-clients_2.10-1.0.3.jar:/Users/hasan.guercan/.m2/repository/org/apache/flink/flink-optimizer_2.10/1.0.3/flink-optimizer_2.10-1.0.3.jar:/Users/hasan.guercan/.m2/repository/commons-cli/commons-cli/1.2/commons-cli-1.2.jar:/Users/hasan.guercan/.m2/repository/org/apache/commons/commons-lang3/3.0.1/commons-lang3-3.0.1.jar:/Applications/IntelliJ IDEA.app/Contents/lib/idea_rt.jar" com.intellij.rt.execution.application.AppMain org.apache.flink.quickstart.exercise2.ReplyGraph
Exception in thread "main" org.apache.flink.api.common.functions.InvalidTypesException: The return type of function 'retrieve(ReplyGraph.java:33)' could not be determined automatically, due to type erasure. You can give type information hints by using the returns(...) method on the result of the transformation call, or by letting your function implement the 'ResultTypeQueryable' interface.
at org.apache.flink.api.java.DataSet.getType(DataSet.java:178)
at org.apache.flink.api.java.DataSet.collect(DataSet.java:407)
at org.apache.flink.api.java.DataSet.print(DataSet.java:1605)
at org.apache.flink.quickstart.exercise2.ReplyGraph.retrieve(ReplyGraph.java:41)
at org.apache.flink.quickstart.exercise2.ReplyGraph.main(ReplyGraph.java:56)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147)
Caused by: org.apache.flink.api.common.functions.InvalidTypesException: The generic type parameters of 'Tuple3' are missing.
It seems that your compiler has not stored them into the .class file.
Currently, only the Eclipse JDT compiler preserves the type information necessary to use the lambdas feature type-safely.
See the documentation for more information about how to compile jobs containing lambda expressions.
at org.apache.flink.api.java.typeutils.TypeExtractor.validateLambdaGenericParameter(TypeExtractor.java:1316)
at org.apache.flink.api.java.typeutils.TypeExtractor.validateLambdaGenericParameters(TypeExtractor.java:1302)
at org.apache.flink.api.java.typeutils.TypeExtractor.getUnaryOperatorReturnType(TypeExtractor.java:346)
at org.apache.flink.api.java.typeutils.TypeExtractor.getUnaryOperatorReturnType(TypeExtractor.java:304)
at org.apache.flink.api.java.typeutils.TypeExtractor.getMapReturnTypes(TypeExtractor.java:119)
at org.apache.flink.api.java.DataSet.map(DataSet.java:215)
at org.apache.flink.quickstart.exercise2.ReplyGraph.retrieve(ReplyGraph.java:33)
... 6 more
So i have to create at least an anonymous class to solve the problem.
First code snippet represents code that leads to described exception:
DataSet<MailEntry> filteredUserReplyMails = replyMails.filter(entryTuple -> {
String sender = entryTuple.getField(1).toString();
return !sender.contains("git#") && !sender.contains("jira#");
}).map((entry -> {
MailEntry mailEntry = new MailEntry();
mailEntry.messageId = entry.f0.replaceAll("<", "").replaceAll(">", "");
mailEntry.sender = entry.f1;
mailEntry.replyTo = entry.f2;
return mailEntry;
});
Next one is working when creating an anonymous class:
DataSet<MailEntry> filteredUserReplyMails = replyMails.filter(entryTuple -> {
String sender = entryTuple.getField(1).toString();
return !sender.contains("git#") && !sender.contains("jira#");
}).map(new MapFunction<Tuple3<String, String, String>, MailEntry>() {
#Override
public MailEntry map(Tuple3<String, String, String> entry) throws Exception {
MailEntry mailEntry = new MailEntry();
mailEntry.messageId = entry.f0.replaceAll("<", "").replaceAll(">", "");
mailEntry.sender = entry.f1;
mailEntry.replyTo = entry.f2;
return mailEntry;
}
});
Javas lambda function is very neat. How can I solve this problem without creating an anonymous class?
Try to use returns method after map:
DataSet<MailEntry> filteredUserReplyMails = replyMails.filter(entryTuple -> {
String sender = entryTuple.getField(1).toString();
return !sender.contains("git#") && !sender.contains("jira#");
}).map(entry -> {
MailEntry mailEntry = new MailEntry();
mailEntry.messageId = entry.f0.replaceAll("<", "").replaceAll(">", "");
mailEntry.sender = entry.f1;
mailEntry.replyTo = entry.f2;
return mailEntry;
}).returns(MailEntry.class);
If you want to use Lambda expression in Flink you can't use the javac compiler, see https://ci.apache.org/projects/flink/flink-docs-release-1.4/dev/java8.html
Currently, Flink only supports jobs containing Lambda Expressions completely if they are compiled with the Eclipse JDT compiler contained in Eclipse Luna 4.4.2 (and above).
In order to make it work, you need to compile it using the Eclipse compiler. You can do it following these steps:
Modify the "pom.xml" file, uncommenting the following lines (or adding them in the section if they aren't present)
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>1.8</source> <!-- ${java.version} -->
<target>1.8</target>
<compilerId>jdt</compilerId>
</configuration>
<dependencies>
<dependency>
<groupId>org.eclipse.tycho</groupId>
<artifactId>tycho-compiler-jdt</artifactId>
<version>0.21.0</version>
</dependency>
</dependencies>
</plugin>
This will ensure you use the JDT compiler instead of the javac
Modify the running configuration you use to launch the project. Go to "Edit Configurations" (see the following picture), and then remove all the "Before launch" tasks (selecting them and then clicking on the '-' button) and add a new "Run Maven Goal" target (using the '+' button), setting "compile" in the "command line" field (see the following picture).
Now clicking on the run button you will be able to run Flink projects
Maybe someone can explain the behaviour below. I know there were some generic type-handling changes from Java 6 to 7, but I couldn't find one to explain this.
This is happening with this library:
<dependency>
<groupId>org.apache.felix</groupId>
<artifactId>org.apache.felix.framework</artifactId>
<version>3.2.2</version>
</dependency>
And the following demonstration code:
import org.apache.felix.framework.util.manifestparser.ManifestParser;
ManifestParser manifestParser = new ManifestParser(null, null, null, null);
for (Capability capability : manifestParser.getCapabilities()) {
capability.toString();
}
// where the signature of getCapabilities() is:
// public List<Capability> getCapabilities() { return m_capabilities; }
// and there are no other methods with similar signatures or names
This demo code compiles just fine with JDK 6 (x86, 1.6.0_45, 32-bit), but fails to compile with JDK 7 (x86, 1.7.0_25, 32-bit, same host):
// line number matches the for loop
java: incompatible types
required: org.apache.felix.framework.capabilityset.Capability
found: java.lang.Object
After some head scratching, I have a workaround but no explanation. The following modification to the demo code compiles with JDK 7:
ManifestParser manifestParser = new ManifestParser(null, null, null, null);
List<Capability> capabilities = manifestParser.getCapabilities();
for (Capability capability : capabilities) {
capability.toString();
}
Why is this?
See How to compile mavenized OSGi 4.3 bundle with OpenJDK 7?
Because of the OSGi classes in that felix jar, you cannot use it to compile against with Java 7.
Are you sure that there is no classpath problem, like different versions of the same class in the classpath? This could be possible if you have in the classpath two versions of the same class, one for java 1.4 returning List and one for java 5+ returning List<Capability>.