I am trying to use the WEKA library in Python3, since it is the language I use at work, using the python-weka-wrapper. When trying to run the classifier M5P, I get the next exception:
JavaException: no/uib/cipr/matrix/Matrix
I have seen some posts about downloading some jar files, but I am not sure how does that solve my problem. Any clue?
Thanks
Several jars were added as is in weka.jar instead of getting expanded. Release 0.1.3 fixes that: https://github.com/fracpete/python-weka-wrapper3/releases/tag/v0.1.3
Related
I followed all the steps in https://cloud.google.com/vision/docs/ocr and when executing the code I get the following Exception.
Full Stacktrace:
Exception in thread "main" java.lang.NoSuchMethodError: com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;CLjava/lang/Object;)V
at io.grpc.Metadata$Key.validateName(Metadata.java:742)
at io.grpc.Metadata$Key.<init>(Metadata.java:750)
at io.grpc.Metadata$Key.<init>(Metadata.java:668)
at io.grpc.Metadata$AsciiKey.<init>(Metadata.java:959)
at io.grpc.Metadata$AsciiKey.<init>(Metadata.java:954)
at io.grpc.Metadata$Key.of(Metadata.java:705)
at io.grpc.Metadata$Key.of(Metadata.java:701)
at com.google.api.gax.grpc.GrpcHeaderInterceptor.<init>(GrpcHeaderInterceptor.java:60)
at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:228)
at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1500(InstantiatingGrpcChannelProvider.java:71)
at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:202)
at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:209)
at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:192)
at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:155)
at com.google.cloud.vision.v1.stub.GrpcImageAnnotatorStub.create(GrpcImageAnnotatorStub.java:117)
at com.google.cloud.vision.v1.stub.ImageAnnotatorStubSettings.createStub(ImageAnnotatorStubSettings.java:156)
at com.google.cloud.vision.v1.ImageAnnotatorClient.<init>(ImageAnnotatorClient.java:136)
at com.google.cloud.vision.v1.ImageAnnotatorClient.create(ImageAnnotatorClient.java:117)
at com.google.cloud.vision.v1.ImageAnnotatorClient.create(ImageAnnotatorClient.java:108)
I tried using different Guava (22.0 and 23.6) and HttpCore (5.0 and 4.4.8) versions than the ones already being used on the Google Cloud Platform Libraries (28.2 and 4.4.12) but got no luck.
I'm using Eclipse and used the Eclipse and followed these steps: https://cloud.google.com/eclipse/docs/libraries
As you noted, this kind of errors typically indicate version mismatches. You haven't said whether you're using Maven or the Cloud Tools for Eclipse native project. I'd recommend you use a Maven setup so that you can take advantage of the Cloud Libraries BOM. That should eliminate these version mismatches.
That stacktrace indicates your classpath includes an old version of Guava, or the OCR client, or both. Without a complete project to inspect, I can't say exactly how you're getting that old version--it depends on how you've configured the project--but that is definitely what's happening.
Ok, I made this run cloning the whole java vision project from GitHub: https://github.com/googleapis/java-vision.
However I still had to add some extra jars since path errors existed.
http://nlp.stanford.edu/software/openie.shtml generates compile time error :
SemanticGraphCoreAnnotations.EnhancedDependenciesAnnotation cannot be resolved to a type
I am running java-8 under eclipse 2016. corenlp-full-2015-12-01.zip, openie.jar, openie-models.jar are in included in my eclipse project.
Error generated by line of:
System.out.println(sentence.get(SemanticGraphCoreAnnotations.EnhancedDependenciesAnnotation.class).toString(SemanticGraph.OutputFormat.LIST));
thanks
If I were to guess, this sounds like a classpath issue. What happens if you remove either openie.jar or the corenlp distribution? In theory, openie.jar should contain everything you need to run the Open IE system.
I have the same error.
What does this package edu.stanford.nlp.naturalli; in the first line of the code snippet mean?
I do not have much Java knowledge, I just created a Java project somewhere and included the same files as WH Sweet did.
I just found that within the corenlp project subfolder naturalli there is a OpenIEDemo.java already. Maybe the explanation for this would solve the initial problem..
I'm currently getting this error:
java.lang.NoSuchMethodError: org.json.JSONObject.keySet()Ljava/util/Set;
at ee.ut.cs.Parser.accessLint(Parser.java:39)
I have tried cleaning the project to no awail.
I suspect I have an error in the src/plugin/parse-htmlraw/build.xml while creating the jar file but I'm not certain. I understand that this error is because the function does not exist at runtime, but the object is created which means that the class is there, just not that function. I decompiled the .class file in created jar and it has the necessary functions.
Code is available at https://github.com/jaansusi/WCAGgrader
Q: What is wrong with the build that produces this error?
The problem is that even if I put the necessary class files in the jar I create, they are not linked correctly and the class that's called in the jar can't locate functions inside the other classes. The class object JSONObject is created but the functions inside the JSONObject class can't be found.
If you do not find the problematic version, there is a possibility you get it (especially if you are using Spring) from the following dependency -
<artifactId>android-json</artifactId>
<groupId>com.vaadin.external.google</groupId>
excluding it worked for me,
An easy way of analyzing dependencies is the maven-helper plugin in Intellij, see here
Check for the version you have used.
There might be a case where 2 different versions are being used which in turn causes this error.
To their own maven local repository com\Google\code\gson\gson, see if there are two or more version about json, will have to do is to delete the old, and remember to look at any other place in the project is introduced into the old version of the dependence, if any, change the old version of the dependence to the new version is perfectly solved this problem
I've installed Flume and Hadoop manually (I mean, not CDH) and I'm trying to run the twitter example from Cloudera.
In the apache-flume-1.5.0-SNAPSHOT-bin directory, I start the agent with the following command:
bin/flume-ng agent -c conf -f conf/twitter.conf -Dflume.root.logger=DEBUG,console -n TwitterAgent
My conf/twitter.conf file uses the logger as the sink. The conf/flume-env.sh assigns to CLASSPATH the flume-sources-1.0-SNAPSHOT.jar that contains the definition of the twitter source. The resulting output is:
(...) [ERROR org.apache.flume.lifecycle.LifecycleSupervisor$MonitorRunnable.run(LifecycleSupervisor.java:253)] Unable to start EventDrivenSourceRunner: { source:com.cloudera.flume.source.TwitterSource{name:Twitter,state:IDLE} } - Exception follows. java.lang.NoSuchMethodError:
twitter4j.FilterQuery.setIncludeEntities(Z)Ltwitter4j/FilterQuery;
at com.cloudera.flume.source.TwitterSource.start(TwitterSource.java:139)
The conflict results from a FilterQuery class that is defined elsewhere in the flume lib and that does not contain the setIncludeEntities method. For me, the file that contains this class is the twitter4j-stream-3.0.3.jar and I cannot exclude the file from the classpath as suggested here.
I believe this experience was quite frustrating for you, for me it was for sure. The main problem is, both the files, flume-sources-1.0-SNAPSHOT.jar and twitter4j-stream-3.0.3.jar contains the same FilterQuery.class. That is why the conflict message is generated in the log file.
I am not a Java or Big Data expert, but I can give you an alternate to this problem. Download the Twitter4j-stream-2.6.6.jar or lower version from here and replacethe twitter4j-stream-3.0.3.jar. All the 3.X.X uses this class. After replacing, everything should work fine. But you may get some heap error after downloading huge amount of tweets. Please google the solution as it was resolved in 3.X.X files.
-Edit
Also, please don't forget to download and replace all the twitter4j files in /usr/lib/flume-ng folder. Namely, twitter4j-media-support-2.2.6.jar, twitter4j-stream-2.2.6.jar and twitter4j-core-2.2.6.jar. Any mismatch related to version among these files will also create problem.
As suggested in the post a problematic file can be search-contrib-1.0.0-jar-with-dependencies.jar too.
You need to recompile flume-sources-1.0-SNAPSHOT.jar from the git:https://github.com/cloudera/cdh-twitter-example
Install Maven, then download the repository of cdh-twitter-example.
Unzip, then execute inside (as mentionned) :
$ cd flume-sources
$ mvn package
$ cd ..
This problem happened when the twitter4j version updated from 2.2.6 to 3.X, they removed the method setIncludeEntities, and the JAR is not up to date.
PS: Do not download the prebuilt version, it is still the old.
Simply rename all twitter4j-stream* jar files and rerun your flume. It will work with charm. :)
I had the same problem and at last I solved following these steps:
First I renamed all jar files in jarx: from twitter4j-stream-3.0.3.jar -> twitter4j-stream-3.0.3.jarx, ...
This solved the error, but when it tried to estabilish connection, I got error 404:
(Twitter Stream consumer-1[Establishing connection])
[INFO - Twitter4j.internal.logging.SLF4JLogger.info(SLF4JLogger.java:83)] 404:
The URI requested is invalid or the resource requested, such as a user, does not exist.)
After reading this page (https://twittercommunity.com/t/twitter-streaming-api-not-working-with-twitter4j-and-apache-flume/66612/11) finally I solved downloading a new version of twitter4j (in the page there's a link).
Probably not the best solution, but worked for me.
I am taking the input from the web, which is an Xml file and converting into a Json data using the library json-lib . I have created a user library and added the following jars into it:-
json-lib-2.3-jdk15.jar
commons-collections.jar
commons-lang.jar
commons-logging.jar
commons-beanutils.jar
ezmorph-1.0.6.jar
xom-1.1.jar
But still gives the following error:-
08-04 13:58:31.642: ERROR/dalvikvm(484): Could not find class 'net.sf.json.xml.XMLSerializer$CustomElement', referenced from method net.sf.json.xml.XMLSerializer.addNameSpaceToElement
Can anyone help me out in resolving this issue.
Either you have a sdk level / jdk level conflict. I mean dalvik can't get the byte code of the CustomElement class of your librairy as it is compiled with to recent features for your SDK like annotations for instance.
Or there is a conflicting librairy json-lib in some other of your jars or lib folders.
(the 3 first comments are not relevant, it's just the way inner classes are compiled, using a $)
Regards,
Stéphane
Since android already support json org.json a different json library may conflict. (You can download the jar here)
Try to use this library instead of an external library on android.
BTW: You can also use this library if you need on any java code (not only android)