Upgraded to Cassandra 1.1.0 and could not find CassandraStorage() - java

This issue seems to have been resolved in the latest Cassandra 1.1.2 but I shall leave it here for those still on 1.1.0...
I have just upgraded to Cassandra 1.1.0, compiling it from source, and now I cannot find the CassandraStorage() class anywhere, and the contrib directory has disappeared!
May I know where I could find it? This class is important and utmost essential to the operation of Pig and Hadoop via Cassandra.
Also, executing the pig_cassandra from the examples directory produced the error that the CassandraStorage() could not be found. Is there a way around this?
Thanks!

org.apache.cassandra.hadoop.pig.CassandraStorage

Related

Scala error in the logs after deploying the project on tomcat

I don't use Scala anywhere in my project still getting this error. Cannot find the root cause here.
Kafka uses Scala internally, however Zookeeper isn't used for Consumers any longer, so I suggest you try using a newer version of spring-kafka

#Transactional not an annotation

I am trying to rebuild a grails application using ggts 3.4.0 with groovy 2.1.8, grails 2.3.3 and java 1.7.
In the services files I'm getting these error messages:
Groovy.class grails.transaction.Transactional is not an annotation in #grails.transaction.Transactional
I believe that one of the components listed above is causing this problem. Can someone suggest which particular one and the required version I need?
Incidentally, I've successfully built this in the past with different versions of the various components but I'm not sure what they were. I do have backups - so if you can suggest where can I find details of these builds that would help.
I would welcome any suggestions?
-mike

java.lang.VerifyError: Expecting a stackmap frame at branch target 29

Exception Details:
Location:
com/sonicsw/mf/comm/jms/ConnectorClient.setRequestTimeout(J)V #3: ifnonnull
Reason:
Expected stackmap frame at this location.
at com.sonicsw.jndi.mfcontext.MFContext.<init>(MFContext.java:101)
at com.sonicsw.jndi.mfcontext.MFContextFac
Can anyone resolve this issue? Googled it for over a week and tried every possible alternatives . Used -XX:-UseSplitVerifier . It is also not working . With noverify option it works fine. Java version is 1.7.0_51 . Working fine with 7.6 sonic libraries . We recently upgraded those libraries from 7.6 to 2015 and after that moment we are getting this error.
Even I also faced the same challenge when migrate my application from 1.6 to 1.7.
After huge struggle, we found a fix for this issue.
Approach 1: Either you can use -XX:-UseSplitVerifier argument will resolve this issue and you don't need to worry about to upgrade the library files.
Approach 2: I have followed the below steps to overcome the issue.
Step 1: Identify and keep a list of external libraries consumed by your application.
Step 2: Once you identify the list, keep removing one by one external library files and plug in upgraded version library files which will help you to isolate the library which might causing the issue.
In my case: j2ee.jar and openjpa-1.2.2 jar files created an issue and then I have upgraded these libraries which had resolved the migration issues.
Hence, It is bit of slow and painful process to figure out which library causing the issue and arrest it.
I hope, this information might be useful because it is based out of my real time experience.

Classpath Issue

I am trying to run a map/reduce job and I am getting a java.lang.NoSuchMethodError. I did some research on this and this appears when my code is executed (not compiled). The correct version of the class and methods are there during compilation, but when trying to run, the correct method is not available. The jar file that is causing this is guava. I know this from the stack that is printed. I throws an error when trying to execute the following line of code:
ArrayDeque<Entry<String, String>> a = Queues.newArrayDeque();
This jar is part of the hadoop classpath because it comes with the CDH verson 5.3.0 that I am using. I have tried adding the correct version of guava to the classpath, but the error does not change. My questions are as follows:
I believe that I have correctly identified the issue. Does this seem reasonable to you? I have never come across this error before.
I believe that I need to remove the older version of guava from the classpath and add the new one. However, I really do not know where to begin with correcting this. The command that is issued to hadoop jar does not contain the older version of guava (in the -libjar parm). The jar is part of the hadoop classpath when I issue the command "hadoop classpath". So I am assuming that there is some hadoop config file I could edit to make this go away. Is that the correct way to go, or is there some other thing I need to do?
I am using Java 7, CDH 5.3.0, NetBeans 8.
TIA
At the time that I'm writing this, Hadoop has a dependency on Guava version 11.0.2. It uses the library pretty heavily in its internal implementation.
According to the Guava JavaDocs, the Queues#newArrayDeque method was added in version 12.0. If your code is compiling successfully, then that means that Guava version 12.0 or higher is available on your compilation classpath at build time, but since version 11.0.2 is supplied at runtime by Hadoop, the method doesn't exist, resulting in NoSuchMethodError.
Unfortunately, there is no reliable way to swap out a different Guava version in Hadoop. Specifically, I recommend that you do not attempt to replace the Guava 11.0.2 jar that ships in the Hadoop distro. Replacing this with a different Guava version is untested, and it would risk destabilizing the cluster.
The broader problem is that Hadoop's dependencies "leak" to its clients. HADOOP-11656 is an unimplemented feature request that would isolate Hadoop's internal dependencies away from clients, so that you could more easily use common libraries like Guava at your desired version. Meanwhile, until that feature is implemented, I think your only options are to stick to Guava 11.0.2 APIs, or possibly try inlining some of the Guava code that you really want into your own project directly. The code for Queues#newArrayDeque is visible on GitHub.
public static <E> ArrayDeque<E> newArrayDeque() {
return new ArrayDeque<E>();
}
In this particular case, it looks like it will be easy to replace your code with a direct call to the java.util.ArrayDeque constructor. Thanks to the Java 7 diamond operator, it won't even be much more verbose.
ArrayDeque<Entry<String, String>> a = new java.util.ArrayDeque<>();

Neo4j won't start on arch linux - "java.lang.NoSuchMethodError: org.slf4j.spi.LocationAwareLogger.log"

I'm trying to install neo4j on a arch linux machine, but I have run into trouble. This is the error message I get: http://pastie.org/8646079.
I have tried following the installation instructions for linux in the manual and I have tried to install the package from AUR (non-official arch linux packages), but both give the same error.
I think it might be related to conflicting versions of slf4j, but I'm not really sure, so here's all the hits I get when searching for slf4j in my filesystem: http://pastie.org/8646086
If anybody knows what is wrong and how to fix it, I would be really happy!
Edit: Fixed this. Uninstalled jdk, removed /opt/java and reinstalled, which fixed it. It seems that a copy of slf4j had been installed to /opt/java but not removed properly.
Perhaps you can start neo4j by adding -verbose=class to the startup script?
To see which classes are loaded from where? It should then be listed in /path/to/neo4j/log/console.log
The Neo4j installation should only load files from /path/to/neo4j/lib and /path/to/neo4j/system/lib
If slf4j is loaded from somewhere else we have to figure out how it gets on the classpath for Neo4j.

Categories

Resources