I have a ubuntu machine and have java-11 installed. My whole project consists of integrating a Hadoop cluster to work with Apache Drill and Apache Superset. I initially got Apache Drill running with Java -11. Then when configuring the Hadoop Cluster the documentation stated that Hadoop only supports java-8. So I downloaded java-8 through Oracle and changed the JAVA_HOME to the Java-8's path.
Then Hadoop worked fine. But then going back to drill it didn't work. I assume it must be due to the two java versions conflicting. I checked out websites like https://novicestuffs.wordpress.com/2017/04/25/how-to-uninstall-java-from-linux/ only to realize that if I follow them I would be completely removing java-8 and java-11 from my machine
Therefore is there a way I can permanently remove java-11 but keep java-8 in my system?
Related
I am trying to upgrade 0.14.0 sshd-core and 0.11.0 sshd-sftp to 2.8.0 version.
Unfortunately I found that there are a lot of implementation changes, but I did not find a migration guide before version 2.0.0 (It would be great if someone could provide me with a migration guide from 0.11.0/0.14.0 to 2.0.0)
The main problem I have is the lack of FileSystemView and SshFile since the previous version has file system independence and we use the ability for the root sub-directory of both Windows and Unix to be '/'.
Now, when I try to run sftp server on Windows the Windows file system always redirects me to 'C:\'.
I would be extremely grateful for some ideas on how I can migrate to the new version given the file system problem.
I regularly use MALLET for topic modeling in the classes that I teach. Running MALLET requires users to have the Java Development Kit installed. I currently have JDK 8 update 241 installed on my main computer, and I know that MALLET works properly in this environment. That said, JDK is now up to v14.
Which version(s) of JDK does MALLET support?
I'm not altogether sure that you do need the JDK. They never say that on the website. The tarfile that I downloaded already includes compiled classes - you aren't expected to build it from source - so the JRE should be enough.
Strangely enough, the compiled classes in the class directory are targeted at 1.7 (bytecode version 51) whereas the pom indicates that it's supposed to target Java 1.6. So it's quite probable that by rebuilding it you could support an older version of Java.
In any case, the JDK is backwards compatible by design. Any version from 7 onwards will be able to run it (6+ if you were to rebuild it).
Running it on a newer version will benefit from the new features of the JDK, such as improvements to the garbage collector, so you may see some performance improvement there. If you are not concerned about that then it doesn't matter.
We've updated our buildserver (Atlassian Bamboo) to Java 8 (JDK).
Since then our integrationtests are failing because our started product does not open any port.
We are building with maven and as part of the integrationtest we are starting our builded product. Our product is a Rest-Api based in an OSGI (equinox) and Jetty.
I tried a lot of things, but nothing helped me to get the product start properly in the maven build.
When I log in on my remote machine and start the product manually everything works fine.
Some more information:
Our buildserver runs as a windows service and our product is written in plain Java.
Presumably you are affected by one or more of the issues discussed in Custom AMIs will not start anymore in Bamboo Cloud (BAM-16291), notably that Bamboo is not compatible with JDK8u60 yet:
Joda-time, one of the libraries used by Bamboo is not compatible with
8u60. We've fixed this problem, but the fix has not been rolled out
yet. Known breakages include S3 interaction and CodeDeploy plugin.
Most/All participants got things working again by downgrading to JDK8u45, as also recommended in Atlassian's most recent update:
Use JDK 8u45. The latest JDKs are incompatible with some 3rd party libraries we're using.
Try to match the layout and scripts of our stock images as closely as possible. This will make it easier for us to provide help if
anything goes wrong.
Choose Oracle if you have the choice between Oracle and OpenJDK flavor of JDK.
I am developing an application for data syncing between Hive and Teradata.
For this I am using sqoop in embedded mode, i.e. I have added sqoop as a jar in the classpath and use Sqoop.runTool(..) to execute the operation.
However on eclipse it is marked as deprecated. I using version 1.4.2 and could not find any information on this.
I'm currently using it anyway but it would be better if somebody could provide some information as to why it is deprecated and what could be done about it?
what precise class are you using? There are currently two Sqoop classes:
com.cloudera.sqoop.Sqoop - this is deprecated
org.apache.sqoop.Sqoop - this is the right one to use
Sqoop was historically developed mainly at cloudera and it got moved under Apache later (during version 1.3.0). During incubation (before version 1.4.0) we moved all the functionality from cloudera namespace into apache namespace. We've provided classes in cloudera namespace to keep backward compatibility, but we marked them as deprecated.
Jarcec
This question is more of an Eclipse question, but those who have worked with Hadoop may know the answer.
Last year, I wrote some Hadoop MapReduce jobs in Eclipse. To access the libraries and packages I needed to compile the code, I installed an Eclipse plug-in off the Apache website (http://wiki.apache.org/hadoop/EclipsePlugIn). Unfortunately, that plug-in isn’t compatible with the latest version of Hadoop (the 2.0.0-alpha version). I was wondering if anyone had done any work with writing Hadoop jobs within the past few months, and knows how I can get Eclipse to recognize the most recent Hadoop (2.0.0-alpha) packages.
The plug-in still works for Hadoop 0.20.203.0, but I was hoping to be able to use the newest version.
I'm running Eclipse in Windows. I tried right-clicking on the name of my project, going to Properties then to Java Build Path and finally selecting Add External JARs
I added all the JARs from the hadoop-2.0.0-alpha/share/hadoop directory and subdirectories. At first, I thought this had worked, but when I try to use methods and constructors unique to the Hadoop 2.0.0-alpha API, the code does not compile. The libraries it is recognizing are definitely more recent than those from Hadoop 0.20.203.0, but not as recent as the current alpha version of Hadoop.
Does anyone know how I can fix this problem?