I am trying to connect to Hbase using Java(in Eclipse) in Cloudera VM, but getting below error. Am able to run same program in command line(by converting my program into jar)
my java program
`import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.hbase.HBaseConfiguration;
import org.apache.hadoop.hbase.HColumnDescriptor;
import org.apache.hadoop.hbase.HTableDescriptor;
import org.apache.hadoop.hbase.TableName;
import org.apache.hadoop.hbase.client.*;
import org.apache.hadoop.hbase.util.Bytes;
//import org.apache.hadoop.mapred.MapTask;
import java.io.FileWriter;
import java.io.IOException;
public class HbaseConnection {
public static void main(String[] args) throws IOException {
Configuration config = HBaseConfiguration.create();
config.addResource("/usr/lib/hbase/conf/hbase-site.xml");
HTable table = new HTable(config, "test_table");
byte[] columnFamily = Bytes.toBytes("colf");
byte[] idColumnName = Bytes.toBytes("id");
byte[] groupIdColumnName = Bytes.toBytes("g_id");
Put put = new Put(Bytes.toBytes("testkey"));
put.add(columnFamily, idColumnName, Bytes.toBytes("test id"));
put.add(columnFamily, groupIdColumnName, Bytes.toBytes("test group id"));
table.put(put);
table.close();
}
}`
And I have kept hbase-site.xml in the source folder in eclipse
hbase-site.xml
<property>
<name>hbase.rest.port</name>
<value>8070</value>
<description>The port for the HBase REST server.</description>
</property>
<property>
<name>hbase.cluster.distributed</name>
<value>true</value>
</property>
<property>
<name>hbase.rootdir</name>
<value>hdfs://quickstart.cloudera:8020/hbase</value>
</property>
<property>
<name>hbase.regionserver.ipc.address</name>
<value>0.0.0.0</value>
</property>
<property>
<name>hbase.master.ipc.address</name>
<value>0.0.0.0</value>
</property>
<property>
<name>hbase.thrift.info.bindAddress</name>
<value>0.0.0.0</value>
</property>
And am getting below error while running the program in eclipse
log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Exception in thread "main" java.io.IOException: java.lang.reflect.InvocationTargetException
at org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:389)
at org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:366)
at org.apache.hadoop.hbase.client.HConnectionManager.getConnection(HConnectionManager.java:247)
at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:188)
at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:150)
at com.aig.gds.hadoop.platform.idgen.hbase.HBaseTest.main(HBaseTest.java:34)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:387)
... 5 more
Caused by: java.util.ServiceConfigurationError: org.apache.hadoop.fs.FileSystem: Provider org.apache.hadoop.hdfs.DistributedFileSystem could not be instantiated
at java.util.ServiceLoader.fail(ServiceLoader.java:224)
at java.util.ServiceLoader.access$100(ServiceLoader.java:181)
at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:377)
at java.util.ServiceLoader$1.next(ServiceLoader.java:445)
at org.apache.hadoop.fs.FileSystem.loadFileSystems(FileSystem.java:2400)
at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2411)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2428)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:88)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2467)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2449)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:367)
at org.apache.hadoop.fs.Path.getFileSystem(Path.java:287)
at org.apache.hadoop.hbase.util.DynamicClassLoader.<init>(DynamicClassLoader.java:104)
at org.apache.hadoop.hbase.protobuf.ProtobufUtil.<clinit>(ProtobufUtil.java:197)
at org.apache.hadoop.hbase.ClusterId.parseFrom(ClusterId.java:64)
at org.apache.hadoop.hbase.zookeeper.ZKClusterId.readClusterIdZNode(ZKClusterId.java:69)
at org.apache.hadoop.hbase.client.ZooKeeperRegistry.getClusterId(ZooKeeperRegistry.java:83)
at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.retrieveClusterId(HConnectionManager.java:801)
at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.<init>(HConnectionManager.java:633)
... 10 more
Caused by: java.lang.NoSuchMethodError: org.apache.hadoop.conf.Configuration.addDeprecations([Lorg/apache/hadoop/conf/Configuration$DeprecationDelta;)V
at org.apache.hadoop.hdfs.HdfsConfiguration.addDeprecatedKeys(HdfsConfiguration.java:66)
at org.apache.hadoop.hdfs.HdfsConfiguration.<clinit>(HdfsConfiguration.java:31)
at org.apache.hadoop.hdfs.DistributedFileSystem.<clinit>(DistributedFileSystem.java:114)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at java.lang.Class.newInstance(Class.java:374)
at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:373)
... 26 more
Thanks in advance.
The root cause of your problem is in the stack trace:
NoSuchMethodError: org.apache.hadoop.conf.Configuration.addDeprecations
This means that your hadoop-common-* jar version is not in sync with your hadoop-hdfs-* jar version Or you may have a mix of different versions in your class path.
Note that addDeprecations is present in hadoop 2.3.0 and later :
https://hadoop.apache.org/docs/r2.3.0/api/org/apache/hadoop/conf/Configuration.html
but was missing in 2.2.0 and prior:
https://hadoop.apache.org/docs/r2.2.0/api/org/apache/hadoop/conf/Configuration.html
Related
Hi I am beginner to hadoop,
I just installed hive 2.3.7 and setup the metastore with mysql
according to this tutorial https://www.guru99.com/hive-metastore-configuration-mysql.html
and this one https://ravi-chamarthy.medium.com/apache-hive-configuration-with-mysql-metastore-3ecb9a0df3a1.
Here is my hdfs-site.xml file
<configuration>
<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:mysql://localhost/metastore?createDatabaseIfNotExist=true</value>
<description>metadata is stored in a MySQL server</description>
</property>
<property>
<name>javax.jdo.option.ConnectionDriverName</name>
<value>com.mysql.cj.jdbc.Driver</value>
<description>MySQL JDBC driver class</description>
</property>
<property>
<name>javax.jdo.option.ConnectionUserName</name>
<value>hiveuser</value>
<description>user name for connecting to mysql server</description>
</property>
<property>
<name>javax.jdo.option.ConnectionPassword</name>
<value>hivepassword</value>
<description>password for connecting to mysql server</description>
</property>
When I executed schematool -initSchema -dbType mysql
everything is fine. It initialize the 2.3.0 schema of hive.
when I started hive and execute the command show databases; or any other,
I got this errors
hive> show databases;
Loading class `com.mysql.jdbc.Driver'. This is deprecated. The new driver class is `com.mysql.cj.jdbc.Driver'. The driver is automatically registered via the SPI and manual loading of the driver class is
generally unnecessary.
Exception in thread "main" java.lang.IllegalAccessError: tried to access method com.google.common.collect.Iterators.emptyIterator()Lcom/google/common/collect/UnmodifiableIterator; from class org.apache.hadoop.hive.ql.exec.FetchOperator
at org.apache.hadoop.hive.ql.exec.FetchOperator.<init>(FetchOperator.java:108)
at org.apache.hadoop.hive.ql.exec.FetchTask.initialize(FetchTask.java:87)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:541)
at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1317)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1457)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1237)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1227)
at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:233)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:184)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:403)
at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:821)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:686)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:318)
at org.apache.hadoop.util.RunJar.main(RunJar.java:232)
Note: I used mysql 8.0.22, mysql-connector-java.jar, mysql-connector-java-8.0.22.jar
ubuntu 18.04, hadoop 3.1.4.
What version of Hadoop are you using? Might have to do with incompatible version of hadoop
I am trying to operate Sentry in our CDH cluster using its Java Client API. Since I haven't found any related documentation about this (very little useful information on its official site), I just guess and write. Now some exception came up and I cannot solve it. So I am wondering whether there are documentations about how to use Sentry's Java API.
Below is the code I have written:
package com.playground;
import org.apache.hadoop.conf.Configuration;
import org.apache.sentry.api.service.thrift.SentryPolicyServiceClient;
import org.apache.sentry.api.service.thrift.TSentryRole;
import org.apache.sentry.service.thrift.SentryServiceClientFactory;
import java.util.Set;
public class SentryClientTest {
public static void main(String[] args) {
Configuration conf = new Configuration();
conf.addResource("sentry-site.xml");
try {
SentryPolicyServiceClient client = SentryServiceClientFactory.create(conf);
Set<TSentryRole> roles= client.listAllRoles("myname");
for (TSentryRole role : roles) {
System.out.println(role);
}
} catch (Exception e) {
e.printStackTrace();
}
}
}
sentry-site.xml is copied from /etc/sentry/sentry-site.xml, its content is like this:
<?xml version="1.0" encoding="UTF-8"?>
<!--Autogenerated by Cloudera Manager-->
<configuration>
<property>
<name>sentry.service.server.principal</name>
<value>sentry/_HOST#SOME_REALM</value>
</property>
<property>
<name>sentry.service.security.mode</name>
<value>kerberos</value>
</property>
<property>
<name>sentry.service.client.server.rpc-address</name>
<value>some-hostname</value>
</property>
<property>
<name>sentry.service.client.server.rpc-port</name>
<value>8038</value>
</property>
<property>
<name>sentry.service.client.server.rpc-addresses</name>
<value>hostname1:8038,hostname2:8038</value>
</property>
</configuration>
The exception message:
org.apache.sentry.core.common.exception.SentryUserException: GSS initiate failed
at org.apache.sentry.core.common.transport.RetryClientInvocationHandler.connect(RetryClientInvocationHandler.java:166)
at org.apache.sentry.core.common.transport.RetryClientInvocationHandler.invokeImpl(RetryClientInvocationHandler.java:90)
at org.apache.sentry.core.common.transport.SentryClientInvocationHandler.invoke(SentryClientInvocationHandler.java:41)
at com.sun.proxy.$Proxy2.listAllRoles(Unknown Source)
at com.playground.SentryClientTest.main(SentryClientTest.java:17)
Caused by: org.apache.thrift.transport.TTransportException: GSS initiate failed
at org.apache.thrift.transport.TSaslTransport.sendAndThrowMessage(TSaslTransport.java:232)
at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:316)
at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
at org.apache.sentry.core.common.transport.SentryTransportFactory$UgiSaslClientTransport.baseOpen(SentryTransportFactory.java:183)
at org.apache.sentry.core.common.transport.SentryTransportFactory$UgiSaslClientTransport.access$100(SentryTransportFactory.java:141)
at org.apache.sentry.core.common.transport.SentryTransportFactory$UgiSaslClientTransport$1.run(SentryTransportFactory.java:168)
at org.apache.sentry.core.common.transport.SentryTransportFactory$UgiSaslClientTransport$1.run(SentryTransportFactory.java:166)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1754)
at org.apache.sentry.core.common.transport.SentryTransportFactory$UgiSaslClientTransport.open(SentryTransportFactory.java:166)
at org.apache.sentry.core.common.transport.SentryTransportFactory.connectToServer(SentryTransportFactory.java:99)
at org.apache.sentry.core.common.transport.SentryTransportFactory.getTransport(SentryTransportFactory.java:86)
at org.apache.sentry.core.common.transport.SentryTransportPool$PoolFactory.create(SentryTransportPool.java:302)
at org.apache.sentry.core.common.transport.SentryTransportPool$PoolFactory.create(SentryTransportPool.java:271)
at org.apache.commons.pool2.BaseKeyedPooledObjectFactory.makeObject(BaseKeyedPooledObjectFactory.java:62)
at org.apache.commons.pool2.impl.GenericKeyedObjectPool.create(GenericKeyedObjectPool.java:1041)
at org.apache.commons.pool2.impl.GenericKeyedObjectPool.borrowObject(GenericKeyedObjectPool.java:380)
at org.apache.commons.pool2.impl.GenericKeyedObjectPool.borrowObject(GenericKeyedObjectPool.java:279)
at org.apache.sentry.core.common.transport.SentryTransportPool.getTransport(SentryTransportPool.java:183)
at org.apache.sentry.api.service.thrift.SentryPolicyServiceClientDefaultImpl.connect(SentryPolicyServiceClientDefaultImpl.java:100)
at org.apache.sentry.core.common.transport.RetryClientInvocationHandler.connect(RetryClientInvocationHandler.java:141)
... 4 more
I guess it has something to do with kerberos but I cannot figure it out. Any help is appreciated.
When I start my HBase , the HReginServer and HMaster of my master machine start successful , but the slaves HReginServer can not start
my hadoop version is 2.8.0 , hbase version is 1.2.6 , zookeeper version is 3.4.9
and my hbase-site.xml is :
<property>
<name>hbase.rootdir</name>
<value>hdfs://hadoop:9000/hbase</value>
</property>
<property>
<name>hbase.cluster.distributed</name>
<value>true</value>
</property>
<property>
<name>hbase.master</name>
<value>60000</value>
</property>
<property>
<name>hbase.zookeeper.property.dataDir</name>
<value>/usr/local/hbase/zoodata</value>
</property>
<property>
<name>hbase.zookeeper.quorum</name>
<value>hadoop,slave03,slave02</value>
</property>
<property>
<name>hbase.zookeeper.property.clientPort</name>
<value>2181</value>
</property>
<property>
<name>hbase.regionserver.port</name>
<value>60020</value>
</property>
and regionservers is :
hadoop
slave03
slave02
here is the error message
2017-06-16 15:09:34,730 ERROR [main]
regionserver.HRegionServerCommandLine: Region server exiting
java.lang.RuntimeException: Failed construction of Regionserver: class org.apache.hadoop.hbase.regionserver.HRegionServer
at org.apache.hadoop.hbase.regionserver.HRegionServer.constructRegionServer(HRegionServer.java:2682)
at org.apache.hadoop.hbase.regionserver.HRegionServerCommandLine.start(HRegionServerCommandLine.java:64)
at org.apache.hadoop.hbase.regionserver.HRegionServerCommandLine.run(HRegionServerCommandLine.java:87)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.hadoop.hbase.util.ServerCommandLine.doMain(ServerCommandLine.java:126)
at org.apache.hadoop.hbase.regionserver.HRegionServer.main(HRegionServer.java:2697)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.hbase.regionserver.HRegionServer.constructRegionServer(HRegionServer.java:2680)
... 5 more
Caused by: java.io.IOException: Problem binding to slave02/119.29.83.97:60020 : Cannot assign requested address. To switch ports use the 'hbase.regionserver.port' configuration property.
at org.apache.hadoop.hbase.regionserver.RSRpcServices.<init>(RSRpcServices.java:938)
at org.apache.hadoop.hbase.regionserver.HRegionServer.createRpcServices(HRegionServer.java:647)
at org.apache.hadoop.hbase.regionserver.HRegionServer.<init>(HRegionServer.java:531)
... 10 more
Caused by: java.net.BindException: Cannot assign requested address
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:433)
at sun.nio.ch.Net.bind(Net.java:425)
at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
at org.apache.hadoop.hbase.ipc.RpcServer.bind(RpcServer.java:2592)
at org.apache.hadoop.hbase.ipc.RpcServer$Listener.<init>(RpcServer.java:585)
at org.apache.hadoop.hbase.ipc.RpcServer.<init>(RpcServer.java:2045)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.<init>(RSRpcServices.java:930)
... 12 more
I hope some help , thank you so much !
I have followed this site https://blogs.msdn.microsoft.com/arsen/2016/08/05/accessing-azure-data-lake-store-using-webhdfs-with-oauth2-from-spark-2-0-that-is-running-locally/ to connect ADLS storage with my Azure VM.
Created Azure VM and installed my application in it
Created Azure Data Lake Store and service pricipal
Here is my core-site.xml:-
<configuration>
<property>
<name>dfs.webhdfs.oauth2.enabled</name>
<value>true</value>
</property>
<property>
<name>dfs.webhdfs.oauth2.access.token.provider</name>
<value>org.apache.hadoop.hdfs.web.oauth2.ConfRefreshTokenBasedAccessTokenProvider</value>
</property>
<property>
<name>dfs.webhdfs.oauth2.refresh.url</name>
<value>https://login.windows.net/tenaid-id-here/oauth2/token</value>
</property>
<property>
<name>dfs.webhdfs.oauth2.client.id</name>
<value>Client id</value>
</property>
<property>
<name>dfs.webhdfs.oauth2.refresh.token.expires.ms.since.epoch</name>
<value>0</value>
</property>
<property>
<name>dfs.webhdfs.oauth2.refresh.token</name>
<value>Refresh token</value>
</property>
</configuration>
I have installed my application in Azure VM and I get following error when I upload file in my application.
2017-01-27 12:54:25.963 GMT+0000 WARN [admin-1fd467a4c41f43fe9f30ab446a5c93ac-84-b6792518109848bead029c9144603d04-libraryService.importDataFiles] LibraryImpl - Failed to write data file partID: 0 at: library/51dc056c0a634beba243120501fe70d6/545ca95c2a894f948b1f5184b013a53e/5c68d893090f471d81f3cdfc810bc4f7/b6d5ceb64bfd4d65ba4ea24d24f99e90
java.io.IOException: Mkdirs failed to create file:/clusters/myapp/library/51dc056c0a634beba243120501fe70d6/545ca95c2a894f948b1f5184b013a53e/5c68d893090f471d81f3cdfc810bc4f7/b6d5ceb64bfd4d65ba4ea24d24f99e90/data (exists=false, cwd=file:/home/palmtree/work/software/myapp-2.5-SNAPSHOT/myapp)
at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:450)
at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:435)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:909)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:890)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:787)
at parquet.hadoop.ParquetFileWriter.<init>(ParquetFileWriter.java:150)
at parquet.hadoop.ParquetWriter.<init>(ParquetWriter.java:176)
at parquet.avro.AvroParquetWriter.<init>(AvroParquetWriter.java:93)
at com.myapp.hadoop.common.PaxParquetWriterImpl.doWriteRow(PaxParquetWriterImpl.java:52)
at com.myapp.hadoop.common.PaxParquetWriterImpl.access$000(PaxParquetWriterImpl.java:19)
at com.myapp.hadoop.common.PaxParquetWriterImpl$1.run(PaxParquetWriterImpl.java:43)
at com.myapp.hadoop.common.PaxParquetWriterImpl$1.run(PaxParquetWriterImpl.java:40)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at com.myapp.hadoop.common.PaxParquetWriterImpl.writpeRow(PaxParquetWriterImpl.java:40)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.myapp.hadoop.core.DistributionManager$$anon$10.invoke(DistributionManager.scala:313)
at com.sun.proxy.$Proxy56.writeRow(Unknown Source)
at com.myapp.library.stacks.DataFileWriter.write(DataFileWriter.java:49)
at com.myapp.library.LibraryImpl.pullImportData(LibraryImpl.java:747)
at com.myapp.library.LibraryImpl.importDataFile(LibraryImpl.java:631)
at com.myapp.frontend.server.LibraryAPI.importDataFile(LibraryAPI.java:269)
at com.myapp.frontend.server.LibraryWebSocketDelegate.importDataFile(LibraryWebSocketDelegate.java:189)
at com.myapp.frontend.server.LibraryWebSocketDelegate.importDataFiles(LibraryWebSocketDelegate.java:204)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.myapp.frontend.util.PXWebSocketProtocolHandler$PXMethodHandler.call(PXWebSocketProtocolHandler.java:144)
at com.myapp.frontend.util.PXWebSocketEndpoint.performMethodCall(PXWebSocketEndpoint.java:284)
at com.myapp.frontend.util.PXWebSocketEndpoint.access$200(PXWebSocketEndpoint.java:47)
at com.myapp.frontend.util.PXWebSocketEndpoint$1.run(PXWebSocketEndpoint.java:169)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
2017-01-27 12:54:25.966 GMT+0000 WARN [admin-1fd467a4c41f43fe9f30ab446a5c93ac-84-b6792518109848bead029c9144603d04-libraryService.importDataFiles] LibraryImpl - Failed to import acquisition da73b76755c34c74a1643a324e41e156
com.myapp.iface.service.RequestFailedException
at com.myapp.library.LibraryImpl.pullImportData(LibraryImpl.java:754)
at com.myapp.library.LibraryImpl.importDataFile(LibraryImpl.java:631)
at com.myapp.frontend.server.LibraryAPI.importDataFile(LibraryAPI.java:269)
at com.myapp.frontend.server.LibraryWebSocketDelegate.importDataFile(LibraryWebSocketDelegate.java:189)
at com.myapp.frontend.server.LibraryWebSocketDelegate.importDataFiles(LibraryWebSocketDelegate.java:204)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.myapp.frontend.util.PXWebSocketProtocolHandler$PXMethodHandler.call(PXWebSocketProtocolHandler.java:144)
at com.myapp.frontend.util.PXWebSocketEndpoint.performMethodCall(PXWebSocketEndpoint.java:284)
at com.myapp.frontend.util.PXWebSocketEndpoint.access$200(PXWebSocketEndpoint.java:47)
at com.myapp.frontend.util.PXWebSocketEndpoint$1.run(PXWebSocketEndpoint.java:169)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Kindly help to solve this
Update 1:-
Tried the following to connect My application on Azure VM with ADLS:-
Added azure-data-lake-store-sdk in lib
I have followed this Service-to-service authentication to create application in Azure Active Directory .
I have also assigned the Azure AD application to the ADLS account root directory.
Root directory -> /clusters/myapp
Updated core-site.xml based on values from above documentation.
<configuration>
<property>
<name>dfs.adls.home.hostname</name>
<value>dev.azuredatalakestore.net</value>
</property>
<property>
<name>dfs.adls.home.mountpoint</name>
<value>/clusters</value>
</property>
<property>
<name>fs.adl.impl</name>
<value>org.apache.hadoop.fs.adl.AdlFileSystem</value>
</property>
<property>
<name>fs.AbstractFileSystem.adl.impl</name>
<value>org.apache.hadoop.fs.adl.Adl</value>
</property>
<property>
<name>dfs.adls.oauth2.refresh.url</name>
<value>https://login.windows.net/[tenantId]/oauth2/token</value>
</property>
<property>
<name>dfs.adls.oauth2.client.id</name>
<value>[CLIENT ID]</value>
</property>
<property>
<name>dfs.adls.oauth2.credential</name>
<value>[CLIENT KEY]</value>
</property>
<property>
<name>dfs.adls.oauth2.access.token.provider.type</name>
<value>ClientCredential</value>
</property>
<property>
<name>fs.azure.io.copyblob.retry.max.retries</name>
<value>60</value>
</property>
<property>
<name>fs.azure.io.read.tolerate.concurrent.append</name>
<value>true</value>
</property>
<property>
<name>fs.defaultFS</name>
<value>adl://dev.azuredatalakestore.net</value>
<final>true</final>
</property>
<property>
<name>fs.trash.interval</name>
<value>360</value>
</property>
</configuration>
I am getting following error when I start my application Server in VM:-
2017-02-02 07:40:27.527 GMT+0000 INFO [main] DistributionManager - Looking for class loader for distroName=adl kerberized=false
2017-02-02 07:40:28.428 GMT+0000 ERROR [main] SimpleHdfsFileSystem - Failed to initialize HDFS file storage on null as hdfs root /myapp
org.apache.hadoop.security.AccessControlException: Unauthorized
at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:347)
at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$200(WebHdfsFileSystem.java:98)
at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.runWithRetry(WebHdfsFileSystem.java:623)
at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.access$100(WebHdfsFileSystem.java:472)
at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner$1.run(WebHdfsFileSystem.java:502)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.run(WebHdfsFileSystem.java:498)
at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.mkdirs(WebHdfsFileSystem.java:919)
at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1877)
at com.myapp.hadoop.common.HdfsFileSystem$1.run(HdfsFileSystem.java:98)
at com.myapp.hadoop.common.HdfsFileSystem$1.run(HdfsFileSystem.java:91)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at com.myapp.hadoop.common.HdfsFileSystem.__initialize(HdfsFileSystem.java:91)
at com.myapp.hadoop.common.SimpleHdfsFileSystem.initialize(SimpleHdfsFileSystem.java:40)
at com.myapp.hadoop.hdp2.HadoopDistributionImpl.initializeHdfs(HadoopDistributionImpl.java:63)
at com.myapp.hadoop.hdp2.UnsecureHadoopDistributionImpl.connectToFileSystem(UnsecureHadoopDistributionImpl.java:22)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.myapp.hadoop.core.DistributionManager$$anon$1.invoke(DistributionManager.scala:135)
at com.sun.proxy.$Proxy22.connectToFileSystem(Unknown Source)
at com.myapp.library.LibraryStorageImpl.parseSimpleAuthFileSystem(LibraryStorageImpl.scala:126)
at com.myapp.library.LibraryStorageImpl.initializeStorageWithPrefix(LibraryStorageImpl.scala:64)
at com.myapp.library.LibraryStorageImpl.initialize(LibraryStorageImpl.scala:39)
at com.myapp.library.LibraryStorageImpl.initialize(LibraryStorageImpl.scala:33)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeCustomInitMethod(AbstractAutowireCapableBeanFactory.java:1581)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1522)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1452)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:519)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:197)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBean(DefaultListableBeanFactory.java:274)
at org.springframework.context.support.AbstractApplicationContext.getBean(AbstractApplicationContext.java:1106)
at com.myapp.container.PxBeanContext.getBean(PxBeanContext.java:156)
at com.myapp.library.streaming.files.UploadFileServiceImpl.initialize(UploadFileServiceImpl.java:49)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeCustomInitMethod(AbstractAutowireCapableBeanFactory.java:1581)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1522)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1452)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:519)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:609)
at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:918)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:469)
at com.myapp.container.PxBeanContext.startup(PxBeanContext.java:42)
at com.myapp.jetty.FrontendServer.main(FrontendServer.java:124)
2017-02-02 07:40:28.462 GMT+0000 WARN [main] server - HQ222113: On ManagementService stop, there are 1 unexpected registered MBeans: [core.acceptor.dc9ff2aa-e91a-11e6-9a51-09b76b4431e6]
2017-02-02 07:40:28.479 GMT+0000 INFO [main] server - HQ221002: HornetQ Server version 2.5.0.SNAPSHOT (Wild Hornet, 124) [7039110c-dd57-11e6-b90d-2bc6685808f5] stopped
2017-02-02 07:40:28.480 GMT+0000 ERROR [main] FrontendServer - Fatal error trying to start server
java.lang.RuntimeException: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'com.myapp.library.streaming.files.UploadFileServiceImpl#0' defined in class path resource [system-config.xml]: Invocation of init method failed; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'com.myapp.library.LibraryStorageImpl#0' defined in class path resource [system-config.xml]: Invocation of init method failed; nested exception is java.lang.RuntimeException: org.apache.hadoop.security.AccessControlException: Unauthorized
at com.myapp.container.PxBeanContext.startup(PxBeanContext.java:44)
at com.myapp.jetty.FrontendServer.main(FrontendServer.java:124)
Caused by: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'com.myapp.library.streaming.files.UploadFileServiceImpl#0' defined in class path resource [system-config.xml]: Invocation of init method failed; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'com.myapp.library.LibraryStorageImpl#0' defined in class path resource [system-config.xml]: Invocation of init method failed; nested exception is java.lang.RuntimeException: org.apache.hadoop.security.AccessControlException: Unauthorized
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1455)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:519)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:609)
at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:918)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:469)
at com.myapp.container.PxBeanContext.startup(PxBeanContext.java:42)
... 1 more
Caused by: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'com.myapp.library.LibraryStorageImpl#0' defined in class path resource [system-config.xml]: Invocation of init method failed; nested exception is java.lang.RuntimeException: org.apache.hadoop.security.AccessControlException: Unauthorized
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1455)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:519)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:197)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBean(DefaultListableBeanFactory.java:274)
at org.springframework.context.support.AbstractApplicationContext.getBean(AbstractApplicationContext.java:1106)
at com.myapp.container.PxBeanContext.getBean(PxBeanContext.java:156)
at com.myapp.library.streaming.files.UploadFileServiceImpl.initialize(UploadFileServiceImpl.java:49)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeCustomInitMethod(AbstractAutowireCapableBeanFactory.java:1581)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1522)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1452)
... 11 more
Caused by: java.lang.RuntimeException: org.apache.hadoop.security.AccessControlException: Unauthorized
at com.myapp.hadoop.common.SimpleHdfsFileSystem.initialize(SimpleHdfsFileSystem.java:45)
at com.myapp.hadoop.hdp2.HadoopDistributionImpl.initializeHdfs(HadoopDistributionImpl.java:63)
at com.myapp.hadoop.hdp2.UnsecureHadoopDistributionImpl.connectToFileSystem(UnsecureHadoopDistributionImpl.java:22)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.myapp.hadoop.core.DistributionManager$$anon$1.invoke(DistributionManager.scala:135)
at com.sun.proxy.$Proxy22.connectToFileSystem(Unknown Source)
at com.myapp.library.LibraryStorageImpl.parseSimpleAuthFileSystem(LibraryStorageImpl.scala:126)
at com.myapp.library.LibraryStorageImpl.initializeStorageWithPrefix(LibraryStorageImpl.scala:64)
at com.myapp.library.LibraryStorageImpl.initialize(LibraryStorageImpl.scala:39)
at com.myapp.library.LibraryStorageImpl.initialize(LibraryStorageImpl.scala:33)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeCustomInitMethod(AbstractAutowireCapableBeanFactory.java:1581)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1522)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1452)
... 28 more
Caused by: org.apache.hadoop.security.AccessControlException: Unauthorized
at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:347)
at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$200(WebHdfsFileSystem.java:98)
at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.runWithRetry(WebHdfsFileSystem.java:623)
at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.access$100(WebHdfsFileSystem.java:472)
at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner$1.run(WebHdfsFileSystem.java:502)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.run(WebHdfsFileSystem.java:498)
at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.mkdirs(WebHdfsFileSystem.java:919)
at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1877)
at com.myapp.hadoop.common.HdfsFileSystem$1.run(HdfsFileSystem.java:98)
at com.myapp.hadoop.common.HdfsFileSystem$1.run(HdfsFileSystem.java:91)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at com.myapp.hadoop.common.HdfsFileSystem.__initialize(HdfsFileSystem.java:91)
at com.myapp.hadoop.common.SimpleHdfsFileSystem.initialize(SimpleHdfsFileSystem.java:40)
... 47 more
Using following jars in my project:-
azure-data-lake-store-sdk-2.1.4.jar
commons-cli-1.2.jar
commons-configuration-1.6.jar
hadoop-auth-2.7.1.jar
hadoop-azure-datalake-3.0.0-alpha1.jar
hadoop-common-2.5-SNAPSHOT.jar
hadoop-common-2.7.1.jar
hadoop-hdfs-2.7.3.jar
hadoop-hdp2-2.5-SNAPSHOT.jar
Clarifications:-
My intention is to connect My application on Azure VM with DataLake Store with out need of HDInsight Cluster. Is that possible ? If so what steps should I need to follow ? What are the configuration needs to be present in core-site.xml ?
File preview fails with AccessControlException error in ADLS
Login the HDInsight cluster which is associated to the Data Lake Store using ssh command - ssh [user]#[cluster2]-ssh.azurehdinsight.net
Copy a file to the cluster using the wget command - wget http://www.sample-videos.com/csv/Sample-Spreadsheet-10-rows.csv
Create a new folder in your Data Lake Store account
Now upload a file using PUT command
hdfs dfs -put Sample-Spreadsheet-10-rows.csv adl://dev2.azuredatalakestore.net/new
View the file in the Azure Portal
Actual Result: The file is uploaded and shows in the Azure Portal. But, file preview is broken and I see the below error
AccessControlException
OPEN failed with error 0x83090aa2 (Forbidden. ACL verification failed. Either the resource does not exist or the user is not authorized to perform the requested operation.). [4f97235c-0852-44c8-a8d4-cbe190ffdb34]
How to solve this issue ?
Firstly, we really do not recommend that you use the swebhdfs path. As called out in Arsen’s blog, the adl client is much more performant. Here are directions for configuring the adl filesystem:
Hadoop Azure Data Lake Support
For your specific error, it looks like the mkdir is invoked on the local file system as shown by the "file:" in the output of the mkdir command.
To solve the error, follow the steps mentioned in Arsen’s blog. After configuration, run hdfs command on the swebhdfs path like
bin\hadoop> fs -ls swebhdfs://avdatalake2.azuredatalakestore.net:443/
One more thing: Since posting that blog, Azure Data Lake now has full support for the Java SDK. Here is an article that describes how to use the Java SDK to perform basic file operations:
Get started with Azure Data Lake Store using Java
-- Cathy
The easiest way to connect to ADLS is to use the Java SDK that Cathy mentioned in her response.
Get started with Azure Data Lake Store using Java
In your example why are you trying to connect from your Azure VM with the data lake store using the Hadoop client? Using the Hadoop client is a more convoluted way to achieve the seemingly simple scenario of connecting from your app on an Azure VM to ADLS.
The Hadoop client is typically what people do to connect existing Hadoop clusters to ADLS. I have a feeling that is not what you are trying to do. Let us know if this is not the case.
I am using Hibernate c3p0's Connection Pool
<property name="hibernate.c3p0.acquire_increment">1</property>
<property name="hibernate.c3p0.idle_test_period">100</property>
<property name="hibernate.c3p0.max_size">10</property>
<property name="hibernate.c3p0.max_statements">10</property>
<property name="hibernate.c3p0.min_size">10</property>
<property name="hibernate.c3p0.timeout">100</property>
<property name="hibernate.connection.provider_class">org.hibernate.connection.C3P0ConnectionProvider</property>
the dependency is defined in my pom.xml, and jar inside the library
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-c3p0</artifactId>
<version>4.3.6.Final</version>
</dependency>
The C3P0 connector gets picked up while starting my application:
INFO: HHH000130: Instantiating explicit connection provider: org.hibernate.connection.C3P0ConnectionProvider
Sep 10, 2014 7:39:39 PM org.hibernate.c3p0.internal.C3P0ConnectionProvider configure
INFO: HHH010002: C3P0 using driver: net.sourceforge.jtds.jdbc.Driver at URL: jdbc:jtds:sqlserver://localhost:1433/myDatabase
Sep 10, 2014 7:39:39 PM org.hibernate.c3p0.internal.C3P0ConnectionProvider configure
However, the code breaks at line 21:
3 | import org.hibernate.Session;
4 | import org.hibernate.SessionFactory;
5 | import org.hibernate.boot.registry.StandardServiceRegistryBuilder;
6 | import org.hibernate.cfg.Configuration;
...
...
19 | Configuration configuration = new Configuration().configure();
20 | StandardServiceRegistryBuilder builder = new StandardServiceRegistryBuilder().applySettings(configuration.getProperties());
21 | SessionFactory sessionFactory = configuration.buildSessionFactory(builder.build());
And this is returned:
Exception in thread "main" java.lang.NoClassDefFoundError: com/mchange/v2/c3p0/DataSources
at org.hibernate.c3p0.internal.C3P0ConnectionProvider.configure(C3P0ConnectionProvider.java:203)
at org.hibernate.boot.registry.internal.StandardServiceRegistryImpl.configureService(StandardServiceRegistryImpl.java:111)
at org.hibernate.service.internal.AbstractServiceRegistryImpl.initializeService(AbstractServiceRegistryImpl.java:234)
at org.hibernate.service.internal.AbstractServiceRegistryImpl.getService(AbstractServiceRegistryImpl.java:206)
at org.hibernate.engine.jdbc.internal.JdbcServicesImpl.buildJdbcConnectionAccess(JdbcServicesImpl.java:260)
at org.hibernate.engine.jdbc.internal.JdbcServicesImpl.configure(JdbcServicesImpl.java:94)
at org.hibernate.boot.registry.internal.StandardServiceRegistryImpl.configureService(StandardServiceRegistryImpl.java:111)
at org.hibernate.service.internal.AbstractServiceRegistryImpl.initializeService(AbstractServiceRegistryImpl.java:234)
at org.hibernate.service.internal.AbstractServiceRegistryImpl.getService(AbstractServiceRegistryImpl.java:206)
at org.hibernate.cfg.Configuration.buildTypeRegistrations(Configuration.java:1885)
at org.hibernate.cfg.Configuration.buildSessionFactory(Configuration.java:1843)
at com.boa.ecris.test.Main.main(Main.java:21)
Caused by: java.lang.ClassNotFoundException: com.mchange.v2.c3p0.DataSources
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
I've been searching online for quite sometime for deprecated jars, etc. But no luck yet.
Any ideas?
Caused by: java.lang.ClassNotFoundException: com.mchange.v2.c3p0.DataSources
The Error Itself says that it is not able to Find the com.mchange.v2.c3p0.DataSources class from the jars which are added in your Maven Dependencies.
The Exception is thrown at the time when you are configuring the Hibernate i.e.,
at LINE 19: Configuration configuration = new Configuration().configure();
Please add the c3p0-0.9.1.2.jar to the classpath or add the dependency in pom.xml from the link maven dependency.
In my case, it worked!! hope it helps you too...