How to connect ADLS account with Azure VM - java

I have followed this site https://blogs.msdn.microsoft.com/arsen/2016/08/05/accessing-azure-data-lake-store-using-webhdfs-with-oauth2-from-spark-2-0-that-is-running-locally/ to connect ADLS storage with my Azure VM.
Created Azure VM and installed my application in it
Created Azure Data Lake Store and service pricipal
Here is my core-site.xml:-
<configuration>
<property>
<name>dfs.webhdfs.oauth2.enabled</name>
<value>true</value>
</property>
<property>
<name>dfs.webhdfs.oauth2.access.token.provider</name>
<value>org.apache.hadoop.hdfs.web.oauth2.ConfRefreshTokenBasedAccessTokenProvider</value>
</property>
<property>
<name>dfs.webhdfs.oauth2.refresh.url</name>
<value>https://login.windows.net/tenaid-id-here/oauth2/token</value>
</property>
<property>
<name>dfs.webhdfs.oauth2.client.id</name>
<value>Client id</value>
</property>
<property>
<name>dfs.webhdfs.oauth2.refresh.token.expires.ms.since.epoch</name>
<value>0</value>
</property>
<property>
<name>dfs.webhdfs.oauth2.refresh.token</name>
<value>Refresh token</value>
</property>
</configuration>
I have installed my application in Azure VM and I get following error when I upload file in my application.
2017-01-27 12:54:25.963 GMT+0000 WARN [admin-1fd467a4c41f43fe9f30ab446a5c93ac-84-b6792518109848bead029c9144603d04-libraryService.importDataFiles] LibraryImpl - Failed to write data file partID: 0 at: library/51dc056c0a634beba243120501fe70d6/545ca95c2a894f948b1f5184b013a53e/5c68d893090f471d81f3cdfc810bc4f7/b6d5ceb64bfd4d65ba4ea24d24f99e90
java.io.IOException: Mkdirs failed to create file:/clusters/myapp/library/51dc056c0a634beba243120501fe70d6/545ca95c2a894f948b1f5184b013a53e/5c68d893090f471d81f3cdfc810bc4f7/b6d5ceb64bfd4d65ba4ea24d24f99e90/data (exists=false, cwd=file:/home/palmtree/work/software/myapp-2.5-SNAPSHOT/myapp)
at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:450)
at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:435)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:909)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:890)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:787)
at parquet.hadoop.ParquetFileWriter.<init>(ParquetFileWriter.java:150)
at parquet.hadoop.ParquetWriter.<init>(ParquetWriter.java:176)
at parquet.avro.AvroParquetWriter.<init>(AvroParquetWriter.java:93)
at com.myapp.hadoop.common.PaxParquetWriterImpl.doWriteRow(PaxParquetWriterImpl.java:52)
at com.myapp.hadoop.common.PaxParquetWriterImpl.access$000(PaxParquetWriterImpl.java:19)
at com.myapp.hadoop.common.PaxParquetWriterImpl$1.run(PaxParquetWriterImpl.java:43)
at com.myapp.hadoop.common.PaxParquetWriterImpl$1.run(PaxParquetWriterImpl.java:40)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at com.myapp.hadoop.common.PaxParquetWriterImpl.writpeRow(PaxParquetWriterImpl.java:40)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.myapp.hadoop.core.DistributionManager$$anon$10.invoke(DistributionManager.scala:313)
at com.sun.proxy.$Proxy56.writeRow(Unknown Source)
at com.myapp.library.stacks.DataFileWriter.write(DataFileWriter.java:49)
at com.myapp.library.LibraryImpl.pullImportData(LibraryImpl.java:747)
at com.myapp.library.LibraryImpl.importDataFile(LibraryImpl.java:631)
at com.myapp.frontend.server.LibraryAPI.importDataFile(LibraryAPI.java:269)
at com.myapp.frontend.server.LibraryWebSocketDelegate.importDataFile(LibraryWebSocketDelegate.java:189)
at com.myapp.frontend.server.LibraryWebSocketDelegate.importDataFiles(LibraryWebSocketDelegate.java:204)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.myapp.frontend.util.PXWebSocketProtocolHandler$PXMethodHandler.call(PXWebSocketProtocolHandler.java:144)
at com.myapp.frontend.util.PXWebSocketEndpoint.performMethodCall(PXWebSocketEndpoint.java:284)
at com.myapp.frontend.util.PXWebSocketEndpoint.access$200(PXWebSocketEndpoint.java:47)
at com.myapp.frontend.util.PXWebSocketEndpoint$1.run(PXWebSocketEndpoint.java:169)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
2017-01-27 12:54:25.966 GMT+0000 WARN [admin-1fd467a4c41f43fe9f30ab446a5c93ac-84-b6792518109848bead029c9144603d04-libraryService.importDataFiles] LibraryImpl - Failed to import acquisition da73b76755c34c74a1643a324e41e156
com.myapp.iface.service.RequestFailedException
at com.myapp.library.LibraryImpl.pullImportData(LibraryImpl.java:754)
at com.myapp.library.LibraryImpl.importDataFile(LibraryImpl.java:631)
at com.myapp.frontend.server.LibraryAPI.importDataFile(LibraryAPI.java:269)
at com.myapp.frontend.server.LibraryWebSocketDelegate.importDataFile(LibraryWebSocketDelegate.java:189)
at com.myapp.frontend.server.LibraryWebSocketDelegate.importDataFiles(LibraryWebSocketDelegate.java:204)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.myapp.frontend.util.PXWebSocketProtocolHandler$PXMethodHandler.call(PXWebSocketProtocolHandler.java:144)
at com.myapp.frontend.util.PXWebSocketEndpoint.performMethodCall(PXWebSocketEndpoint.java:284)
at com.myapp.frontend.util.PXWebSocketEndpoint.access$200(PXWebSocketEndpoint.java:47)
at com.myapp.frontend.util.PXWebSocketEndpoint$1.run(PXWebSocketEndpoint.java:169)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Kindly help to solve this
Update 1:-
Tried the following to connect My application on Azure VM with ADLS:-
Added azure-data-lake-store-sdk in lib
I have followed this Service-to-service authentication to create application in Azure Active Directory .
I have also assigned the Azure AD application to the ADLS account root directory.
Root directory -> /clusters/myapp
Updated core-site.xml based on values from above documentation.
<configuration>
<property>
<name>dfs.adls.home.hostname</name>
<value>dev.azuredatalakestore.net</value>
</property>
<property>
<name>dfs.adls.home.mountpoint</name>
<value>/clusters</value>
</property>
<property>
<name>fs.adl.impl</name>
<value>org.apache.hadoop.fs.adl.AdlFileSystem</value>
</property>
<property>
<name>fs.AbstractFileSystem.adl.impl</name>
<value>org.apache.hadoop.fs.adl.Adl</value>
</property>
<property>
<name>dfs.adls.oauth2.refresh.url</name>
<value>https://login.windows.net/[tenantId]/oauth2/token</value>
</property>
<property>
<name>dfs.adls.oauth2.client.id</name>
<value>[CLIENT ID]</value>
</property>
<property>
<name>dfs.adls.oauth2.credential</name>
<value>[CLIENT KEY]</value>
</property>
<property>
<name>dfs.adls.oauth2.access.token.provider.type</name>
<value>ClientCredential</value>
</property>
<property>
<name>fs.azure.io.copyblob.retry.max.retries</name>
<value>60</value>
</property>
<property>
<name>fs.azure.io.read.tolerate.concurrent.append</name>
<value>true</value>
</property>
<property>
<name>fs.defaultFS</name>
<value>adl://dev.azuredatalakestore.net</value>
<final>true</final>
</property>
<property>
<name>fs.trash.interval</name>
<value>360</value>
</property>
</configuration>
I am getting following error when I start my application Server in VM:-
2017-02-02 07:40:27.527 GMT+0000 INFO [main] DistributionManager - Looking for class loader for distroName=adl kerberized=false
2017-02-02 07:40:28.428 GMT+0000 ERROR [main] SimpleHdfsFileSystem - Failed to initialize HDFS file storage on null as hdfs root /myapp
org.apache.hadoop.security.AccessControlException: Unauthorized
at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:347)
at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$200(WebHdfsFileSystem.java:98)
at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.runWithRetry(WebHdfsFileSystem.java:623)
at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.access$100(WebHdfsFileSystem.java:472)
at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner$1.run(WebHdfsFileSystem.java:502)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.run(WebHdfsFileSystem.java:498)
at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.mkdirs(WebHdfsFileSystem.java:919)
at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1877)
at com.myapp.hadoop.common.HdfsFileSystem$1.run(HdfsFileSystem.java:98)
at com.myapp.hadoop.common.HdfsFileSystem$1.run(HdfsFileSystem.java:91)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at com.myapp.hadoop.common.HdfsFileSystem.__initialize(HdfsFileSystem.java:91)
at com.myapp.hadoop.common.SimpleHdfsFileSystem.initialize(SimpleHdfsFileSystem.java:40)
at com.myapp.hadoop.hdp2.HadoopDistributionImpl.initializeHdfs(HadoopDistributionImpl.java:63)
at com.myapp.hadoop.hdp2.UnsecureHadoopDistributionImpl.connectToFileSystem(UnsecureHadoopDistributionImpl.java:22)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.myapp.hadoop.core.DistributionManager$$anon$1.invoke(DistributionManager.scala:135)
at com.sun.proxy.$Proxy22.connectToFileSystem(Unknown Source)
at com.myapp.library.LibraryStorageImpl.parseSimpleAuthFileSystem(LibraryStorageImpl.scala:126)
at com.myapp.library.LibraryStorageImpl.initializeStorageWithPrefix(LibraryStorageImpl.scala:64)
at com.myapp.library.LibraryStorageImpl.initialize(LibraryStorageImpl.scala:39)
at com.myapp.library.LibraryStorageImpl.initialize(LibraryStorageImpl.scala:33)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeCustomInitMethod(AbstractAutowireCapableBeanFactory.java:1581)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1522)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1452)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:519)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:197)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBean(DefaultListableBeanFactory.java:274)
at org.springframework.context.support.AbstractApplicationContext.getBean(AbstractApplicationContext.java:1106)
at com.myapp.container.PxBeanContext.getBean(PxBeanContext.java:156)
at com.myapp.library.streaming.files.UploadFileServiceImpl.initialize(UploadFileServiceImpl.java:49)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeCustomInitMethod(AbstractAutowireCapableBeanFactory.java:1581)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1522)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1452)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:519)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:609)
at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:918)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:469)
at com.myapp.container.PxBeanContext.startup(PxBeanContext.java:42)
at com.myapp.jetty.FrontendServer.main(FrontendServer.java:124)
2017-02-02 07:40:28.462 GMT+0000 WARN [main] server - HQ222113: On ManagementService stop, there are 1 unexpected registered MBeans: [core.acceptor.dc9ff2aa-e91a-11e6-9a51-09b76b4431e6]
2017-02-02 07:40:28.479 GMT+0000 INFO [main] server - HQ221002: HornetQ Server version 2.5.0.SNAPSHOT (Wild Hornet, 124) [7039110c-dd57-11e6-b90d-2bc6685808f5] stopped
2017-02-02 07:40:28.480 GMT+0000 ERROR [main] FrontendServer - Fatal error trying to start server
java.lang.RuntimeException: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'com.myapp.library.streaming.files.UploadFileServiceImpl#0' defined in class path resource [system-config.xml]: Invocation of init method failed; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'com.myapp.library.LibraryStorageImpl#0' defined in class path resource [system-config.xml]: Invocation of init method failed; nested exception is java.lang.RuntimeException: org.apache.hadoop.security.AccessControlException: Unauthorized
at com.myapp.container.PxBeanContext.startup(PxBeanContext.java:44)
at com.myapp.jetty.FrontendServer.main(FrontendServer.java:124)
Caused by: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'com.myapp.library.streaming.files.UploadFileServiceImpl#0' defined in class path resource [system-config.xml]: Invocation of init method failed; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'com.myapp.library.LibraryStorageImpl#0' defined in class path resource [system-config.xml]: Invocation of init method failed; nested exception is java.lang.RuntimeException: org.apache.hadoop.security.AccessControlException: Unauthorized
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1455)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:519)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:609)
at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:918)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:469)
at com.myapp.container.PxBeanContext.startup(PxBeanContext.java:42)
... 1 more
Caused by: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'com.myapp.library.LibraryStorageImpl#0' defined in class path resource [system-config.xml]: Invocation of init method failed; nested exception is java.lang.RuntimeException: org.apache.hadoop.security.AccessControlException: Unauthorized
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1455)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:519)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:197)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBean(DefaultListableBeanFactory.java:274)
at org.springframework.context.support.AbstractApplicationContext.getBean(AbstractApplicationContext.java:1106)
at com.myapp.container.PxBeanContext.getBean(PxBeanContext.java:156)
at com.myapp.library.streaming.files.UploadFileServiceImpl.initialize(UploadFileServiceImpl.java:49)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeCustomInitMethod(AbstractAutowireCapableBeanFactory.java:1581)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1522)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1452)
... 11 more
Caused by: java.lang.RuntimeException: org.apache.hadoop.security.AccessControlException: Unauthorized
at com.myapp.hadoop.common.SimpleHdfsFileSystem.initialize(SimpleHdfsFileSystem.java:45)
at com.myapp.hadoop.hdp2.HadoopDistributionImpl.initializeHdfs(HadoopDistributionImpl.java:63)
at com.myapp.hadoop.hdp2.UnsecureHadoopDistributionImpl.connectToFileSystem(UnsecureHadoopDistributionImpl.java:22)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.myapp.hadoop.core.DistributionManager$$anon$1.invoke(DistributionManager.scala:135)
at com.sun.proxy.$Proxy22.connectToFileSystem(Unknown Source)
at com.myapp.library.LibraryStorageImpl.parseSimpleAuthFileSystem(LibraryStorageImpl.scala:126)
at com.myapp.library.LibraryStorageImpl.initializeStorageWithPrefix(LibraryStorageImpl.scala:64)
at com.myapp.library.LibraryStorageImpl.initialize(LibraryStorageImpl.scala:39)
at com.myapp.library.LibraryStorageImpl.initialize(LibraryStorageImpl.scala:33)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeCustomInitMethod(AbstractAutowireCapableBeanFactory.java:1581)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1522)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1452)
... 28 more
Caused by: org.apache.hadoop.security.AccessControlException: Unauthorized
at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:347)
at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$200(WebHdfsFileSystem.java:98)
at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.runWithRetry(WebHdfsFileSystem.java:623)
at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.access$100(WebHdfsFileSystem.java:472)
at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner$1.run(WebHdfsFileSystem.java:502)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.run(WebHdfsFileSystem.java:498)
at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.mkdirs(WebHdfsFileSystem.java:919)
at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1877)
at com.myapp.hadoop.common.HdfsFileSystem$1.run(HdfsFileSystem.java:98)
at com.myapp.hadoop.common.HdfsFileSystem$1.run(HdfsFileSystem.java:91)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at com.myapp.hadoop.common.HdfsFileSystem.__initialize(HdfsFileSystem.java:91)
at com.myapp.hadoop.common.SimpleHdfsFileSystem.initialize(SimpleHdfsFileSystem.java:40)
... 47 more
Using following jars in my project:-
azure-data-lake-store-sdk-2.1.4.jar
commons-cli-1.2.jar
commons-configuration-1.6.jar
hadoop-auth-2.7.1.jar
hadoop-azure-datalake-3.0.0-alpha1.jar
hadoop-common-2.5-SNAPSHOT.jar
hadoop-common-2.7.1.jar
hadoop-hdfs-2.7.3.jar
hadoop-hdp2-2.5-SNAPSHOT.jar
Clarifications:-
My intention is to connect My application on Azure VM with DataLake Store with out need of HDInsight Cluster. Is that possible ? If so what steps should I need to follow ? What are the configuration needs to be present in core-site.xml ?
File preview fails with AccessControlException error in ADLS
Login the HDInsight cluster which is associated to the Data Lake Store using ssh command - ssh [user]#[cluster2]-ssh.azurehdinsight.net
Copy a file to the cluster using the wget command - wget http://www.sample-videos.com/csv/Sample-Spreadsheet-10-rows.csv
Create a new folder in your Data Lake Store account
Now upload a file using PUT command
hdfs dfs -put Sample-Spreadsheet-10-rows.csv adl://dev2.azuredatalakestore.net/new
View the file in the Azure Portal
Actual Result: The file is uploaded and shows in the Azure Portal. But, file preview is broken and I see the below error
AccessControlException
OPEN failed with error 0x83090aa2 (Forbidden. ACL verification failed. Either the resource does not exist or the user is not authorized to perform the requested operation.). [4f97235c-0852-44c8-a8d4-cbe190ffdb34]
How to solve this issue ?

Firstly, we really do not recommend that you use the swebhdfs path. As called out in Arsen’s blog, the adl client is much more performant. Here are directions for configuring the adl filesystem:
Hadoop Azure Data Lake Support
For your specific error, it looks like the mkdir is invoked on the local file system as shown by the "file:" in the output of the mkdir command.
To solve the error, follow the steps mentioned in Arsen’s blog. After configuration, run hdfs command on the swebhdfs path like
bin\hadoop> fs -ls swebhdfs://avdatalake2.azuredatalakestore.net:443/
One more thing: Since posting that blog, Azure Data Lake now has full support for the Java SDK. Here is an article that describes how to use the Java SDK to perform basic file operations:
Get started with Azure Data Lake Store using Java
-- Cathy

The easiest way to connect to ADLS is to use the Java SDK that Cathy mentioned in her response.
Get started with Azure Data Lake Store using Java
In your example why are you trying to connect from your Azure VM with the data lake store using the Hadoop client? Using the Hadoop client is a more convoluted way to achieve the seemingly simple scenario of connecting from your app on an Azure VM to ADLS.
The Hadoop client is typically what people do to connect existing Hadoop clusters to ADLS. I have a feeling that is not what you are trying to do. Let us know if this is not the case.

Related

java.lang.NoClassDefFoundError: org/apache/hadoop/hive/ql/metadata/HiveException when query in spark-shell

I’m trying to integrate spark(3.1.1) and hive local metastore (3.1.2) to use spark-sql.
i configured the spark-defaults.conf according to https://spark.apache.org/docs/latest/sql-data-sources-hive-tables.html and hive jar files exists in correct path.
but an exception occurred when execute 'spark.sql("show tables").show' like below.
any mistakes, hints, or corrections would be appreciated.
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 3.1.1
/_/
Using Scala version 2.12.10 (OpenJDK 64-Bit Server VM, Java 1.8.0_292)
Type in expressions to have them evaluated.
Type :help for more information.
scala> spark.sql("show tables").show
java.lang.NoClassDefFoundError: org/apache/hadoop/hive/ql/metadata/HiveException
at java.lang.Class.getDeclaredConstructors0(Native Method)
at java.lang.Class.privateGetDeclaredConstructors(Class.java:2671)
at java.lang.Class.getConstructors(Class.java:1651)
at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:291)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:492)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:352)
at org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:71)
at org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:70)
at org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$databaseExists$1(HiveExternalCatalog.scala:22
4)
at scala.runtime.java8.JFunction0$mcZ$sp.apply(JFunction0$mcZ$sp.java:23)
spark-defaults.conf
spark.master yarn
spark.eventLog.enabled true
spark.eventLog.dir hdfs://192.168.5.130:9000/spark
spark.history.fs.logDirectory hdfs://192.168.5.130:9000/spark
spark.history.provider org.apache.spark.deploy.history.FsHistoryProvider
spark.yarn.historyServer.address http://192.168.5.130:8188
spark.yarn.historyServer.allowTracking true
spark.sql.uris thrift://192.168.5.130:10000
spark.sql.warehouse.dir /user/hive/warehouse
spark.sql.hive.metastore.jars path
spark.sql.hive.metastore.jars.path file:///usr/local/hive/lib/*.jar
spark.sql.hive.metastore.version 3.1.2
spark.sql.hive.metastore.sharedPrefixes org.postgresql
ls /usr/local/hive/lib | grep hive
hive-accumulo-handler-3.1.2.jar
hive-beeline-3.1.2.jar
hive-classification-3.1.2.jar
hive-cli-3.1.2.jar
hive-common-3.1.2.jar
hive-contrib-3.1.2.jar
hive-druid-handler-3.1.2.jar
hive-exec-3.1.2.jar
hive-hbase-handler-3.1.2.jar
hive-hcatalog-core-3.1.2.jar
hive-hcatalog-server-extensions-3.1.2.jar
hive-hplsql-3.1.2.jar
hive-jdbc-3.1.2.jar
hive-jdbc-handler-3.1.2.jar
hive-kryo-registrator-3.1.2.jar
hive-llap-client-3.1.2.jar
hive-llap-common-3.1.2.jar
hive-llap-common-3.1.2-tests.jar
hive-llap-ext-client-3.1.2.jar
hive-llap-server-3.1.2.jar
hive-llap-tez-3.1.2.jar
hive-metastore-3.1.2.jar
hive-serde-3.1.2.jar
hive-service-3.1.2.jar
hive-service-rpc-3.1.2.jar
hive-shims-0.23-3.1.2.jar
hive-shims-3.1.2.jar
hive-shims-common-3.1.2.jar
hive-shims-scheduler-3.1.2.jar
hive-standalone-metastore-3.1.2.jar
hive-storage-api-2.7.0.jar
hive-streaming-3.1.2.jar
hive-testutils-3.1.2.jar
hive-upgrade-acid-3.1.2.jar
hive-vector-code-gen-3.1.2.jar
hive-site.xml
<configuration>
<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:postgresql://192.168.5.130:5432/hive?createDatabaseIfNotExist=true</value>
</property>
<property>
<name>javax.jdo.option.ConnectionDriverName</name>
<value>org.postgresql.Driver</value></property>
<property>
<name>javax.jdo.option.ConnectionUserName</name>
<value>hive</value>
</property>
<property>
<name>javax.jdo.option.ConnectionPassword</name>
<value>hive</value>
</property>
<property>
<name>hive.metastore.warehouse.dir</name>
<value>hdfs://192.168.5.130:9000/user/hive/warehouse</value>
</property>
<property>
<name>hive.server2.enable.doAs</name>
<value>false</value>
</property>
<property>
<name>datanucleus.autoCreateSchema</name>
<value>false</value>
</property>
<property>
<name>hive.aux.jars.path</name>
<value>file:///usr/local/hive/lib</value>
</property>
</configuration>
After copy hive-site.xml to $SPARK_HOME/conf, not found exception occrred about org/apache/commons/collections/CollectionUtils like below.
spark.sql("show tables").show
scala> spark.sql("show tables").show
21/05/24 00:49:58 ERROR FileUtils: The jar file path file:///usr/local/hive/lib/*.jar doesn't exist
Hive Session ID = a6d63a41-e235-4d8c-a660-6f7b1a22996b
21/05/24 00:49:59 WARN ObjectStore: datanucleus.autoStartMechanismMode is set to unsupported value null . Setting it to value: ignored
21/05/24 00:50:01 WARN MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored
21/05/24 00:50:01 WARN MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored
21/05/24 00:50:01 WARN MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored
21/05/24 00:50:01 WARN MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored
21/05/24 00:50:01 WARN MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored
21/05/24 00:50:01 WARN MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored
21/05/24 00:50:02 WARN MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored
21/05/24 00:50:02 WARN MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored
21/05/24 00:50:02 WARN MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored
21/05/24 00:50:02 WARN MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored
21/05/24 00:50:02 WARN MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored
21/05/24 00:50:02 WARN MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored
21/05/24 00:50:02 WARN Hive: Failed to register all functions.
java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
at org.apache.hadoop.hive.metastore.utils.JavaUtils.newInstance(JavaUtils.java:86)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:95)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:148)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:119)
at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:4299)
at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:4367)
at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:4347)
at org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:4603)
at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:291)
at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:274)
at org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:435)
at org.apache.hadoop.hive.ql.metadata.Hive.create(Hive.java:375)
at org.apache.hadoop.hive.ql.metadata.Hive.getInternal(Hive.java:355)
at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:331)
at org.apache.spark.sql.hive.client.HiveClientImpl.client(HiveClientImpl.scala:257)
at org.apache.spark.sql.hive.client.HiveClientImpl.$anonfun$withHiveState$1(HiveClientImpl.scala:283)
at org.apache.spark.sql.hive.client.HiveClientImpl.liftedTree1$1(HiveClientImpl.scala:224)
at org.apache.spark.sql.hive.client.HiveClientImpl.retryLocked(HiveClientImpl.scala:223)
at org.apache.spark.sql.hive.client.HiveClientImpl.withHiveState(HiveClientImpl.scala:273)
at org.apache.spark.sql.hive.client.HiveClientImpl.databaseExists(HiveClientImpl.scala:384)
at org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$databaseExists$1(HiveExternalCatalog.scala:224)
at scala.runtime.java8.JFunction0$mcZ$sp.apply(JFunction0$mcZ$sp.java:23)
at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:102)
at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:224)
at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:134)
at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:124)
at org.apache.spark.sql.internal.SharedState.globalTempViewManager$lzycompute(SharedState.scala:154)
at org.apache.spark.sql.internal.SharedState.globalTempViewManager(SharedState.scala:152)
at org.apache.spark.sql.hive.HiveSessionStateBuilder.$anonfun$catalog$2(HiveSessionStateBuilder.scala:60)
at org.apache.spark.sql.catalyst.catalog.SessionCatalog.globalTempViewManager$lzycompute(SessionCatalog.scala:99)
at org.apache.spark.sql.catalyst.catalog.SessionCatalog.globalTempViewManager(SessionCatalog.scala:99)
at org.apache.spark.sql.catalyst.catalog.SessionCatalog.listTables(SessionCatalog.scala:946)
at org.apache.spark.sql.catalyst.catalog.SessionCatalog.listTables(SessionCatalog.scala:932)
at org.apache.spark.sql.catalyst.catalog.SessionCatalog.listTables(SessionCatalog.scala:924)
at org.apache.spark.sql.execution.command.ShowTablesCommand.$anonfun$run$43(tables.scala:868)
at scala.Option.getOrElse(Option.scala:189)
at org.apache.spark.sql.execution.command.ShowTablesCommand.run(tables.scala:868)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:79)
at org.apache.spark.sql.Dataset.$anonfun$logicalPlan$1(Dataset.scala:228)
at org.apache.spark.sql.Dataset.$anonfun$withAction$1(Dataset.scala:3687)
at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$5(SQLExecution.scala:103)
at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163)
at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:90)
at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:772)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64)
at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3685)
at org.apache.spark.sql.Dataset.<init>(Dataset.scala:228)
at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:99)
at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:772)
at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:96)
at org.apache.spark.sql.SparkSession.$anonfun$sql$1(SparkSession.scala:615)
at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:772)
at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:610)
at $line14.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(<console>:24)
at $line14.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(<console>:28)
at $line14.$read$$iw$$iw$$iw$$iw$$iw$$iw.<init>(<console>:30)
at $line14.$read$$iw$$iw$$iw$$iw$$iw.<init>(<console>:32)
at $line14.$read$$iw$$iw$$iw$$iw.<init>(<console>:34)
at $line14.$read$$iw$$iw$$iw.<init>(<console>:36)
at $line14.$read$$iw$$iw.<init>(<console>:38)
at $line14.$read$$iw.<init>(<console>:40)
at $line14.$read.<init>(<console>:42)
at $line14.$read$.<init>(<console>:46)
at $line14.$read$.<clinit>(<console>)
at $line14.$eval$.$print$lzycompute(<console>:7)
at $line14.$eval$.$print(<console>:6)
at $line14.$eval.$print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:745)
at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1021)
at scala.tools.nsc.interpreter.IMain.$anonfun$interpret$1(IMain.scala:574)
at scala.reflect.internal.util.ScalaClassLoader.asContext(ScalaClassLoader.scala:41)
at scala.reflect.internal.util.ScalaClassLoader.asContext$(ScalaClassLoader.scala:37)
at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:41)
at scala.tools.nsc.interpreter.IMain.loadAndRunReq$1(IMain.scala:573)
at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:600)
at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:570)
at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:894)
at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:762)
at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:464)
at scala.tools.nsc.interpreter.ILoop.loop(ILoop.scala:485)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:239)
at org.apache.spark.repl.Main$.doMain(Main.scala:78)
at org.apache.spark.repl.Main$.main(Main.scala:58)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:951)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1030)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1039)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.hive.metastore.utils.JavaUtils.newInstance(JavaUtils.java:84)
... 101 more
Caused by: MetaException(message:org/apache/commons/collections/CollectionUtils)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:84)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:93)
at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:8667)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:169)
at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:94)
... 106 more
Caused by: java.lang.NoClassDefFoundError: org/apache/commons/collections/CollectionUtils
at org.apache.hadoop.hive.metastore.ObjectStore.grantPrivileges(ObjectStore.java:5709)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:97)
at com.sun.proxy.$Proxy39.grantPrivileges(Unknown Source)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultRoles_core(HiveMetaStore.java:828)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultRoles(HiveMetaStore.java:794)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:539)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invokeInternal(RetryingHMSHandler.java:147)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:108)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:80)
... 110 more
Caused by: java.lang.ClassNotFoundException: org.apache.commons.collections.CollectionUtils
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
at org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1.doLoadClass(IsolatedClientLoader.scala:247)
at org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1.loadClass(IsolatedClientLoader.scala:236)
at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
... 127 more
repeated logs ...
this error can be fixed to copying $SPARK_HOME/jars/commons-collections-3.2.2.jar to $HIVE_HOME/lib.
Seems your hive conf is missing.
To connect to hive metastore you need to copy the hive-site.xml file into spark/conf directory.
Try
cp /usr/lib/hive/conf/hive-site.xml ${SPARK_HOME}/conf/

How to resolve a ConnectException when running a jar on Hadoop?

I have written a simple map reduce job to perform KMeans clustering on some points.
However, when running the following command on Windows 10 cmd:
hadoop jar kmeans.jar KMeansJob /input /output
I get the following error:
21/04/08 22:26:14 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032
21/04/08 22:26:14 WARN mapreduce.JobResourceUploader: Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this.
21/04/08 22:26:17 INFO mapreduce.JobSubmitter: Cleaning up the staging area /tmp/hadoop-yarn/staging/????????/.staging/job_1617909910497_0001
Exception in thread "main" java.net.ConnectException: Call From PCNAME/XXX.XXX.X.X to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused: no further information; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:824)
at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:754)
at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1497)
at org.apache.hadoop.ipc.Client.call(Client.java:1439)
at org.apache.hadoop.ipc.Client.call(Client.java:1349)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:227)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:116)
at com.sun.proxy.$Proxy10.getFileInfo(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:796)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:422)
at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:165)
at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:157)
at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:359)
at com.sun.proxy.$Proxy11.getFileInfo(Unknown Source)
at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1649)
at org.apache.hadoop.hdfs.DistributedFileSystem$29.doCall(DistributedFileSystem.java:1529)
at org.apache.hadoop.hdfs.DistributedFileSystem$29.doCall(DistributedFileSystem.java:1526)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1526)
at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:327)
at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:237)
at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestamps(ClientDistributedCacheManager.java:106)
at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestampsAndCacheVisibilities(ClientDistributedCacheManager.java:70)
at org.apache.hadoop.mapreduce.JobResourceUploader.uploadResourcesInternal(JobResourceUploader.java:210)
at org.apache.hadoop.mapreduce.JobResourceUploader.uploadResources(JobResourceUploader.java:128)
at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:101)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:196)
at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1570)
at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1567)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1889)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1567)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1588)
at clustering.KMeansJob.run(KMeansJob.java:43)
at clustering.KMeansJob.main(KMeansJob.java:47)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:239)
at org.apache.hadoop.util.RunJar.main(RunJar.java:153)
Caused by: java.net.ConnectException: Connection refused: no further information
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:715)
at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:687)
at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:790)
at org.apache.hadoop.ipc.Client$Connection.access$3500(Client.java:411)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:1554)
at org.apache.hadoop.ipc.Client.call(Client.java:1385)
... 43 more
In the above logs, I have hidden the IP the call was made from.
Running jps gives the following output:
4608 ResourceManager
13284 DataNode
7252 NameNode
10632 NodeManager
15436 Jps
What is the problem and is there any suggestion to cope with it?
Changing the core-site.xml configuration seems to do the job:
<configuration>
<property>
<name>fs.default.name</name>
<value>hdfs://localhost:9000</value>
</property>
</configuration>
For the record, my previous configuration had a value of:
<value>hdfs://0.0.0.0:19000</value>

Camunda Process Engine configuration with MockExpressionManager for wildfly

We have Camunda installed as a module for Wildfly8.
Everything works fine, but I need a way to mock java delegates for unit tests (with Arquillian).
As I understand, org.camunda.bpm.engine.test.mock.Mocks can be used to provide mocked delegates.
According to JavaDoc, I should register MockExpressionManager in my process engine configuration.
I've found some similar config with MockExpressingManager here
https://github.com/camunda/camunda-bpm-assert/blob/master/camunda-bpm-assert-examples/src/test/resources/camunda.cfg.xml
but camunda module for wildfly is configured in standalone-full.xml:
<subsystem xmlns="urn:org.camunda.bpm.jboss:1.1">
<process-engines>
<process-engine name="default" default="true">
<datasource>
java:jboss/datasources/ProcessEngine
</datasource>
<history-level>
full
</history-level>
<configuration>
org.camunda.bpm.engine.cdi.CdiJtaProcessEngineConfiguration
</configuration>
<properties>
<property name="jobExecutorAcquisitionName">
default
</property>
<property name="isAutoSchemaUpdate">
true
</property>
<property name="authorizationEnabled">
true
</property>
<property name="jobExecutorDeploymentAware">
true
</property>
<property name="expressionManager">
org.camunda.bpm.engine.test.mock.MockExpressionManager
</property>
</properties>
but this doesn't work, and on wildfly startup I see
16:45:31,930 ERROR [org.jboss.msc.service.fail] (ServerService Thread Pool -- 61) MSC000001: Failed to start service org.camunda.bpm.platform.process-engine.default: org.jboss.msc.service.StartException in service org.camunda.bpm.platform.process-engine.default: org.camunda.bpm.engine.ProcessEngineException: Could not set value for property 'expressionManager' on class org.camunda.bpm.engine.cdi.CdiJtaProcessEngineConfiguration
at org.camunda.bpm.container.impl.jboss.service.MscManagedProcessEngineController$1.run(MscManagedProcessEngineController.java:97)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [rt.jar:1.8.0_11]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) [rt.jar:1.8.0_11]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [rt.jar:1.8.0_11]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [rt.jar:1.8.0_11]
at java.lang.Thread.run(Thread.java:745) [rt.jar:1.8.0_11]
at org.jboss.threads.JBossThread.run(JBossThread.java:122) [jboss-threads-2.1.1.Final.jar:2.1.1.Final]
Caused by: org.camunda.bpm.engine.ProcessEngineException: Could not set value for property 'expressionManager' on class org.camunda.bpm.engine.cdi.CdiJtaProcessEngineConfiguration
at org.camunda.bpm.container.impl.metadata.PropertyHelper.applyProperty(PropertyHelper.java:76)
at org.camunda.bpm.container.impl.metadata.PropertyHelper.applyProperties(PropertyHelper.java:94)
at org.camunda.bpm.container.impl.jboss.service.MscManagedProcessEngineController.startProcessEngine(MscManagedProcessEngineController.java:173)
at org.camunda.bpm.container.impl.jboss.service.MscManagedProcessEngineController$2.run(MscManagedProcessEngineController.java:131)
at org.camunda.bpm.container.impl.jboss.service.MscManagedProcessEngineController$2.run(MscManagedProcessEngineController.java:129)
at org.camunda.bpm.container.impl.jboss.util.Tccl.runWithTccl(Tccl.java:53)
at org.camunda.bpm.container.impl.jboss.util.Tccl.runUnderClassloader(Tccl.java:45)
at org.camunda.bpm.container.impl.jboss.service.MscManagedProcessEngineController.startInternal(MscManagedProcessEngineController.java:129)
at org.camunda.bpm.container.impl.jboss.service.MscManagedProcessEngineController$1.run(MscManagedProcessEngineController.java:90)
... 6 more
Caused by: java.lang.IllegalArgumentException: argument type mismatch
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) [rt.jar:1.8.0_11]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) [rt.jar:1.8.0_11]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) [rt.jar:1.8.0_11]
at java.lang.reflect.Method.invoke(Method.java:483) [rt.jar:1.8.0_11]
at org.camunda.bpm.container.impl.metadata.PropertyHelper.applyProperty(PropertyHelper.java:74)
... 14 more
How to set this MockExpressionManager to the configuration correctly?
Or may be there are some other ways to mock java delegates?
From looking at the Camunda code, they're expecting a reference to an expression manager to be passed in. If I had to guess, this is being passed in as a string. A dirty way I can think of to handle this is to create your subclass of CdiJtaProcessEngineConfiguration which configures the proper expression manager, deploy that JAR to their module and wire that in.
Personally, I found their configuration to be a bit too tight for me in direct wildfly integration and chose to configure their library in my app directly.

CXF 2.6.1 Logging in jboss - exception

I'm trying to upgrade cxf from version 2.3.1 to version 2.6.1 in my project.
The project is a web-application which uses cxf as a web-service consumer running in jboss 5.1
The project uses slf4j logging but the dependencies are provided since the jboss container comes with slf4j.
I created the following file in the project:
META-INF/cxf/org.apache.cxf.Logger with a single line as its content:
org.apache.cxf.common.logging.Slf4jLogger (As per cxf instructions)
When I run the project in jboss (stand-alone or from eclipse) I get the following error:
org.springframework.beans.factory.BeanCreationException: Error creating bean with name '******' defined in class path resource [applicationContext.xml]: Instantiation of bean failed; nested exception is org.springframework.beans.factory.BeanDefinitionStoreException: Factory method [public synchronized java.lang.Object org.apache.cxf.jaxws.JaxWsProxyFactoryBean.create()] threw exception; nested exception is java.lang.NoSuchMethodError: org.slf4j.spi.LocationAwareLogger.log(Lorg/slf4j/Marker;Ljava/lang/String;ILjava/lang/String;[Ljava/lang/Object;Ljava/lang/Throwable;)V
at org.springframework.beans.factory.support.ConstructorResolver.instantiateUsingFactoryMethod(ConstructorResolver.java:581)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateUsingFactoryMethod(AbstractAutowireCapableBeanFactory.java:1015)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:911)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:485)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:609)
at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:918)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:469)
at org.springframework.web.context.ContextLoader.configureAndRefreshWebApplicationContext(ContextLoader.java:383)
at org.springframework.web.context.ContextLoader.initWebApplicationContext(ContextLoader.java:283)
at org.springframework.web.context.ContextLoaderListener.contextInitialized(ContextLoaderListener.java:111)
at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:3910)
at org.apache.catalina.core.StandardContext.start(StandardContext.java:4393)
at org.jboss.web.tomcat.service.deployers.TomcatDeployment.performDeployInternal(TomcatDeployment.java:310)
at org.jboss.web.tomcat.service.deployers.TomcatDeployment.performDeploy(TomcatDeployment.java:142)
at org.jboss.web.deployers.AbstractWarDeployment.start(AbstractWarDeployment.java:461)
at org.jboss.web.deployers.WebModule.startModule(WebModule.java:118)
at org.jboss.web.deployers.WebModule.start(WebModule.java:97)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.jboss.mx.interceptor.ReflectedDispatcher.invoke(ReflectedDispatcher.java:157)
at org.jboss.mx.server.Invocation.dispatch(Invocation.java:96)
at org.jboss.mx.server.Invocation.invoke(Invocation.java:88)
at org.jboss.mx.server.AbstractMBeanInvoker.invoke(AbstractMBeanInvoker.java:264)
at org.jboss.mx.server.MBeanServerImpl.invoke(MBeanServerImpl.java:668)
at org.jboss.system.microcontainer.ServiceProxy.invoke(ServiceProxy.java:206)
It seems to have something to do with the logging but when I lookup the class of the slf4j-api provided by jboss I can see that the method is correct:
public abstract void log(Marker paramMarker, String paramString1, int paramInt, String paramString2, Throwable paramThrowable);
This exception keeps the application from starting.
I can confirm that slf4j was working since all the logs were written correctly.
I tried setting slf4j as compile in maven but this causes other issues. I do not know how to continue. Can someone help with this?
Have you found exactly this method signature org.slf4j.spi.LocationAwareLogger.log(Lorg/slf4j/Marker, java/lang/String, java/lang/String, java/lang/Object, java/lang/Throwable;) in your classpath ?. Probably jboss slf4j libs are somehow older than the ones expected to be used by fresh cxf. Also, check your classpath for libs collisions

Arquillian/Shrinkwrap MavenDependencyResolver behind proxy

I'm using Jenkins to execute a Maven Build which includes an EJB integration test using Arquillian.
The server hosting the Jenkins and running the build is behind a proxy, which shouldn't be a problem because the settings.xml contains the valid proxy settings. (On my local system with no proxy it's working just fine.)
Resolving the dependencies by maven (while running mvn install) works perfectly fine (started manually from bash or by Jenkins) but if I use the MavenDependencyResolver of Arquillian I get an Exception:
Exception
2011-06-09 06:03:59,391 ERROR my.package.test.util.ArchiveUtil - Could not resolve DBUnit Dependency
org.jboss.shrinkwrap.resolver.api.ResolutionException: Unable to collect dependeny tree for a resolution
at org.jboss.shrinkwrap.resolver.impl.maven.MavenBuilderImpl.resolveAsFiles(MavenBuilderImpl.java:320)
at org.jboss.shrinkwrap.resolver.impl.maven.MavenBuilderImpl.resolveAs(MavenBuilderImpl.java:376)
at org.jboss.shrinkwrap.resolver.impl.maven.MavenBuilderImpl.resolveAs(MavenBuilderImpl.java:353)
at org.jboss.shrinkwrap.resolver.impl.maven.MavenBuilderImpl$MavenArtifactBuilderImpl.resolveAs(MavenBuilderImpl.java:450)
at my.package.test.util.ArchiveUtil.createTestArchive(ArchiveUtil.java:125)
at my.package.test.util.ArchiveUtil.<clinit>(ArchiveUtil.java:36)
at my.package.test.util.AbstractTest.createTestArchive(AbstractTest.java:46)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.jboss.arquillian.impl.client.deployment.AnnotationDeploymentScenarioGenerator.invoke(AnnotationDeploymentScenarioGenerator.java:162)
at org.jboss.arquillian.impl.client.deployment.AnnotationDeploymentScenarioGenerator.generateDeployment(AnnotationDeploymentScenarioGenerator.java:100)
at org.jboss.arquillian.impl.client.deployment.AnnotationDeploymentScenarioGenerator.generate(AnnotationDeploymentScenarioGenerator.java:55)
at org.jboss.arquillian.impl.client.deployment.DeploymentGenerator.generateDeployment(DeploymentGenerator.java:76)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.jboss.arquillian.impl.core.ObserverImpl.invoke(ObserverImpl.java:90)
at org.jboss.arquillian.impl.core.EventContextImpl.invokeObservers(EventContextImpl.java:98)
at org.jboss.arquillian.impl.core.EventContextImpl.proceed(EventContextImpl.java:80)
at org.jboss.arquillian.impl.core.ManagerImpl.fire(ManagerImpl.java:126)
at org.jboss.arquillian.impl.core.ManagerImpl.fire(ManagerImpl.java:106)
at org.jboss.arquillian.impl.core.EventImpl.fire(EventImpl.java:67)
at org.jboss.arquillian.impl.client.ContainerEventController.execute(ContainerEventController.java:68)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.jboss.arquillian.impl.core.ObserverImpl.invoke(ObserverImpl.java:90)
at org.jboss.arquillian.impl.core.EventContextImpl.invokeObservers(EventContextImpl.java:98)
at org.jboss.arquillian.impl.core.EventContextImpl.proceed(EventContextImpl.java:80)
at org.jboss.arquillian.impl.TestContextHandler.createClassContext(TestContextHandler.java:68)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.jboss.arquillian.impl.core.ObserverImpl.invoke(ObserverImpl.java:90)
at org.jboss.arquillian.impl.core.EventContextImpl.proceed(EventContextImpl.java:87)
at org.jboss.arquillian.impl.TestContextHandler.createSuiteContext(TestContextHandler.java:54)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.jboss.arquillian.impl.core.ObserverImpl.invoke(ObserverImpl.java:90)
at org.jboss.arquillian.impl.core.EventContextImpl.proceed(EventContextImpl.java:87)
at org.jboss.arquillian.impl.core.ManagerImpl.fire(ManagerImpl.java:126)
at org.jboss.arquillian.impl.core.ManagerImpl.fire(ManagerImpl.java:106)
at org.jboss.arquillian.impl.EventTestRunnerAdaptor.beforeClass(EventTestRunnerAdaptor.java:70)
at org.jboss.arquillian.junit.Arquillian$2.evaluate(Arquillian.java:170)
at org.jboss.arquillian.junit.Arquillian.multiExecute(Arquillian.java:303)
at org.jboss.arquillian.junit.Arquillian.access$300(Arquillian.java:45)
at org.jboss.arquillian.junit.Arquillian$3.evaluate(Arquillian.java:187)
at org.junit.runners.ParentRunner.run(ParentRunner.java:236)
at org.jboss.arquillian.junit.Arquillian.run(Arquillian.java:127)
at org.apache.maven.surefire.junit4.JUnit4TestSet.execute(JUnit4TestSet.java:59)
at org.apache.maven.surefire.suite.AbstractDirectoryTestSuite.executeTestSet(AbstractDirectoryTestSuite.java:120)
at org.apache.maven.surefire.suite.AbstractDirectoryTestSuite.execute(AbstractDirectoryTestSuite.java:103)
at org.apache.maven.surefire.Surefire.run(Surefire.java:169)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.maven.surefire.booter.SurefireBooter.runSuitesInProcess(SurefireBooter.java:350)
at org.apache.maven.surefire.booter.SurefireBooter.main(SurefireBooter.java:1021)
Caused by: org.sonatype.aether.collection.DependencyCollectionException: Failed to collect dependencies for [org.dbunit:dbunit:jar:2.4.8 ()]
at org.sonatype.aether.impl.internal.DefaultDependencyCollector.collectDependencies(DefaultDependencyCollector.java:251)
at org.sonatype.aether.impl.internal.DefaultRepositorySystem.collectDependencies(DefaultRepositorySystem.java:267)
at org.sonatype.aether.impl.internal.DefaultRepositorySystem.resolveDependencies(DefaultRepositorySystem.java:314)
at org.jboss.shrinkwrap.resolver.impl.maven.MavenRepositorySystem.resolveDependencies(MavenRepositorySystem.java:176)
at org.jboss.shrinkwrap.resolver.impl.maven.MavenBuilderImpl.resolveAsFiles(MavenBuilderImpl.java:316)
... 65 more
Caused by: org.sonatype.aether.resolution.ArtifactDescriptorException: Failed to read artifact descriptor for org.dbunit:dbunit:jar:2.4.8
at org.apache.maven.repository.internal.DefaultArtifactDescriptorReader.loadPom(DefaultArtifactDescriptorReader.java:275)
at org.apache.maven.repository.internal.DefaultArtifactDescriptorReader.readArtifactDescriptor(DefaultArtifactDescriptorReader.java:171)
at org.sonatype.aether.impl.internal.DefaultDependencyCollector.process(DefaultDependencyCollector.java:419)
at org.sonatype.aether.impl.internal.DefaultDependencyCollector.collectDependencies(DefaultDependencyCollector.java:233)
... 69 more
Caused by: org.sonatype.aether.resolution.ArtifactResolutionException: Could not transfer artifact org.dbunit:dbunit:pom:2.4.8 from/to central (http://repo1.maven.org/maven2): Error transferring file: Connection timed out
at org.sonatype.aether.impl.internal.DefaultArtifactResolver.resolveArtifacts(DefaultArtifactResolver.java:499)
at org.sonatype.aether.impl.internal.DefaultArtifactResolver.resolveArtifact(DefaultArtifactResolver.java:187)
at org.apache.maven.repository.internal.DefaultArtifactDescriptorReader.loadPom(DefaultArtifactDescriptorReader.java:260)
... 72 more
Caused by: org.sonatype.aether.transfer.ArtifactTransferException: Could not transfer artifact org.dbunit:dbunit:pom:2.4.8 from/to central (http://repo1.maven.org/maven2): Error transferring file: Connection timed out
at org.sonatype.aether.connector.wagon.WagonRepositoryConnector$4.wrap(WagonRepositoryConnector.java:934)
at org.sonatype.aether.connector.wagon.WagonRepositoryConnector$4.wrap(WagonRepositoryConnector.java:925)
at org.sonatype.aether.connector.wagon.WagonRepositoryConnector$GetTask.flush(WagonRepositoryConnector.java:681)
at org.sonatype.aether.connector.wagon.WagonRepositoryConnector$GetTask.flush(WagonRepositoryConnector.java:675)
at org.sonatype.aether.connector.wagon.WagonRepositoryConnector.get(WagonRepositoryConnector.java:420)
at org.sonatype.aether.impl.internal.DefaultArtifactResolver.resolveArtifacts(DefaultArtifactResolver.java:411)
... 74 more
Caused by: org.apache.maven.wagon.TransferFailedException: Error transferring file: Connection timed out
at org.apache.maven.wagon.providers.http.LightweightHttpWagon.resourceExists(LightweightHttpWagon.java:357)
at org.sonatype.aether.connector.wagon.WagonRepositoryConnector$GetTask.run(WagonRepositoryConnector.java:566)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
Caused by: java.net.ConnectException: Connection timed out
at java.net.PlainSocketImpl.socketConnect(Native Method)
at java.net.PlainSocketImpl.doConnect(PlainSocketImpl.java:351)
at java.net.PlainSocketImpl.connectToAddress(PlainSocketImpl.java:213)
at java.net.PlainSocketImpl.connect(PlainSocketImpl.java:200)
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:366)
at java.net.Socket.connect(Socket.java:529)
at java.net.Socket.connect(Socket.java:478)
at sun.net.NetworkClient.doConnect(NetworkClient.java:163)
at sun.net.www.http.HttpClient.openServer(HttpClient.java:394)
at sun.net.www.http.HttpClient.openServer(HttpClient.java:529)
at sun.net.www.http.HttpClient.<init>(HttpClient.java:233)
at sun.net.www.http.HttpClient.New(HttpClient.java:306)
at sun.net.www.http.HttpClient.New(HttpClient.java:323)
at sun.net.www.protocol.http.HttpURLConnection.getNewHttpClient(HttpURLConnection.java:970)
at sun.net.www.protocol.http.HttpURLConnection.plainConnect(HttpURLConnection.java:911)
at sun.net.www.protocol.http.HttpURLConnection.connect(HttpURLConnection.java:836)
at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1172)
at java.net.HttpURLConnection.getResponseCode(HttpURLConnection.java:379)
at org.apache.maven.wagon.providers.http.LightweightHttpWagon.resourceExists(LightweightHttpWagon.java:334)
... 4 more
Java Source
//...
try {
ear.addAsLibrary(DependencyResolvers
.use(MavenDependencyResolver.class)
.artifact("org.dbunit:dbunit:1.4.8")
.resolveAs(JavaArchive.class).iterator().next());
} catch (Throwable t) {
LOGGER.error("Could not resolve DBUnit Dependency", t);
}
//...
pom.xml
<dependency>
<groupId>org.dbunit</groupId>
<artifactId>dbunit</artifactId>
<version>${dbunit.version}</version>
<scope>test</scope>
</dependency>
How can I make sure this java code gets there is a proxy (if the missing proxy is the issue). Do I have to use environment variable or a settings.xml?
Is there an other solution to get this dependency with Shrinkwarp, e.g. as a jar? (I guess this would be my preferred solution.)
Environment Details
Server OS: Ubuntu 10.04
Maven 3.03
Jenkins 1.413 running on Tomcat 7
Arquillian 1.0.0.Alpha5
Thank you.
Daniel
proxy support for dependency resolution is not currently supported.
However, you can specify a path to settings.xml, which is used to activate specific repositories mentioned directly or in active (activated profiles).
See for comprehensive example:
https://github.com/shrinkwrap/resolver/blob/master/impl-maven/src/test/java/org/jboss/shrinkwrap/resolver/impl/maven/integration/ProfilesUnitTestCase.java
Basically, to summarize settings.xml magic:
By default, ${user.home}/.m2/settings.xml
Can be overridden by setting a system property org.apache.maven.user-settings
or org.apache.maven.user-settings
Can use MavenDependencyResolver.configureFrom(path-to-settings.xml-file) in your test
Cheers,
Karel
This is the same problem I'm running into as well. I've tried setting the -Dhttp.proxyHost -Dhttp.proxyPort on the JVM running the test, but still with no success. I finally had to resolve the Maven dependency via command line so that it would download to my repository before the test would start working. It would seem reasonable to assume that the Maven dependency resolver, running with the JVM, would utilize the HttpConnection class of the JVM to download the library. That's not the case, apparently.

Categories

Resources