Uploaded image for project: 'CDAP'
  1. CDAP
  2. CDAP-3563

undocumented requirement for read perms on /hbase

    XMLWordPrintableJSON

    Details

    • Type: New Feature
    • Status: Resolved
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: None
    • Fix Version/s: 3.0.6, 3.2.0
    • Component/s: CDAP
    • Labels:
      None
    • Release Notes:
      Removed faulty and unused metrics around CDAP file resource usage.
    • Rank:
      1|hzyz2v:

      Description

      Setting up a Kerberos enabled cluster via Cloudera Manager results in the exception below. It appears CDAP requires read permissions to hdfs /hbase dir. Why? This must have been introduced recently, and is not documented anywhere I could find. One impact of this is that it's another manual step Cloudera Manager users need to do, since CM sets restrictive properties on /hbase

      15/09/01 03:16:02 WARN c.c.c.i.a.r.d.DistributedProgramRuntimeService: Exception getting hdfs metrics
      org.apache.hadoop.security.AccessControlException: Permission denied: user=cdap, access=READ_EXECUTE, inode="/hbase":hbase:hbase:drwx------
      	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:257)
      	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:238)
      	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:151)
      	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:138)
      	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6599)
      	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6581)
      	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPathAccess(FSNamesystem.java:6506)
      	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getListingInt(FSNamesystem.java:5043)
      	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getListing(FSNamesystem.java:5004)
      	at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getListing(NameNodeRpcServer.java:868)
      	at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getListing(AuthorizationProviderProxyClientProtocol.java:334)
      	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getListing(ClientNamenodeProtocolServerSideTranslatorPB.java:613)
      	at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
      	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
      	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1060)
      	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
      	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2040)
      	at java.security.AccessController.doPrivileged(Native Method)
      	at javax.security.auth.Subject.doAs(Subject.java:415)
      	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
      	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2038)
      
      	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[na:1.7.0_67]
      	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57) ~[na:1.7.0_67]
      	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[na:1.7.0_67]
      	at java.lang.reflect.Constructor.newInstance(Constructor.java:526) ~[na:1.7.0_67]
      	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106) ~[hadoop-common-2.6.0-cdh5.4.5.jar:na]
      	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:73) ~[hadoop-common-2.6.0-cdh5.4.5.jar:na]
      	at org.apache.hadoop.hdfs.DFSClient.listPaths(DFSClient.java:1965) ~[hadoop-hdfs-2.6.0-cdh5.4.5.jar:na]
      	at org.apache.hadoop.hdfs.DFSClient.listPaths(DFSClient.java:1946) ~[hadoop-hdfs-2.6.0-cdh5.4.5.jar:na]
      	at org.apache.hadoop.hdfs.DistributedFileSystem.listStatusInternal(DistributedFileSystem.java:693) ~[hadoop-hdfs-2.6.0-cdh5.4.5.jar:na]
      	at org.apache.hadoop.hdfs.DistributedFileSystem.access$600(DistributedFileSystem.java:105) ~[hadoop-hdfs-2.6.0-cdh5.4.5.jar:na]
      	at org.apache.hadoop.hdfs.DistributedFileSystem$15.doCall(DistributedFileSystem.java:755) ~[hadoop-hdfs-2.6.0-cdh5.4.5.jar:na]
      	at org.apache.hadoop.hdfs.DistributedFileSystem$15.doCall(DistributedFileSystem.java:751) ~[hadoop-hdfs-2.6.0-cdh5.4.5.jar:na]
      	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) ~[hadoop-common-2.6.0-cdh5.4.5.jar:na]
      	at org.apache.hadoop.hdfs.DistributedFileSystem.listStatus(DistributedFileSystem.java:751) ~[hadoop-hdfs-2.6.0-cdh5.4.5.jar:na]
      	at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1485) ~[hadoop-common-2.6.0-cdh5.4.5.jar:na]
      	at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1525) ~[hadoop-common-2.6.0-cdh5.4.5.jar:na]
      	at co.cask.cdap.internal.app.runtime.distributed.DistributedProgramRuntimeService$ClusterResourceReporter.reportClusterStorage(DistributedProgramRuntimeService.java:565) [co.cask.cdap.cdap-app-fabric-3.1.1.jar:na]
      	at co.cask.cdap.internal.app.runtime.distributed.DistributedProgramRuntimeService$ClusterResourceReporter.reportResources(DistributedProgramRuntimeService.java:495) [co.cask.cdap.cdap-app-fabric-3.1.1.jar:na]
      	at co.cask.cdap.internal.app.runtime.AbstractResourceReporter.runOneIteration(AbstractResourceReporter.java:72) [co.cask.cdap.cdap-app-fabric-3.1.1.jar:na]
      	at com.google.common.util.concurrent.AbstractScheduledService$1$1.run(AbstractScheduledService.java:170) [com.google.guava.guava-13.0.1.jar:na]
      	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) [na:1.7.0_67]
      	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:304) [na:1.7.0_67]
      	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:178) [na:1.7.0_67]
      	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293) [na:1.7.0_67]
      	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) [na:1.7.0_67]
      	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) [na:1.7.0_67]
      	at java.lang.Thread.run(Thread.java:745) [na:1.7.0_67]
      Caused by: org.apache.hadoop.ipc.RemoteException: Permission denied: user=cdap, access=READ_EXECUTE, inode="/hbase":hbase:hbase:drwx------
      	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:257)
      	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:238)
      	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:151)
      	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:138)
      	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6599)
      	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6581)
      	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPathAccess(FSNamesystem.java:6506)
      	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getListingInt(FSNamesystem.java:5043)
      	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getListing(FSNamesystem.java:5004)
      	at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getListing(NameNodeRpcServer.java:868)
      	at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getListing(AuthorizationProviderProxyClientProtocol.java:334)
      	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getListing(ClientNamenodeProtocolServerSideTranslatorPB.java:613)
      	at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
      	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
      	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1060)
      	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
      	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2040)
      	at java.security.AccessController.doPrivileged(Native Method)
      	at javax.security.auth.Subject.doAs(Subject.java:415)
      	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
      	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2038)
      
      	at org.apache.hadoop.ipc.Client.call(Client.java:1468) ~[hadoop-common-2.6.0-cdh5.4.5.jar:na]
      	at org.apache.hadoop.ipc.Client.call(Client.java:1399) ~[hadoop-common-2.6.0-cdh5.4.5.jar:na]
      	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232) ~[hadoop-common-2.6.0-cdh5.4.5.jar:na]
      	at com.sun.proxy.$Proxy36.getListing(Unknown Source) ~[na:na]
      	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getListing(ClientNamenodeProtocolTranslatorPB.java:554) ~[hadoop-hdfs-2.6.0-cdh5.4.5.jar:na]
      	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.7.0_67]
      	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) ~[na:1.7.0_67]
      	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.7.0_67]
      	at java.lang.reflect.Method.invoke(Method.java:606) ~[na:1.7.0_67]
      	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187) ~[hadoop-common-2.6.0-cdh5.4.5.jar:na]
      	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) ~[hadoop-common-2.6.0-cdh5.4.5.jar:na]
      	at com.sun.proxy.$Proxy37.getListing(Unknown Source) ~[na:na]
      	at org.apache.hadoop.hdfs.DFSClient.listPaths(DFSClient.java:1963) ~[hadoop-hdfs-2.6.0-cdh5.4.5.jar:na]
      	... 20 common frames omitted
      
      

        Attachments

        There are no Sub-Tasks for this issue.

          Activity

            People

            • Assignee:
              ashau Albert Shau
              Reporter:
              derek Derek Wood
            • Votes:
              0 Vote for this issue
              Watchers:
              4 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved: