CDH:cdh5环境mkdir: Permission denied: user=root, access=WRITE, inode="/user":hdfs:hadoop:drwxr-xr-x

产生问题原因:

环境hadoop2,cdh5创建
使用hadoop fs -mdkir /use/xxx创建文件路径时,出现权限问题

前提我们已经把当前用户zhangsan和root放到/etc/suders中。

su root

vi /etc/sudoers
root ALL=(ALL) ALL
zhangsan ALL=(ALL) NOPASSWD:ALL

su zhangsan
[zhangsan@cdh107 ~]$ hadoop fs -ls /user
Found 7 items
drwxrwxrwx - mapred hadoop 0 2018-11-12 14:07 /user/history
drwxrwxr-t - hive hive 0 2018-11-12 14:09 /user/hive
drwxrwxr-x - hue hue 0 2018-11-12 14:09 /user/hue
drwxrwxr-x - impala impala 0 2018-11-12 14:09 /user/impala
drwxrwxr-x - oozie oozie 0 2018-11-12 14:09 /user/oozie
drwxr-x--x - spark spark 0 2018-11-12 15:14 /user/spark
drwxrwxr-x - sqoop2 sqoop 0 2018-11-12 14:08 /user/sqoop2

创建目录/user/zhangsan抛出异常:

[zhangsan@cdh107 ~]$ hadoop fs -mkdir /user/wangyou
mkdir: Permission denied: user=zhangsan, access=WRITE, inode="/user":hdfs:supergroup:drwxr-xr-x
[zhangsan@cdh107 ~]$ sudo hadoop fs -mkdir /user/wangyou
mkdir: Permission denied: user=root, access=WRITE, inode="/user":hdfs:supergroup:drwxr-xr-x

解决步骤:

1.查看/user目录的用户和用户组

[zhangsan@cdh107 ~]$ hadoop fs -ls /
Found 3 items
drwxr-xr-x - hbase hbase 0 2018-11-12 14:59 /hbase
drwxrwxrwt - hdfs supergroup 0 2018-11-12 14:09 /tmp
drwxr-xr-x - hdfs supergroup 0 2018-11-12 14:09 /user

备注:从上边可以看出我们想在/user路径下创建用户,但是/user路径的用户是hdfs,用户组是supergroup,当我们使用zhangsan或者root时提示创建失败,因此我们要修改创建/user/wangyou的用户。

2.使用hdfs这个用户登录来创建目录/user/wangyou

[zhangsan@cdh107 ~]$ sudo -uhdfs hadoop fs -mkdir /user/wangyou
[zhangsan@cdh107 ~]$ hadoop fs -ls /user
Found 8 items
drwxr-xr-x - hdfs supergroup 0 2018-11-13 16:43 /user/wangyou
drwxrwxrwx - mapred hadoop 0 2018-11-12 14:07 /user/history
drwxrwxr-t - hive hive 0 2018-11-12 14:09 /user/hive
drwxrwxr-x - hue hue 0 2018-11-12 14:09 /user/hue
drwxrwxr-x - impala impala 0 2018-11-12 14:09 /user/impala
drwxrwxr-x - oozie oozie 0 2018-11-12 14:09 /user/oozie
drwxr-x--x - spark spark 0 2018-11-12 15:14 /user/spark
drwxrwxr-x - sqoop2 sqoop 0 2018-11-12 14:08 /user/sqoop2

3.通过hdfs用户更改所在用户的权限

[zhangsan@cdh107 ~]$ sudo -uhdfs hadoop fs -chown zhangsan:zhangsan /user/wangyou
[zhangsan@cdh107 ~]$ hadoop fs -ls /user
Found 8 items
drwxr-xr-x - zhangsan zhangsan 0 2018-11-13 16:43 /user/wangyou
drwxrwxrwx - mapred hadoop 0 2018-11-12 14:07 /user/history
drwxrwxr-t - hive hive 0 2018-11-12 14:09 /user/hive
drwxrwxr-x - hue hue 0 2018-11-12 14:09 /user/hue
drwxrwxr-x - impala impala 0 2018-11-12 14:09 /user/impala
drwxrwxr-x - oozie oozie 0 2018-11-12 14:09 /user/oozie
drwxr-x--x - spark spark 0 2018-11-12 15:14 /user/spark
drwxrwxr-x - sqoop2 sqoop 0 2018-11-12 14:08 /user/sqoop2

 如果spark2-shell出现这个错误,如何解决参考《问题解决:Permission denied: user=root, access=WRITE, inode=”/user”:hdfs:supergroup:drwxr-xr-x

 

    原文作者:spark
    原文地址: https://www.cnblogs.com/yy3b2007com/p/9953191.html
    本文转自网络文章,转载此文章仅为分享知识,如有侵权,请联系博主进行删除。
点赞