Hadoop Datanode节点无法启动(All directories in dfs.data.dir are invalid)

Hadoop Datanode节点无法启动(All directories in dfs.data.dir are invalid)

java.io.IOException: All directories in dfs.datanode.data.dir are invalid: "/usr/local/hadoop-2.4.0/dfs/data" 
    at org.apache.hadoop.hdfs.server.datanode.DataNode.checkStorageLocations(DataNode.java:1900)
    at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1873)
    at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1772)
    at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1812)
    at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1988)
    at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:2012)
2014-08-26 20:02:08,720 INFO org.apache.hadoop.util.ExitUtil: Exiting with status 1

 

原因:多了用户组的写权限能造成集群系统的无法启动。

解决:

root@VM_160_34_centos:/usr/local/hadoop-2.4.0> chmod -R 755 dfs

 

    原文作者:mjorcen
    原文地址: https://www.cnblogs.com/mjorcen/p/3938126.html
    本文转自网络文章,转载此文章仅为分享知识,如有侵权,请联系博主进行删除。
点赞