标签:des io 使用 ar java 文件 sp 问题 on
HDFS权限问题
Win下Eclipse提交hadoop程序出错:org.apache.hadoop.security.AccessControlException: Permission denied: user=mango, access=WRITE
描述:在window下使用Eclipse进行hadoop的程序编写,然后Run on hadoop 后,出现如下错误:
11/10/28 16:05:53 INFO mapred.JobClient: Running job: job_201110281103_0003
11/10/28 16:05:54 INFO mapred.JobClient: map 0% reduce 0%
11/10/28 16:06:05 INFO mapred.JobClient: Task Id : attempt_201110281103_0003_m_000002_0, Status : FAILED
org.apache.hadoop.security.AccessControlException: org.apache.hadoop.security.AccessControlException: Permission denied: user=mango, access=WRITE, inode="hadoop"/inode="tmp":hadoop:supergroup:rwxr-xr-x
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
解决方法:
到服务器上修改hadoop的配置文件:conf/hdfs-site.xml, 找到 dfs.permissions 的配置项 , 将value值改为 false
<property>
<name>dfs.permissions</name>
<value>false</value>
<description>
If "true", enable permission checking in HDFS.
If "false", permission checking is turned off,
but all other behavior is unchanged.
Switching from one parameter value to the other does not change the mode,
owner or group of files or directories.
</description>
</property>
如果配置文件中没有该内容,可以自己添加
修改完貌似要重启下hadoop的进程才能生效
标签:des io 使用 ar java 文件 sp 问题 on
原文地址:http://www.cnblogs.com/catWang/p/4014696.html