tail -n 20 /usr/local/hadoop/logs/hadoop-hadoop-datanode-hadoop-VirtualBox.log
Me aparecia el siguiente error
STARTUP_MSG: build = https://svn.apache.org/repos/asf/hadoop/common -r 1529768; compiled by 'hortonmu' on 2013-10-07T06:28Z STARTUP_MSG: java = 1.7.0_55 ************************************************************/ 2014-08-24 19:26:09,907 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: registered UNIX signal handlers for [TERM, HUP, INT] 2014-08-24 19:26:14,307 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Invalid dfs.datanode.data.dir /home/hadoop/mydata/hdfs/datanode : EPERM: Operation not permitted at org.apache.hadoop.io.nativeio.NativeIO$POSIX.chmodImpl(Native Method) at org.apache.hadoop.io.nativeio.NativeIO$POSIX.chmod(NativeIO.java:158) at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:635) at org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.java:468) at org.apache.hadoop.util.DiskChecker.mkdirsWithExistsAndPermissionCheck(DiskChecker.java:130) at org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:146) at org.apache.hadoop.hdfs.server.datanode.DataNode$DataNodeDiskChecker.checkDir(DataNode.java:1698) at org.apache.hadoop.hdfs.server.datanode.DataNode.getDataDirsFromURIs(DataNode.java:1745) at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1722) at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1642) at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1665) at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1837) at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1858) 2014-08-24 19:26:14,366 FATAL org.apache.hadoop.hdfs.server.datanode.DataNode: Exception in secureMain java.io.IOException: All directories in dfs.datanode.data.dir are invalid: "/home/hadoop/mydata/hdfs/datanode" at org.apache.hadoop.hdfs.server.datanode.DataNode.getDataDirsFromURIs(DataNode.java:1754) at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1722) at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1642) at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1665) at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1837) at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1858) 2014-08-24 19:26:14,403 INFO org.apache.hadoop.util.ExitUtil: Exiting with status 1 2014-08-24 19:26:14,489 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG:
Solucione el problema con este post, basicamente:
1.) Parar el cluster: stop-all.sh
2.) Cambiar permisos de la carpeta donde apunta la variable dfs.datanode.data.dir (especificado en hdfs-site.xml)
sudo chown hduser:hadoop -R /home/hadoop/mydata/hdfs/
sudo chmod 777 -R
3.) Formatear el namenode y arrancar de nuevo
hadoop namenode -format
start-all.sh
No hay comentarios:
Publicar un comentario