Unable to execute command start-all.sh in Hadoop












0














GUYS I'm using this tutorial How to install Hadoop?
I mean the one made by Luis Alvarado in one of the comments ..
So I'm on Ubuntu 13.10 64bit
Hadoop version is 2.2.0



Actually I'm a total newbie on Hadoop .. Its new for me and We guys are trying to work on some Big Data related project
I count you guys.. Help me!
I know tutorial is based on earlier versions of Hadoop but I managed to make it through the 11th step!
And the output of the step is




root@sandesh-Inspiron-1564:/home/hduser/hadoop# sudo ./bin/hadoop namenode -format
DEPRECATED: Use of this script to execute hdfs command is deprecated.
Instead use the hdfs command for it.

14/03/24 20:29:54 INFO namenode.NameNode: STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting NameNode
STARTUP_MSG: host = sandesh-Inspiron-1564/127.0.1.1
STARTUP_MSG: args = [-format]
STARTUP_MSG: version = 2.2.0
STARTUP_MSG: classpath = /home/hduser/hadoop/etc/hadoop:/home/hduser/hadoop/share/hadoop/common/lib/jsr305-1.3.9.jar:/home/hduser/hadoop/share/hadoop/common/lib/jersey-server-1.9.jar:/home/hduser/hadoop/share/hadoop/common/lib/jasper-runtime-5.5.23.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-io-2.1.jar:/home/hduser/hadoop/share/hadoop/common/lib/netty-3.6.2.Final.jar:/home/hduser/hadoop/share/hadoop/common/lib/log4j-1.2.17.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-codec-1.4.jar:/home/hduser/hadoop/share/hadoop/common/lib/slf4j-api-1.7.5.jar:/home/hduser/hadoop/share/hadoop/common/lib/paranamer-2.3.jar:/home/hduser/hadoop/share/hadoop/common/lib/servlet-api-2.5.jar:/home/hduser/hadoop/share/hadoop/common/lib/zookeeper-3.4.5.jar:/home/hduser/hadoop/share/hadoop/common/lib/hadoop-auth-2.2.0.jar:/home/hduser/hadoop/share/hadoop/common/lib/jsp-api-2.1.jar:/home/hduser/hadoop/share/hadoop/common/lib/guava-11.0.2.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-logging-1.1.1.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-collections-3.2.1.jar:/home/hduser/hadoop/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-configuration-1.6.jar:/home/hduser/hadoop/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/home/hduser/hadoop/share/hadoop/common/lib/junit-4.8.2.jar:/home/hduser/hadoop/share/hadoop/common/lib/hadoop-annotations-2.2.0.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-el-1.0.jar:/home/hduser/hadoop/share/hadoop/common/lib/jersey-core-1.9.jar:/home/hduser/hadoop/share/hadoop/common/lib/jets3t-0.6.1.jar:/home/hduser/hadoop/share/hadoop/common/lib/stax-api-1.0.1.jar:/home/hduser/hadoop/share/hadoop/common/lib/xmlenc-0.52.jar:/home/hduser/hadoop/share/hadoop/common/lib/jersey-json-1.9.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-compress-1.4.1.jar:/home/hduser/hadoop/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/home/hduser/hadoop/share/hadoop/common/lib/jsch-0.1.42.jar:/home/hduser/hadoop/share/hadoop/common/lib/jackson-mapper-asl-1.8.8.jar:/home/hduser/hadoop/share/hadoop/common/lib/jackson-core-asl-1.8.8.jar:/home/hduser/hadoop/share/hadoop/common/lib/mockito-all-1.8.5.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-lang-2.5.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/home/hduser/hadoop/share/hadoop/common/lib/asm-3.2.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-math-2.1.jar:/home/hduser/hadoop/share/hadoop/common/lib/xz-1.0.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-cli-1.2.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-beanutils-1.7.0.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-httpclient-3.1.jar:/home/hduser/hadoop/share/hadoop/common/lib/jackson-jaxrs-1.8.8.jar:/home/hduser/hadoop/share/hadoop/common/lib/activation-1.1.jar:/home/hduser/hadoop/share/hadoop/common/lib/jasper-compiler-5.5.23.jar:/home/hduser/hadoop/share/hadoop/common/lib/jetty-6.1.26.jar:/home/hduser/hadoop/share/hadoop/common/lib/jetty-util-6.1.26.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-digester-1.8.jar:/home/hduser/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-net-3.1.jar:/home/hduser/hadoop/share/hadoop/common/lib/jackson-xc-1.8.8.jar:/home/hduser/hadoop/share/hadoop/common/lib/avro-1.7.4.jar:/home/hduser/hadoop/share/hadoop/common/lib/jettison-1.1.jar:/home/hduser/hadoop/share/hadoop/common/lib/snappy-java-1.0.4.1.jar:/home/hduser/hadoop/share/hadoop/common/hadoop-nfs-2.2.0.jar:/home/hduser/hadoop/share/hadoop/common/hadoop-common-2.2.0.jar:/home/hduser/hadoop/share/hadoop/common/hadoop-common-2.2.0-tests.jar:/home/hduser/hadoop/share/hadoop/hdfs:/home/hduser/hadoop/share/hadoop/hdfs/lib/jsr305-1.3.9.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/jasper-runtime-5.5.23.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/commons-io-2.1.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/commons-codec-1.4.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/servlet-api-2.5.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/jsp-api-2.1.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/guava-11.0.2.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/commons-logging-1.1.1.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/commons-el-1.0.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/xmlenc-0.52.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/jackson-mapper-asl-1.8.8.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/jackson-core-asl-1.8.8.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/commons-lang-2.5.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/asm-3.2.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/home/hduser/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.2.0-tests.jar:/home/hduser/hadoop/share/hadoop/hdfs/hadoop-hdfs-nfs-2.2.0.jar:/home/hduser/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.2.0.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/jersey-server-1.9.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/commons-io-2.1.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/netty-3.6.2.Final.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/log4j-1.2.17.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/paranamer-2.3.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/aopalliance-1.0.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/hadoop-annotations-2.2.0.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/jersey-core-1.9.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/junit-4.10.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/commons-compress-1.4.1.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/javax.inject-1.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/jackson-mapper-asl-1.8.8.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/jackson-core-asl-1.8.8.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/hamcrest-core-1.1.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/asm-3.2.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/xz-1.0.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/guice-3.0.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/avro-1.7.4.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/snappy-java-1.0.4.1.jar:/home/hduser/hadoop/share/hadoop/yarn/hadoop-yarn-site-2.2.0.jar:/home/hduser/hadoop/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.2.0.jar:/home/hduser/hadoop/share/hadoop/yarn/hadoop-yarn-common-2.2.0.jar:/home/hduser/hadoop/share/hadoop/yarn/hadoop-yarn-server-common-2.2.0.jar:/home/hduser/hadoop/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.2.0.jar:/home/hduser/hadoop/share/hadoop/yarn/hadoop-yarn-client-2.2.0.jar:/home/hduser/hadoop/share/hadoop/yarn/hadoop-yarn-api-2.2.0.jar:/home/hduser/hadoop/share/hadoop/yarn/hadoop-yarn-server-tests-2.2.0.jar:/home/hduser/hadoop/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.2.0.jar:/home/hduser/hadoop/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.2.0.jar:/home/hduser/hadoop/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.2.0.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/jersey-server-1.9.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/commons-io-2.1.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/log4j-1.2.17.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/paranamer-2.3.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/hadoop-annotations-2.2.0.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/jersey-core-1.9.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/junit-4.10.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/javax.inject-1.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.8.8.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/jackson-core-asl-1.8.8.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/hamcrest-core-1.1.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/asm-3.2.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/xz-1.0.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/guice-3.0.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/avro-1.7.4.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar:/home/hduser/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.2.0.jar:/home/hduser/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.2.0.jar:/home/hduser/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.2.0.jar:/home/hduser/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.2.0-tests.jar:/home/hduser/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.2.0.jar:/home/hduser/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.2.0.jar:/home/hduser/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.2.0.jar:/home/hduser/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.2.0.jar:/home/hduser/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.2.0.jar:/contrib/capacity-scheduler/*.jar:/contrib/capacity-scheduler/*.jar
STARTUP_MSG: build = https://svn.apache.org/repos/asf/hadoop/common -r 1529768; compiled by 'hortonmu' on 2013-10-07T06:28Z
STARTUP_MSG: java = 1.7.0_51
************************************************************/
14/03/24 20:29:54 INFO namenode.NameNode: registered UNIX signal handlers for [TERM, HUP, INT]
Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library /home/hduser/hadoop/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now.
It's highly recommended that you fix the library with 'execstack -c ', or link it with '-z noexecstack'.
14/03/24 20:29:54 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Formatting using clusterid: CID-f3d89333-8217-48ce-9281-44b0caed76f9
14/03/24 20:29:55 INFO namenode.HostFileManager: read includes:
HostSet(
)
14/03/24 20:29:55 INFO namenode.HostFileManager: read excludes:
HostSet(
)
14/03/24 20:29:55 INFO blockmanagement.DatanodeManager: dfs.block.invalidate.limit=1000
14/03/24 20:29:55 INFO util.GSet: Computing capacity for map BlocksMap
14/03/24 20:29:55 INFO util.GSet: VM type = 64-bit
14/03/24 20:29:55 INFO util.GSet: 2.0% max memory = 889 MB
14/03/24 20:29:55 INFO util.GSet: capacity = 2^21 = 2097152 entries
14/03/24 20:29:55 INFO blockmanagement.BlockManager: dfs.block.access.token.enable=false
14/03/24 20:29:55 INFO blockmanagement.BlockManager: defaultReplication = 1
14/03/24 20:29:55 INFO blockmanagement.BlockManager: maxReplication = 512
14/03/24 20:29:55 INFO blockmanagement.BlockManager: minReplication = 1
14/03/24 20:29:55 INFO blockmanagement.BlockManager: maxReplicationStreams = 2
14/03/24 20:29:55 INFO blockmanagement.BlockManager: shouldCheckForEnoughRacks = false
14/03/24 20:29:55 INFO blockmanagement.BlockManager: replicationRecheckInterval = 3000
14/03/24 20:29:55 INFO blockmanagement.BlockManager: encryptDataTransfer = false
14/03/24 20:29:55 INFO namenode.FSNamesystem: fsOwner = root (auth:SIMPLE)
14/03/24 20:29:55 INFO namenode.FSNamesystem: supergroup = supergroup
14/03/24 20:29:55 INFO namenode.FSNamesystem: isPermissionEnabled = true
14/03/24 20:29:55 INFO namenode.FSNamesystem: HA Enabled: false
14/03/24 20:29:55 INFO namenode.FSNamesystem: Append Enabled: true
14/03/24 20:29:55 INFO util.GSet: Computing capacity for map INodeMap
14/03/24 20:29:55 INFO util.GSet: VM type = 64-bit
14/03/24 20:29:55 INFO util.GSet: 1.0% max memory = 889 MB
14/03/24 20:29:55 INFO util.GSet: capacity = 2^20 = 1048576 entries
14/03/24 20:29:55 INFO namenode.NameNode: Caching file names occuring more than 10 times
14/03/24 20:29:55 INFO namenode.FSNamesystem: dfs.namenode.safemode.threshold-pct = 0.9990000128746033
14/03/24 20:29:55 INFO namenode.FSNamesystem: dfs.namenode.safemode.min.datanodes = 0
14/03/24 20:29:55 INFO namenode.FSNamesystem: dfs.namenode.safemode.extension = 30000
14/03/24 20:29:55 INFO namenode.FSNamesystem: Retry cache on namenode is enabled
14/03/24 20:29:55 INFO namenode.FSNamesystem: Retry cache will use 0.03 of total heap and retry cache entry expiry time is 600000 millis
14/03/24 20:29:55 INFO util.GSet: Computing capacity for map Namenode Retry Cache
14/03/24 20:29:55 INFO util.GSet: VM type = 64-bit
14/03/24 20:29:55 INFO util.GSet: 0.029999999329447746% max memory = 889 MB
14/03/24 20:29:55 INFO util.GSet: capacity = 2^15 = 32768 entries
Re-format filesystem in Storage Directory /app/hadoop/tmp/dfs/name ? (Y or N) y
14/03/24 20:30:01 INFO common.Storage: Storage directory /app/hadoop/tmp/dfs/name has been successfully formatted.
14/03/24 20:30:01 INFO namenode.FSImage: Saving image file /app/hadoop/tmp/dfs/name/current/fsimage.ckpt_0000000000000000000 using no compression
14/03/24 20:30:01 INFO namenode.FSImage: Image file /app/hadoop/tmp/dfs/name/current/fsimage.ckpt_0000000000000000000 of size 196 bytes saved in 0 seconds.
14/03/24 20:30:01 INFO namenode.NNStorageRetentionManager: Going to retain 1 images with txid >= 0
14/03/24 20:30:01 INFO util.ExitUtil: Exiting with status 0
14/03/24 20:30:01 INFO namenode.NameNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down NameNode at sandesh-Inspiron-1564/127.0.1.1
************************************************************/



Now this is the error when I'm executing the command
./start-all.sh




hduser@sandesh-Inspiron-1564:~/hadoop$ ./sbin/start-all.sh
This script is Deprecated. Instead use start-dfs.sh and start-yarn.sh
14/03/24 20:34:57 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library /home/hduser/hadoop/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now.
It's highly recommended that you fix the library with 'execstack -c ', or link it with '-z noexecstack'.
localhost]
sed: -e expression #1, char 6: unknown option to `s'
Java: ssh: Could not resolve hostname Java: Name or service not known
VM: ssh: Could not resolve hostname VM: Name or service not known
have: ssh: Could not resolve hostname have: Name or service not known
loaded: ssh: Could not resolve hostname loaded: Name or service not known
-c: Unknown cipher type 'cd'
The: ssh: Could not resolve hostname The: Name or service not known
might: ssh: Could not resolve hostname might: Name or service not known
stack: ssh: Could not resolve hostname stack: Name or service not known
have: ssh: Could not resolve hostname have: Name or service not known
guard: ssh: Could not resolve hostname guard: Name or service not known
stack: ssh: Could not resolve hostname stack: Name or service not known
library: ssh: Could not resolve hostname library: Name or service not known
HotSpot(TM): ssh: Could not resolve hostname HotSpot(TM): Name or service not known
disabled: ssh: Could not resolve hostname disabled: Name or service not known
You: ssh: Could not resolve hostname You: Name or service not known
Server: ssh: Could not resolve hostname Server: Name or service not known
highly: ssh: Could not resolve hostname highly: Name or service not known
that: ssh: Could not resolve hostname that: Name or service not known
which: ssh: Could not resolve hostname which: Name or service not known
will: ssh: Could not resolve hostname will: Name or service not known
now.: ssh: Could not resolve hostname now.: Name or service not known
'-z: ssh: Could not resolve hostname '-z: Name or service not known
link: ssh: Could not resolve hostname link: Name or service not known
library: ssh: Could not resolve hostname library: Name or service not known
with: ssh: Could not resolve hostname with: Name or service not known
the: ssh: Could not resolve hostname the: Name or service not known
guard.: ssh: Could not resolve hostname guard.: Name or service not known
fix: ssh: Could not resolve hostname fix: Name or service not known
the: ssh: Could not resolve hostname the: Name or service not known
with: ssh: Could not resolve hostname with: Name or service not known
it: ssh: Could not resolve hostname it: Name or service not known
64-Bit: ssh: Could not resolve hostname 64-Bit: Name or service not known
try: ssh: Could not resolve hostname try: Name or service not known
',: ssh: Could not resolve hostname ',: Name or service not known
localhost: starting namenode, logging to /home/hduser/hadoop/logs/hadoop-hduser-namenode-sandesh-Inspiron-1564.out
to: ssh: connect to host to port 22: Connection refused
warning:: ssh: Could not resolve hostname warning:: Name or service not known
you: ssh: Could not resolve hostname you: Name or service not known
fix: ssh: Could not resolve hostname fix: Name or service not known
It's: ssh: Could not resolve hostname It's: Name or service not known
or: ssh: Could not resolve hostname or: Name or service not known
recommended: ssh: Could not resolve hostname recommended: Name or service not known
VM: ssh: Could not resolve hostname VM: Name or service not known
noexecstack'.: ssh: Could not resolve hostname noexecstack'.: Name or service not known
'execstack: ssh: Could not resolve hostname 'execstack: Name or service not known
localhost: starting datanode, logging to /home/hduser/hadoop/logs/hadoop-hduser-datanode-sandesh-Inspiron-1564.out
Starting secondary namenodes [Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library /home/hduser/hadoop/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now.
It's highly recommended that you fix the library with 'execstack -c ', or link it with '-z noexecstack'.
0.0.0.0]
sed: -e expression #1, char 6: unknown option to `s'
-c: Unknown cipher type 'cd'
Java: ssh: Could not resolve hostname Java: Name or service not known
stack: ssh: Could not resolve hostname stack: Name or service not known
Server: ssh: Could not resolve hostname Server: Name or service not known
You: ssh: Could not resolve hostname You: Name or service not known
have: ssh: Could not resolve hostname have: Name or service not known
VM: ssh: Could not resolve hostname VM: Name or service not known
fix: ssh: Could not resolve hostname fix: Name or service not known
HotSpot(TM): ssh: Could not resolve hostname HotSpot(TM): Name or service not known
warning:: ssh: Could not resolve hostname warning:: Name or service not known
The authenticity of host '0.0.0.0 (0.0.0.0)' can't be established.
ECDSA key fingerprint is 89:fb:3d:98:2c:6d:03:c1:a3:de:96:3b:39:bc:ca:b3.
Are you sure you want to continue connecting (yes/no)? loaded: ssh: Could not resolve hostname loaded: Name or service not known
stack: ssh: Could not resolve hostname stack: Name or service not known
have: ssh: Could not resolve hostname have: Name or service not known
guard: ssh: Could not resolve hostname guard: Name or service not known
library: ssh: Could not resolve hostname library: Name or service not known
The: ssh: Could not resolve hostname The: Name or service not known
disabled: ssh: Could not resolve hostname disabled: Name or service not known
VM: ssh: Could not resolve hostname VM: Name or service not known
64-Bit: ssh: Could not resolve hostname 64-Bit: Name or service not known
try: ssh: Could not resolve hostname try: Name or service not known
you: ssh: Could not resolve hostname you: Name or service not known
might: ssh: Could not resolve hostname might: Name or service not known
the: ssh: Could not resolve hostname the: Name or service not known
highly: ssh: Could not resolve hostname highly: Name or service not known
It's: ssh: Could not resolve hostname It's: Name or service not known
now.: ssh: Could not resolve hostname now.: Name or service not known
that: ssh: Could not resolve hostname that: Name or service not known
which: ssh: Could not resolve hostname which: Name or service not known
guard.: ssh: Could not resolve hostname guard.: Name or service not known
link: ssh: Could not resolve hostname link: Name or service not known
library: ssh: Could not resolve hostname library: Name or service not known
fix: ssh: Could not resolve hostname fix: Name or service not known
the: ssh: Could not resolve hostname the: Name or service not known
or: ssh: Could not resolve hostname or: Name or service not known
with: ssh: Could not resolve hostname with: Name or service not known
with: ssh: Could not resolve hostname with: Name or service not known
recommended: ssh: Could not resolve hostname recommended: Name or service not known
'-z: ssh: Could not resolve hostname '-z: Name or service not known
it: ssh: Could not resolve hostname it: Name or service not known
will: ssh: Could not resolve hostname will: Name or service not known
',: ssh: Could not resolve hostname ',: Name or service not known
'execstack: ssh: Could not resolve hostname 'execstack: Name or service not known
noexecstack'.: ssh: Could not resolve hostname noexecstack'.: Name or service not known
to: ssh: connect to host to port 22: Connection refused



OUTPUT FOR ./start-dfs.sh




hduser@sandesh-Inspiron-1564:~/hadoop$ ./sbin/start-dfs.sh
14/03/24 20:57:53 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library /home/hduser/hadoop/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now.
It's highly recommended that you fix the library with 'execstack -c ', or link it with '-z noexecstack'.
localhost]
sed: -e expression #1, char 6: unknown option to `s'
-c: Unknown cipher type 'cd'
'execstack: ssh: Could not resolve hostname 'execstack: Name or service not known
HotSpot(TM): ssh: Could not resolve hostname HotSpot(TM): Name or service not known
now.: ssh: Could not resolve hostname now.: Name or service not known
or: ssh: Could not resolve hostname or: Name or service not known
might: ssh: Could not resolve hostname might: Name or service not known
that: ssh: Could not resolve hostname that: Name or service not known
library: ssh: Could not resolve hostname library: Name or service not known
the: ssh: Could not resolve hostname the: Name or service not known
recommended: ssh: Could not resolve hostname recommended: Name or service not known
the: ssh: Could not resolve hostname the: Name or service not known
it: ssh: Could not resolve hostname it: Name or service not known
with: ssh: Could not resolve hostname with: Name or service not known
',: ssh: Could not resolve hostname ',: Name or service not known
link: ssh: Could not resolve hostname link: Name or service not known
you: ssh: Could not resolve hostname you: Name or service not known
disabled: ssh: Could not resolve hostname disabled: Name or service not known
The: ssh: Could not resolve hostname The: Name or service not known
have: ssh: Could not resolve hostname have: Name or service not known
You: ssh: Could not resolve hostname You: Name or service not known
guard: ssh: Could not resolve hostname guard: Name or service not known
guard.: ssh: Could not resolve hostname guard.: Name or service not known
library: ssh: Could not resolve hostname library: Name or service not known
will: ssh: Could not resolve hostname will: Name or service not known
warning:: ssh: Could not resolve hostname warning:: Name or service not known
fix: ssh: Could not resolve hostname fix: Name or service not known
VM: ssh: Could not resolve hostname VM: Name or service not known
localhost: starting namenode, logging to /home/hduser/hadoop/logs/hadoop-hduser-namenode-sandesh-Inspiron-1564.out
to: ssh: connect to host to port 22: Connection refused
loaded: ssh: Could not resolve hostname loaded: Name or service not known
VM: ssh: Could not resolve hostname VM: Name or service not known
which: ssh: Could not resolve hostname which: Name or service not known
stack: ssh: Could not resolve hostname stack: Name or service not known
have: ssh: Could not resolve hostname have: Name or service not known
fix: ssh: Could not resolve hostname fix: Name or service not known
'-z: ssh: Could not resolve hostname '-z: Name or service not known
try: ssh: Could not resolve hostname try: Name or service not known
highly: ssh: Could not resolve hostname highly: Name or service not known
64-Bit: ssh: Could not resolve hostname 64-Bit: Name or service not known
with: ssh: Could not resolve hostname with: Name or service not known
Java: ssh: Could not resolve hostname Java: Name or service not known
stack: ssh: Could not resolve hostname stack: Name or service not known
Server: ssh: Could not resolve hostname Server: Name or service not known
It's: ssh: Could not resolve hostname It's: Name or service not known
noexecstack'.: ssh: Could not resolve hostname noexecstack'.: Name or service not known
localhost: starting datanode, logging to /home/hduser/hadoop/logs/hadoop-hduser-datanode-sandesh-Inspiron-1564.out



This output is for ./start-yarn.sh




hduser@sandesh-Inspiron-1564:~/hadoop$ ./sbin/start-yarn.sh
starting yarn daemons
resourcemanager running as process 16118. Stop it first.
localhost: nodemanager running as process 16238. Stop it first.










share|improve this question
























  • The first line reads: "This script is Deprecated. Instead use start-dfs.sh and start-yarn.sh". Did you try that?
    – terdon
    Mar 24 '14 at 15:22










  • Its giving somewhat same error see the updated post
    – pandaren
    Mar 24 '14 at 15:29










  • You are showing something completely different. Please read the error messages. Both errors are telling you in the first few lines to use a different program than what you are using. Try using that and if it still gives you error update your question to show us.
    – terdon
    Mar 24 '14 at 15:32










  • noe me to have the same problem, if you got solution please post it..
    – A J
    May 3 '14 at 6:57










  • The problem was with the ssh keys ... use this to generate the key again ssh-keygen -t rsa -P '' -f ~/.ssh/id_rsa and cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
    – pandaren
    May 7 '14 at 13:28


















0














GUYS I'm using this tutorial How to install Hadoop?
I mean the one made by Luis Alvarado in one of the comments ..
So I'm on Ubuntu 13.10 64bit
Hadoop version is 2.2.0



Actually I'm a total newbie on Hadoop .. Its new for me and We guys are trying to work on some Big Data related project
I count you guys.. Help me!
I know tutorial is based on earlier versions of Hadoop but I managed to make it through the 11th step!
And the output of the step is




root@sandesh-Inspiron-1564:/home/hduser/hadoop# sudo ./bin/hadoop namenode -format
DEPRECATED: Use of this script to execute hdfs command is deprecated.
Instead use the hdfs command for it.

14/03/24 20:29:54 INFO namenode.NameNode: STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting NameNode
STARTUP_MSG: host = sandesh-Inspiron-1564/127.0.1.1
STARTUP_MSG: args = [-format]
STARTUP_MSG: version = 2.2.0
STARTUP_MSG: classpath = /home/hduser/hadoop/etc/hadoop:/home/hduser/hadoop/share/hadoop/common/lib/jsr305-1.3.9.jar:/home/hduser/hadoop/share/hadoop/common/lib/jersey-server-1.9.jar:/home/hduser/hadoop/share/hadoop/common/lib/jasper-runtime-5.5.23.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-io-2.1.jar:/home/hduser/hadoop/share/hadoop/common/lib/netty-3.6.2.Final.jar:/home/hduser/hadoop/share/hadoop/common/lib/log4j-1.2.17.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-codec-1.4.jar:/home/hduser/hadoop/share/hadoop/common/lib/slf4j-api-1.7.5.jar:/home/hduser/hadoop/share/hadoop/common/lib/paranamer-2.3.jar:/home/hduser/hadoop/share/hadoop/common/lib/servlet-api-2.5.jar:/home/hduser/hadoop/share/hadoop/common/lib/zookeeper-3.4.5.jar:/home/hduser/hadoop/share/hadoop/common/lib/hadoop-auth-2.2.0.jar:/home/hduser/hadoop/share/hadoop/common/lib/jsp-api-2.1.jar:/home/hduser/hadoop/share/hadoop/common/lib/guava-11.0.2.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-logging-1.1.1.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-collections-3.2.1.jar:/home/hduser/hadoop/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-configuration-1.6.jar:/home/hduser/hadoop/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/home/hduser/hadoop/share/hadoop/common/lib/junit-4.8.2.jar:/home/hduser/hadoop/share/hadoop/common/lib/hadoop-annotations-2.2.0.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-el-1.0.jar:/home/hduser/hadoop/share/hadoop/common/lib/jersey-core-1.9.jar:/home/hduser/hadoop/share/hadoop/common/lib/jets3t-0.6.1.jar:/home/hduser/hadoop/share/hadoop/common/lib/stax-api-1.0.1.jar:/home/hduser/hadoop/share/hadoop/common/lib/xmlenc-0.52.jar:/home/hduser/hadoop/share/hadoop/common/lib/jersey-json-1.9.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-compress-1.4.1.jar:/home/hduser/hadoop/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/home/hduser/hadoop/share/hadoop/common/lib/jsch-0.1.42.jar:/home/hduser/hadoop/share/hadoop/common/lib/jackson-mapper-asl-1.8.8.jar:/home/hduser/hadoop/share/hadoop/common/lib/jackson-core-asl-1.8.8.jar:/home/hduser/hadoop/share/hadoop/common/lib/mockito-all-1.8.5.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-lang-2.5.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/home/hduser/hadoop/share/hadoop/common/lib/asm-3.2.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-math-2.1.jar:/home/hduser/hadoop/share/hadoop/common/lib/xz-1.0.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-cli-1.2.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-beanutils-1.7.0.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-httpclient-3.1.jar:/home/hduser/hadoop/share/hadoop/common/lib/jackson-jaxrs-1.8.8.jar:/home/hduser/hadoop/share/hadoop/common/lib/activation-1.1.jar:/home/hduser/hadoop/share/hadoop/common/lib/jasper-compiler-5.5.23.jar:/home/hduser/hadoop/share/hadoop/common/lib/jetty-6.1.26.jar:/home/hduser/hadoop/share/hadoop/common/lib/jetty-util-6.1.26.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-digester-1.8.jar:/home/hduser/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-net-3.1.jar:/home/hduser/hadoop/share/hadoop/common/lib/jackson-xc-1.8.8.jar:/home/hduser/hadoop/share/hadoop/common/lib/avro-1.7.4.jar:/home/hduser/hadoop/share/hadoop/common/lib/jettison-1.1.jar:/home/hduser/hadoop/share/hadoop/common/lib/snappy-java-1.0.4.1.jar:/home/hduser/hadoop/share/hadoop/common/hadoop-nfs-2.2.0.jar:/home/hduser/hadoop/share/hadoop/common/hadoop-common-2.2.0.jar:/home/hduser/hadoop/share/hadoop/common/hadoop-common-2.2.0-tests.jar:/home/hduser/hadoop/share/hadoop/hdfs:/home/hduser/hadoop/share/hadoop/hdfs/lib/jsr305-1.3.9.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/jasper-runtime-5.5.23.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/commons-io-2.1.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/commons-codec-1.4.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/servlet-api-2.5.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/jsp-api-2.1.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/guava-11.0.2.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/commons-logging-1.1.1.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/commons-el-1.0.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/xmlenc-0.52.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/jackson-mapper-asl-1.8.8.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/jackson-core-asl-1.8.8.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/commons-lang-2.5.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/asm-3.2.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/home/hduser/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.2.0-tests.jar:/home/hduser/hadoop/share/hadoop/hdfs/hadoop-hdfs-nfs-2.2.0.jar:/home/hduser/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.2.0.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/jersey-server-1.9.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/commons-io-2.1.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/netty-3.6.2.Final.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/log4j-1.2.17.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/paranamer-2.3.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/aopalliance-1.0.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/hadoop-annotations-2.2.0.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/jersey-core-1.9.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/junit-4.10.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/commons-compress-1.4.1.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/javax.inject-1.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/jackson-mapper-asl-1.8.8.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/jackson-core-asl-1.8.8.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/hamcrest-core-1.1.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/asm-3.2.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/xz-1.0.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/guice-3.0.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/avro-1.7.4.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/snappy-java-1.0.4.1.jar:/home/hduser/hadoop/share/hadoop/yarn/hadoop-yarn-site-2.2.0.jar:/home/hduser/hadoop/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.2.0.jar:/home/hduser/hadoop/share/hadoop/yarn/hadoop-yarn-common-2.2.0.jar:/home/hduser/hadoop/share/hadoop/yarn/hadoop-yarn-server-common-2.2.0.jar:/home/hduser/hadoop/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.2.0.jar:/home/hduser/hadoop/share/hadoop/yarn/hadoop-yarn-client-2.2.0.jar:/home/hduser/hadoop/share/hadoop/yarn/hadoop-yarn-api-2.2.0.jar:/home/hduser/hadoop/share/hadoop/yarn/hadoop-yarn-server-tests-2.2.0.jar:/home/hduser/hadoop/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.2.0.jar:/home/hduser/hadoop/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.2.0.jar:/home/hduser/hadoop/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.2.0.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/jersey-server-1.9.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/commons-io-2.1.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/log4j-1.2.17.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/paranamer-2.3.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/hadoop-annotations-2.2.0.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/jersey-core-1.9.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/junit-4.10.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/javax.inject-1.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.8.8.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/jackson-core-asl-1.8.8.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/hamcrest-core-1.1.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/asm-3.2.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/xz-1.0.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/guice-3.0.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/avro-1.7.4.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar:/home/hduser/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.2.0.jar:/home/hduser/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.2.0.jar:/home/hduser/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.2.0.jar:/home/hduser/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.2.0-tests.jar:/home/hduser/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.2.0.jar:/home/hduser/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.2.0.jar:/home/hduser/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.2.0.jar:/home/hduser/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.2.0.jar:/home/hduser/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.2.0.jar:/contrib/capacity-scheduler/*.jar:/contrib/capacity-scheduler/*.jar
STARTUP_MSG: build = https://svn.apache.org/repos/asf/hadoop/common -r 1529768; compiled by 'hortonmu' on 2013-10-07T06:28Z
STARTUP_MSG: java = 1.7.0_51
************************************************************/
14/03/24 20:29:54 INFO namenode.NameNode: registered UNIX signal handlers for [TERM, HUP, INT]
Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library /home/hduser/hadoop/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now.
It's highly recommended that you fix the library with 'execstack -c ', or link it with '-z noexecstack'.
14/03/24 20:29:54 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Formatting using clusterid: CID-f3d89333-8217-48ce-9281-44b0caed76f9
14/03/24 20:29:55 INFO namenode.HostFileManager: read includes:
HostSet(
)
14/03/24 20:29:55 INFO namenode.HostFileManager: read excludes:
HostSet(
)
14/03/24 20:29:55 INFO blockmanagement.DatanodeManager: dfs.block.invalidate.limit=1000
14/03/24 20:29:55 INFO util.GSet: Computing capacity for map BlocksMap
14/03/24 20:29:55 INFO util.GSet: VM type = 64-bit
14/03/24 20:29:55 INFO util.GSet: 2.0% max memory = 889 MB
14/03/24 20:29:55 INFO util.GSet: capacity = 2^21 = 2097152 entries
14/03/24 20:29:55 INFO blockmanagement.BlockManager: dfs.block.access.token.enable=false
14/03/24 20:29:55 INFO blockmanagement.BlockManager: defaultReplication = 1
14/03/24 20:29:55 INFO blockmanagement.BlockManager: maxReplication = 512
14/03/24 20:29:55 INFO blockmanagement.BlockManager: minReplication = 1
14/03/24 20:29:55 INFO blockmanagement.BlockManager: maxReplicationStreams = 2
14/03/24 20:29:55 INFO blockmanagement.BlockManager: shouldCheckForEnoughRacks = false
14/03/24 20:29:55 INFO blockmanagement.BlockManager: replicationRecheckInterval = 3000
14/03/24 20:29:55 INFO blockmanagement.BlockManager: encryptDataTransfer = false
14/03/24 20:29:55 INFO namenode.FSNamesystem: fsOwner = root (auth:SIMPLE)
14/03/24 20:29:55 INFO namenode.FSNamesystem: supergroup = supergroup
14/03/24 20:29:55 INFO namenode.FSNamesystem: isPermissionEnabled = true
14/03/24 20:29:55 INFO namenode.FSNamesystem: HA Enabled: false
14/03/24 20:29:55 INFO namenode.FSNamesystem: Append Enabled: true
14/03/24 20:29:55 INFO util.GSet: Computing capacity for map INodeMap
14/03/24 20:29:55 INFO util.GSet: VM type = 64-bit
14/03/24 20:29:55 INFO util.GSet: 1.0% max memory = 889 MB
14/03/24 20:29:55 INFO util.GSet: capacity = 2^20 = 1048576 entries
14/03/24 20:29:55 INFO namenode.NameNode: Caching file names occuring more than 10 times
14/03/24 20:29:55 INFO namenode.FSNamesystem: dfs.namenode.safemode.threshold-pct = 0.9990000128746033
14/03/24 20:29:55 INFO namenode.FSNamesystem: dfs.namenode.safemode.min.datanodes = 0
14/03/24 20:29:55 INFO namenode.FSNamesystem: dfs.namenode.safemode.extension = 30000
14/03/24 20:29:55 INFO namenode.FSNamesystem: Retry cache on namenode is enabled
14/03/24 20:29:55 INFO namenode.FSNamesystem: Retry cache will use 0.03 of total heap and retry cache entry expiry time is 600000 millis
14/03/24 20:29:55 INFO util.GSet: Computing capacity for map Namenode Retry Cache
14/03/24 20:29:55 INFO util.GSet: VM type = 64-bit
14/03/24 20:29:55 INFO util.GSet: 0.029999999329447746% max memory = 889 MB
14/03/24 20:29:55 INFO util.GSet: capacity = 2^15 = 32768 entries
Re-format filesystem in Storage Directory /app/hadoop/tmp/dfs/name ? (Y or N) y
14/03/24 20:30:01 INFO common.Storage: Storage directory /app/hadoop/tmp/dfs/name has been successfully formatted.
14/03/24 20:30:01 INFO namenode.FSImage: Saving image file /app/hadoop/tmp/dfs/name/current/fsimage.ckpt_0000000000000000000 using no compression
14/03/24 20:30:01 INFO namenode.FSImage: Image file /app/hadoop/tmp/dfs/name/current/fsimage.ckpt_0000000000000000000 of size 196 bytes saved in 0 seconds.
14/03/24 20:30:01 INFO namenode.NNStorageRetentionManager: Going to retain 1 images with txid >= 0
14/03/24 20:30:01 INFO util.ExitUtil: Exiting with status 0
14/03/24 20:30:01 INFO namenode.NameNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down NameNode at sandesh-Inspiron-1564/127.0.1.1
************************************************************/



Now this is the error when I'm executing the command
./start-all.sh




hduser@sandesh-Inspiron-1564:~/hadoop$ ./sbin/start-all.sh
This script is Deprecated. Instead use start-dfs.sh and start-yarn.sh
14/03/24 20:34:57 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library /home/hduser/hadoop/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now.
It's highly recommended that you fix the library with 'execstack -c ', or link it with '-z noexecstack'.
localhost]
sed: -e expression #1, char 6: unknown option to `s'
Java: ssh: Could not resolve hostname Java: Name or service not known
VM: ssh: Could not resolve hostname VM: Name or service not known
have: ssh: Could not resolve hostname have: Name or service not known
loaded: ssh: Could not resolve hostname loaded: Name or service not known
-c: Unknown cipher type 'cd'
The: ssh: Could not resolve hostname The: Name or service not known
might: ssh: Could not resolve hostname might: Name or service not known
stack: ssh: Could not resolve hostname stack: Name or service not known
have: ssh: Could not resolve hostname have: Name or service not known
guard: ssh: Could not resolve hostname guard: Name or service not known
stack: ssh: Could not resolve hostname stack: Name or service not known
library: ssh: Could not resolve hostname library: Name or service not known
HotSpot(TM): ssh: Could not resolve hostname HotSpot(TM): Name or service not known
disabled: ssh: Could not resolve hostname disabled: Name or service not known
You: ssh: Could not resolve hostname You: Name or service not known
Server: ssh: Could not resolve hostname Server: Name or service not known
highly: ssh: Could not resolve hostname highly: Name or service not known
that: ssh: Could not resolve hostname that: Name or service not known
which: ssh: Could not resolve hostname which: Name or service not known
will: ssh: Could not resolve hostname will: Name or service not known
now.: ssh: Could not resolve hostname now.: Name or service not known
'-z: ssh: Could not resolve hostname '-z: Name or service not known
link: ssh: Could not resolve hostname link: Name or service not known
library: ssh: Could not resolve hostname library: Name or service not known
with: ssh: Could not resolve hostname with: Name or service not known
the: ssh: Could not resolve hostname the: Name or service not known
guard.: ssh: Could not resolve hostname guard.: Name or service not known
fix: ssh: Could not resolve hostname fix: Name or service not known
the: ssh: Could not resolve hostname the: Name or service not known
with: ssh: Could not resolve hostname with: Name or service not known
it: ssh: Could not resolve hostname it: Name or service not known
64-Bit: ssh: Could not resolve hostname 64-Bit: Name or service not known
try: ssh: Could not resolve hostname try: Name or service not known
',: ssh: Could not resolve hostname ',: Name or service not known
localhost: starting namenode, logging to /home/hduser/hadoop/logs/hadoop-hduser-namenode-sandesh-Inspiron-1564.out
to: ssh: connect to host to port 22: Connection refused
warning:: ssh: Could not resolve hostname warning:: Name or service not known
you: ssh: Could not resolve hostname you: Name or service not known
fix: ssh: Could not resolve hostname fix: Name or service not known
It's: ssh: Could not resolve hostname It's: Name or service not known
or: ssh: Could not resolve hostname or: Name or service not known
recommended: ssh: Could not resolve hostname recommended: Name or service not known
VM: ssh: Could not resolve hostname VM: Name or service not known
noexecstack'.: ssh: Could not resolve hostname noexecstack'.: Name or service not known
'execstack: ssh: Could not resolve hostname 'execstack: Name or service not known
localhost: starting datanode, logging to /home/hduser/hadoop/logs/hadoop-hduser-datanode-sandesh-Inspiron-1564.out
Starting secondary namenodes [Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library /home/hduser/hadoop/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now.
It's highly recommended that you fix the library with 'execstack -c ', or link it with '-z noexecstack'.
0.0.0.0]
sed: -e expression #1, char 6: unknown option to `s'
-c: Unknown cipher type 'cd'
Java: ssh: Could not resolve hostname Java: Name or service not known
stack: ssh: Could not resolve hostname stack: Name or service not known
Server: ssh: Could not resolve hostname Server: Name or service not known
You: ssh: Could not resolve hostname You: Name or service not known
have: ssh: Could not resolve hostname have: Name or service not known
VM: ssh: Could not resolve hostname VM: Name or service not known
fix: ssh: Could not resolve hostname fix: Name or service not known
HotSpot(TM): ssh: Could not resolve hostname HotSpot(TM): Name or service not known
warning:: ssh: Could not resolve hostname warning:: Name or service not known
The authenticity of host '0.0.0.0 (0.0.0.0)' can't be established.
ECDSA key fingerprint is 89:fb:3d:98:2c:6d:03:c1:a3:de:96:3b:39:bc:ca:b3.
Are you sure you want to continue connecting (yes/no)? loaded: ssh: Could not resolve hostname loaded: Name or service not known
stack: ssh: Could not resolve hostname stack: Name or service not known
have: ssh: Could not resolve hostname have: Name or service not known
guard: ssh: Could not resolve hostname guard: Name or service not known
library: ssh: Could not resolve hostname library: Name or service not known
The: ssh: Could not resolve hostname The: Name or service not known
disabled: ssh: Could not resolve hostname disabled: Name or service not known
VM: ssh: Could not resolve hostname VM: Name or service not known
64-Bit: ssh: Could not resolve hostname 64-Bit: Name or service not known
try: ssh: Could not resolve hostname try: Name or service not known
you: ssh: Could not resolve hostname you: Name or service not known
might: ssh: Could not resolve hostname might: Name or service not known
the: ssh: Could not resolve hostname the: Name or service not known
highly: ssh: Could not resolve hostname highly: Name or service not known
It's: ssh: Could not resolve hostname It's: Name or service not known
now.: ssh: Could not resolve hostname now.: Name or service not known
that: ssh: Could not resolve hostname that: Name or service not known
which: ssh: Could not resolve hostname which: Name or service not known
guard.: ssh: Could not resolve hostname guard.: Name or service not known
link: ssh: Could not resolve hostname link: Name or service not known
library: ssh: Could not resolve hostname library: Name or service not known
fix: ssh: Could not resolve hostname fix: Name or service not known
the: ssh: Could not resolve hostname the: Name or service not known
or: ssh: Could not resolve hostname or: Name or service not known
with: ssh: Could not resolve hostname with: Name or service not known
with: ssh: Could not resolve hostname with: Name or service not known
recommended: ssh: Could not resolve hostname recommended: Name or service not known
'-z: ssh: Could not resolve hostname '-z: Name or service not known
it: ssh: Could not resolve hostname it: Name or service not known
will: ssh: Could not resolve hostname will: Name or service not known
',: ssh: Could not resolve hostname ',: Name or service not known
'execstack: ssh: Could not resolve hostname 'execstack: Name or service not known
noexecstack'.: ssh: Could not resolve hostname noexecstack'.: Name or service not known
to: ssh: connect to host to port 22: Connection refused



OUTPUT FOR ./start-dfs.sh




hduser@sandesh-Inspiron-1564:~/hadoop$ ./sbin/start-dfs.sh
14/03/24 20:57:53 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library /home/hduser/hadoop/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now.
It's highly recommended that you fix the library with 'execstack -c ', or link it with '-z noexecstack'.
localhost]
sed: -e expression #1, char 6: unknown option to `s'
-c: Unknown cipher type 'cd'
'execstack: ssh: Could not resolve hostname 'execstack: Name or service not known
HotSpot(TM): ssh: Could not resolve hostname HotSpot(TM): Name or service not known
now.: ssh: Could not resolve hostname now.: Name or service not known
or: ssh: Could not resolve hostname or: Name or service not known
might: ssh: Could not resolve hostname might: Name or service not known
that: ssh: Could not resolve hostname that: Name or service not known
library: ssh: Could not resolve hostname library: Name or service not known
the: ssh: Could not resolve hostname the: Name or service not known
recommended: ssh: Could not resolve hostname recommended: Name or service not known
the: ssh: Could not resolve hostname the: Name or service not known
it: ssh: Could not resolve hostname it: Name or service not known
with: ssh: Could not resolve hostname with: Name or service not known
',: ssh: Could not resolve hostname ',: Name or service not known
link: ssh: Could not resolve hostname link: Name or service not known
you: ssh: Could not resolve hostname you: Name or service not known
disabled: ssh: Could not resolve hostname disabled: Name or service not known
The: ssh: Could not resolve hostname The: Name or service not known
have: ssh: Could not resolve hostname have: Name or service not known
You: ssh: Could not resolve hostname You: Name or service not known
guard: ssh: Could not resolve hostname guard: Name or service not known
guard.: ssh: Could not resolve hostname guard.: Name or service not known
library: ssh: Could not resolve hostname library: Name or service not known
will: ssh: Could not resolve hostname will: Name or service not known
warning:: ssh: Could not resolve hostname warning:: Name or service not known
fix: ssh: Could not resolve hostname fix: Name or service not known
VM: ssh: Could not resolve hostname VM: Name or service not known
localhost: starting namenode, logging to /home/hduser/hadoop/logs/hadoop-hduser-namenode-sandesh-Inspiron-1564.out
to: ssh: connect to host to port 22: Connection refused
loaded: ssh: Could not resolve hostname loaded: Name or service not known
VM: ssh: Could not resolve hostname VM: Name or service not known
which: ssh: Could not resolve hostname which: Name or service not known
stack: ssh: Could not resolve hostname stack: Name or service not known
have: ssh: Could not resolve hostname have: Name or service not known
fix: ssh: Could not resolve hostname fix: Name or service not known
'-z: ssh: Could not resolve hostname '-z: Name or service not known
try: ssh: Could not resolve hostname try: Name or service not known
highly: ssh: Could not resolve hostname highly: Name or service not known
64-Bit: ssh: Could not resolve hostname 64-Bit: Name or service not known
with: ssh: Could not resolve hostname with: Name or service not known
Java: ssh: Could not resolve hostname Java: Name or service not known
stack: ssh: Could not resolve hostname stack: Name or service not known
Server: ssh: Could not resolve hostname Server: Name or service not known
It's: ssh: Could not resolve hostname It's: Name or service not known
noexecstack'.: ssh: Could not resolve hostname noexecstack'.: Name or service not known
localhost: starting datanode, logging to /home/hduser/hadoop/logs/hadoop-hduser-datanode-sandesh-Inspiron-1564.out



This output is for ./start-yarn.sh




hduser@sandesh-Inspiron-1564:~/hadoop$ ./sbin/start-yarn.sh
starting yarn daemons
resourcemanager running as process 16118. Stop it first.
localhost: nodemanager running as process 16238. Stop it first.










share|improve this question
























  • The first line reads: "This script is Deprecated. Instead use start-dfs.sh and start-yarn.sh". Did you try that?
    – terdon
    Mar 24 '14 at 15:22










  • Its giving somewhat same error see the updated post
    – pandaren
    Mar 24 '14 at 15:29










  • You are showing something completely different. Please read the error messages. Both errors are telling you in the first few lines to use a different program than what you are using. Try using that and if it still gives you error update your question to show us.
    – terdon
    Mar 24 '14 at 15:32










  • noe me to have the same problem, if you got solution please post it..
    – A J
    May 3 '14 at 6:57










  • The problem was with the ssh keys ... use this to generate the key again ssh-keygen -t rsa -P '' -f ~/.ssh/id_rsa and cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
    – pandaren
    May 7 '14 at 13:28
















0












0








0







GUYS I'm using this tutorial How to install Hadoop?
I mean the one made by Luis Alvarado in one of the comments ..
So I'm on Ubuntu 13.10 64bit
Hadoop version is 2.2.0



Actually I'm a total newbie on Hadoop .. Its new for me and We guys are trying to work on some Big Data related project
I count you guys.. Help me!
I know tutorial is based on earlier versions of Hadoop but I managed to make it through the 11th step!
And the output of the step is




root@sandesh-Inspiron-1564:/home/hduser/hadoop# sudo ./bin/hadoop namenode -format
DEPRECATED: Use of this script to execute hdfs command is deprecated.
Instead use the hdfs command for it.

14/03/24 20:29:54 INFO namenode.NameNode: STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting NameNode
STARTUP_MSG: host = sandesh-Inspiron-1564/127.0.1.1
STARTUP_MSG: args = [-format]
STARTUP_MSG: version = 2.2.0
STARTUP_MSG: classpath = /home/hduser/hadoop/etc/hadoop:/home/hduser/hadoop/share/hadoop/common/lib/jsr305-1.3.9.jar:/home/hduser/hadoop/share/hadoop/common/lib/jersey-server-1.9.jar:/home/hduser/hadoop/share/hadoop/common/lib/jasper-runtime-5.5.23.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-io-2.1.jar:/home/hduser/hadoop/share/hadoop/common/lib/netty-3.6.2.Final.jar:/home/hduser/hadoop/share/hadoop/common/lib/log4j-1.2.17.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-codec-1.4.jar:/home/hduser/hadoop/share/hadoop/common/lib/slf4j-api-1.7.5.jar:/home/hduser/hadoop/share/hadoop/common/lib/paranamer-2.3.jar:/home/hduser/hadoop/share/hadoop/common/lib/servlet-api-2.5.jar:/home/hduser/hadoop/share/hadoop/common/lib/zookeeper-3.4.5.jar:/home/hduser/hadoop/share/hadoop/common/lib/hadoop-auth-2.2.0.jar:/home/hduser/hadoop/share/hadoop/common/lib/jsp-api-2.1.jar:/home/hduser/hadoop/share/hadoop/common/lib/guava-11.0.2.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-logging-1.1.1.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-collections-3.2.1.jar:/home/hduser/hadoop/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-configuration-1.6.jar:/home/hduser/hadoop/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/home/hduser/hadoop/share/hadoop/common/lib/junit-4.8.2.jar:/home/hduser/hadoop/share/hadoop/common/lib/hadoop-annotations-2.2.0.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-el-1.0.jar:/home/hduser/hadoop/share/hadoop/common/lib/jersey-core-1.9.jar:/home/hduser/hadoop/share/hadoop/common/lib/jets3t-0.6.1.jar:/home/hduser/hadoop/share/hadoop/common/lib/stax-api-1.0.1.jar:/home/hduser/hadoop/share/hadoop/common/lib/xmlenc-0.52.jar:/home/hduser/hadoop/share/hadoop/common/lib/jersey-json-1.9.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-compress-1.4.1.jar:/home/hduser/hadoop/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/home/hduser/hadoop/share/hadoop/common/lib/jsch-0.1.42.jar:/home/hduser/hadoop/share/hadoop/common/lib/jackson-mapper-asl-1.8.8.jar:/home/hduser/hadoop/share/hadoop/common/lib/jackson-core-asl-1.8.8.jar:/home/hduser/hadoop/share/hadoop/common/lib/mockito-all-1.8.5.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-lang-2.5.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/home/hduser/hadoop/share/hadoop/common/lib/asm-3.2.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-math-2.1.jar:/home/hduser/hadoop/share/hadoop/common/lib/xz-1.0.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-cli-1.2.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-beanutils-1.7.0.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-httpclient-3.1.jar:/home/hduser/hadoop/share/hadoop/common/lib/jackson-jaxrs-1.8.8.jar:/home/hduser/hadoop/share/hadoop/common/lib/activation-1.1.jar:/home/hduser/hadoop/share/hadoop/common/lib/jasper-compiler-5.5.23.jar:/home/hduser/hadoop/share/hadoop/common/lib/jetty-6.1.26.jar:/home/hduser/hadoop/share/hadoop/common/lib/jetty-util-6.1.26.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-digester-1.8.jar:/home/hduser/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-net-3.1.jar:/home/hduser/hadoop/share/hadoop/common/lib/jackson-xc-1.8.8.jar:/home/hduser/hadoop/share/hadoop/common/lib/avro-1.7.4.jar:/home/hduser/hadoop/share/hadoop/common/lib/jettison-1.1.jar:/home/hduser/hadoop/share/hadoop/common/lib/snappy-java-1.0.4.1.jar:/home/hduser/hadoop/share/hadoop/common/hadoop-nfs-2.2.0.jar:/home/hduser/hadoop/share/hadoop/common/hadoop-common-2.2.0.jar:/home/hduser/hadoop/share/hadoop/common/hadoop-common-2.2.0-tests.jar:/home/hduser/hadoop/share/hadoop/hdfs:/home/hduser/hadoop/share/hadoop/hdfs/lib/jsr305-1.3.9.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/jasper-runtime-5.5.23.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/commons-io-2.1.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/commons-codec-1.4.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/servlet-api-2.5.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/jsp-api-2.1.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/guava-11.0.2.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/commons-logging-1.1.1.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/commons-el-1.0.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/xmlenc-0.52.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/jackson-mapper-asl-1.8.8.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/jackson-core-asl-1.8.8.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/commons-lang-2.5.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/asm-3.2.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/home/hduser/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.2.0-tests.jar:/home/hduser/hadoop/share/hadoop/hdfs/hadoop-hdfs-nfs-2.2.0.jar:/home/hduser/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.2.0.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/jersey-server-1.9.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/commons-io-2.1.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/netty-3.6.2.Final.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/log4j-1.2.17.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/paranamer-2.3.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/aopalliance-1.0.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/hadoop-annotations-2.2.0.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/jersey-core-1.9.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/junit-4.10.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/commons-compress-1.4.1.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/javax.inject-1.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/jackson-mapper-asl-1.8.8.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/jackson-core-asl-1.8.8.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/hamcrest-core-1.1.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/asm-3.2.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/xz-1.0.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/guice-3.0.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/avro-1.7.4.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/snappy-java-1.0.4.1.jar:/home/hduser/hadoop/share/hadoop/yarn/hadoop-yarn-site-2.2.0.jar:/home/hduser/hadoop/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.2.0.jar:/home/hduser/hadoop/share/hadoop/yarn/hadoop-yarn-common-2.2.0.jar:/home/hduser/hadoop/share/hadoop/yarn/hadoop-yarn-server-common-2.2.0.jar:/home/hduser/hadoop/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.2.0.jar:/home/hduser/hadoop/share/hadoop/yarn/hadoop-yarn-client-2.2.0.jar:/home/hduser/hadoop/share/hadoop/yarn/hadoop-yarn-api-2.2.0.jar:/home/hduser/hadoop/share/hadoop/yarn/hadoop-yarn-server-tests-2.2.0.jar:/home/hduser/hadoop/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.2.0.jar:/home/hduser/hadoop/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.2.0.jar:/home/hduser/hadoop/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.2.0.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/jersey-server-1.9.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/commons-io-2.1.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/log4j-1.2.17.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/paranamer-2.3.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/hadoop-annotations-2.2.0.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/jersey-core-1.9.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/junit-4.10.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/javax.inject-1.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.8.8.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/jackson-core-asl-1.8.8.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/hamcrest-core-1.1.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/asm-3.2.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/xz-1.0.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/guice-3.0.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/avro-1.7.4.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar:/home/hduser/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.2.0.jar:/home/hduser/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.2.0.jar:/home/hduser/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.2.0.jar:/home/hduser/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.2.0-tests.jar:/home/hduser/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.2.0.jar:/home/hduser/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.2.0.jar:/home/hduser/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.2.0.jar:/home/hduser/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.2.0.jar:/home/hduser/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.2.0.jar:/contrib/capacity-scheduler/*.jar:/contrib/capacity-scheduler/*.jar
STARTUP_MSG: build = https://svn.apache.org/repos/asf/hadoop/common -r 1529768; compiled by 'hortonmu' on 2013-10-07T06:28Z
STARTUP_MSG: java = 1.7.0_51
************************************************************/
14/03/24 20:29:54 INFO namenode.NameNode: registered UNIX signal handlers for [TERM, HUP, INT]
Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library /home/hduser/hadoop/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now.
It's highly recommended that you fix the library with 'execstack -c ', or link it with '-z noexecstack'.
14/03/24 20:29:54 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Formatting using clusterid: CID-f3d89333-8217-48ce-9281-44b0caed76f9
14/03/24 20:29:55 INFO namenode.HostFileManager: read includes:
HostSet(
)
14/03/24 20:29:55 INFO namenode.HostFileManager: read excludes:
HostSet(
)
14/03/24 20:29:55 INFO blockmanagement.DatanodeManager: dfs.block.invalidate.limit=1000
14/03/24 20:29:55 INFO util.GSet: Computing capacity for map BlocksMap
14/03/24 20:29:55 INFO util.GSet: VM type = 64-bit
14/03/24 20:29:55 INFO util.GSet: 2.0% max memory = 889 MB
14/03/24 20:29:55 INFO util.GSet: capacity = 2^21 = 2097152 entries
14/03/24 20:29:55 INFO blockmanagement.BlockManager: dfs.block.access.token.enable=false
14/03/24 20:29:55 INFO blockmanagement.BlockManager: defaultReplication = 1
14/03/24 20:29:55 INFO blockmanagement.BlockManager: maxReplication = 512
14/03/24 20:29:55 INFO blockmanagement.BlockManager: minReplication = 1
14/03/24 20:29:55 INFO blockmanagement.BlockManager: maxReplicationStreams = 2
14/03/24 20:29:55 INFO blockmanagement.BlockManager: shouldCheckForEnoughRacks = false
14/03/24 20:29:55 INFO blockmanagement.BlockManager: replicationRecheckInterval = 3000
14/03/24 20:29:55 INFO blockmanagement.BlockManager: encryptDataTransfer = false
14/03/24 20:29:55 INFO namenode.FSNamesystem: fsOwner = root (auth:SIMPLE)
14/03/24 20:29:55 INFO namenode.FSNamesystem: supergroup = supergroup
14/03/24 20:29:55 INFO namenode.FSNamesystem: isPermissionEnabled = true
14/03/24 20:29:55 INFO namenode.FSNamesystem: HA Enabled: false
14/03/24 20:29:55 INFO namenode.FSNamesystem: Append Enabled: true
14/03/24 20:29:55 INFO util.GSet: Computing capacity for map INodeMap
14/03/24 20:29:55 INFO util.GSet: VM type = 64-bit
14/03/24 20:29:55 INFO util.GSet: 1.0% max memory = 889 MB
14/03/24 20:29:55 INFO util.GSet: capacity = 2^20 = 1048576 entries
14/03/24 20:29:55 INFO namenode.NameNode: Caching file names occuring more than 10 times
14/03/24 20:29:55 INFO namenode.FSNamesystem: dfs.namenode.safemode.threshold-pct = 0.9990000128746033
14/03/24 20:29:55 INFO namenode.FSNamesystem: dfs.namenode.safemode.min.datanodes = 0
14/03/24 20:29:55 INFO namenode.FSNamesystem: dfs.namenode.safemode.extension = 30000
14/03/24 20:29:55 INFO namenode.FSNamesystem: Retry cache on namenode is enabled
14/03/24 20:29:55 INFO namenode.FSNamesystem: Retry cache will use 0.03 of total heap and retry cache entry expiry time is 600000 millis
14/03/24 20:29:55 INFO util.GSet: Computing capacity for map Namenode Retry Cache
14/03/24 20:29:55 INFO util.GSet: VM type = 64-bit
14/03/24 20:29:55 INFO util.GSet: 0.029999999329447746% max memory = 889 MB
14/03/24 20:29:55 INFO util.GSet: capacity = 2^15 = 32768 entries
Re-format filesystem in Storage Directory /app/hadoop/tmp/dfs/name ? (Y or N) y
14/03/24 20:30:01 INFO common.Storage: Storage directory /app/hadoop/tmp/dfs/name has been successfully formatted.
14/03/24 20:30:01 INFO namenode.FSImage: Saving image file /app/hadoop/tmp/dfs/name/current/fsimage.ckpt_0000000000000000000 using no compression
14/03/24 20:30:01 INFO namenode.FSImage: Image file /app/hadoop/tmp/dfs/name/current/fsimage.ckpt_0000000000000000000 of size 196 bytes saved in 0 seconds.
14/03/24 20:30:01 INFO namenode.NNStorageRetentionManager: Going to retain 1 images with txid >= 0
14/03/24 20:30:01 INFO util.ExitUtil: Exiting with status 0
14/03/24 20:30:01 INFO namenode.NameNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down NameNode at sandesh-Inspiron-1564/127.0.1.1
************************************************************/



Now this is the error when I'm executing the command
./start-all.sh




hduser@sandesh-Inspiron-1564:~/hadoop$ ./sbin/start-all.sh
This script is Deprecated. Instead use start-dfs.sh and start-yarn.sh
14/03/24 20:34:57 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library /home/hduser/hadoop/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now.
It's highly recommended that you fix the library with 'execstack -c ', or link it with '-z noexecstack'.
localhost]
sed: -e expression #1, char 6: unknown option to `s'
Java: ssh: Could not resolve hostname Java: Name or service not known
VM: ssh: Could not resolve hostname VM: Name or service not known
have: ssh: Could not resolve hostname have: Name or service not known
loaded: ssh: Could not resolve hostname loaded: Name or service not known
-c: Unknown cipher type 'cd'
The: ssh: Could not resolve hostname The: Name or service not known
might: ssh: Could not resolve hostname might: Name or service not known
stack: ssh: Could not resolve hostname stack: Name or service not known
have: ssh: Could not resolve hostname have: Name or service not known
guard: ssh: Could not resolve hostname guard: Name or service not known
stack: ssh: Could not resolve hostname stack: Name or service not known
library: ssh: Could not resolve hostname library: Name or service not known
HotSpot(TM): ssh: Could not resolve hostname HotSpot(TM): Name or service not known
disabled: ssh: Could not resolve hostname disabled: Name or service not known
You: ssh: Could not resolve hostname You: Name or service not known
Server: ssh: Could not resolve hostname Server: Name or service not known
highly: ssh: Could not resolve hostname highly: Name or service not known
that: ssh: Could not resolve hostname that: Name or service not known
which: ssh: Could not resolve hostname which: Name or service not known
will: ssh: Could not resolve hostname will: Name or service not known
now.: ssh: Could not resolve hostname now.: Name or service not known
'-z: ssh: Could not resolve hostname '-z: Name or service not known
link: ssh: Could not resolve hostname link: Name or service not known
library: ssh: Could not resolve hostname library: Name or service not known
with: ssh: Could not resolve hostname with: Name or service not known
the: ssh: Could not resolve hostname the: Name or service not known
guard.: ssh: Could not resolve hostname guard.: Name or service not known
fix: ssh: Could not resolve hostname fix: Name or service not known
the: ssh: Could not resolve hostname the: Name or service not known
with: ssh: Could not resolve hostname with: Name or service not known
it: ssh: Could not resolve hostname it: Name or service not known
64-Bit: ssh: Could not resolve hostname 64-Bit: Name or service not known
try: ssh: Could not resolve hostname try: Name or service not known
',: ssh: Could not resolve hostname ',: Name or service not known
localhost: starting namenode, logging to /home/hduser/hadoop/logs/hadoop-hduser-namenode-sandesh-Inspiron-1564.out
to: ssh: connect to host to port 22: Connection refused
warning:: ssh: Could not resolve hostname warning:: Name or service not known
you: ssh: Could not resolve hostname you: Name or service not known
fix: ssh: Could not resolve hostname fix: Name or service not known
It's: ssh: Could not resolve hostname It's: Name or service not known
or: ssh: Could not resolve hostname or: Name or service not known
recommended: ssh: Could not resolve hostname recommended: Name or service not known
VM: ssh: Could not resolve hostname VM: Name or service not known
noexecstack'.: ssh: Could not resolve hostname noexecstack'.: Name or service not known
'execstack: ssh: Could not resolve hostname 'execstack: Name or service not known
localhost: starting datanode, logging to /home/hduser/hadoop/logs/hadoop-hduser-datanode-sandesh-Inspiron-1564.out
Starting secondary namenodes [Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library /home/hduser/hadoop/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now.
It's highly recommended that you fix the library with 'execstack -c ', or link it with '-z noexecstack'.
0.0.0.0]
sed: -e expression #1, char 6: unknown option to `s'
-c: Unknown cipher type 'cd'
Java: ssh: Could not resolve hostname Java: Name or service not known
stack: ssh: Could not resolve hostname stack: Name or service not known
Server: ssh: Could not resolve hostname Server: Name or service not known
You: ssh: Could not resolve hostname You: Name or service not known
have: ssh: Could not resolve hostname have: Name or service not known
VM: ssh: Could not resolve hostname VM: Name or service not known
fix: ssh: Could not resolve hostname fix: Name or service not known
HotSpot(TM): ssh: Could not resolve hostname HotSpot(TM): Name or service not known
warning:: ssh: Could not resolve hostname warning:: Name or service not known
The authenticity of host '0.0.0.0 (0.0.0.0)' can't be established.
ECDSA key fingerprint is 89:fb:3d:98:2c:6d:03:c1:a3:de:96:3b:39:bc:ca:b3.
Are you sure you want to continue connecting (yes/no)? loaded: ssh: Could not resolve hostname loaded: Name or service not known
stack: ssh: Could not resolve hostname stack: Name or service not known
have: ssh: Could not resolve hostname have: Name or service not known
guard: ssh: Could not resolve hostname guard: Name or service not known
library: ssh: Could not resolve hostname library: Name or service not known
The: ssh: Could not resolve hostname The: Name or service not known
disabled: ssh: Could not resolve hostname disabled: Name or service not known
VM: ssh: Could not resolve hostname VM: Name or service not known
64-Bit: ssh: Could not resolve hostname 64-Bit: Name or service not known
try: ssh: Could not resolve hostname try: Name or service not known
you: ssh: Could not resolve hostname you: Name or service not known
might: ssh: Could not resolve hostname might: Name or service not known
the: ssh: Could not resolve hostname the: Name or service not known
highly: ssh: Could not resolve hostname highly: Name or service not known
It's: ssh: Could not resolve hostname It's: Name or service not known
now.: ssh: Could not resolve hostname now.: Name or service not known
that: ssh: Could not resolve hostname that: Name or service not known
which: ssh: Could not resolve hostname which: Name or service not known
guard.: ssh: Could not resolve hostname guard.: Name or service not known
link: ssh: Could not resolve hostname link: Name or service not known
library: ssh: Could not resolve hostname library: Name or service not known
fix: ssh: Could not resolve hostname fix: Name or service not known
the: ssh: Could not resolve hostname the: Name or service not known
or: ssh: Could not resolve hostname or: Name or service not known
with: ssh: Could not resolve hostname with: Name or service not known
with: ssh: Could not resolve hostname with: Name or service not known
recommended: ssh: Could not resolve hostname recommended: Name or service not known
'-z: ssh: Could not resolve hostname '-z: Name or service not known
it: ssh: Could not resolve hostname it: Name or service not known
will: ssh: Could not resolve hostname will: Name or service not known
',: ssh: Could not resolve hostname ',: Name or service not known
'execstack: ssh: Could not resolve hostname 'execstack: Name or service not known
noexecstack'.: ssh: Could not resolve hostname noexecstack'.: Name or service not known
to: ssh: connect to host to port 22: Connection refused



OUTPUT FOR ./start-dfs.sh




hduser@sandesh-Inspiron-1564:~/hadoop$ ./sbin/start-dfs.sh
14/03/24 20:57:53 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library /home/hduser/hadoop/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now.
It's highly recommended that you fix the library with 'execstack -c ', or link it with '-z noexecstack'.
localhost]
sed: -e expression #1, char 6: unknown option to `s'
-c: Unknown cipher type 'cd'
'execstack: ssh: Could not resolve hostname 'execstack: Name or service not known
HotSpot(TM): ssh: Could not resolve hostname HotSpot(TM): Name or service not known
now.: ssh: Could not resolve hostname now.: Name or service not known
or: ssh: Could not resolve hostname or: Name or service not known
might: ssh: Could not resolve hostname might: Name or service not known
that: ssh: Could not resolve hostname that: Name or service not known
library: ssh: Could not resolve hostname library: Name or service not known
the: ssh: Could not resolve hostname the: Name or service not known
recommended: ssh: Could not resolve hostname recommended: Name or service not known
the: ssh: Could not resolve hostname the: Name or service not known
it: ssh: Could not resolve hostname it: Name or service not known
with: ssh: Could not resolve hostname with: Name or service not known
',: ssh: Could not resolve hostname ',: Name or service not known
link: ssh: Could not resolve hostname link: Name or service not known
you: ssh: Could not resolve hostname you: Name or service not known
disabled: ssh: Could not resolve hostname disabled: Name or service not known
The: ssh: Could not resolve hostname The: Name or service not known
have: ssh: Could not resolve hostname have: Name or service not known
You: ssh: Could not resolve hostname You: Name or service not known
guard: ssh: Could not resolve hostname guard: Name or service not known
guard.: ssh: Could not resolve hostname guard.: Name or service not known
library: ssh: Could not resolve hostname library: Name or service not known
will: ssh: Could not resolve hostname will: Name or service not known
warning:: ssh: Could not resolve hostname warning:: Name or service not known
fix: ssh: Could not resolve hostname fix: Name or service not known
VM: ssh: Could not resolve hostname VM: Name or service not known
localhost: starting namenode, logging to /home/hduser/hadoop/logs/hadoop-hduser-namenode-sandesh-Inspiron-1564.out
to: ssh: connect to host to port 22: Connection refused
loaded: ssh: Could not resolve hostname loaded: Name or service not known
VM: ssh: Could not resolve hostname VM: Name or service not known
which: ssh: Could not resolve hostname which: Name or service not known
stack: ssh: Could not resolve hostname stack: Name or service not known
have: ssh: Could not resolve hostname have: Name or service not known
fix: ssh: Could not resolve hostname fix: Name or service not known
'-z: ssh: Could not resolve hostname '-z: Name or service not known
try: ssh: Could not resolve hostname try: Name or service not known
highly: ssh: Could not resolve hostname highly: Name or service not known
64-Bit: ssh: Could not resolve hostname 64-Bit: Name or service not known
with: ssh: Could not resolve hostname with: Name or service not known
Java: ssh: Could not resolve hostname Java: Name or service not known
stack: ssh: Could not resolve hostname stack: Name or service not known
Server: ssh: Could not resolve hostname Server: Name or service not known
It's: ssh: Could not resolve hostname It's: Name or service not known
noexecstack'.: ssh: Could not resolve hostname noexecstack'.: Name or service not known
localhost: starting datanode, logging to /home/hduser/hadoop/logs/hadoop-hduser-datanode-sandesh-Inspiron-1564.out



This output is for ./start-yarn.sh




hduser@sandesh-Inspiron-1564:~/hadoop$ ./sbin/start-yarn.sh
starting yarn daemons
resourcemanager running as process 16118. Stop it first.
localhost: nodemanager running as process 16238. Stop it first.










share|improve this question















GUYS I'm using this tutorial How to install Hadoop?
I mean the one made by Luis Alvarado in one of the comments ..
So I'm on Ubuntu 13.10 64bit
Hadoop version is 2.2.0



Actually I'm a total newbie on Hadoop .. Its new for me and We guys are trying to work on some Big Data related project
I count you guys.. Help me!
I know tutorial is based on earlier versions of Hadoop but I managed to make it through the 11th step!
And the output of the step is




root@sandesh-Inspiron-1564:/home/hduser/hadoop# sudo ./bin/hadoop namenode -format
DEPRECATED: Use of this script to execute hdfs command is deprecated.
Instead use the hdfs command for it.

14/03/24 20:29:54 INFO namenode.NameNode: STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting NameNode
STARTUP_MSG: host = sandesh-Inspiron-1564/127.0.1.1
STARTUP_MSG: args = [-format]
STARTUP_MSG: version = 2.2.0
STARTUP_MSG: classpath = /home/hduser/hadoop/etc/hadoop:/home/hduser/hadoop/share/hadoop/common/lib/jsr305-1.3.9.jar:/home/hduser/hadoop/share/hadoop/common/lib/jersey-server-1.9.jar:/home/hduser/hadoop/share/hadoop/common/lib/jasper-runtime-5.5.23.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-io-2.1.jar:/home/hduser/hadoop/share/hadoop/common/lib/netty-3.6.2.Final.jar:/home/hduser/hadoop/share/hadoop/common/lib/log4j-1.2.17.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-codec-1.4.jar:/home/hduser/hadoop/share/hadoop/common/lib/slf4j-api-1.7.5.jar:/home/hduser/hadoop/share/hadoop/common/lib/paranamer-2.3.jar:/home/hduser/hadoop/share/hadoop/common/lib/servlet-api-2.5.jar:/home/hduser/hadoop/share/hadoop/common/lib/zookeeper-3.4.5.jar:/home/hduser/hadoop/share/hadoop/common/lib/hadoop-auth-2.2.0.jar:/home/hduser/hadoop/share/hadoop/common/lib/jsp-api-2.1.jar:/home/hduser/hadoop/share/hadoop/common/lib/guava-11.0.2.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-logging-1.1.1.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-collections-3.2.1.jar:/home/hduser/hadoop/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-configuration-1.6.jar:/home/hduser/hadoop/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/home/hduser/hadoop/share/hadoop/common/lib/junit-4.8.2.jar:/home/hduser/hadoop/share/hadoop/common/lib/hadoop-annotations-2.2.0.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-el-1.0.jar:/home/hduser/hadoop/share/hadoop/common/lib/jersey-core-1.9.jar:/home/hduser/hadoop/share/hadoop/common/lib/jets3t-0.6.1.jar:/home/hduser/hadoop/share/hadoop/common/lib/stax-api-1.0.1.jar:/home/hduser/hadoop/share/hadoop/common/lib/xmlenc-0.52.jar:/home/hduser/hadoop/share/hadoop/common/lib/jersey-json-1.9.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-compress-1.4.1.jar:/home/hduser/hadoop/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/home/hduser/hadoop/share/hadoop/common/lib/jsch-0.1.42.jar:/home/hduser/hadoop/share/hadoop/common/lib/jackson-mapper-asl-1.8.8.jar:/home/hduser/hadoop/share/hadoop/common/lib/jackson-core-asl-1.8.8.jar:/home/hduser/hadoop/share/hadoop/common/lib/mockito-all-1.8.5.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-lang-2.5.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/home/hduser/hadoop/share/hadoop/common/lib/asm-3.2.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-math-2.1.jar:/home/hduser/hadoop/share/hadoop/common/lib/xz-1.0.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-cli-1.2.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-beanutils-1.7.0.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-httpclient-3.1.jar:/home/hduser/hadoop/share/hadoop/common/lib/jackson-jaxrs-1.8.8.jar:/home/hduser/hadoop/share/hadoop/common/lib/activation-1.1.jar:/home/hduser/hadoop/share/hadoop/common/lib/jasper-compiler-5.5.23.jar:/home/hduser/hadoop/share/hadoop/common/lib/jetty-6.1.26.jar:/home/hduser/hadoop/share/hadoop/common/lib/jetty-util-6.1.26.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-digester-1.8.jar:/home/hduser/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-net-3.1.jar:/home/hduser/hadoop/share/hadoop/common/lib/jackson-xc-1.8.8.jar:/home/hduser/hadoop/share/hadoop/common/lib/avro-1.7.4.jar:/home/hduser/hadoop/share/hadoop/common/lib/jettison-1.1.jar:/home/hduser/hadoop/share/hadoop/common/lib/snappy-java-1.0.4.1.jar:/home/hduser/hadoop/share/hadoop/common/hadoop-nfs-2.2.0.jar:/home/hduser/hadoop/share/hadoop/common/hadoop-common-2.2.0.jar:/home/hduser/hadoop/share/hadoop/common/hadoop-common-2.2.0-tests.jar:/home/hduser/hadoop/share/hadoop/hdfs:/home/hduser/hadoop/share/hadoop/hdfs/lib/jsr305-1.3.9.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/jasper-runtime-5.5.23.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/commons-io-2.1.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/commons-codec-1.4.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/servlet-api-2.5.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/jsp-api-2.1.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/guava-11.0.2.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/commons-logging-1.1.1.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/commons-el-1.0.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/xmlenc-0.52.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/jackson-mapper-asl-1.8.8.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/jackson-core-asl-1.8.8.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/commons-lang-2.5.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/asm-3.2.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/home/hduser/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.2.0-tests.jar:/home/hduser/hadoop/share/hadoop/hdfs/hadoop-hdfs-nfs-2.2.0.jar:/home/hduser/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.2.0.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/jersey-server-1.9.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/commons-io-2.1.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/netty-3.6.2.Final.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/log4j-1.2.17.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/paranamer-2.3.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/aopalliance-1.0.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/hadoop-annotations-2.2.0.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/jersey-core-1.9.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/junit-4.10.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/commons-compress-1.4.1.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/javax.inject-1.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/jackson-mapper-asl-1.8.8.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/jackson-core-asl-1.8.8.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/hamcrest-core-1.1.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/asm-3.2.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/xz-1.0.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/guice-3.0.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/avro-1.7.4.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/snappy-java-1.0.4.1.jar:/home/hduser/hadoop/share/hadoop/yarn/hadoop-yarn-site-2.2.0.jar:/home/hduser/hadoop/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.2.0.jar:/home/hduser/hadoop/share/hadoop/yarn/hadoop-yarn-common-2.2.0.jar:/home/hduser/hadoop/share/hadoop/yarn/hadoop-yarn-server-common-2.2.0.jar:/home/hduser/hadoop/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.2.0.jar:/home/hduser/hadoop/share/hadoop/yarn/hadoop-yarn-client-2.2.0.jar:/home/hduser/hadoop/share/hadoop/yarn/hadoop-yarn-api-2.2.0.jar:/home/hduser/hadoop/share/hadoop/yarn/hadoop-yarn-server-tests-2.2.0.jar:/home/hduser/hadoop/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.2.0.jar:/home/hduser/hadoop/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.2.0.jar:/home/hduser/hadoop/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.2.0.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/jersey-server-1.9.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/commons-io-2.1.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/log4j-1.2.17.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/paranamer-2.3.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/hadoop-annotations-2.2.0.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/jersey-core-1.9.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/junit-4.10.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/javax.inject-1.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.8.8.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/jackson-core-asl-1.8.8.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/hamcrest-core-1.1.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/asm-3.2.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/xz-1.0.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/guice-3.0.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/avro-1.7.4.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar:/home/hduser/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.2.0.jar:/home/hduser/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.2.0.jar:/home/hduser/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.2.0.jar:/home/hduser/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.2.0-tests.jar:/home/hduser/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.2.0.jar:/home/hduser/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.2.0.jar:/home/hduser/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.2.0.jar:/home/hduser/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.2.0.jar:/home/hduser/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.2.0.jar:/contrib/capacity-scheduler/*.jar:/contrib/capacity-scheduler/*.jar
STARTUP_MSG: build = https://svn.apache.org/repos/asf/hadoop/common -r 1529768; compiled by 'hortonmu' on 2013-10-07T06:28Z
STARTUP_MSG: java = 1.7.0_51
************************************************************/
14/03/24 20:29:54 INFO namenode.NameNode: registered UNIX signal handlers for [TERM, HUP, INT]
Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library /home/hduser/hadoop/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now.
It's highly recommended that you fix the library with 'execstack -c ', or link it with '-z noexecstack'.
14/03/24 20:29:54 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Formatting using clusterid: CID-f3d89333-8217-48ce-9281-44b0caed76f9
14/03/24 20:29:55 INFO namenode.HostFileManager: read includes:
HostSet(
)
14/03/24 20:29:55 INFO namenode.HostFileManager: read excludes:
HostSet(
)
14/03/24 20:29:55 INFO blockmanagement.DatanodeManager: dfs.block.invalidate.limit=1000
14/03/24 20:29:55 INFO util.GSet: Computing capacity for map BlocksMap
14/03/24 20:29:55 INFO util.GSet: VM type = 64-bit
14/03/24 20:29:55 INFO util.GSet: 2.0% max memory = 889 MB
14/03/24 20:29:55 INFO util.GSet: capacity = 2^21 = 2097152 entries
14/03/24 20:29:55 INFO blockmanagement.BlockManager: dfs.block.access.token.enable=false
14/03/24 20:29:55 INFO blockmanagement.BlockManager: defaultReplication = 1
14/03/24 20:29:55 INFO blockmanagement.BlockManager: maxReplication = 512
14/03/24 20:29:55 INFO blockmanagement.BlockManager: minReplication = 1
14/03/24 20:29:55 INFO blockmanagement.BlockManager: maxReplicationStreams = 2
14/03/24 20:29:55 INFO blockmanagement.BlockManager: shouldCheckForEnoughRacks = false
14/03/24 20:29:55 INFO blockmanagement.BlockManager: replicationRecheckInterval = 3000
14/03/24 20:29:55 INFO blockmanagement.BlockManager: encryptDataTransfer = false
14/03/24 20:29:55 INFO namenode.FSNamesystem: fsOwner = root (auth:SIMPLE)
14/03/24 20:29:55 INFO namenode.FSNamesystem: supergroup = supergroup
14/03/24 20:29:55 INFO namenode.FSNamesystem: isPermissionEnabled = true
14/03/24 20:29:55 INFO namenode.FSNamesystem: HA Enabled: false
14/03/24 20:29:55 INFO namenode.FSNamesystem: Append Enabled: true
14/03/24 20:29:55 INFO util.GSet: Computing capacity for map INodeMap
14/03/24 20:29:55 INFO util.GSet: VM type = 64-bit
14/03/24 20:29:55 INFO util.GSet: 1.0% max memory = 889 MB
14/03/24 20:29:55 INFO util.GSet: capacity = 2^20 = 1048576 entries
14/03/24 20:29:55 INFO namenode.NameNode: Caching file names occuring more than 10 times
14/03/24 20:29:55 INFO namenode.FSNamesystem: dfs.namenode.safemode.threshold-pct = 0.9990000128746033
14/03/24 20:29:55 INFO namenode.FSNamesystem: dfs.namenode.safemode.min.datanodes = 0
14/03/24 20:29:55 INFO namenode.FSNamesystem: dfs.namenode.safemode.extension = 30000
14/03/24 20:29:55 INFO namenode.FSNamesystem: Retry cache on namenode is enabled
14/03/24 20:29:55 INFO namenode.FSNamesystem: Retry cache will use 0.03 of total heap and retry cache entry expiry time is 600000 millis
14/03/24 20:29:55 INFO util.GSet: Computing capacity for map Namenode Retry Cache
14/03/24 20:29:55 INFO util.GSet: VM type = 64-bit
14/03/24 20:29:55 INFO util.GSet: 0.029999999329447746% max memory = 889 MB
14/03/24 20:29:55 INFO util.GSet: capacity = 2^15 = 32768 entries
Re-format filesystem in Storage Directory /app/hadoop/tmp/dfs/name ? (Y or N) y
14/03/24 20:30:01 INFO common.Storage: Storage directory /app/hadoop/tmp/dfs/name has been successfully formatted.
14/03/24 20:30:01 INFO namenode.FSImage: Saving image file /app/hadoop/tmp/dfs/name/current/fsimage.ckpt_0000000000000000000 using no compression
14/03/24 20:30:01 INFO namenode.FSImage: Image file /app/hadoop/tmp/dfs/name/current/fsimage.ckpt_0000000000000000000 of size 196 bytes saved in 0 seconds.
14/03/24 20:30:01 INFO namenode.NNStorageRetentionManager: Going to retain 1 images with txid >= 0
14/03/24 20:30:01 INFO util.ExitUtil: Exiting with status 0
14/03/24 20:30:01 INFO namenode.NameNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down NameNode at sandesh-Inspiron-1564/127.0.1.1
************************************************************/



Now this is the error when I'm executing the command
./start-all.sh




hduser@sandesh-Inspiron-1564:~/hadoop$ ./sbin/start-all.sh
This script is Deprecated. Instead use start-dfs.sh and start-yarn.sh
14/03/24 20:34:57 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library /home/hduser/hadoop/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now.
It's highly recommended that you fix the library with 'execstack -c ', or link it with '-z noexecstack'.
localhost]
sed: -e expression #1, char 6: unknown option to `s'
Java: ssh: Could not resolve hostname Java: Name or service not known
VM: ssh: Could not resolve hostname VM: Name or service not known
have: ssh: Could not resolve hostname have: Name or service not known
loaded: ssh: Could not resolve hostname loaded: Name or service not known
-c: Unknown cipher type 'cd'
The: ssh: Could not resolve hostname The: Name or service not known
might: ssh: Could not resolve hostname might: Name or service not known
stack: ssh: Could not resolve hostname stack: Name or service not known
have: ssh: Could not resolve hostname have: Name or service not known
guard: ssh: Could not resolve hostname guard: Name or service not known
stack: ssh: Could not resolve hostname stack: Name or service not known
library: ssh: Could not resolve hostname library: Name or service not known
HotSpot(TM): ssh: Could not resolve hostname HotSpot(TM): Name or service not known
disabled: ssh: Could not resolve hostname disabled: Name or service not known
You: ssh: Could not resolve hostname You: Name or service not known
Server: ssh: Could not resolve hostname Server: Name or service not known
highly: ssh: Could not resolve hostname highly: Name or service not known
that: ssh: Could not resolve hostname that: Name or service not known
which: ssh: Could not resolve hostname which: Name or service not known
will: ssh: Could not resolve hostname will: Name or service not known
now.: ssh: Could not resolve hostname now.: Name or service not known
'-z: ssh: Could not resolve hostname '-z: Name or service not known
link: ssh: Could not resolve hostname link: Name or service not known
library: ssh: Could not resolve hostname library: Name or service not known
with: ssh: Could not resolve hostname with: Name or service not known
the: ssh: Could not resolve hostname the: Name or service not known
guard.: ssh: Could not resolve hostname guard.: Name or service not known
fix: ssh: Could not resolve hostname fix: Name or service not known
the: ssh: Could not resolve hostname the: Name or service not known
with: ssh: Could not resolve hostname with: Name or service not known
it: ssh: Could not resolve hostname it: Name or service not known
64-Bit: ssh: Could not resolve hostname 64-Bit: Name or service not known
try: ssh: Could not resolve hostname try: Name or service not known
',: ssh: Could not resolve hostname ',: Name or service not known
localhost: starting namenode, logging to /home/hduser/hadoop/logs/hadoop-hduser-namenode-sandesh-Inspiron-1564.out
to: ssh: connect to host to port 22: Connection refused
warning:: ssh: Could not resolve hostname warning:: Name or service not known
you: ssh: Could not resolve hostname you: Name or service not known
fix: ssh: Could not resolve hostname fix: Name or service not known
It's: ssh: Could not resolve hostname It's: Name or service not known
or: ssh: Could not resolve hostname or: Name or service not known
recommended: ssh: Could not resolve hostname recommended: Name or service not known
VM: ssh: Could not resolve hostname VM: Name or service not known
noexecstack'.: ssh: Could not resolve hostname noexecstack'.: Name or service not known
'execstack: ssh: Could not resolve hostname 'execstack: Name or service not known
localhost: starting datanode, logging to /home/hduser/hadoop/logs/hadoop-hduser-datanode-sandesh-Inspiron-1564.out
Starting secondary namenodes [Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library /home/hduser/hadoop/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now.
It's highly recommended that you fix the library with 'execstack -c ', or link it with '-z noexecstack'.
0.0.0.0]
sed: -e expression #1, char 6: unknown option to `s'
-c: Unknown cipher type 'cd'
Java: ssh: Could not resolve hostname Java: Name or service not known
stack: ssh: Could not resolve hostname stack: Name or service not known
Server: ssh: Could not resolve hostname Server: Name or service not known
You: ssh: Could not resolve hostname You: Name or service not known
have: ssh: Could not resolve hostname have: Name or service not known
VM: ssh: Could not resolve hostname VM: Name or service not known
fix: ssh: Could not resolve hostname fix: Name or service not known
HotSpot(TM): ssh: Could not resolve hostname HotSpot(TM): Name or service not known
warning:: ssh: Could not resolve hostname warning:: Name or service not known
The authenticity of host '0.0.0.0 (0.0.0.0)' can't be established.
ECDSA key fingerprint is 89:fb:3d:98:2c:6d:03:c1:a3:de:96:3b:39:bc:ca:b3.
Are you sure you want to continue connecting (yes/no)? loaded: ssh: Could not resolve hostname loaded: Name or service not known
stack: ssh: Could not resolve hostname stack: Name or service not known
have: ssh: Could not resolve hostname have: Name or service not known
guard: ssh: Could not resolve hostname guard: Name or service not known
library: ssh: Could not resolve hostname library: Name or service not known
The: ssh: Could not resolve hostname The: Name or service not known
disabled: ssh: Could not resolve hostname disabled: Name or service not known
VM: ssh: Could not resolve hostname VM: Name or service not known
64-Bit: ssh: Could not resolve hostname 64-Bit: Name or service not known
try: ssh: Could not resolve hostname try: Name or service not known
you: ssh: Could not resolve hostname you: Name or service not known
might: ssh: Could not resolve hostname might: Name or service not known
the: ssh: Could not resolve hostname the: Name or service not known
highly: ssh: Could not resolve hostname highly: Name or service not known
It's: ssh: Could not resolve hostname It's: Name or service not known
now.: ssh: Could not resolve hostname now.: Name or service not known
that: ssh: Could not resolve hostname that: Name or service not known
which: ssh: Could not resolve hostname which: Name or service not known
guard.: ssh: Could not resolve hostname guard.: Name or service not known
link: ssh: Could not resolve hostname link: Name or service not known
library: ssh: Could not resolve hostname library: Name or service not known
fix: ssh: Could not resolve hostname fix: Name or service not known
the: ssh: Could not resolve hostname the: Name or service not known
or: ssh: Could not resolve hostname or: Name or service not known
with: ssh: Could not resolve hostname with: Name or service not known
with: ssh: Could not resolve hostname with: Name or service not known
recommended: ssh: Could not resolve hostname recommended: Name or service not known
'-z: ssh: Could not resolve hostname '-z: Name or service not known
it: ssh: Could not resolve hostname it: Name or service not known
will: ssh: Could not resolve hostname will: Name or service not known
',: ssh: Could not resolve hostname ',: Name or service not known
'execstack: ssh: Could not resolve hostname 'execstack: Name or service not known
noexecstack'.: ssh: Could not resolve hostname noexecstack'.: Name or service not known
to: ssh: connect to host to port 22: Connection refused



OUTPUT FOR ./start-dfs.sh




hduser@sandesh-Inspiron-1564:~/hadoop$ ./sbin/start-dfs.sh
14/03/24 20:57:53 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library /home/hduser/hadoop/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now.
It's highly recommended that you fix the library with 'execstack -c ', or link it with '-z noexecstack'.
localhost]
sed: -e expression #1, char 6: unknown option to `s'
-c: Unknown cipher type 'cd'
'execstack: ssh: Could not resolve hostname 'execstack: Name or service not known
HotSpot(TM): ssh: Could not resolve hostname HotSpot(TM): Name or service not known
now.: ssh: Could not resolve hostname now.: Name or service not known
or: ssh: Could not resolve hostname or: Name or service not known
might: ssh: Could not resolve hostname might: Name or service not known
that: ssh: Could not resolve hostname that: Name or service not known
library: ssh: Could not resolve hostname library: Name or service not known
the: ssh: Could not resolve hostname the: Name or service not known
recommended: ssh: Could not resolve hostname recommended: Name or service not known
the: ssh: Could not resolve hostname the: Name or service not known
it: ssh: Could not resolve hostname it: Name or service not known
with: ssh: Could not resolve hostname with: Name or service not known
',: ssh: Could not resolve hostname ',: Name or service not known
link: ssh: Could not resolve hostname link: Name or service not known
you: ssh: Could not resolve hostname you: Name or service not known
disabled: ssh: Could not resolve hostname disabled: Name or service not known
The: ssh: Could not resolve hostname The: Name or service not known
have: ssh: Could not resolve hostname have: Name or service not known
You: ssh: Could not resolve hostname You: Name or service not known
guard: ssh: Could not resolve hostname guard: Name or service not known
guard.: ssh: Could not resolve hostname guard.: Name or service not known
library: ssh: Could not resolve hostname library: Name or service not known
will: ssh: Could not resolve hostname will: Name or service not known
warning:: ssh: Could not resolve hostname warning:: Name or service not known
fix: ssh: Could not resolve hostname fix: Name or service not known
VM: ssh: Could not resolve hostname VM: Name or service not known
localhost: starting namenode, logging to /home/hduser/hadoop/logs/hadoop-hduser-namenode-sandesh-Inspiron-1564.out
to: ssh: connect to host to port 22: Connection refused
loaded: ssh: Could not resolve hostname loaded: Name or service not known
VM: ssh: Could not resolve hostname VM: Name or service not known
which: ssh: Could not resolve hostname which: Name or service not known
stack: ssh: Could not resolve hostname stack: Name or service not known
have: ssh: Could not resolve hostname have: Name or service not known
fix: ssh: Could not resolve hostname fix: Name or service not known
'-z: ssh: Could not resolve hostname '-z: Name or service not known
try: ssh: Could not resolve hostname try: Name or service not known
highly: ssh: Could not resolve hostname highly: Name or service not known
64-Bit: ssh: Could not resolve hostname 64-Bit: Name or service not known
with: ssh: Could not resolve hostname with: Name or service not known
Java: ssh: Could not resolve hostname Java: Name or service not known
stack: ssh: Could not resolve hostname stack: Name or service not known
Server: ssh: Could not resolve hostname Server: Name or service not known
It's: ssh: Could not resolve hostname It's: Name or service not known
noexecstack'.: ssh: Could not resolve hostname noexecstack'.: Name or service not known
localhost: starting datanode, logging to /home/hduser/hadoop/logs/hadoop-hduser-datanode-sandesh-Inspiron-1564.out



This output is for ./start-yarn.sh




hduser@sandesh-Inspiron-1564:~/hadoop$ ./sbin/start-yarn.sh
starting yarn daemons
resourcemanager running as process 16118. Stop it first.
localhost: nodemanager running as process 16238. Stop it first.







networking server 13.10 ssh hadoop






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Apr 13 '17 at 12:24









Community

1




1










asked Mar 24 '14 at 15:16









pandarenpandaren

1123




1123












  • The first line reads: "This script is Deprecated. Instead use start-dfs.sh and start-yarn.sh". Did you try that?
    – terdon
    Mar 24 '14 at 15:22










  • Its giving somewhat same error see the updated post
    – pandaren
    Mar 24 '14 at 15:29










  • You are showing something completely different. Please read the error messages. Both errors are telling you in the first few lines to use a different program than what you are using. Try using that and if it still gives you error update your question to show us.
    – terdon
    Mar 24 '14 at 15:32










  • noe me to have the same problem, if you got solution please post it..
    – A J
    May 3 '14 at 6:57










  • The problem was with the ssh keys ... use this to generate the key again ssh-keygen -t rsa -P '' -f ~/.ssh/id_rsa and cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
    – pandaren
    May 7 '14 at 13:28




















  • The first line reads: "This script is Deprecated. Instead use start-dfs.sh and start-yarn.sh". Did you try that?
    – terdon
    Mar 24 '14 at 15:22










  • Its giving somewhat same error see the updated post
    – pandaren
    Mar 24 '14 at 15:29










  • You are showing something completely different. Please read the error messages. Both errors are telling you in the first few lines to use a different program than what you are using. Try using that and if it still gives you error update your question to show us.
    – terdon
    Mar 24 '14 at 15:32










  • noe me to have the same problem, if you got solution please post it..
    – A J
    May 3 '14 at 6:57










  • The problem was with the ssh keys ... use this to generate the key again ssh-keygen -t rsa -P '' -f ~/.ssh/id_rsa and cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
    – pandaren
    May 7 '14 at 13:28


















The first line reads: "This script is Deprecated. Instead use start-dfs.sh and start-yarn.sh". Did you try that?
– terdon
Mar 24 '14 at 15:22




The first line reads: "This script is Deprecated. Instead use start-dfs.sh and start-yarn.sh". Did you try that?
– terdon
Mar 24 '14 at 15:22












Its giving somewhat same error see the updated post
– pandaren
Mar 24 '14 at 15:29




Its giving somewhat same error see the updated post
– pandaren
Mar 24 '14 at 15:29












You are showing something completely different. Please read the error messages. Both errors are telling you in the first few lines to use a different program than what you are using. Try using that and if it still gives you error update your question to show us.
– terdon
Mar 24 '14 at 15:32




You are showing something completely different. Please read the error messages. Both errors are telling you in the first few lines to use a different program than what you are using. Try using that and if it still gives you error update your question to show us.
– terdon
Mar 24 '14 at 15:32












noe me to have the same problem, if you got solution please post it..
– A J
May 3 '14 at 6:57




noe me to have the same problem, if you got solution please post it..
– A J
May 3 '14 at 6:57












The problem was with the ssh keys ... use this to generate the key again ssh-keygen -t rsa -P '' -f ~/.ssh/id_rsa and cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
– pandaren
May 7 '14 at 13:28






The problem was with the ssh keys ... use this to generate the key again ssh-keygen -t rsa -P '' -f ~/.ssh/id_rsa and cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
– pandaren
May 7 '14 at 13:28












2 Answers
2






active

oldest

votes


















0














Similar than an StackOverflow question
I had the same problem and I've resolved it adding to /etc/hosts file the next line:



192.168.56.101 localhost hadoop


where you must change the ip and change hadoop and put your own hostname.






share|improve this answer































    0














    You have to test SSH by connecting localhost from hadoop user,



    hadoopz@ubuntu:/$ ssh localhost
    Last login: Tue Apr 1 12:02:45 2014 from localhost


    my hadoop username is "hadoopz".



    To check step by stem process follow this link.






    share|improve this answer





















      Your Answer








      StackExchange.ready(function() {
      var channelOptions = {
      tags: "".split(" "),
      id: "89"
      };
      initTagRenderer("".split(" "), "".split(" "), channelOptions);

      StackExchange.using("externalEditor", function() {
      // Have to fire editor after snippets, if snippets enabled
      if (StackExchange.settings.snippets.snippetsEnabled) {
      StackExchange.using("snippets", function() {
      createEditor();
      });
      }
      else {
      createEditor();
      }
      });

      function createEditor() {
      StackExchange.prepareEditor({
      heartbeatType: 'answer',
      autoActivateHeartbeat: false,
      convertImagesToLinks: true,
      noModals: true,
      showLowRepImageUploadWarning: true,
      reputationToPostImages: 10,
      bindNavPrevention: true,
      postfix: "",
      imageUploader: {
      brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
      contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
      allowUrls: true
      },
      onDemand: true,
      discardSelector: ".discard-answer"
      ,immediatelyShowMarkdownHelp:true
      });


      }
      });














      draft saved

      draft discarded


















      StackExchange.ready(
      function () {
      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2faskubuntu.com%2fquestions%2f438584%2funable-to-execute-command-start-all-sh-in-hadoop%23new-answer', 'question_page');
      }
      );

      Post as a guest















      Required, but never shown

























      2 Answers
      2






      active

      oldest

      votes








      2 Answers
      2






      active

      oldest

      votes









      active

      oldest

      votes






      active

      oldest

      votes









      0














      Similar than an StackOverflow question
      I had the same problem and I've resolved it adding to /etc/hosts file the next line:



      192.168.56.101 localhost hadoop


      where you must change the ip and change hadoop and put your own hostname.






      share|improve this answer




























        0














        Similar than an StackOverflow question
        I had the same problem and I've resolved it adding to /etc/hosts file the next line:



        192.168.56.101 localhost hadoop


        where you must change the ip and change hadoop and put your own hostname.






        share|improve this answer


























          0












          0








          0






          Similar than an StackOverflow question
          I had the same problem and I've resolved it adding to /etc/hosts file the next line:



          192.168.56.101 localhost hadoop


          where you must change the ip and change hadoop and put your own hostname.






          share|improve this answer














          Similar than an StackOverflow question
          I had the same problem and I've resolved it adding to /etc/hosts file the next line:



          192.168.56.101 localhost hadoop


          where you must change the ip and change hadoop and put your own hostname.







          share|improve this answer














          share|improve this answer



          share|improve this answer








          edited May 23 '17 at 12:39









          Community

          1




          1










          answered Apr 3 '14 at 9:47









          ssotossoto

          556715




          556715

























              0














              You have to test SSH by connecting localhost from hadoop user,



              hadoopz@ubuntu:/$ ssh localhost
              Last login: Tue Apr 1 12:02:45 2014 from localhost


              my hadoop username is "hadoopz".



              To check step by stem process follow this link.






              share|improve this answer


























                0














                You have to test SSH by connecting localhost from hadoop user,



                hadoopz@ubuntu:/$ ssh localhost
                Last login: Tue Apr 1 12:02:45 2014 from localhost


                my hadoop username is "hadoopz".



                To check step by stem process follow this link.






                share|improve this answer
























                  0












                  0








                  0






                  You have to test SSH by connecting localhost from hadoop user,



                  hadoopz@ubuntu:/$ ssh localhost
                  Last login: Tue Apr 1 12:02:45 2014 from localhost


                  my hadoop username is "hadoopz".



                  To check step by stem process follow this link.






                  share|improve this answer












                  You have to test SSH by connecting localhost from hadoop user,



                  hadoopz@ubuntu:/$ ssh localhost
                  Last login: Tue Apr 1 12:02:45 2014 from localhost


                  my hadoop username is "hadoopz".



                  To check step by stem process follow this link.







                  share|improve this answer












                  share|improve this answer



                  share|improve this answer










                  answered Apr 9 '14 at 7:40









                  A JA J

                  5,833153357




                  5,833153357






























                      draft saved

                      draft discarded




















































                      Thanks for contributing an answer to Ask Ubuntu!


                      • Please be sure to answer the question. Provide details and share your research!

                      But avoid



                      • Asking for help, clarification, or responding to other answers.

                      • Making statements based on opinion; back them up with references or personal experience.


                      To learn more, see our tips on writing great answers.





                      Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


                      Please pay close attention to the following guidance:


                      • Please be sure to answer the question. Provide details and share your research!

                      But avoid



                      • Asking for help, clarification, or responding to other answers.

                      • Making statements based on opinion; back them up with references or personal experience.


                      To learn more, see our tips on writing great answers.




                      draft saved


                      draft discarded














                      StackExchange.ready(
                      function () {
                      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2faskubuntu.com%2fquestions%2f438584%2funable-to-execute-command-start-all-sh-in-hadoop%23new-answer', 'question_page');
                      }
                      );

                      Post as a guest















                      Required, but never shown





















































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown

































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown







                      Popular posts from this blog

                      flock() on closed filehandle LOCK_FILE at /usr/bin/apt-mirror

                      Mangá

                       ⁒  ․,‪⁊‑⁙ ⁖, ⁇‒※‌, †,⁖‗‌⁝    ‾‸⁘,‖⁔⁣,⁂‾
”‑,‥–,‬ ,⁀‹⁋‴⁑ ‒ ,‴⁋”‼ ⁨,‷⁔„ ‰′,‐‚ ‥‡‎“‷⁃⁨⁅⁣,⁔
⁇‘⁔⁡⁏⁌⁡‿‶‏⁨ ⁣⁕⁖⁨⁩⁥‽⁀  ‴‬⁜‟ ⁃‣‧⁕‮ …‍⁨‴ ⁩,⁚⁖‫ ,‵ ⁀,‮⁝‣‣ ⁑  ⁂– ․, ‾‽ ‏⁁“⁗‸ ‾… ‹‡⁌⁎‸‘ ‡⁏⁌‪ ‵⁛ ‎⁨ ―⁦⁤⁄⁕