1、首先是安装jdk:
2、创建hadoop用户,设置ssh 免密登录
useradd hadoop
ssh-****** -t dsa -P '' -f ~/.ssh/id_dsa
cat ~/.ssh/id_dsa.pub >> ~/.ssh/authorized_keys
ssh localhost 测试是否成功
3、下载安装hadoop2.7.3:
wget http://apache.claz.org/hadoop/common/hadoop-2.7.3/hadoop-2.7.3.tar.gz .
tar zxf hadoop-2.7.3.tar.gz
进入解压之后目录 /etc/hadoop/ 配置文件, 以下摘录一些关键的必要配置信息
core-site.xml
<configuration>
<property>
<name>hadoop.tmp.dir</name>
<value>/home/hadoop/tmp</value>
</property>
<property>
<name>fs.defaultFS</name>
<value>hdfs://localhost:9000</value>
</property>
</configuration>
hdfs-site.xml
<configuration>
<property>
<name>dfs.replication</name>
<value>1</value>
</property>
</configuration>
mapred-site.xml
<property>
<name>mapred.job.tracker</name>
<value>localhost:9001</value>
</property>
hadoop-env.sh
export JAVA_HOME=*** 把java_home写上
slaves
单节点默认一行 localhost
4、启动hadoop
格式化hdfs文件系统
bin/hadoop namenode -format
启动、停止hadoop
sbin/start-all.sh 、sbin/stop-all.sh
启动成功
在网页输入ip:50070 hdfs