1/spark  解压缩

[[email protected]itcast01 local]# tar  -zxvf spark-2.3.1-bin-hadoop2.7.tgz 

2将文件移到/itcast
[[email protected]itcast01 itcast]# mv  spark-2.3.1-bin-hadoop2.7 /itcast
3配置环境变量
[[email protected]itcast01 itcast]# vi /etc/profile
export SPARK_HOME=/itcast/spark-2.3.1-bin-hadoop2.7
生效[[email protected]itcast01 local]# source  /etc/profile

4配置spark spark-env.sh
[[email protected]itcast01 conf]# pwd
/itcast/spark-2.3.1-bin-hadoop2.7/conf

[[email protected]itcast01 conf]# ls
fairscheduler.xml.template  log4j.properties.template 

metrics.properties.template  slaves  spark-defaults.conf.template 

spark-env.sh.template

[[email protected]itcast01 conf]# vi spark-env.sh

export SCALA_HOME=/itcast/scala-2.12.6
export JAVA_HOME=/usr/java/jdk1.8.0_171
export SPARK_MASTER_IP=192.168.117.100
export SPARK_WORKER_MEMORY=512m
export HADOOP_CONF_DIR=/itcast/hadoop-2.7.2/conf

5 vi slaves

已有
localhost

6/启动HADOOP参考https://blog.csdn.net/anaitudou/article/details/80392141

Hadoop学习安装spark

7 启动spark

[[email protected] spark-2.3.1-bin-hadoop2.7]# pwd
/itcast/spark-2.3.1-bin-hadoop2.7

[[email protected] sbin]# ./start-all.sh

Hadoop学习安装spark

检验
Hadoop学习安装spark

8 开始进入 spark-shell 

[[email protected] sbin]#  spark-shell

Hadoop学习安装spark


9在本地物理机浏览器打开spak

http://itcast01:4040/

 Hadoop学习安装spark

相关文章: