1.准备三台机器
192.168.193.129 hasoop01
192.168.193.130 hadoop02
192.168.193.131 hadoop03
2.下载压缩包:下载地址:https://archive.apache.org/dist/spark/spark-2.4.4/
3.解压到/opt/mosule目录下面
centos7安装spark
4.进入到:cd spark-2.4.4-bin-hadoop2.7/conf
复制文件: cp spark-env.sh.template spark-env.sh
cp slaves.template slaves
vi spark-env.sh
添加内容如下:
export JAVA_HOME=/opt/module/jdk1.8.0_221
export HADOOP_HOME=/opt/module/hadoop-2.7.4
export SPARK_MASTER_MASTER_IP=192.168.193.129
export SPARK_WORKED_MEMORY=1g
export HADOOP_CONF_DIR=/opt/module/hadoop-2.7.4/etc/hadoop
export SPARK_DIST_CLASSPATH=$(/opt/module/hadoop-2.7.4/bin/hado
op classpath)

vi slaves
centos7安装spark
5.通过scp -r 传到另外两个节点
scp -r spark-2.4.4-bin-hadoop2.7/ hadoop02:/opt/module
scp -r spark-2.4.4-bin-hadoop2.7/ hadoop03:/opt/module
6.在sbin目录启动spark,在启动之前,请先将hadoop平台启动运行。
如果您的hadoop平台和zookeeper有整合,则先启动zookeeper
执行命令: ./bin/zkServer.sh start 可通过 ./bin/zkServer.sh status查看状态
再启动hadoop:
start-dfs.sh
start-yarn.sh
最后在主节点启动spark(在sbin目录下):
./start-all.sh
centos7安装spark
centos7安装spark
所用到到包的下载(hadoop、jdk。。。):
链接:https://pan.baidu.com/s/13YLV8i4GxXCzu1PNnsMbcA
提取码:c0l4

相关文章:

  • 2022-01-24
  • 2021-12-13
  • 2022-03-01
  • 2021-11-15
  • 2021-08-21
  • 2022-02-01
  • 2022-12-23
  • 2022-12-23
猜你喜欢
  • 2022-03-04
  • 2022-02-06
  • 2021-11-22
  • 2021-08-17
  • 2022-12-23
  • 2022-12-23
  • 2021-06-12
相关资源
相似解决方案