Spark有几种运行模式,包括 local,standalone,YARN,MESOS等

1. local模式

--master local  : run in local mode with a single core

--master local[N]  : run in local mode with N cores

--master local[*]  : run in local mode and use as many cores as the machine has

例如:

$ pyspark --master local[*]


Web UI: 

同一台电脑上: http://localhost:4040

不同的电脑上: http://192.168.0.105:4040


2. standalone模式

http://spark.apache.org/docs/latest/spark-standalone.html


(1)真分布模式,master与worker在不同的电脑时,如何配置?


(2)伪分布模式,master与worker在同一台电脑时,如何配置?

$ start-master.sh 

starting org.apache.spark.deploy.master.Master, logging to /usr/local/share/spark/logs/spark-liuwei-org.apache.spark.deploy.master.Master-1-liuwei-pc.out


然后本机查看 Web UI:  http://localhost:8080

在其它机器上查看 Web UI:  http://192.168.0.105:8080

在 Web UI中能够看到:

Spark运行模式配置

Spark master : spark://liuwei-pc:7077

启动了master,再启动worker,启动worker去连接master:

$ start-slave.sh spark://liuwei-pc:7077

starting org.apache.spark.deploy.worker.Worker, logging to /usr/local/share/spark/logs/spark-liuwei-org.apache.spark.deploy.worker.Worker-1-liuwei-pc.out


此时,查看进程,能够看到:

$ jps

8949 Master

9703 Jps

9629 Worker


(jps - Lists the instrumented Java Virtual Machines (JVMs) on the target system.)


再次打开 Web UI,能够看到增加的 Worker:

Spark运行模式配置



相关文章:

  • 2021-07-06
  • 2021-09-28
  • 2021-11-04
  • 2021-04-15
  • 2021-08-10
  • 2021-08-31
  • 2022-12-23
猜你喜欢
  • 2021-11-21
  • 2022-01-01
  • 2021-05-28
  • 2021-04-25
  • 2022-02-08
相关资源
相似解决方案