SparkContext可以通过parallelize把一个集合转换为RDD

def main(args: Array[String]): Unit = {
      val conf = new SparkConf();
      val list = List(1, 2, 3, 4, 5,6);
      conf.set("spark.master", "local")
      conf.set("spark.app.name", "spark demo")
      val sc = new SparkContext(conf);
      val input = sc.parallelize(list)
      val sum = input.sum()
      println(sum)
  }

  

  

相关文章:

  • 2021-11-22
  • 2021-06-06
  • 2022-12-23
  • 2022-12-23
  • 2022-12-23
  • 2022-01-04
  • 2022-12-23
  • 2022-12-23
猜你喜欢
  • 2021-11-09
  • 2022-01-09
  • 2021-08-31
  • 2022-12-23
  • 2022-01-23
  • 2022-12-23
  • 2022-12-23
相关资源
相似解决方案