Attempting to run http://spark.apache.org/docs/latest/quick-start.html#a-standalone-app-in-scala from source.

This line val wordCounts = textFile.flatMap(line => line.split(" ")).map(word => (word, 1)).reduceByKey(_+_) reports compile 

value reduceByKey is not a member of org.apache.spark.rdd.RDD[(String, Int)] 

 

Resolution:

 import the implicit conversions from SparkContext:

import org.apache.spark.SparkContext._

They use the 'pimp up my library' pattern to add methods to RDD's of specific types. If curious, seeSparkContext:1296

相关文章:

  • 2021-07-07
  • 2021-08-24
  • 2021-07-23
  • 2021-10-02
  • 2021-12-07
  • 2022-12-23
  • 2021-08-05
  • 2018-10-28
猜你喜欢
  • 2021-05-27
  • 2022-12-23
  • 2022-12-23
  • 2021-08-03
  • 2022-01-24
  • 2022-12-23
  • 2022-12-23
相关资源
相似解决方案