安装好spark(spark-2.0.0-bin-hadoop2.6)后在ubuntu的终端输入pyspark启动时出现错误提示:Exception in thread "main" java.lang.UnsupportedClassVersionError,上百度搜了一下,很多博客说这是因为The problem is that you compiled with/for Java 8, but you are running Spark on Java 7 or older,所以就下载安装了java8(注意要每个节点都要安装java8) 重新在ubuntu终端输入pyspark启动后成功:

安装spark过程中出现Exception in thread "main" java.lang.UnsupportedClassVersionError错误的解决办法

如何在ubuntu14下下载安装java8可以参照zhuxp1的博客:https://blog.csdn.net/zhuxiaoping54532/article/details/70158200

相关文章:

  • 2021-12-31
  • 2022-01-08
  • 2022-02-22
  • 2022-12-23
  • 2022-12-23
  • 2021-11-22
猜你喜欢
  • 2021-10-09
  • 2022-03-02
  • 2022-12-23
  • 2021-07-23
  • 2021-12-11
  • 2022-12-23
相关资源
相似解决方案