skp-blogs

开发环境:

        win10+idea+jdk1.8+scala2.12.4

具体步骤:

  1. 编写scala测试类
    object MyTest {
      def main(args: Array[String]): Unit = {
        val conf = new SparkConf()
        conf.setAppName("MyTest")
        conf.setMaster("local")
        val sc = new SparkContext(conf)
        val input = sc.textFile("file:///F:/sparktest/catalina.out")
        val count = input.filter(_.contains("java.lang.NullPointerException")).count
        System.out.println("空指针异常数" + count)
        sc.stop()
      }
    }
    
  2. 设置工程输出路径
  3. 打jar包设置
  4. java编写调用类(需要依赖saprk包,可以将所有相关的包都加到lib依赖)
    public class SubmitScalaJobToSpark {
        public static void main(String[] args) {
            String[] arg0 = new String[]{
                    "--master", "spark://node101:7077",
                    "--deploy-mode", "client",
                    "--name", "test java submit job to spark",
                    "--class", "MyTest",//指定spark任务执行函数所在类
                    "--executor-memory", "1G",//运行内存
                    "E:\\其他代码仓库\\spark\\out\\artifacts\\unnamed\\unnamed.jar",//jar包路径
    
            };
    
            SparkSubmit.main(arg0);
        }
    }
    
  5. 运行测试

             

             

 

 

    

 

分类:

技术点:

相关文章:

  • 2021-07-07
  • 2021-04-10
  • 2018-01-24
  • 2021-09-24
  • 2021-07-04
  • 2018-04-27
  • 2021-05-28
猜你喜欢
  • 2022-01-21
  • 2020-06-24
  • 2021-09-20
  • 2020-09-26
  • 2019-06-15
  • 2022-01-19
相关资源
相似解决方案