由于Oracle授权问题,Maven3不提供Oracle JDBC driver,为了在Maven项目中应用Oracle JDBC driver,必须手动添加到本地仓库。

二.手动安装

命令如下:(首先电脑安装maven,并配置maven环境 ,windows -cmd/powershell)

mvn install:install-file -Dfile=F:\ruanjian\data-integration_v6.1\lib\ojdbc6.jar -DgroupId=com.oracle -DartifactId=ojdbc6 -Dversion=11.2.0.3 -Dpackaging=jar

然后在pom文件中引入

     <dependency>
<groupId>com.oracle</groupId>
<artifactId>ojdbc6</artifactId>
<version>11.2.0.3</version>
     </dependency>

保存就好了。



==== 测试代码 === 

package com.spark.sql

import org.apache.spark.sql.SparkSession

/**
  * Created by 92421 on 2018/4/5.
  */
object ReadOracle {
  def main(args: Array[String]): Unit = {

    val spark = SparkSession
      .builder()
        .appName("ReadOracle")
        .master("local[2]")
    .getOrCreate()


    val oracleDriverUrl = "jdbc:oracle:thin:@//192.168.137.251:1521/devdb"
    val jdbcMap1 = Map("url" -> oracleDriverUrl,
      "user" -> "scott",
      "password" -> "tiger",
      "dbtable" -> "emp",
      "driver" -> "oracle.jdbc.driver.OracleDriver")


    val jdbcMap2 = Map("url" -> oracleDriverUrl,
      "user" -> "scott",
      "password" -> "tiger",
      "dbtable" -> "dept",
      "driver" -> "oracle.jdbc.driver.OracleDriver")

    val jdbcDF1 = spark.read.options(jdbcMap1).format("jdbc").load.createOrReplaceTempView("emp")
    val jdbcDF2 = spark.read.options(jdbcMap2).format("jdbc").load.createOrReplaceTempView("dept")
    spark.sql(" select * from dept  d join emp e on e.deptno = d.deptno where e.sal > 2000").show()
    

    spark.stop()



  }



}



idea中 spark 2.x 操作 Oracle 11g 表






相关文章: