启动hdfs

start-dfs.sh
 

启动spark

sbin/start-all.sh
 

添加hive-site.xml,core-site.xml,hdfs-site.xml

把hive-site.xml,core-site.xml,hdfs-site.xml拷贝到/opt/module/spark-2.1.1-bin-hadoop2.7/conf
 

启动spark-sql

bin/spark-sql --master spark://master:7077 --driver-class-path /opt/software/mysql-connector-java-5.1.43-bin.jar
spark-sql连hive
spark-sql>show databases;
spark-sql>use default;
spark-sql>show tables;
 
 

相关文章:

  • 2021-10-05
  • 2021-12-31
  • 2021-07-07
  • 2021-10-06
  • 2022-12-23
  • 2022-12-23
  • 2022-12-23
猜你喜欢
  • 2021-11-13
  • 2021-11-27
  • 2021-08-28
  • 2022-12-23
  • 2022-12-23
  • 2022-12-23
相关资源
相似解决方案