IDEA中Spark连接外置hive详细步骤
时间:2022-08-25 18:30:00
环境要求:
- 虚拟机上hadoop集群hdfs开启
- 虚拟机配置hive,且hive配置metastore到mysql
- windows中配置hadoop环境,且IDEA中sparksql内部可运行
- 关闭虚拟机防火墙
许多在线帖子的操作步骤过于繁琐,现总结如下:
1.向pom.xml中导入依赖(mysql驱动、hive依赖,spark-on-hive依赖)
mysql mysql-connector-java 5.1.27 org.apache.spark spark-hive_2.12 3.0.0 org.apache.hive hive-exec 1.2.1
2.将虚拟机hive/conf目录下hive-site.xml 将文件复制到项目中 resources (根据自己的目录)mysql情况调整url、用户名及密码)
hive.metastore.schema.verification false javax.jdo.option.ConnectionURL jdbc:mysql://master:3306/hive?createDatabaseIfNotExist=true JDBC connect string for a JDBC metastore javax.jdo.option.ConnectionDriverName com.mysql.jdbc.Driver Driver class name for a JDBC metastore javax.jdo.option.ConnectionUserName root username to use against metastore database javax.jdo.option.ConnectionPassword 123456 password to use against metastore database
3.idea项目target/classes目录中hive-site.xml是否自动复制,如果没有,需要放置,否则spark只能本地运行
4.开启Hive支持,正在创建SparkSession时,添加enableHiveSupport()即可
//创建 SparkSession val spark: SparkSession = SparkSession .builder() .enableHiveSupport() .master("local[*]") .appName("sql") .getOrCreate()