spark运行exchange找不到mysql驱动

  • nebula 版本:v3.2.0
  • nebula-exchange版本:nebula-exchange_spark_2.4_3.3.0.jar
  • spark版本:2.4.3
  • 部署方式:单机
  • 安装方式:Docker
  • 是否为线上版本:N
  • 问题的具体描述:spark-submit执行mysql导入时,找不到com.mysql.cj.jdbc.driver

执行命令

spark-submit  --master spark://master:7077 --class com.vesoft.nebula.exchange.Exchange   /jars/nebula-exchange_spark_2.4-3.3.0.jar  -c /conf/baoxian_product.conf

输出日志

Exception in thread "main" org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 4 times, most recent failure: Lost task 0.3 in stage 0.0 (TID 3, 172.19.0.2, executor 0): java.lang.ClassNotFoundException: com.mysql.cj.jdbc.Driver
	at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at org.apache.spark.sql.execution.datasources.jdbc.DriverRegistry$.register(DriverRegistry.scala:45)
	at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$createConnectionFactory$1.apply(JdbcUtils.scala:55)
	at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$createConnectionFactory$1.apply(JdbcUtils.scala:54)
	at org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD.compute(JDBCRDD.scala:272)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
	at org.a...

你要把mysql驱动放到spark_home/jars下

你好,我已在该目录下加入了mysql驱动,但程序运行还是找不到mysql驱动;命令行加入--jars MySQL驱动 --driver-class-path mysql驱动 执行才可以成功,请问这是必须要加这两个参数吗

不是必须的,你要spark集群的所有节点的jars下都放mysql驱动

此话题已在最后回复的 7 天后被自动关闭。不再允许新回复。