exchange报的spark-warehouse错误,请大神帮忙

充足的信息能提高解决问题的速度

提问之前,记得在【本论坛】和【文档】下搜索是否已有解决方案存在哟 ^^

为了更快地定位、解决问题,麻烦参考下面模版提问(不符合提问规范的问题,会被隐藏待补充相关信息之后再发布

提问参考模版:

  • nebula 版本:(为节省回复者核对版本信息的时间,首次发帖的版本信息记得以截图形式展示)
  • 部署方式:单机
  • 安装方式:TAR
  • 是否为线上版本:Y / N
  • 硬件信息
    • 磁盘( 推荐使用 SSD)
    • CPU、内存信息
  • 问题的具体描述
  • 相关的 meta / storage / graph info 日志信息(尽量使用文本形式方便检索)
    之前用importer导入,遇到一个问题,还没等到大神解决,只是连夜学用exchange。本人对spark不了解,只是依照葫芦画瓢,
    nebula2.6.1 exchange 2.6.1 openjdk-11 spark3.2-bin-hadoop3.2
21/11/24 07:04:22 INFO Exchange$: Processing Tag player
21/11/24 07:04:22 INFO Exchange$: field keys: name
21/11/24 07:04:22 INFO Exchange$: nebula keys: name
21/11/24 07:04:22 INFO Exchange$: Loading from mysql config: MySql source host: 192.168.88.52, port: 4000, database: tyc_2021, table: v_graph_vertex_address, user: tyc2021, password: Wang1980, sentence: select _id,name from tyc_2021.v_graph_vertex_address order by _id where name !='';
21/11/24 07:04:22 INFO SharedState: Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir.
21/11/24 07:04:22 INFO SharedState: Warehouse path is 'file:/u03/nebula/2.6.1/spark-warehouse'.
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/sql/sources/v2/StreamWriteSupport
	at java.base/java.lang.ClassLoader.defineClass1(Native Method)
	at java.base/java.lang.ClassLoader.defineClass(ClassLoader.java:1017)
	at java.base/java.security.SecureClassLoader.defineClass(SecureClassLoader.java:174)
	at java.base/java.net.URLClassLoader.defineClass(URLClassLoader.java:550)
	at java.base/java.net.URLClassLoader$1.run(URLClassLoader.java:458)
	at java.base/java.net.URLClassLoader$1.run(URLClassLoader.java:452)
	at java.base/java.security.AccessController.doPrivileged(Native Method)
	at java.base/java.net.URLClassLoader.findClass(URLClassLoader.java:451)
	at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:589)
	at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:522)
	at java.base/java.lang.Class.forName0(Native Method)
	at java.base/java.lang.Class.forName(Class.java:398)
	at java.base/java.util.ServiceLoader$LazyClassPathLookupIterator.nextProviderClass(ServiceLoader.java:1210)
	at java.base/java.util.ServiceLoader$LazyClassPathLookupIterator.hasNextService(ServiceLoader.java:1221)
	at java.base/java.util.ServiceLoader$LazyClassPathLookupIterator.hasNext(ServiceLoader.java:1265)
	at java.base/java.util.ServiceLoader$2.hasNext(ServiceLoader.java:1300)
	at java.base/java.util.ServiceLoader$3.hasNext(ServiceLoader.java:1385)
	at scala.collection.convert.Wrappers$JIteratorWrapper.hasNext(Wrappers.scala:45)
	at scala.collection.Iterator.foreach(Iterator.scala:943)
	at scala.collection.Iterator.foreach$(Iterator.scala:943)
	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
	at scala.collection.IterableLike.foreach(IterableLike.scala:74)
	at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
	at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
	at scala.collection.TraversableLike.filterImpl(TraversableLike.scala:303)
	at scala.collection.TraversableLike.filterImpl$(TraversableLike.scala:297)
	at scala.collection.AbstractTraversable.filterImpl(Traversable.scala:108)
	at scala.collection.TraversableLike.filter(TraversableLike.scala:395)
	at scala.collection.TraversableLike.filter$(TraversableLike.scala:395)
	at scala.collection.AbstractTraversable.filter(Traversable.scala:108)
	at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:652)
	at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSourceV2(DataSource.scala:720)
	at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:210)
	at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:174)
	at com.vesoft.nebula.exchange.reader.MySQLReader.read(ServerBaseReader.scala:92)
	at com.vesoft.nebula.exchange.Exchange$.com$vesoft$nebula$exchange$Exchange$$createDataSource(Exchange.scala:275)
	at com.vesoft.nebula.exchange.Exchange$$anonfun$main$2.apply(Exchange.scala:134)
	at com.vesoft.nebula.exchange.Exchange$$anonfun$main$2.apply(Exchange.scala:126)
	at scala.collection.immutable.List.foreach(List.scala:431)
	at com.vesoft.nebula.exchange.Exchange$.main(Exchange.scala:126)
	at com.vesoft.nebula.exchange.Exchange.main(Exchange.scala)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:566)
	at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
	at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:955)
	at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
	at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
	at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
	at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1043)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1052)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.sql.sources.v2.StreamWriteSupport
	at java.base/java.net.URLClassLoader.findClass(URLClassLoader.java:471)
	at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:589)
	at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:522)
	... 53 more
21/11/24 07:04:23 INFO SparkContext: Invoking stop() from shutdown hook
21/11/24 07:04:23 INFO SparkUI: Stopped Spark web UI at http://192.168.88.52:4040
21/11/24 07:04:23 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
21/11/24 07:04:23 INFO MemoryStore: MemoryStore cleared
21/11/24 07:04:23 INFO BlockManager: BlockManager stopped
21/11/24 07:04:23 INFO BlockManagerMaster: BlockManagerMaster stopped
21/11/24 07:04:23 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
21/11/24 07:04:23 INFO SparkContext: Successfully stopped SparkContext
21/11/24 07:04:23 INFO ShutdownHookManager: Shutdown hook called

好像是spark-warehouse的路径问题。请大神帮忙解决一下。

用的mysql这种方式导入的

版本不匹配, 你看下文档中的说明 https://docs.nebula-graph.com.cn/2.6.1/nebula-exchange/about-exchange/ex-ug-limitations/#_4

谢谢,是这个问题啊

如果你觉得 Nicole 的回复解决了你的问题,可以勾选她的回复为【解决方案】哈