一个关于nebula exchange 的使用问题

提问参考模版:

  • nebula 版本:2.6.2
  • 部署方式:单机
  • 问题的具体描述
  • 相关的 meta / storage / graph info 日志信息(尽量使用文本形式方便检索)
Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.wrapRefArray([Ljava/lang/Object;)Lscala/collection/mutable/WrappedArray;
	at com.vesoft.nebula.exchange.config.Configs$$anon$1.<init>(Configs.scala:843)
	at com.vesoft.nebula.exchange.config.Configs$.parser(Configs.scala:842)
	at com.vesoft.nebula.exchange.Exchange$.main(Exchange.scala:62)
	at com.vesoft.nebula.exchange.Exchange.main(Exchange.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
	at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:955)
	at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
	at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
	at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
	at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1043)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1052)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
22/02/14 21:35:35 INFO ShutdownHookManager: Shutdown hook called
22/02/14 21:35:35 INFO ShutdownHookManager: Deleting directory /tmp/spark-c6fa66f9-143a-4c3f-9e9d-bb70f1457cf5

spark版本为3.2.1 以及对应的scala版本为2.13.0,这两个的版本是对应的。

运行exchange的配置文件都配置完毕,使用的exchange版本为下载的2.6.2版本,具体为【nebula-exchange-2.6.2.jar 2022-01-06 02:50 197663660 】,但是在运行【/bin/spark-submit --master “local” --class com.vesoft.nebula.exchange.Exchange <nebula-exchange-2.6.1.jar_path> -c <csv_application.conf_path> 】这个终端代码的时候,报了如上所示的错误,请问可能的原因是什么?

https://docs.nebula-graph.com.cn/2.6.2/nebula-exchange/about-exchange/ex-ug-limitations/#_4

image
关于这个限制,spark必须使用2.4.x,Java必须1.8,以及scala只能三选一是吗? :cry:

你 scala 什么版本的

现在把配置改成了spark 2.4.1, scala改成了2.12.10,现在的问题变成了spark起不开了,报错如下:

starting org.apache.spark.deploy.master.Master, logging to /datadisk/guan/spark-2.4.1/logs/spark-root-org.apache.spark.deploy.master.Master-1-kf2-data-162.out
failed to launch: nice -n 0 /datadisk/guan/spark-2.4.1/bin/spark-class org.apache.spark.deploy.master.Master --host kf2-data-162 --port 7077 --webui-port 8080 -h 192.168.0.162
at java.lang.Class.getMethod0(Class.java:3018)
at java.lang.Class.getMethod(Class.java:1784)
at sun.launcher.LauncherHelper.validateMainClass(LauncherHelper.java:650)
at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:632)
Caused by: java.lang.ClassNotFoundException: org.slf4j.Logger
at java.net.URLClassLoader.findClass(URLClassLoader.java:387)
at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:355)
at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
… 7 more
full log in /datadisk/guan/spark-2.4.1/logs/spark-root-org.apache.spark.deploy.master.Master-1-kf2-data-162.out

报这个错误是Scala 版本和nebula-exchange-2.6.2.jar 中的方法不匹配

1 个赞

你用运行Spark的Scala 重新编译一下 nebula exchange 吧

git clone的这条网址里,没有2.6,没办法下载2.6版本

https://github.com/vesoft-inc/nebula-exchange/releases

你的spark环境没配置好,看下spark_home/conf目录下配置文件里面有没有错误的配置信息


只进行了如下的几个修改:
#指定Java地址
export JAVA_HOME=/usr/local/java/jdk1.8
#指定masterIP
export SPARK_MASTER_IP=192.168.0.162
#指定master端口号
export SPARK_MASTER_PORT=7077
新建了一个log4j.properties

我这边降低了scala的版本,spark起来了,但是新的错误出现了:

Exception in thread “main” com.facebook.thrift.transport.TTransportException: java.net.NoRouteToHostException: No route to host (Host unreachable)
at com.facebook.thrift.transport.TSocket.open(TSocket.java:206)
at com.vesoft.nebula.client.meta.MetaClient.getClient(MetaClient.java:145)
at com.vesoft.nebula.client.meta.MetaClient.doConnect(MetaClient.java:124)
at com.vesoft.nebula.client.meta.MetaClient.connect(MetaClient.java:113)
at com.vesoft.nebula.exchange.MetaProvider.(MetaProvider.scala:56)
at com.vesoft.nebula.exchange.processor.VerticesProcessor.process(VerticesProcessor.scala:110)
at com.vesoft.nebula.exchange.Exchange$$anonfun$main$2.apply(Exchange.scala:150)
at com.vesoft.nebula.exchange.Exchange$$anonfun$main$2.apply(Exchange.scala:126)
at scala.collection.immutable.List.foreach(List.scala:392)
at com.vesoft.nebula.exchange.Exchange$.main(Exchange.scala:126)
at com.vesoft.nebula.exchange.Exchange.main(Exchange.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.net.NoRouteToHostException: No route to host (Host unreachable)
at java.net.PlainSocketImpl.socketConnect(Native Method)
at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:476)
at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:218)
at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:200)
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:394)
at java.net.Socket.connect(Socket.java:606)
at com.facebook.thrift.transport.TSocket.open(TSocket.java:201)
… 22 more
22/02/15 11:35:12 INFO SparkContext: Invoking stop() from shutdown hook
22/02/15 11:35:12 INFO SparkUI: Stopped Spark web UI at http://kf2-data-162:4040
22/02/15 11:35:12 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
22/02/15 11:35:13 INFO MemoryStore: MemoryStore cleared
22/02/15 11:35:13 INFO BlockManager: BlockManager stopped
22/02/15 11:35:13 INFO BlockManagerMaster: BlockManagerMaster stopped
22/02/15 11:35:13 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
22/02/15 11:35:13 INFO SparkContext: Successfully stopped SparkContext
22/02/15 11:35:13 INFO ShutdownHookManager: Shutdown hook called
22/02/15 11:35:13 INFO ShutdownHookManager: Deleting directory /tmp/spark-c422e2b5-3344-404d-a8ba-457f43e0b0e9
22/02/15 11:35:13 INFO ShutdownHookManager: Deleting directory /tmp/spark-b67aa4f1-5911-4fae-8f43-7c6b0d59ddcc

新的错误出现了,

Exception in thread “main” com.facebook.thrift.transport.TTransportException: java.net.NoRouteToHostException: No route to host (Host unreachable)
at com.facebook.thrift.transport.TSocket.open(TSocket.java:206)
at com.vesoft.nebula.client.meta.MetaClient.getClient(MetaClient.java:145)
at com.vesoft.nebula.client.meta.MetaClient.doConnect(MetaClient.java:124)
at com.vesoft.nebula.client.meta.MetaClient.connect(MetaClient.java:113)
at com.vesoft.nebula.exchange.MetaProvider.(MetaProvider.scala:56)
at com.vesoft.nebula.exchange.processor.VerticesProcessor.process(VerticesProcessor.scala:110)
at com.vesoft.nebula.exchange.Exchange$$anonfun$main$2.apply(Exchange.scala:150)
at com.vesoft.nebula.exchange.Exchange$$anonfun$main$2.apply(Exchange.scala:126)
at scala.collection.immutable.List.foreach(List.scala:392)
at com.vesoft.nebula.exchange.Exchange$.main(Exchange.scala:126)
at com.vesoft.nebula.exchange.Exchange.main(Exchange.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.net.NoRouteToHostException: No route to host (Host unreachable)
at java.net.PlainSocketImpl.socketConnect(Native Method)
at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:476)
at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:218)
at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:200)
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:394)
at java.net.Socket.connect(Socket.java:606)
at com.facebook.thrift.transport.TSocket.open(TSocket.java:201)
… 22 more
22/02/15 11:35:12 INFO SparkContext: Invoking stop() from shutdown hook
22/02/15 11:35:12 INFO SparkUI: Stopped Spark web UI at http://kf2-data-162:4040
22/02/15 11:35:12 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
22/02/15 11:35:13 INFO MemoryStore: MemoryStore cleared
22/02/15 11:35:13 INFO BlockManager: BlockManager stopped
22/02/15 11:35:13 INFO BlockManagerMaster: BlockManagerMaster stopped
22/02/15 11:35:13 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
22/02/15 11:35:13 INFO SparkContext: Successfully stopped SparkContext
22/02/15 11:35:13 INFO ShutdownHookManager: Shutdown hook called
22/02/15 11:35:13 INFO ShutdownHookManager: Deleting directory /tmp/spark-c422e2b5-3344-404d-a8ba-457f43e0b0e9
22/02/15 11:35:13 INFO ShutdownHookManager: Deleting directory /tmp/spark-b67aa4f1-5911-4fae-8f43-7c6b0d59ddcc

您好,我们在您发的链接中进行了下载,并试图编译exchange,但是当编译exchange2.6.3的时候出现了如下错误:

[INFO] Scanning for projects…
Downloading: http://repo.maven.apache.org/maven2/org/sonatype/plugins/nexus-staging-maven-plugin/1.6.8/nexus-staging-maven-plugin-1.6.8.pom
[ERROR] The build could not read 1 project → [Help 1]
[ERROR]
[ERROR] The project com.vesoft:nebula-exchange:2.6.3 (/datadisk/guan/nebula-exchange/nebula-exchange-2.6.3/nebula-exchange/pom.xml) has 1 error
[ERROR] Unresolveable build extension: Plugin org.sonatype.plugins:nexus-staging-maven-plugin:1.6.8 or one of its dependencies could not be resolved: Failed to read artifact descriptor for org.sonatype.plugins:nexus-staging-maven-plugin:jar:1.6.8: Could not transfer artifact org.sonatype.plugins:nexus-staging-maven-plugin:pom:1.6.8 from/to central (http://repo.maven.apache.org/maven2): Failed to transfer file: http://repo.maven.apache.org/maven2/org/sonatype/plugins/nexus-staging-maven-plugin/1.6.8/nexus-staging-maven-plugin-1.6.8.pom. Return code is: 501 , ReasonPhrase:HTTPS Required. → [Help 2]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/ProjectBuildingException
[ERROR] [Help 2] http://cwiki.apache.org/confluence/display/MAVEN/PluginResolutionException

在编译exchange2.6.2的时候出现了如下错误:
[INFO] Scanning for projects…
Downloading: http://repo.maven.apache.org/maven2/org/sonatype/plugins/nexus-staging-maven-plugin/1.6.8/nexus-staging-maven-plugin-1.6.8.pom
[ERROR] The build could not read 1 project → [Help 1]
[ERROR]
[ERROR] The project com.vesoft:nebula-exchange:2.6.2 (/datadisk/guan/nebula-exchange/nebula-exchange-2.6.2/nebula-exchange/pom.xml) has 1 error
[ERROR] Unresolveable build extension: Plugin org.sonatype.plugins:nexus-staging-maven-plugin:1.6.8 or one of its dependencies could not be resolved: Failed to read artifact descriptor for org.sonatype.plugins:nexus-staging-maven-plugin:jar:1.6.8: Could not transfer artifact org.sonatype.plugins:nexus-staging-maven-plugin:pom:1.6.8 from/to central (http://repo.maven.apache.org/maven2): Failed to transfer file: http://repo.maven.apache.org/maven2/org/sonatype/plugins/nexus-staging-maven-plugin/1.6.8/nexus-staging-maven-plugin-1.6.8.pom. Return code is: 501 , ReasonPhrase:HTTPS Required. → [Help 2]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/ProjectBuildingException
[ERROR] [Help 2] http://cwiki.apache.org/confluence/display/MAVEN/PluginResolutionException
这两个错误都是在编译最后一步的时候出现的

  1. spark环境你不配spark-default.conf嘛
  2. no route to hosts 错误是因为你spark 环境没法访问nebula meta client的host地址吧, 你要把域名加到机器的hosts文件中去。
  3. 编译时没按照readme中的 编译命令去编译,要跳过gpg


编译的时候是按照这个readme里面的编译命令去编译的,不过在如图的这一步中失败了

  1. 你如果spark环境已经降为2.4了,就不用单独编译exchange了,直接用官方包就可以。
  2. 关于编译,我试了下那个命令是ok的, 你的maven版本太低了吧,识别pom文件出错了。 试下用3.5以上
1 个赞

现在spark是2.4.1, scala是2.12.10,nebula是2.6.2版本,但是在如图所示中,并没有对应nebula2.6.2版本的exchange,这里我使用的是2.6.2版本的exchange,它对应的是2.6.0和2.6.1,是不是不行,我是否需要把nebula改成2.6.1.


不过我又在文档中看到了这样的一幅版本对应的图片,不知道这两个为什么不一样

我现在正在按照下面这张图进行更改,希望可以成功

终端中出现如下信息是否意味着导入失败?

[root@kf2-data-162 bin]# ./spark-submit --master “local” --class com.vesoft.nebula.exchange.Exchange /datadisk/guan/nebula-exchange/nebula-exchange-2.6.1.jar -c /datadisk/qiu/csv_application.conf
22/02/15 15:04:46 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform… using builtin-java classes where applicable
22/02/15 15:04:46 INFO Configs$: DataBase Config com.vesoft.nebula.exchange.config.DataBaseConfigEntry@43a9a4af
22/02/15 15:04:46 INFO Configs$: User Config com.vesoft.nebula.exchange.config.UserConfigEntry@3161b833
22/02/15 15:04:46 INFO Configs$: Connection Config Some(Config(SimpleConfigObject({“retry”:3,“timeout”:3000})))
22/02/15 15:04:46 INFO Configs$: Execution Config com.vesoft.nebula.exchange.config.ExecutionConfigEntry@c757796a
22/02/15 15:04:46 INFO Configs$: Source Config File source path: file:///datadisk/qiu/data_new/Department.csv, separator: Some(,), header: Some(true)
22/02/15 15:04:46 INFO Configs$: Sink Config File source path: file:///datadisk/qiu/data_new/Department.csv, separator: Some(,), header: Some(true)
22/02/15 15:04:46 INFO Configs$: name department batch 256
22/02/15 15:04:46 INFO Configs$: Tag Config: Tag name: department, source: File source path: file:///datadisk/qiu/data_new/Department.csv, separator: Some(,), header: Some(true), sink: Nebula sink addresses: [192.168.0.145:9669], vertex field: vid, vertex policy: None, batch: 256, partition: 32.
22/02/15 15:04:46 INFO Configs$: Source Config File source path: file:///datadisk/qiu/data_new/Person.csv, separator: Some(,), header: Some(true)
22/02/15 15:04:46 INFO Configs$: Sink Config File source path: file:///datadisk/qiu/data_new/Person.csv, separator: Some(,), header: Some(true)
22/02/15 15:04:46 INFO Configs$: name person batch 256
22/02/15 15:04:46 INFO Configs$: Tag Config: Tag name: person, source: File source path: file:///datadisk/qiu/data_new/Person.csv, separator: Some(,), header: Some(true), sink: Nebula sink addresses: [192.168.0.145:9669], vertex field: vid, vertex policy: None, batch: 256, partition: 32.
22/02/15 15:04:46 INFO Configs$: Source Config File source path: file:///datadisk/qiu/data_new/Case.csv, separator: Some(,), header: Some(true)
22/02/15 15:04:46 INFO Configs$: Sink Config File source path: file:///datadisk/qiu/data_new/Case.csv, separator: Some(,), header: Some(true)
22/02/15 15:04:46 INFO Configs$: name case_ batch 256
22/02/15 15:04:46 INFO Configs$: Tag Config: Tag name: case_, source: File source path: file:///datadisk/qiu/data_new/Case.csv, separator: Some(,), header: Some(true), sink: Nebula sink addresses: [192.168.0.145:9669], vertex field: vid, vertex policy: None, batch: 256, partition: 32.
22/02/15 15:04:46 INFO Configs$: Source Config File source path: file:///datadisk/qiu/data_new/Law.csv, separator: Some(,), header: Some(true)
22/02/15 15:04:46 INFO Configs$: Sink Config File source path: file:///datadisk/qiu/data_new/Law.csv, separator: Some(,), header: Some(true)
22/02/15 15:04:46 INFO Configs$: name law batch 256
22/02/15 15:04:46 INFO Configs$: Tag Config: Tag name: law, source: File source path: file:///datadisk/qiu/data_new/Law.csv, separator: Some(,), header: Some(true), sink: Nebula sink addresses: [192.168.0.145:9669], vertex field: vid, vertex policy: None, batch: 256, partition: 32.
22/02/15 15:04:46 INFO Configs$: Source Config File source path: file:///datadisk/qiu/data_new/Power.csv, separator: Some(,), header: Some(true)
22/02/15 15:04:46 INFO Configs$: Sink Config File source path: file:///datadisk/qiu/data_new/Power.csv, separator: Some(,), header: Some(true)
22/02/15 15:04:46 INFO Configs$: name power batch 256
22/02/15 15:04:46 INFO Configs$: Tag Config: Tag name: power, source: File source path: file:///datadisk/qiu/data_new/Power.csv, separator: Some(,), header: Some(true), sink: Nebula sink addresses: [192.168.0.145:9669], vertex field: vid, vertex policy: None, batch: 256, partition: 32.
22/02/15 15:04:46 INFO Configs$: Source Config File source path: file:///datadisk/qiu/data_new/Shizhengfu.csv, separator: Some(,), header: Some(true)
22/02/15 15:04:46 INFO Configs$: Sink Config File source path: file:///datadisk/qiu/data_new/Shizhengfu.csv, separator: Some(,), header: Some(true)
22/02/15 15:04:46 INFO Configs$: name shizhengfu batch 256
22/02/15 15:04:46 INFO Configs$: Tag Config: Tag name: shizhengfu, source: File source path: file:///datadisk/qiu/data_new/Shizhengfu.csv, separator: Some(,), header: Some(true), sink: Nebula sink addresses: [192.168.0.145:9669], vertex field: vid, vertex policy: None, batch: 256, partition: 32.
22/02/15 15:04:46 INFO Configs$: Source Config File source path: file:///datadisk/qiu/data_new/Subject.csv, separator: Some(,), header: Some(true)
22/02/15 15:04:46 INFO Configs$: Sink Config File source path: file:///datadisk/qiu/data_new/Subject.csv, separator: Some(,), header: Some(true)
22/02/15 15:04:46 INFO Configs$: name subject batch 256
22/02/15 15:04:46 INFO Configs$: Tag Config: Tag name: subject, source: File source path: file:///datadisk/qiu/data_new/Subject.csv, separator: Some(,), header: Some(true), sink: Nebula sink addresses: [192.168.0.145:9669], vertex field: vid, vertex policy: None, batch: 256, partition: 32.
22/02/15 15:04:46 INFO Configs$: Source Config File source path: file:///datadisk/qiu/data_new/Edge_case_department.csv, separator: Some(,), header: Some(true)
22/02/15 15:04:46 INFO Configs$: Sink Config File source path: file:///datadisk/qiu/data_new/Edge_case_department.csv, separator: Some(,), header: Some(true)
22/02/15 15:04:46 INFO Configs$: Edge Config: Edge name: edge_case_department, source: File source path: file:///datadisk/qiu/data_new/Edge_case_department.csv, separator: Some(,), header: Some(true), sink: Nebula sink addresses: [192.168.0.145:9669], source field: department_vid, source policy: None, ranking: None, target field: case_vid, target policy: None, batch: 256, partition: 32.

现在的问题应该就是您所说的第二点,也就是no route to hosts,但是检查了 好久也没找到哪里的问题