Exchang组件yarn-cluster模式导入报错,java.lang.ClassNotFoundException: com.vesoft.nebula.exchange.Exchange

nebula 版本:v2.6.2
部署方式:分布式 (3节点)
安装方式:RPM
是否为线上版本:N
硬件信息
磁盘: HHD+SSD 500G
CPU、内存信息: 48C / 100G
问题的具体描述:老师们好,请教一个exchange问题哈,使用exchange导入数据的时候,同一个space同一份数据多次导入不是覆盖插入吗?为什么我上面操作的时候是增量插入的呢?(一份数据30G,导入了3次,集群中每个节点的数据量变成了30*3=90G)

Nebula Exchange v2.6.3版本
Spark v2.4.0版本
CDH v6.3.2版本

使用exchange组件,导入hdfs中的csv数据文件到nebula数据库中。使用local模式或yarn-client模式都提交正常,使用yarn模式的cluster提交时报错,spark提交命令如下:
spark-submit --master yarn
–deploy-mode cluster
–driver-memory=2G
–num-executors=10
–executor-memory=6G
–executor-cores=6
–class com.vesoft.nebula.exchange.Exchange
nebula-exchange-2.6.3.jar -c hdfs://nameservice1/nebula_data/app_sf30.conf

报错信息如下:

看下readme哦,有说明yarn cluster模式下怎么使用。
原因是yarn cluster模式下会随机选择一个机器作为driver,作为driver的那个机器中没有exchange的jar包。

2 个赞

谢谢老师,ReadMe中介绍的比较简单,没有对jar包和配置文件位置做特殊说明,这种默认都是在本地服务器目录吧?

你用这个 你圈出来的命令的话,jar包和配置文件就放在 执行spark-submit命令的机器本地就好了

使用这个命令会报错:
spark-submit --class com.vesoft.nebula.exchange.Exchange \ --master yarn-cluster \ --files app_sf30.conf \ nebula-exchange-2.6.3.jar \ -c app_sf30.conf

我将jar包放到hdfs上面,用下面的命令提交运行,也是找不到主类:

spark-submit --master yarn \
--deploy-mode cluster \
--driver-memory=2G \
--num-executors=10 \
--executor-memory=6G \
--executor-cores=6 \
--class com.vesoft.nebula.exchange.Exchange \
hdfs://nameservice1/nebula_data/nebula-exchange-2.6.3.jar \
-c hdfs://nameservice1/nebula_data/app_sf30.conf 

错误信息:java.lang.ClassNotFoundException: com.vesoft.nebula.exchange.Exchange

你为什么不用readme中给出的命令啊, 你少了两个–conf 配置

老师好,我用了一样的命令也是返回错误的:

换用另一种方式调用也不行,麻烦老师再帮忙看下 :cold_sweat:


执行命令的当前目录下有exchange的jar包吗

有的老师,jar和配置文件都在当前目录下。

这是当前目录:
2Z7O_A8E}XVPJ}{Y~76T@3

如果你在A机器的当前目录下执行spark-submit 目录,加上–conf spark.driver.extraClassPath后会从你当前目录下找jar包并读取主类Exchange,不应该找不到类呀

有点奇怪,如果位置有问题应该提示找不到jar包吧?现在一直提示类找不到java.lang.ClassNotFoundException: com.vesoft.nebula.exchange.Exchange
我把详细信息发一下哈!

当前目录:

[root@vm01 test]# ll
total 4580
-rw-rw-r-- 1 apprun apprun   14258 Aug  4 10:55 app_sf30_base.conf
-rw-rw-r-- 1 apprun apprun   13426 Aug 11 19:12 app_sf30.conf
drwxrwxr-x 2 apprun apprun     258 Aug 15 11:23 logs
-rw-rw-r-- 1 apprun apprun       0 Aug 15 16:08 nebula-exchange-2.6.3.jar

当前目录提交命令:

spark-submit --master yarn \
--deploy-mode cluster \
--files app_sf30.conf  \
--conf spark.driver.extraClassPath=./ \
--conf spark.executor.extraClassPath=./ \
--class com.vesoft.nebula.exchange.Exchange \
nebula-exchange-2.6.3.jar \
-c app_sf30.conf

打印日志及报错详细信息:

22/08/16 15:12:20 INFO client.RMProxy: Connecting to ResourceManager at vm01/172.20.3.31:8032
22/08/16 15:12:20 INFO yarn.Client: Requesting a new application from cluster with 3 NodeManagers
22/08/16 15:12:20 INFO conf.Configuration: resource-types.xml not found
22/08/16 15:12:20 INFO resource.ResourceUtils: Unable to find 'resource-types.xml'.
22/08/16 15:12:20 INFO yarn.Client: Verifying our application has not requested more than the maximum memory capability of the cluster (99779 MB per container)
22/08/16 15:12:20 INFO yarn.Client: Will allocate AM container, with 1408 MB memory including 384 MB overhead
22/08/16 15:12:20 INFO yarn.Client: Setting up container launch context for our AM
22/08/16 15:12:20 INFO yarn.Client: Setting up the launch environment for our AM container
22/08/16 15:12:20 INFO yarn.Client: Preparing resources for our AM container
22/08/16 15:12:20 INFO yarn.Client: Uploading resource file:/home/apprun/test/nebula-exchange-2.6.3.jar -> hdfs://nameservice1/user/root/.sparkStaging/application_1660532348419_0017/nebula-exchange-2.6.3.jar
22/08/16 15:12:20 INFO yarn.Client: Uploading resource file:/home/apprun/test/app_sf30.conf -> hdfs://nameservice1/user/root/.sparkStaging/application_1660532348419_0017/app_sf30.conf
22/08/16 15:12:21 INFO yarn.Client: Uploading resource file:/tmp/spark-342a558b-54a7-4146-a1c3-668597cb36a5/__spark_conf__5673671080069562295.zip -> hdfs://nameservice1/user/root/.sparkStaging/application_1660532348419_0017/__spark_conf__.zip
22/08/16 15:12:21 INFO spark.SecurityManager: Changing view acls to: root
22/08/16 15:12:21 INFO spark.SecurityManager: Changing modify acls to: root
22/08/16 15:12:21 INFO spark.SecurityManager: Changing view acls groups to: 
22/08/16 15:12:21 INFO spark.SecurityManager: Changing modify acls groups to: 
22/08/16 15:12:21 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(root); groups with view permissions: Set(); users  with modify permissions: Set(root); groups with modify permissions: Set()
22/08/16 15:12:21 INFO conf.HiveConf: Found configuration file file:/etc/hive/conf.cloudera.hive/hive-site.xml
22/08/16 15:12:21 INFO security.YARNHadoopDelegationTokenManager: Attempting to load user's ticket cache.
22/08/16 15:12:21 INFO yarn.Client: Submitting application application_1660532348419_0017 to ResourceManager
22/08/16 15:12:21 INFO impl.YarnClientImpl: Submitted application application_1660532348419_0017
22/08/16 15:12:22 INFO yarn.Client: Application report for application_1660532348419_0017 (state: ACCEPTED)
22/08/16 15:12:22 INFO yarn.Client: 
         client token: N/A
         diagnostics: AM container is launched, waiting for AM container to Register with RM
         ApplicationMaster host: N/A
         ApplicationMaster RPC port: -1
         queue: root.users.root
         start time: 1660633941626
         final status: UNDEFINED
         tracking URL: http://vm01:8088/proxy/application_1660532348419_0017/
         user: root
22/08/16 15:12:23 INFO yarn.Client: Application report for application_1660532348419_0017 (state: ACCEPTED)
22/08/16 15:12:24 INFO yarn.Client: Application report for application_1660532348419_0017 (state: ACCEPTED)
22/08/16 15:12:25 INFO yarn.Client: Application report for application_1660532348419_0017 (state: ACCEPTED)
22/08/16 15:12:26 INFO yarn.Client: Application report for application_1660532348419_0017 (state: ACCEPTED)
22/08/16 15:12:27 INFO yarn.Client: Application report for application_1660532348419_0017 (state: FAILED)
22/08/16 15:12:27 INFO yarn.Client: 
         client token: N/A
         diagnostics: Application application_1660532348419_0017 failed 2 times due to AM Container for appattempt_1660532348419_0017_000002 exited with  exitCode: 13
Failing this attempt.Diagnostics: [2022-08-16 15:12:27.143]Exception from container-launch.
Container id: container_1660532348419_0017_02_000001
Exit code: 13

[2022-08-16 15:12:27.145]Container exited with a non-zero exit code 13. Error file: prelaunch.err.
Last 4096 bytes of prelaunch.err :
Last 4096 bytes of stderr :
22/08/16 15:12:25 INFO util.SignalUtils: Registered signal handler for TERM
22/08/16 15:12:25 INFO util.SignalUtils: Registered signal handler for HUP
22/08/16 15:12:25 INFO util.SignalUtils: Registered signal handler for INT
22/08/16 15:12:25 INFO spark.SecurityManager: Changing view acls to: yarn,root
22/08/16 15:12:25 INFO spark.SecurityManager: Changing modify acls to: yarn,root
22/08/16 15:12:25 INFO spark.SecurityManager: Changing view acls groups to: 
22/08/16 15:12:25 INFO spark.SecurityManager: Changing modify acls groups to: 
22/08/16 15:12:25 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(yarn, root); groups with view permissions: Set(); users  with modify permissions: Set(yarn, root); groups with modify permissions: Set()
22/08/16 15:12:26 INFO yarn.ApplicationMaster: ApplicationAttemptId: appattempt_1660532348419_0017_000002
22/08/16 15:12:26 INFO yarn.ApplicationMaster: Starting the user application in a separate Thread
22/08/16 15:12:26 ERROR yarn.ApplicationMaster: Uncaught exception: 
java.lang.ClassNotFoundException: com.vesoft.nebula.exchange.Exchange
        at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
        at org.apache.spark.deploy.yarn.ApplicationMaster.startUserApplication(ApplicationMaster.scala:663)
        at org.apache.spark.deploy.yarn.ApplicationMaster.runDriver(ApplicationMaster.scala:440)
        at org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:275)
        at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:805)
        at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:804)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875)
        at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:804)
        at org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala)
22/08/16 15:12:26 INFO yarn.ApplicationMaster: Final app status: FAILED, exitCode: 13, (reason: Uncaught exception: java.lang.ClassNotFoundException: com.vesoft.nebula.exchange.Exchange
        at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
        at org.apache.spark.deploy.yarn.ApplicationMaster.startUserApplication(ApplicationMaster.scala:663)
        at org.apache.spark.deploy.yarn.ApplicationMaster.runDriver(ApplicationMaster.scala:440)
        at org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:275)
        at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:805)
        at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:804)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875)
        at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:804)
        at org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala)
)
22/08/16 15:12:26 INFO yarn.ApplicationMaster: Deleting staging directory hdfs://nameservice1/user/root/.sparkStaging/application_1660532348419_0017
22/08/16 15:12:26 INFO util.ShutdownHookManager: Shutdown hook called


[2022-08-16 15:12:27.146]Container exited with a non-zero exit code 13. Error file: prelaunch.err.
Last 4096 bytes of prelaunch.err :
Last 4096 bytes of stderr :
22/08/16 15:12:25 INFO util.SignalUtils: Registered signal handler for TERM
22/08/16 15:12:25 INFO util.SignalUtils: Registered signal handler for HUP
22/08/16 15:12:25 INFO util.SignalUtils: Registered signal handler for INT
22/08/16 15:12:25 INFO spark.SecurityManager: Changing view acls to: yarn,root
22/08/16 15:12:25 INFO spark.SecurityManager: Changing modify acls to: yarn,root
22/08/16 15:12:25 INFO spark.SecurityManager: Changing view acls groups to: 
22/08/16 15:12:25 INFO spark.SecurityManager: Changing modify acls groups to: 
22/08/16 15:12:25 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(yarn, root); groups with view permissions: Set(); users  with modify permissions: Set(yarn, root); groups with modify permissions: Set()
22/08/16 15:12:26 INFO yarn.ApplicationMaster: ApplicationAttemptId: appattempt_1660532348419_0017_000002
22/08/16 15:12:26 INFO yarn.ApplicationMaster: Starting the user application in a separate Thread
22/08/16 15:12:26 ERROR yarn.ApplicationMaster: Uncaught exception: 
java.lang.ClassNotFoundException: com.vesoft.nebula.exchange.Exchange
        at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
        at org.apache.spark.deploy.yarn.ApplicationMaster.startUserApplication(ApplicationMaster.scala:663)
        at org.apache.spark.deploy.yarn.ApplicationMaster.runDriver(ApplicationMaster.scala:440)
        at org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:275)
        at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:805)
        at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:804)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875)
        at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:804)
        at org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala)
22/08/16 15:12:26 INFO yarn.ApplicationMaster: Final app status: FAILED, exitCode: 13, (reason: Uncaught exception: java.lang.ClassNotFoundException: com.vesoft.nebula.exchange.Exchange
        at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
        at org.apache.spark.deploy.yarn.ApplicationMaster.startUserApplication(ApplicationMaster.scala:663)
        at org.apache.spark.deploy.yarn.ApplicationMaster.runDriver(ApplicationMaster.scala:440)
        at org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:275)
        at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:805)
        at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:804)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875)
        at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:804)
        at org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala)
)
22/08/16 15:12:26 INFO yarn.ApplicationMaster: Deleting staging directory hdfs://nameservice1/user/root/.sparkStaging/application_1660532348419_0017
22/08/16 15:12:26 INFO util.ShutdownHookManager: Shutdown hook called


For more detailed output, check the application tracking page: http://vm01:8088/cluster/app/application_1660532348419_0017 Then click on links to logs of each attempt.
. Failing the application.
         ApplicationMaster host: N/A
         ApplicationMaster RPC port: -1
         queue: root.users.root
         start time: 1660633941626
         final status: FAILED
         tracking URL: http://vm01:8088/cluster/app/application_1660532348419_0017
         user: root
22/08/16 15:12:27 ERROR yarn.Client: Application diagnostics message: Application application_1660532348419_0017 failed 2 times due to AM Container for appattempt_1660532348419_0017_000002 exited with  exitCode: 13
Failing this attempt.Diagnostics: [2022-08-16 15:12:27.143]Exception from container-launch.
Container id: container_1660532348419_0017_02_000001
Exit code: 13

[2022-08-16 15:12:27.145]Container exited with a non-zero exit code 13. Error file: prelaunch.err.
Last 4096 bytes of prelaunch.err :
Last 4096 bytes of stderr :
22/08/16 15:12:25 INFO util.SignalUtils: Registered signal handler for TERM
22/08/16 15:12:25 INFO util.SignalUtils: Registered signal handler for HUP
22/08/16 15:12:25 INFO util.SignalUtils: Registered signal handler for INT
22/08/16 15:12:25 INFO spark.SecurityManager: Changing view acls to: yarn,root
22/08/16 15:12:25 INFO spark.SecurityManager: Changing modify acls to: yarn,root
22/08/16 15:12:25 INFO spark.SecurityManager: Changing view acls groups to: 
22/08/16 15:12:25 INFO spark.SecurityManager: Changing modify acls groups to: 
22/08/16 15:12:25 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(yarn, root); groups with view permissions: Set(); users  with modify permissions: Set(yarn, root); groups with modify permissions: Set()
22/08/16 15:12:26 INFO yarn.ApplicationMaster: ApplicationAttemptId: appattempt_1660532348419_0017_000002
22/08/16 15:12:26 INFO yarn.ApplicationMaster: Starting the user application in a separate Thread
22/08/16 15:12:26 ERROR yarn.ApplicationMaster: Uncaught exception: 
java.lang.ClassNotFoundException: com.vesoft.nebula.exchange.Exchange
        at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
        at org.apache.spark.deploy.yarn.ApplicationMaster.startUserApplication(ApplicationMaster.scala:663)
        at org.apache.spark.deploy.yarn.ApplicationMaster.runDriver(ApplicationMaster.scala:440)
        at org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:275)
        at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:805)
        at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:804)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875)
        at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:804)
        at org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala)
22/08/16 15:12:26 INFO yarn.ApplicationMaster: Final app status: FAILED, exitCode: 13, (reason: Uncaught exception: java.lang.ClassNotFoundException: com.vesoft.nebula.exchange.Exchange
        at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
        at org.apache.spark.deploy.yarn.ApplicationMaster.startUserApplication(ApplicationMaster.scala:663)
        at org.apache.spark.deploy.yarn.ApplicationMaster.runDriver(ApplicationMaster.scala:440)
        at org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:275)
        at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:805)
        at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:804)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875)
        at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:804)
        at org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala)
)
22/08/16 15:12:26 INFO yarn.ApplicationMaster: Deleting staging directory hdfs://nameservice1/user/root/.sparkStaging/application_1660532348419_0017
22/08/16 15:12:26 INFO util.ShutdownHookManager: Shutdown hook called


[2022-08-16 15:12:27.146]Container exited with a non-zero exit code 13. Error file: prelaunch.err.
Last 4096 bytes of prelaunch.err :
Last 4096 bytes of stderr :
22/08/16 15:12:25 INFO util.SignalUtils: Registered signal handler for TERM
22/08/16 15:12:25 INFO util.SignalUtils: Registered signal handler for HUP
22/08/16 15:12:25 INFO util.SignalUtils: Registered signal handler for INT
22/08/16 15:12:25 INFO spark.SecurityManager: Changing view acls to: yarn,root
22/08/16 15:12:25 INFO spark.SecurityManager: Changing modify acls to: yarn,root
22/08/16 15:12:25 INFO spark.SecurityManager: Changing view acls groups to: 
22/08/16 15:12:25 INFO spark.SecurityManager: Changing modify acls groups to: 
22/08/16 15:12:25 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(yarn, root); groups with view permissions: Set(); users  with modify permissions: Set(yarn, root); groups with modify permissions: Set()
22/08/16 15:12:26 INFO yarn.ApplicationMaster: ApplicationAttemptId: appattempt_1660532348419_0017_000002
22/08/16 15:12:26 INFO yarn.ApplicationMaster: Starting the user application in a separate Thread
22/08/16 15:12:26 ERROR yarn.ApplicationMaster: Uncaught exception: 
java.lang.ClassNotFoundException: com.vesoft.nebula.exchange.Exchange
        at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
        at org.apache.spark.deploy.yarn.ApplicationMaster.startUserApplication(ApplicationMaster.scala:663)
        at org.apache.spark.deploy.yarn.ApplicationMaster.runDriver(ApplicationMaster.scala:440)
        at org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:275)
        at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:805)
        at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:804)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875)
        at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:804)
        at org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala)
22/08/16 15:12:26 INFO yarn.ApplicationMaster: Final app status: FAILED, exitCode: 13, (reason: Uncaught exception: java.lang.ClassNotFoundException: com.vesoft.nebula.exchange.Exchange
        at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
        at org.apache.spark.deploy.yarn.ApplicationMaster.startUserApplication(ApplicationMaster.scala:663)
        at org.apache.spark.deploy.yarn.ApplicationMaster.runDriver(ApplicationMaster.scala:440)
        at org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:275)
        at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:805)
        at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:804)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875)
        at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:804)
        at org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala)
)
22/08/16 15:12:26 INFO yarn.ApplicationMaster: Deleting staging directory hdfs://nameservice1/user/root/.sparkStaging/application_1660532348419_0017
22/08/16 15:12:26 INFO util.ShutdownHookManager: Shutdown hook called


For more detailed output, check the application tracking page: http://vm01:8088/cluster/app/application_1660532348419_0017 Then click on links to logs of each attempt.
. Failing the application.
Exception in thread "main" org.apache.spark.SparkException: Application application_1660532348419_0017 finished with failed status
        at org.apache.spark.deploy.yarn.Client.run(Client.scala:1158)
        at org.apache.spark.deploy.yarn.YarnClusterApplication.start(Client.scala:1606)
        at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:851)
        at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
        at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:926)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:935)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
22/08/16 15:12:27 INFO util.ShutdownHookManager: Shutdown hook called
22/08/16 15:12:27 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-342a558b-54a7-4146-a1c3-668597cb36a5
22/08/16 15:12:27 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-2d98e7a1-69eb-48eb-b124-fbc758422169

如果Spark的应用程序包不在,只会提示主类找不到,不会提示jar包找不到的。 你可以用本地模式随便找个目录测试一下就可以验证了。

我刚试了下 上面的提交命令是可以的,可能你要看下你的yarn集群中其他机器是否有对jar包的访问权限

1 个赞

我在一个不存在jar包的目录中运行上述命令,提示的是file not exist.

请教下老师这个怎么验证?目前这种yarn-cluster的提交方式,是会将jar包同步到集群上,然后各个节点机器再去读取使用对吧?有办法验证没 :lying_face:

1 个赞

file not exist是因为你的测试目录下不仅没有jar包也没有配置文件,这个错误是指配置文件不存在,你看我测试结果。
在test目录下有配置文件没有jar包,提示的是ClassNotFoundException

运行exchange后在spark的webui界面查看下 environment 里面有System property,看此时的用户是谁,然后确认下该用户是否有extraClassPath的访问权限

2 个赞

老师好,抱歉应该是我这边包的问题,这里显示的大小是0 :cold_sweat:
我再排查下

1 个赞