nebula algorithm调用时报错NoClassDefFoundError

大佬,我参考github这个配置更新了我的application.conf,配置如下:

{
  # 1. Spark 相关配置(已修复语法错误)
  spark: {
    app: {
      name: "ConnectedComponent-GangBuild"
      partitionNum: 10
    }
    master: "local[*]"  # 关键修复:双引号包裹 local[*]
  }

  # 2. 数据输入输出配置
  data: {
    source: nebula
    sink: nebula
    hasWeight: false
  }

  # 3. NebulaGraph 连接与读写配置
  nebula: {
    read: {
      metaAddress: "10.10.102.13:9559"
      space: "hyl"
      labels: ["user_inline"]
      weightCols: []
    }
    write: {
      graphAddress: "10.10.102.13:9669"
      metaAddress: "10.10.102.13:9559"
      user: "root"
      pswd: "nebula"
      space: "hyl"
      tag: "user"
      type: "update"
    }
  }

  # 4. 算法配置
  algorithm: {
    executeAlgo: "connectedcomponent"
    connectedcomponent: {
      maxIter: 20
      encodeId: true
    }
  }
}

但是还是有报错

[admin@qa1-rcspt-nebulaGraph ~]$ sudo /home/ops/spark24/spark-2.4.2-bin-hadoop2.7/bin/spark-submit   --class com.vesoft.nebula.algorithm.Main   --master local[*]   --executor-memory 8G   /tmp/nebula-algorithm-3.0.0.jar   -p /tmp/application.conf
25/11/19 15:59:45 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
log4j:WARN No appenders could be found for logger (com.vesoft.nebula.algorithm.Main$).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
25/11/19 15:59:46 INFO SparkContext: Running Spark version 2.4.2
25/11/19 15:59:46 INFO SparkContext: Submitted application: ConnectedComponent-GangBuild
25/11/19 15:59:46 INFO SecurityManager: Changing view acls to: root
25/11/19 15:59:46 INFO SecurityManager: Changing modify acls to: root
25/11/19 15:59:46 INFO SecurityManager: Changing view acls groups to: 
25/11/19 15:59:46 INFO SecurityManager: Changing modify acls groups to: 
25/11/19 15:59:46 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(root); groups with view permissions: Set(); users  with modify permissions: Set(root); groups with modify permissions: Set()
25/11/19 15:59:46 INFO Utils: Successfully started service 'sparkDriver' on port 42805.
25/11/19 15:59:46 INFO SparkEnv: Registering MapOutputTracker
25/11/19 15:59:46 INFO SparkEnv: Registering BlockManagerMaster
25/11/19 15:59:46 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
25/11/19 15:59:46 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
25/11/19 15:59:46 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-836f41a4-36ec-4329-a930-48a72f686425
25/11/19 15:59:46 INFO MemoryStore: MemoryStore started with capacity 366.3 MB
25/11/19 15:59:46 INFO SparkEnv: Registering OutputCommitCoordinator
25/11/19 15:59:46 INFO Utils: Successfully started service 'SparkUI' on port 4040.
25/11/19 15:59:46 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://qa1-rcspt-nebulaGraph:4040
25/11/19 15:59:46 INFO SparkContext: Added JAR file:/tmp/nebula-algorithm-3.0.0.jar at spark://qa1-rcspt-nebulaGraph:42805/jars/nebula-algorithm-3.0.0.jar with timestamp 1763539186707
25/11/19 15:59:46 INFO Executor: Starting executor ID driver on host localhost
25/11/19 15:59:46 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 40973.
25/11/19 15:59:46 INFO NettyBlockTransferService: Server created on qa1-rcspt-nebulaGraph:40973
25/11/19 15:59:46 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
25/11/19 15:59:46 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, qa1-rcspt-nebulaGraph, 40973, None)
25/11/19 15:59:46 INFO BlockManagerMasterEndpoint: Registering block manager qa1-rcspt-nebulaGraph:40973 with 366.3 MB RAM, BlockManagerId(driver, qa1-rcspt-nebulaGraph, 40973, None)
25/11/19 15:59:46 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, qa1-rcspt-nebulaGraph, 40973, None)
25/11/19 15:59:46 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, qa1-rcspt-nebulaGraph, 40973, None)
25/11/19 15:59:47 INFO ReadNebulaConfig$: NebulaReadConfig={space=hyl,label=user_inline,returnCols=List(),noColumn=true,partitionNum=10}
25/11/19 15:59:47 INFO SharedState: Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('file:/home/admin/spark-warehouse').
25/11/19 15:59:47 INFO SharedState: Warehouse path is 'file:/home/admin/spark-warehouse'.
25/11/19 15:59:47 INFO StateStoreCoordinatorRef: Registered StateStoreCoordinator endpoint
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/internal/Logging$class
        at com.vesoft.nebula.connector.NebulaOptions.<init>(NebulaOptions.scala:21)
        at com.vesoft.nebula.connector.NebulaDataSource.getNebulaOptions(NebulaDataSource.scala:142)
        at com.vesoft.nebula.connector.NebulaDataSource.createReader(NebulaDataSource.scala:41)
        at org.apache.spark.sql.execution.datasources.v2.DataSourceV2Relation$SourceHelpers.createReader(DataSourceV2Relation.scala:155)
        at org.apache.spark.sql.execution.datasources.v2.DataSourceV2Relation$.create(DataSourceV2Relation.scala:172)
        at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:206)
        at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:167)
        at com.vesoft.nebula.connector.connector.package$NebulaDataFrameReader.loadEdgesToDF(package.scala:172)
        at com.vesoft.nebula.algorithm.reader.NebulaReader$$anonfun$read$1.apply$mcVI$sp(DataReader.scala:52)
        at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:158)
        at com.vesoft.nebula.algorithm.reader.NebulaReader.read(DataReader.scala:38)
        at com.vesoft.nebula.algorithm.Main$.createDataSource(Main.scala:118)
        at com.vesoft.nebula.algorithm.Main$.main(Main.scala:84)
        at com.vesoft.nebula.algorithm.Main.main(Main.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
        at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
        at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
        at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.internal.Logging$class
        at java.net.URLClassLoader.findClass(URLClassLoader.java:387)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
        ... 26 more
25/11/19 15:59:47 INFO SparkContext: Invoking stop() from shutdown hook
25/11/19 15:59:47 INFO SparkUI: Stopped Spark web UI at http://qa1-rcspt-nebulaGraph:4040
25/11/19 15:59:47 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
25/11/19 15:59:47 INFO MemoryStore: MemoryStore cleared
25/11/19 15:59:47 INFO BlockManager: BlockManager stopped
25/11/19 15:59:47 INFO BlockManagerMaster: BlockManagerMaster stopped
25/11/19 15:59:47 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
25/11/19 15:59:47 INFO SparkContext: Successfully stopped SparkContext
25/11/19 15:59:47 INFO ShutdownHookManager: Shutdown hook called
25/11/19 15:59:47 INFO ShutdownHookManager: Deleting directory /tmp/spark-3411fa98-aebb-4e62-9881-4226cbe75b83
25/11/19 15:59:47 INFO ShutdownHookManager: Deleting directory /tmp/spark-f8c72f2e-4b8a-493c-bfb2-019c246f6608

问ai的话,总是说我版本不兼容,但是我看论坛里其他用户用的也是spark2.4