nebula 3.8 社区版支持exchange导出吗

[root@localhost nebula-spark-exchange]# ./export.sh
26/05/11 20:19:10 WARN Utils: Your hostname, localhost.localdomain resolves to a loopback address: 127.0.0.1; using 170.20.213.21 instead (on interface enp18s0)
26/05/11 20:19:10 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
Exception in thread “main” com.typesafe.config.ConfigException$Missing: /apps/nebula-spark-exchange/export.conf: 1: No configuration setting found for key ‘nebula’
at com.typesafe.config.impl.SimpleConfig.findKeyOrNull(SimpleConfig.java:156)
at com.typesafe.config.impl.SimpleConfig.findOrNull(SimpleConfig.java:174)
at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:188)
at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:193)
at com.typesafe.config.impl.SimpleConfig.getObject(SimpleConfig.java:268)
at com.typesafe.config.impl.SimpleConfig.getConfig(SimpleConfig.java:274)
at com.typesafe.config.impl.SimpleConfig.getConfig(SimpleConfig.java:41)
at com.vesoft.exchange.common.config.Configs$.parse(Configs.scala:314)
at com.vesoft.nebula.exchange.Exchange$.main(Exchange.scala:43)
at com.vesoft.nebula.exchange.Exchange.main(Exchange.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:984)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:191)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:214)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1072)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1081)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
26/05/11 20:19:20 INFO ShutdownHookManager: Shutdown hook called
26/05/11 20:19:20 INFO ShutdownHookManager: Deleting directory /tmp/spark-7bf22f2b-6395-4453-83a2-ccd0814cfcbf


文档里有写哈, 仅企业版 Exchange 提供导出功能,其他导出方式可以论坛里面搜下 是否有合适的