spark-connector2.0.0 拒绝连接图服务 com.facebook.thrift.transport.TTransportException: java.net.ConnectException: Connection refused (Connection refused)

提问参考模版:

  • nebula 版本:2.0
  • 分布式
  • 是否为线上版本:Y
  • 硬件信息
    • 磁盘( SSD)
      -使用spark-connector进行连接读取数据,配置:

//abc为我的三台机器meta地址
val config =
      NebulaConnectionConfig
        .builder()
        .withMetaAddress("a:9559,b:9559,c:9559")
        .withConenctionRetry(2)
        .withTimeout(6000)
        .build()  
  val nebulaReadVertexConfig: ReadNebulaConfig = ReadNebulaConfig
      .builder()
      .withSpace("IDGraph")
      .withLabel("entry")
      .withNoColumn(false)
      .withLimit(1000)
      .withPartitionNum(10)
      .build()
    val vertex = spark.read.nebula(config, nebulaReadVertexConfig).loadVerticesToDF()

-报错信息

  • 相关的 meta / storage / graph info 日志信息.
com.facebook.thrift.transport.TTransportException: java.net.ConnectException: Connection refused (Connection refused)
	at com.facebook.thrift.transport.TSocket.open(TSocket.java:175)
	at com.vesoft.nebula.client.meta.MetaClient.getClient(MetaClient.java:104)
	at com.vesoft.nebula.client.meta.MetaClient.doConnect(MetaClient.java:99)
	at com.vesoft.nebula.client.meta.MetaClient.connect(MetaClient.java:89)
	at com.vesoft.nebula.connector.nebula.MetaProvider.<init>(MetaProvider.scala:22)
	at com.vesoft.nebula.connector.reader.NebulaSourceReader.getSchema(NebulaSourceReader.scala:45)
	at com.vesoft.nebula.connector.reader.NebulaSourceReader.readSchema(NebulaSourceReader.scala:31)
	at org.apache.spark.sql.execution.datasources.v2.DataSourceV2Relation$.create(DataSourceV2Relation.scala:175)
	at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:204)
	at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:167)
	at com.vesoft.nebula.connector.connector.package$NebulaDataFrameReader.loadVerticesToDF(package.scala:123)
	at com.vesoft.nebula.connector.importor.NebulaImport$.readVertex(NebulaImport.scala:51)
	at com.vesoft.nebula.connector.importor.NebulaImport$.main(NebulaImport.scala:30)
	at com.vesoft.nebula.connector.importor.NebulaImport.main(NebulaImport.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:684)
Caused by: java.net.ConnectException: Connection refused (Connection refused)
	at java.net.PlainSocketImpl.socketConnect(Native Method)
	at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
	at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
	at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
	at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
	at java.net.Socket.connect(Socket.java:589)
	at com.facebook.thrift.transport.TSocket.open(TSocket.java:170)
	... 18 more

服务正常 都没挂

有人理理我吗

这个配置。。你直接用的就是 a:9559 和 b、c 吗?

回复时间参考上面的帖子

这里是因为公司不允许贴真实地址 给了个代号地址 地址这里无问题

所以你确定是能访问 a:9559 这个 地址的对吧

meta服务连接不上,在你执行spark-connector的机器上分别执行 telnet a 9559, telnet b 9559, telnet c 9559 看下能连通么

非常感谢 是由于之前spark包冲突时shade-plugin重命名了netty包导致的 目前将代码写在connector里面打出了一个大包上传解决的

2.0.0版本您的spark是什么版本呢

你好 我的spark是2.2 但是connector的spark是2.4.4 我现在有很多冲突 ,应该怎么做呢

spark-submit \
--master yarn \
--name nebulaconnect \
--deploy-mode cluster  \
--num-executors 50  \
--executor-memory 10g \
--executor-cores  5  \
--conf spark.driver.extraClassPath=nebula-spark-connector-2.0-SNAPSHOT.jar \
--conf spark.executor.extraClassPath=nebula-spark-connector-2.0-SNAPSHOT.jar \
--conf spark.dynamicAllocation.enabled=false \
--jars nebula-spark-connector-2.0-SNAPSHOT.jar \
--class com.vesoft.nebula.connector.importor.NebulaImport  \
   original-nebula-spark-connector-2.0-SNAPSHOT.jar
<font style="color:red"> diagnostics: User class threw exception: java.lang.NoSuchMethodError: org.apache.spark.sql.internal.SharedState.externalCatalog()Lorg/apache/spark/sql/catalyst/catalog/ExternalCatalog;</font>

at org.apache.spark.sql.hive.HiveSessionStateBuilder.externalCatalog(HiveSessionStateBuilder.scala:40)
at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog$lzycompute(HiveSessionStateBuilder.scala:55)
at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog(HiveSessionStateBuilder.scala:53)
at org.apache.spark.sql.hive.HiveSessionStateBuilder$$anon$1.<init>(HiveSessionStateBuilder.scala:70)
at org.apache.spark.sql.hive.HiveSessionStateBuilder.analyzer(HiveSessionStateBuilder.scala:70)
at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$2.apply(BaseSessionStateBuilder.scala:293)
at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$2.apply(BaseSessionStateBuilder.scala:293)
at org.apache.spark.sql.internal.SessionState.analyzer$lzycompute(SessionState.scala:79)
at org.apache.spark.sql.internal.SessionState.analyzer(SessionState.scala:79)
at org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:57)
at org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:55)
at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:47)
at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:78)
at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:642)
at com.vesoft.nebula.connector.importor.NebulaImport$.writeVertex(NebulaImport.scala:62)
at com.vesoft.nebula.connector.importor.NebulaImport$.main(NebulaImport.scala:24)
at com.vesoft.nebula.connector.importor.NebulaImport.main(NebulaImport.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:684)

我是在本地windows装了个spark比较方便,服务器上的还没尝试… 可以升级一下spark

升级Spark, 2.2和2.4是不兼容的