nebula-spark-utils里面NebulaSparkWriterExample导入数据时无报错,但是未查询到数据

提问参考模版:

  • nebula 版本:2.0.1

  • 部署方式(分布式 ):

  • 是否为线上版本:Y

  • 问题的具体描述

  • 相关的 meta / storage / graph info 日志信息 无报错

源数据:

{"src":"12345","dst":"23456","degree":34, "descr": "aaa","timep": "2020-01-01"}
{"src":"11111","dst":"22222","degree":33, "descr": "aaa","timep": "2020-01-01"}
{"src":"11111","dst":"33333","degree":32, "descr": "a\baa","timep": "2020-01-01"}
{"src":"11111","dst":"44444","degree":31, "descr": "aaa","timep": "2020-01-01"}
{"src":"22222","dst":"55555","degree":30, "descr": "a\naa","timep": "2020-01-01"}
{"src":"33333","dst":"44444","degree":29, "descr": "aaa","timep": "2020-01-01"}
{"src":"33333","dst":"55555","degree":28, "descr": "aa\ta","timep": "2020-01-01"}
{"src":"44444","dst":"22222","degree":27, "descr": "aaa","timep": "2020-01-01"}
{"src":"44444","dst":"55555","degree":26, "descr": "aaa","timep": "2020-01-01"}
{"src":"22222","dst":"66666","degree":25, "descr": "aaa","timep": "2020-01-01"}

建边语句:

  CREATE EDGE friend(src string,dst string,degree int, descr string,timep DATE);

查询语句:

 FETCH PROP ON friend "44444" -> "55555";

导入方法(官方例子)

  def writeEdge(spark: SparkSession): Unit = {
    LOG.info("start to write nebula edges")
    val df = spark.read.json("src/main/resources/edge")
    df.show()
    df.persist(StorageLevel.MEMORY_AND_DISK_SER)

    val config =
      NebulaConnectionConfig
        .builder()
        .withMetaAddress("")
        .withGraphAddress("")
        .build
    val nebulaWriteEdgeConfig: WriteNebulaEdgeConfig = WriteNebulaEdgeConfig
      .builder()
      .withSpace("test")
      .withEdge("friend")
      .withSrcIdField("src")
      .withDstIdField("dst")
      .withRankField("degree")
      .withSrcAsProperty(true)
      .withDstAsProperty(true)
      .withRankAsProperty(true)
      .withBatch(1000)
      .build()
    df.write.nebula(config, nebulaWriteEdgeConfig).writeEdges()
  }

查询语句不对,导入的时候指定rank了,查询的时候没指定rank

1赞

好的。感谢指点。我去试一下。谢谢

浙ICP备20010487号