带header导入,提示异常

环境

nebule 版本: v2

导入文件准备

  • example.yml
  • example.csv(带header)

example.yml

version: v2
description: example
removeTempFiles: false
clientSettings:
  retry: 3
  concurrency: 2 # number of graph clients
  channelBufferSize: 1
  space: worm
  connection:
    user: root
    password: nebula
    address: 192.168.132.128:9669
  postStart:    # 否,配置连接Nebula Graph服务器之后,在插入数据之前执行的一些操作。
    commands: |
      UPDATE CONFIGS storage:wal_ttl=3600;
      UPDATE CONFIGS storage:rocksdb_column_family_options = { disable_auto_compactions = true };
      DROP SPACE IF EXISTS worm;
      CREATE SPACE IF NOT EXISTS worm(partition_num=5, replica_factor=1);
      USE worm;
    afterPeriod: 8s		# 否,执行上述commands命令后到执行插入数据命令之间的间隔,例如8s。
  preStop:       		# 否,配置断开Nebula Graph服务器连接之前执行的一些操作。
    commands: |
      UPDATE CONFIGS storage:rocksdb_column_family_options = { disable_auto_compactions = false };
      UPDATE CONFIGS storage:wal_ttl=86400;
logPath: ./err/worm.log					# 否, 导入过程中的错误等日志信息输出的文件路径。
files:
  - path: ./host_header.csv             # 是
    failDataPath: ./err/host_header.csv # 是
    batchSize: 2                        # 否, 单批次插入数据的语句数量
    inOrder: true                       # 否, 是否按顺序在文件中插入数据行。
    type: csv                           # 是,文件类型
    csv:
      withHeader: true                  # 是, 是否有表头
      withLabel: false                  # 是, 是否有LABEL。插入/删除数据
    schema:
      type: vertex                      # 是, Schema的类型,可选值为vertex和edge。

  - path: ./flow_header.csv
    failDataPath: ./err/flow_header.csv
    batchSize: 2
    type: csv
    csv:
      withHeader: true          
      withLabel: false       
    schema:
      type: edge
      edge:
        name: flow
        withRanking: false

example.csv(带header)

lite@ubuntu[±|release-v2-0-0-ga U:1 ?:3 ✗]:~/projects/nebula-importer $ head host_header.csv
host.name:string,:VID(int),host.ip:string,:IGNORE
pjqKbl,3237609346,192.249.255.130,HOST
MOfIuk,3760568082,224.37.183.18,HOST
lxepuF,3335969299,198.214.218.19,HOST
HySfKi,2603487773,155.46.18.29,HOST
AjIkEB,3757918228,223.253.72.20,HOST
KPoFRY,4010278438,239.7.254.38,HOST
bdwhmD,563240565,33.146.94.117,HOST
xjaHSD,556175612,33.38.144.252,HOST
rjdtBn,2035040284,121.76.64.28,HOST



lite@ubuntu[±|release-v2-0-0-ga U:1 ?:3 ✗]:~/projects/nebula-importer $ head flow_header.csv
:SRC_VID(int),:DST_VID(int),flow.name:string
1095999950,4197714551,FLOW_TO
2725893716,2083094013,FLOW_TO
13736852,1426201793,FLOW_TO
2990378936,3754053821,FLOW_TO
527769486,3856232157,FLOW_TO
2983788612,2725893716,FLOW_TO
563240565,3012747447,FLOW_TO
2067364548,3611717310,FLOW_TO
2752644230,3760568082,FLOW_TO

导入异常

导入命令

lite @ ubuntu in ~/projects/nebula-importer |07:47:54   |release-v2-0-0-ga ✗|
$ ./nebula-importer --config example.yml
2021/04/27 07:47:55 --- START OF NEBULA IMPORTER ---
2021/04/27 07:47:55 [INFO] config.go:404: files[0].schema.vertex is nil
2021/04/27 07:47:55 [INFO] connection_pool.go:74: [nebula-clients] connection pool is initialized successfully
2021/04/27 07:47:55 [INFO] clientmgr.go:28: Create 2 Nebula Graph clients
2021/04/27 07:47:55 [INFO] reader.go:64: Start to read file(1): /home/lite/projects/nebula-importer/flow_header.csv, schema: < :SRC_VID(int),:DST_VID(int),flow.name:string >
2021/04/27 07:47:55 [INFO] reader.go:64: Start to read file(0): /home/lite/projects/nebula-importer/host_header.csv, schema: < host.name:string,:VID(int),host.ip:string >
2021/04/27 07:48:06 [ERROR] handler.go:63: Client 0 fail to execute: INSERT EDGE `flow`(`name`) VALUES  1095999950->4197714551:("FLOW_TO") , 13736852->1426201793:("FLOW_TO") ;, ErrMsg: EdgeName `flow'  is nonexistent, ErrCode: -8
2021/04/27 07:48:06 [ERROR] handler.go:63: Client 1 fail to execute: INSERT EDGE `flow`(`name`) VALUES  2725893716->2083094013:("FLOW_TO") , 2990378936->3754053821:("FLOW_TO") ;, ErrMsg:EdgeName `flow'  is nonexistent, ErrCode: -8

你有事先把 tag host 和 edge flow 创建好吗?你可以通过console 到你插入的space里面进行 show tags 和 show edges 确认下有没有这tag 和 edge。假如有,那有可能刚才服务端的缓存还没同步到,你可以重新试下。

确实没, 我使用命令 postStart 新建 下