我是用 spark3.0 分支下打包出来的连接器,整合到我自己的pyspark 代码里;
df.write.format("com.vesoft.nebula.connector.NebulaDataSource")\
.options(**prop)\
.option('type','vertex')\
.option('label','router')\
.option('vidPolicy','')\
.option('vertexField','_vertexId')\
.option('batch',1)\
.option('writeMode','update')\
.save()
报错:
pyspark.sql.utils.AnalysisException: TableProvider implementation com.vesoft.nebula.connector.NebulaDataSource cannot be written with ErrorIfExists mode, please use Append or Overwrite modes instead.