Nebula版本:3.0.0
Spark版本:2.4.4
通过spark读取hive 写入nebula 报错写不进去,查看日志发现插入语句:
INSERT edge RemoteThreadCreation
(threadId
,occurTime
) VALUES -1309651532850055942->-1309651532850055942@0: (null, 1628424395000000)
我放到Nebula后台去跑显示错误:[ERROR (-1005)]: Storage Error: Out of range value.
hive 该字段数据格式timestamp nebula的RemoteThreadCreation的数据格式也是timestamp
steam
2
来,按照惯例,补充下 Spark Connector 的版本号
nebula-spark-connector:3.0.0
这是一个问题 还有个问题是 spark写入是并发,如果存在同一个点多条数据会报ERROR:
for Storage Error: More than one request trying to add/update/delete one edge/vertex at the same time
nebula 里的 timestamp 只记到秒
(root@nebula) [sf1000_2]> yield timestamp(1628424395)
+-----------------------+
| timestamp(1628424395) |
+-----------------------+
| 1628424395 |
+-----------------------+
Got 1 rows (time spent 627/10529 us)
Tue, 22 Mar 2022 15:36:00 CST
(root@nebula) [sf1000_2]> yield timestamp(1628424395000000)
+-----------------------------+
| timestamp(1628424395000000) |
+-----------------------------+
| BAD_DATA |
+-----------------------------+
Got 1 rows (time spent 473/16516 us)
Tue, 22 Mar 2022 15:36:03 CST
1 个赞
system
关闭
11
此话题已在最后回复的 30 天后被自动关闭。不再允许新回复。