在使用KnowledgeGraphQueryEngine进行问答时出现错误: File “/home/kg_llm/soft/l1amaindex/lib/python3.10/site-packages/llama_index/core/llms/llm.py”, line 218, in _log_template_data
for k, v in ChainMap(prompt.kwargs, prompt_args).items()
AttributeError: ‘NoneType’ object has no attribute ‘kwargs’
在使用KnowledgeGraphRAGRetriever +RetrieverQueryEngine时出现
Storage Error: The VID must be a 64-bit integer or a string fitting space vertex id length limit.
我的schema空间是:CREATE SPACE IF NOT EXISTS demospace (vid_type=fixed_string(32))
大体代码如下:
space_name = “demospace”
edge_types = ["事故发生时的天气是","作战任务是","包含气象元有",
"包含站点有","包含资料有","服务接口类型包含服务接口有","服务接口类型是",
"服务接口隶属资料是","模型分类包含","模型所属分类",
"气象元隶属资料是","类型是","航线包含","航迹点属于",
"资料包含服务接口有","资料包含的气象元有","资料包含的站点有",
"隶属站网是","隶属资料是"]
# default, could be omit if create from an empty kg
rel_prop_names = ["name","name","name","name","name","name",
"name","name","name","name","name","name",
"name","name","name","name","name","name",
"name"]
tags,tag_prop_names = getProp()
tags = [“weather_concept”,“wuqi”,“专利”,“书”,“会议论文”,“其他现象”,
“地面凝结现象”,“学位论文”,
“新闻资讯”,“服务接口”,“服务接口类型”,
“期刊论文”,“气象元”,“气象元类型”,“站点”,“站网”,
“算法模型”,“算法模型分类”,“航空保障案例”,“航线信息”,
“航迹点信息”,“行业标准”,“视程障碍现象”,“资料”,“降水现象”,“雷电现象”] # default, could be omit if create from an empty kg
try:
os.environ["NEBULA_USER"] = "root"
os.environ["NEBULA_PASSWORD"] = "123456"
os.environ[
"NEBULA_ADDRESS"
] = "10.2.27.124:9669"
os_name = platform.system()
clear_command = 'cls' if os_name == 'Windows' else 'clear'
llm = ChatGML()
# load embedding
emb_path = '/home/kg_llm/llm/text2vec-large-chinese'
embedding_model = HuggingFaceEmbedding(emb_path)
Settings.embed_model = embedding_model
# create service
service_context = ServiceContext.from_defaults(llm=llm, embed_model=embedding_model)
graph_store = NebulaGraphStore(
space_name=space_name,
edge_types=edge_types,
rel_prop_names=rel_prop_names,
tags=tags,
tag_prop_names = tag_prop_names,
)
storage_context = StorageContext.from_defaults(graph_store=graph_store)
graph_rag_retriever = KnowledgeGraphRAGRetriever(
storage_context=storage_context,
service_context=service_context,
llm=llm,
verbose=True,
)
query_engine = RetrieverQueryEngine.from_args(
graph_rag_retriever,
service_context=service_context,
)
query_engine = KnowledgeGraphQueryEngine(
storage_context=storage_context,
service_context=service_context,
llm=llm,
verbose=True,
)
LONG_NEBULAGRAPH_NL2CYPHER_PROMPT_TMPL_STR = '\"'
T2C_PROMPT = PromptTemplate(
LONG_NEBULAGRAPH_NL2CYPHER_PROMPT_TMPL_STR,
prompt_type=PromptType.TEXT_TO_GRAPH_QUERY,
)