使用llama-index,在调用 KnowledgeGraphIndex.from_documents()方法是报错

使用llama-index,在调用 KnowledgeGraphIndex.from_documents()方法是报错,
报错内容如下:

Traceback (most recent call last):
  File "D:\code_llm\LLama-Index\lcm3.py", line 56, in <module>
    kg_index = KnowledgeGraphIndex.from_documents(
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\gogoLcm\.conda\envs\llama-index\Lib\site-packages\llama_index\core\indices\base.py", line 145, in from_documents
    return cls(
           ^^^^
  File "C:\Users\gogoLcm\.conda\envs\llama-index\Lib\site-packages\llama_index\core\indices\knowledge_graph\base.py", line 109, in __init__
    super().__init__(
  File "C:\Users\gogoLcm\.conda\envs\llama-index\Lib\site-packages\llama_index\core\indices\base.py", line 94, in __init__
    index_struct = self.build_index_from_nodes(
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\gogoLcm\.conda\envs\llama-index\Lib\site-packages\llama_index\core\indices\base.py", line 216, in build_index_from_nodes
    return self._build_index_from_nodes(nodes, **build_kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\gogoLcm\.conda\envs\llama-index\Lib\site-packages\llama_index\core\indices\knowledge_graph\base.py", line 215, in _build_index_from_nodes
    triplets = self._extract_triplets(
               ^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\gogoLcm\.conda\envs\llama-index\Lib\site-packages\llama_index\core\indices\knowledge_graph\base.py", line 159, in _extract_triplets
    return self._llm_extract_triplets(text)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\gogoLcm\.conda\envs\llama-index\Lib\site-packages\llama_index\core\indices\knowledge_graph\base.py", line 163, in _llm_extract_triplets
    response = self._llm.predict(
               ^^^^^^^^^^^^^^^^^^
  File "C:\Users\gogoLcm\.conda\envs\llama-index\Lib\site-packages\llama_index\core\instrumentation\dispatcher.py", line 230, in wrapper
    result = func(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\gogoLcm\.conda\envs\llama-index\Lib\site-packages\llama_index\core\llms\llm.py", line 559, in predict
    chat_response = self.chat(messages)
                    ^^^^^^^^^^^^^^^^^^^
  File "C:\Users\gogoLcm\.conda\envs\llama-index\Lib\site-packages\llama_index\core\instrumentation\dispatcher.py", line 230, in wrapper
    result = func(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\gogoLcm\.conda\envs\llama-index\Lib\site-packages\llama_index\core\llms\callbacks.py", line 172, in wrapped_llm_chat
    f_return_val = f(_self, messages, **kwargs)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\gogoLcm\.conda\envs\llama-index\Lib\site-packages\llama_index\llms\ollama\base.py", line 261, in chat
    response = self.client.chat(
               ^^^^^^^^^^^^^^^^^
  File "C:\Users\gogoLcm\.conda\envs\llama-index\Lib\site-packages\ollama\_client.py", line 235, in chat
    return self._request_stream(
           ^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\gogoLcm\.conda\envs\llama-index\Lib\site-packages\ollama\_client.py", line 98, in _request_stream
    return self._stream(*args, **kwargs) if stream else self._request(*args, **kwargs).json()
                                                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\gogoLcm\.conda\envs\llama-index\Lib\site-packages\ollama\_client.py", line 74, in _request
    raise ResponseError(e.response.text, e.response.status_code) from None
ollama._types.ResponseError
Exception ignored in: <function NebulaGraphStore.__del__ at 0x000002D6D22E2840>

我的代码:

# 加载文档数据
documents = SimpleDirectoryReader(
    "data"
).load_data()
# print('>>>  documents : ', documents)

# 初始化Nebula图存储
graph_store = NebulaGraphStore(
    space_name=space_name,
    edge_types=edge_types,
    rel_prop_names=rel_prop_names,
    tags=tags
)
# print('>>>  graph_store : ', graph_store)

# 创建存储上下文
storage_context = StorageContext.from_defaults(graph_store=graph_store)
# print('>>>  storage_context : ', storage_context)

# 构建知识图谱索引
kg_index = KnowledgeGraphIndex.from_documents(
    documents,
    storage_context=storage_context,
    max_triplets_per_chunk=10,
    include_embeddings=True,
    # space_name=space_name,
    # edge_types=edge_types,
    # rel_prop_names=rel_prop_names,
    # tags=tags,
)
print('>>>  kg_index : ', kg_index )

版本:

llama-cloud                             0.0.11
llama-index                             0.10.58
llama-index-agent-openai                0.2.9
llama-index-cli                         0.1.13
llama-index-core                        0.10.58
llama-index-embeddings-huggingface      0.2.2
llama-index-embeddings-ollama           0.1.2
llama-index-embeddings-openai           0.1.11
llama-index-graph-stores-nebula         0.2.0
llama-index-indices-managed-llama-cloud 0.2.7
llama-index-legacy                      0.9.48
llama-index-llms-ollama                 0.2.2
llama-index-llms-openai                 0.1.27
llama-index-multi-modal-llms-openai     0.1.8
llama-index-program-openai              0.1.7
llama-index-question-gen-openai         0.1.3
llama-index-readers-file                0.1.31
llama-index-readers-llama-parse         0.1.6
llama-parse                             0.4.9
ollama                                  0.3.0

请问各位大佬在调试的过程中遇到过这个问题么?是如何解决的。感谢!

看起来是 Ollama 方面的报错,换成别的服务试试呢?

2 个赞

感谢您的回复。我已经解决了这个问题 。

2 个赞

此话题已在最后回复的 30 天后被自动关闭。不再允许新回复。