Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: I try the offical ragtest demo in version 0.4.0, 0.5.0, use the GPT-4o-mini, but it appear this issue. I don't know how to resolve this. #1448

Open
3 tasks
zhu-peiqi opened this issue Nov 26, 2024 · 2 comments
Labels
awaiting_response Maintainers or community have suggested solutions or requested info, awaiting filer response bug Something isn't working triage Default label assignment, indicates new issue needs reviewed by a maintainer

Comments

@zhu-peiqi
Copy link

Do you need to file an issue?

  • I have searched the existing issues and this bug is not already filed.
  • My model is hosted on OpenAI or Azure. If not, please look at the "model providers" issue and don't file a new one here.
  • I believe this is a legitimate bug, not just a question. If this is a question, please use the Discussions area.

Describe the bug

Traceback (most recent call last):
File "E:\anaconda3\envs\graphrag\lib\site-packages\httpx_transports\default.py", line 72, in map_httpcore_exceptions
yield
File "E:\anaconda3\envs\graphrag\lib\site-packages\httpx_transports\default.py", line 377, in handle_async_request
resp = await self._pool.handle_async_request(req)
File "E:\anaconda3\envs\graphrag\lib\site-packages\httpcore_async\connection_pool.py", line 256, in handle_async_request
raise exc from None
File "E:\anaconda3\envs\graphrag\lib\site-packages\httpcore_async\connection_pool.py", line 236, in handle_async_request
response = await connection.handle_async_request(
File "E:\anaconda3\envs\graphrag\lib\site-packages\httpcore_async\connection.py", line 101, in handle_async_request
raise exc
File "E:\anaconda3\envs\graphrag\lib\site-packages\httpcore_async\connection.py", line 78, in handle_async_request
stream = await self._connect(request)
File "E:\anaconda3\envs\graphrag\lib\site-packages\httpcore_async\connection.py", line 124, in _connect
stream = await self._network_backend.connect_tcp(**kwargs)
File "E:\anaconda3\envs\graphrag\lib\site-packages\httpcore_backends\auto.py", line 31, in connect_tcp
return await self._backend.connect_tcp(
File "E:\anaconda3\envs\graphrag\lib\site-packages\httpcore_backends\anyio.py", line 113, in connect_tcp
with map_exceptions(exc_map):
File "E:\anaconda3\envs\graphrag\lib\contextlib.py", line 153, in exit
self.gen.throw(typ, value, traceback)
File "E:\anaconda3\envs\graphrag\lib\site-packages\httpcore_exceptions.py", line 14, in map_exceptions
raise to_exc(exc) from exc
httpcore.ConnectError: All connection attempts failed

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "E:\anaconda3\envs\graphrag\lib\site-packages\openai_base_client.py", line 1572, in _request
response = await self._client.send(
File "E:\anaconda3\envs\graphrag\lib\site-packages\httpx_client.py", line 1674, in send
response = await self._send_handling_auth(
File "E:\anaconda3\envs\graphrag\lib\site-packages\httpx_client.py", line 1702, in _send_handling_auth
response = await self._send_handling_redirects(
File "E:\anaconda3\envs\graphrag\lib\site-packages\httpx_client.py", line 1739, in _send_handling_redirects
response = await self._send_single_request(request)
File "E:\anaconda3\envs\graphrag\lib\site-packages\httpx_client.py", line 1776, in _send_single_request
response = await transport.handle_async_request(request)
File "E:\anaconda3\envs\graphrag\lib\site-packages\httpx_transports\default.py", line 376, in handle_async_request
with map_httpcore_exceptions():
File "E:\anaconda3\envs\graphrag\lib\contextlib.py", line 153, in exit
self.gen.throw(typ, value, traceback)
File "E:\anaconda3\envs\graphrag\lib\site-packages\httpx_transports\default.py", line 89, in map_httpcore_exceptions
raise mapped_exc(message) from exc
httpx.ConnectError: All connection attempts failed

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "F:\Python_Project\AI\LLM\RAG\graphrag\graphrag\index\graph\extractors\graph\graph_extractor.py", line 125, in call
result = await self._process_document(text, prompt_variables)
File "F:\Python_Project\AI\LLM\RAG\graphrag\graphrag\index\graph\extractors\graph\graph_extractor.py", line 153, in _process_document
response = await self._llm(
File "F:\Python_Project\AI\LLM\RAG\graphrag\graphrag\llm\openai\json_parsing_llm.py", line 34, in call
result = await self._delegate(input, **kwargs)
File "F:\Python_Project\AI\LLM\RAG\graphrag\graphrag\llm\openai\openai_token_replacing_llm.py", line 37, in call
return await self.delegate(input, **kwargs)
File "F:\Python_Project\AI\LLM\RAG\graphrag\graphrag\llm\openai\openai_history_tracking_llm.py", line 33, in call
output = await self.delegate(input, **kwargs)
File "F:\Python_Project\AI\LLM\RAG\graphrag\graphrag\llm\base\caching_llm.py", line 96, in call
result = await self.delegate(input, **kwargs)
File "F:\Python_Project\AI\LLM\RAG\graphrag\graphrag\llm\base\rate_limiting_llm.py", line 177, in call
result, start = await execute_with_retry()
File "F:\Python_Project\AI\LLM\RAG\graphrag\graphrag\llm\base\rate_limiting_llm.py", line 159, in execute_with_retry
async for attempt in retryer:
File "E:\anaconda3\envs\graphrag\lib\site-packages\tenacity\asyncio_init
.py", line 166, in anext
do = await self.iter(retry_state=self.retry_state)
File "E:\anaconda3\envs\graphrag\lib\site-packages\tenacity\asyncio_init
.py", line 153, in iter
result = await action(retry_state)
File "E:\anaconda3\envs\graphrag\lib\site-packages\tenacity_utils.py", line 99, in inner
return call(*args, **kwargs)
File "E:\anaconda3\envs\graphrag\lib\site-packages\tenacity_init
.py", line 418, in exc_check
raise retry_exc.reraise()
File "E:\anaconda3\envs\graphrag\lib\site-packages\tenacity_init
.py", line 185, in reraise
raise self.last_attempt.result()
File "E:\anaconda3\envs\graphrag\lib\concurrent\futures_base.py", line 438, in result
return self.__get_result()
File "E:\anaconda3\envs\graphrag\lib\concurrent\futures_base.py", line 390, in __get_result
raise self._exception
File "F:\Python_Project\AI\LLM\RAG\graphrag\graphrag\llm\base\rate_limiting_llm.py", line 165, in execute_with_retry
return await do_attempt(), start
File "F:\Python_Project\AI\LLM\RAG\graphrag\graphrag\llm\base\rate_limiting_llm.py", line 147, in do_attempt
return await self._delegate(input, **kwargs)
File "F:\Python_Project\AI\LLM\RAG\graphrag\graphrag\llm\base\base_llm.py", line 50, in call
return await self._invoke(input, **kwargs)
File "F:\Python_Project\AI\LLM\RAG\graphrag\graphrag\llm\base\base_llm.py", line 54, in _invoke
output = await self._execute_llm(input, **kwargs)
File "F:\Python_Project\AI\LLM\RAG\graphrag\graphrag\llm\openai\openai_chat_llm.py", line 53, in _execute_llm
completion = await self.client.chat.completions.create(
File "E:\anaconda3\envs\graphrag\lib\site-packages\openai\resources\chat\completions.py", line 1661, in create
return await self._post(
File "E:\anaconda3\envs\graphrag\lib\site-packages\openai_base_client.py", line 1839, in post
return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
File "E:\anaconda3\envs\graphrag\lib\site-packages\openai_base_client.py", line 1533, in request
return await self._request(
File "E:\anaconda3\envs\graphrag\lib\site-packages\openai_base_client.py", line 1606, in _request
raise APIConnectionError(request=request) from err
openai.APIConnectionError: Connection error.
22:24:40,295 graphrag.callbacks.file_workflow_callbacks INFO Entity Extraction Error details={'doc_index': 0, 'text': "his knees.\n\nThe finger pointed from the grave to him, and back again.\n\n'No, Spirit! Oh no, no!'\n\nThe finger still was there.\n\n'Spirit!' he cried, tight clutching at its robe, 'hear me! I am not the\nman I was. I will not be the man I must have been but for this\nintercourse. Why show me this, if I am past all hope?'\n\nFor the first time the hand appeared to shake.\n\n'Good Spirit,' he pursued, as down upon the ground he fell before it,\n'your nature intercedes for me, and pities me. Assure me that I yet may\nchange these shadows you have shown me by an altered life?'\n\nThe kind hand trembled.\n\n'I will honour Christmas in my heart, and try to keep it all the year. I\nwill live in the Past, the Present, and the Future. The Spirits of all\nThree shall strive within me. I will not shut out the lessons that they\nteach. Oh, tell me I may sponge away the writing on this stone!'\n\nIn his agony he caught the spectral hand. It sought to free itself, but\nhe was strong in his entreaty, and detained it. The Spirit stronger yet,\nrepulsed him.\n\nHolding up his hands in a last prayer to have his fate reversed, he saw\nan alteration in the Phantom's hood and dress. It shrunk, collapsed, and\ndwindled down into a bedpost.\n\n\nSTAVE FIVE\n\n\n[Illustration]\n\n\n\n\nTHE END OF IT\n\n\nYes! and the bedpost was his own. The bed was his own, the room was his\nown. Best and happiest of all, the Time before him was his own, to make\namends in!\n\n'I will live in the Past, the Present, and the Future!' Scrooge repeated\nas he scrambled out of bed. 'The Spirits of all Three shall strive\nwithin me. O Jacob Marley! Heaven and the Christmas Time be praised for\nthis! I say it on my knees, old Jacob; on my knees!'\n\nHe was so fluttered and so glowing with his good intentions, that his\nbroken voice would scarcely answer to his call. He had been sobbing\nviolently in his conflict with the Spirit, and his face was wet with\ntears.\n\n'They are not torn down,' cried Scrooge, folding one of his bed-curtains\nin his arms, 'They are not torn down, rings and all. They are here--I am\nhere--the shadows of the things that would have been may be dispelled.\nThey will be. I know they will!'\n\nHis hands were busy with his garments all this time: turning them inside\nout, putting them on upside down, tearing them, mislaying them, making\nthem parties to every kind of extravagance.\n\n'I don't know what to do!' cried Scrooge, laughing and crying in the\nsame breath, and making a perfect Laocoon of himself with his stockings.\n'I am as light as a feather, I am as happy as an angel, I am as merry as\na schoolboy, I am as giddy as a drunken man. A merry Christmas to\neverybody! A happy New Year to all the world! Hallo here! Whoop! Hallo!'\n\nHe had frisked into the sitting-room, and was now standing there,\nperfectly winded.\n\n'There's the saucepan that the gruel was in!' cried Scrooge, starting\noff again, and going round the fireplace. 'There's the door by which the\nGhost of Jacob Marley entered! There's the corner where the Ghost of\nChristmas Present sat! There's the window where I saw the wandering\nSpirits! It's all right, it's all true, it all happened. Ha, ha, ha!'\n\nReally, for a man who had been out of practice for so many years, it was\na splendid laugh, a most illustrious laugh. The father of a long, long\nline of brilliant laughs!\n\n'I don't know what day of the month it is,' said Scrooge. 'I don't know\nhow long I have been among the Spirits. I don't know anything. I'm quite\na baby. Never mind. I don't care. I'd rather be a baby. Hallo! Whoop!\nHallo here!'\n\nHe was checked in his transports by the churches ringing out the\nlustiest peals he had ever heard. Clash, clash, hammer; ding, dong,\nbell! Bell, dong, ding; hammer, clash, clash! Oh, glorious, glorious!\n\nRunning to the window, he opened it, and put out his head. No fog, no\nmist; clear, bright, jovial, stirring, cold; cold, piping for the blood\nto dance to; golden sunlight; heavenly sky; sweet fresh air; merry\nbells. Oh, glorious! Glorious!\n\n'What's to-day?' cried Scrooge, calling downward to a boy in Sunday\nclothes, who perhaps had loitered in to look about him.\n\n'EH?' returned the boy with all his might of wonder.\n\n'What's to-day, my fine fellow?' said Scrooge.\n\n'To-day!' replied the boy. 'Why, CHRISTMAS DAY.'\n\n'It's Christmas Day!' said Scrooge to himself. 'I haven't missed it. The\nSpirits have done it all in one night. They can do anything they like.\nOf course they can. Of course they can. Hallo, my fine fellow!'\n\n'Hallo!' returned the boy.\n\n'Do you know the poulterer's in the next street but one"}
22:24:40,445 graphrag.index.operations.cluster_graph WARNING Graph has no nodes
22:24:40,454 datashaper.workflow.workflow ERROR Error executing verb "create_base_entity_graph" in create_base_entity_graph: Columns must be same length as key
Traceback (most recent call last):
File "E:\anaconda3\envs\graphrag\lib\site-packages\datashaper\workflow\workflow.py", line 415, in _execute_verb
result = await result
File "F:\Python_Project\AI\LLM\RAG\graphrag\graphrag\index\workflows\v1\subflows\create_base_entity_graph.py", line 51, in create_base_entity_graph
output = await create_base_entity_graph_flow(
File "F:\Python_Project\AI\LLM\RAG\graphrag\graphrag\index\flows\create_base_entity_graph.py", line 77, in create_base_entity_graph
clustered = cluster_graph(
File "F:\Python_Project\AI\LLM\RAG\graphrag\graphrag\index\operations\cluster_graph.py", line 89, in cluster_graph
output[[level_to, to]] = pd.DataFrame(output[to].tolist(), index=output.index)
File "E:\anaconda3\envs\graphrag\lib\site-packages\pandas\core\frame.py", line 4299, in setitem
self._setitem_array(key, value)
File "E:\anaconda3\envs\graphrag\lib\site-packages\pandas\core\frame.py", line 4341, in _setitem_array
check_key_length(self.columns, key, value)
File "E:\anaconda3\envs\graphrag\lib\site-packages\pandas\core\indexers\utils.py", line 390, in check_key_length
raise ValueError("Columns must be same length as key")
ValueError: Columns must be same length as key
22:24:40,548 graphrag.callbacks.file_workflow_callbacks INFO Error executing verb "create_base_entity_graph" in create_base_entity_graph: Columns must be same length as key details=None
22:24:40,548 graphrag.index.run.run ERROR error running workflow create_base_entity_graph
Traceback (most recent call last):
File "F:\Python_Project\AI\LLM\RAG\graphrag\graphrag\index\run\run.py", line 273, in run_pipeline
result = await _process_workflow(
File "F:\Python_Project\AI\LLM\RAG\graphrag\graphrag\index\run\workflow.py", line 105, in _process_workflow
result = await workflow.run(context, callbacks)
File "E:\anaconda3\envs\graphrag\lib\site-packages\datashaper\workflow\workflow.py", line 369, in run
timing = await self._execute_verb(node, context, callbacks)
File "E:\anaconda3\envs\graphrag\lib\site-packages\datashaper\workflow\workflow.py", line 415, in _execute_verb
result = await result
File "F:\Python_Project\AI\LLM\RAG\graphrag\graphrag\index\workflows\v1\subflows\create_base_entity_graph.py", line 51, in create_base_entity_graph
output = await create_base_entity_graph_flow(
File "F:\Python_Project\AI\LLM\RAG\graphrag\graphrag\index\flows\create_base_entity_graph.py", line 77, in create_base_entity_graph
clustered = cluster_graph(
File "F:\Python_Project\AI\LLM\RAG\graphrag\graphrag\index\operations\cluster_graph.py", line 89, in cluster_graph
output[[level_to, to]] = pd.DataFrame(output[to].tolist(), index=output.index)
File "E:\anaconda3\envs\graphrag\lib\site-packages\pandas\core\frame.py", line 4299, in setitem
self._setitem_array(key, value)
File "E:\anaconda3\envs\graphrag\lib\site-packages\pandas\core\frame.py", line 4341, in _setitem_array
check_key_length(self.columns, key, value)
File "E:\anaconda3\envs\graphrag\lib\site-packages\pandas\core\indexers\utils.py", line 390, in check_key_length
raise ValueError("Columns must be same length as key")
ValueError: Columns must be same length as key
22:24:40,550 graphrag.callbacks.file_workflow_callbacks INFO Error running pipeline! details=None
22:24:40,575 graphrag.cli.index ERROR Errors occurred during the pipeline run, see logs for more details.

Steps to reproduce

No response

Expected Behavior

No response

GraphRAG Config Used

# Paste your config here

Logs and screenshots

No response

Additional Information

  • GraphRAG Version:
  • Operating System:
  • Python Version:
  • Related Issues:
@zhu-peiqi zhu-peiqi added bug Something isn't working triage Default label assignment, indicates new issue needs reviewed by a maintainer labels Nov 26, 2024
@natoverse
Copy link
Collaborator

Can you paste the config you are using (make sure to delete any API keys)? It appears that GraphRAG cannot connect to your configured LLM

@natoverse natoverse added the awaiting_response Maintainers or community have suggested solutions or requested info, awaiting filer response label Nov 26, 2024
@zhu-peiqi
Copy link
Author

Can you paste the config you are using (make sure to delete any API keys)? It appears that GraphRAG cannot connect to your configured LLM

Sure. My config is as follows:

This config file contains required core defaults that must be set, along with a handful of common optional settings.

For a full list of available settings, see https://microsoft.github.io/graphrag/config/yaml/

LLM settings

There are a number of settings to tune the threading and token limits for LLM calls - check the docs.

encoding_model: cl100k_base # this needs to be matched to your model!

llm:
api_key: ${GRAPHRAG_API_KEY} # set this in the generated .env file
type: openai_chat # or azure_openai_chat
model: gpt-4o-mini
model_supports_json: true # recommended if this is available for your model.

audience: "https://cognitiveservices.azure.com/.default"

api_base: https://.openai.azure.com

api_version: 2024-02-15-preview

organization: <organization_id>

deployment_name: <azure_model_deployment_name>

parallelization:
stagger: 0.3

num_threads: 50

async_mode: threaded # or asyncio

embeddings:
async_mode: threaded # or asyncio
vector_store:
type: lancedb
db_uri: 'output\lancedb'
container_name: default
overwrite: true
llm:
api_key: ${GRAPHRAG_API_KEY}
type: openai_embedding # or azure_openai_embedding
model: text-embedding-3-small
# api_base: https://.openai.azure.com
# api_version: 2024-02-15-preview
# audience: "https://cognitiveservices.azure.com/.default"
# organization: <organization_id>
# deployment_name: <azure_model_deployment_name>

Input settings

input:
type: file # or blob
file_type: text # or csv
base_dir: "input"
file_encoding: utf-8
file_pattern: ".*\.txt$"

chunks:
size: 1200
overlap: 100
group_by_columns: [id]

Storage settings

If blob storage is specified in the following four sections,

connection_string and container_name must be provided

cache:
type: file # or blob
base_dir: "cache"

reporting:
type: file # or console, blob
base_dir: "logs"

storage:
type: file # or blob
base_dir: "output"

only turn this on if running graphrag index with custom settings

we normally use graphrag update with the defaults

update_index_storage:

type: file # or blob

base_dir: "update_output"

Workflow settings

skip_workflows: []

entity_extraction:
prompt: "prompts/entity_extraction.txt"
entity_types: [organization,person,geo,event]
max_gleanings: 1

summarize_descriptions:
prompt: "prompts/summarize_descriptions.txt"
max_length: 500

claim_extraction:
enabled: false
prompt: "prompts/claim_extraction.txt"
description: "Any claims or facts that could be relevant to information discovery."
max_gleanings: 1

community_reports:
prompt: "prompts/community_report.txt"
max_length: 2000
max_input_length: 8000

cluster_graph:
max_cluster_size: 10

embed_graph:
enabled: false # if true, will generate node2vec embeddings for nodes

umap:
enabled: false # if true, will generate UMAP embeddings for nodes

snapshots:
graphml: false
raw_entities: false
top_level_nodes: false
embeddings: false
transient: false

Query settings

The prompt locations are required here, but each search method has a number of optional knobs that can be tuned.

See the config docs: https://microsoft.github.io/graphrag/config/yaml/#query

local_search:
prompt: "prompts/local_search_system_prompt.txt"

global_search:
map_prompt: "prompts/global_search_map_system_prompt.txt"
reduce_prompt: "prompts/global_search_reduce_system_prompt.txt"
knowledge_prompt: "prompts/global_search_knowledge_system_prompt.txt"

drift_search:
prompt: "prompts/drift_search_system_prompt.txt"

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
awaiting_response Maintainers or community have suggested solutions or requested info, awaiting filer response bug Something isn't working triage Default label assignment, indicates new issue needs reviewed by a maintainer
Projects
None yet
Development

No branches or pull requests

2 participants