Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: <title>error': {'message': 'POST predict: Post \"http://127.0.0.1:36767/completion #1440

Open
3 tasks done
LuWei6896 opened this issue Nov 25, 2024 · 0 comments
Open
3 tasks done
Labels
bug Something isn't working triage Default label assignment, indicates new issue needs reviewed by a maintainer

Comments

@LuWei6896
Copy link

LuWei6896 commented Nov 25, 2024

Do you need to file an issue?

  • I have searched the existing issues and this bug is not already filed.
  • My model is hosted on OpenAI or Azure. If not, please look at the "model providers" issue and don't file a new one here.
  • I believe this is a legitimate bug, not just a question. If this is a question, please use the Discussions area.

Describe the bug

settings.yaml:
{
"type": "error",
"data": "Error Invoking LLM",
"stack": "Traceback (most recent call last):\n File "D:\librarys\Lib\site-packages\graphrag\llm\base\base_llm.py", line 54, in _invoke\n output = await self._execute_llm(input, **kwargs)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File "D:\librarys\Lib\site-packages\graphrag\llm\openai\openai_chat_llm.py", line 53, in _execute_llm\n completion = await self.client.chat.completions.create(\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File "D:\librarys\Lib\site-packages\openai\resources\chat\completions.py", line 1661, in create\n return await self._post(\n ^^^^^^^^^^^^^^^^^\n File "D:\librarys\Lib\site-packages\openai\_base_client.py", line 1839, in post\n return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File "D:\librarys\Lib\site-packages\openai\_base_client.py", line 1533, in request\n return await self._request(\n ^^^^^^^^^^^^^^^^^^^^\n File "D:\librarys\Lib\site-packages\openai\_base_client.py", line 1634, in _request\n raise self._make_status_error_from_response(err.response) from None\nopenai.InternalServerError: Error code: 500 - {'error': {'message': 'POST predict: Post "http://127.0.0.1:36767/completion\": EOF', 'type': 'api_error', 'param': None, 'code': None}}\n",
"source": "Error code: 500 - {'error': {'message': 'POST predict: Post "http://127.0.0.1:36767/completion\": EOF', 'type': 'api_error', 'param': None, 'code': None}}",
"details": {
"input": "\n-Goal-\nGiven a text document that is potentially relevant to this activity and a list of entity types, identify all entities of those types from the text and all relationships among the identified entities.\n \n-Steps-\n1. Identify all entities. For each identified entity, extract the following information:\n- entity_name: Name of the entity, capitalized\n- entity_type: One of the following types: [organization,person,geo,event]\n- entity_description: Comprehensive description of the entity's attributes and activities\nFormat each entity as ("entity"<|><entity_name><|><entity_type><|><entity_description>)\n \n2. From the entities identified in step 1, identify all pairs of (source_entity, target_entity) that are clearly related to each other.\nFor each pair of related entities, extract the following information:\n- source_entity: name of the source entity, as identified in step 1\n- target_entity: name of the target entity, as identified in step 1\n- relationship_description: explanation as to why you think the source entity and the target entity are related to each other\n- relationship_strength: a numeric score indicating strength of the relationship between the source entity and target entity\n Format each relationship as ("relationship"<|><source_entity><|><target_entity><|><relationship_description><|><relationship_strength>)\n \n3. Return output in English as a single list of all the entities and relationships identified in steps 1 and 2. Use ## as the list delimiter.\n \n4. When finished, output <|COMPLETE|>\n \n######################\n-Examples-\n######################\nExample 1:\nEntity_types: ORGANIZATION,PERSON\nText:\nThe Verdantis's Central Institution is scheduled to meet on Monday and Thursday, with the institution planning to release its latest policy decision on Thursday at 1:30 p.m. PDT, followed by a press conference where Central Institution Chair Martin Smith will take questions. Investors expect the Market Strategy Committee to hold its benchmark interest rate steady in a range of 3.5%-3.75%.\n######################\nOutput:\n("entity"<|>CENTRAL INSTITUTION<|>ORGANIZATION<|>The Central Institution is the Federal Reserve of Verdantis, which is setting interest rates on Monday and Thursday)\n##\n("entity"<|>MARTIN SMITH<|>PERSON<|>Martin Smith is the chair of the Central Institution)\n##\n("entity"<|>MARKET STRATEGY COMMITTEE<|>ORGANIZATION<|>The Central Institution committee makes key decisions about interest rates and the growth of Verdantis's money supply)\n##\n("relationship"<|>MARTIN SMITH<|>CENTRAL INSTITUTION<|>Martin Smith is the Chair of the Central Institution and will answer questions at a press conference<|>9)\n<|COMPLETE|>\n\n######################\nExample 2:\nEntity_types: ORGANIZATION\nText:\nTechGlobal's (TG) stock skyrocketed in its opening day on the Global Exchange Thursday. But IPO experts warn that the semiconductor corporation's debut on the public markets isn't indicative of how other newly listed companies may perform.\n\nTechGlobal, a formerly public company, was taken private by Vision Holdings in 2014. The well-established chip designer says it powers 85% of premium smartphones.\n######################\nOutput:\n("entity"<|>TECHGLOBAL<|>ORGANIZATION<|>TechGlobal is a stock now listed on the Global Exchange which powers 85% of premium smartphones)\n##\n("entity"<|>VISION HOLDINGS<|>ORGANIZATION<|>Vision Holdings is a firm that previously owned TechGlobal)\n##\n("relationship"<|>TECHGLOBAL<|>VISION HOLDINGS<|>Vision Holdings formerly owned TechGlobal from 2014 until present<|>5)\n<|COMPLETE|>\n\n######################\nExample 3:\nEntity_types: ORGANIZATION,GEO,PERSON\nText:\nFive Aurelians jailed for 8 years in Firuzabad and widely regarded as hostages are on their way home to Aurelia.\n\nThe swap orchestrated by Quintara was finalized when $8bn of Firuzi funds were transferred to financial institutions in Krohaara, the capital of Quintara.\n\nThe exchange initiated in Firuzabad's capital, Tiruzia, led to the four men and one woman, who are also Firuzi nationals, boarding a chartered flight to Krohaara.\n\nThey were welcomed by senior Aurelian officials and are now on their way to Aurelia's capital, Cashion.\n\nThe Aurelians include 39-year-old businessman Samuel Namara, who has been held in Tiruzia's Alhamia Prison, as well as journalist Durke Bataglani, 59, and environmentalist Meggie Tazbah, 53, who also holds Bratinas nationality.\n######################\nOutput:\n("entity"<|>FIRUZABAD<|>GEO<|>Firuzabad held Aurelians as hostages)\n##\n("entity"<|>AURELIA<|>GEO<|>Country seeking to release hostages)\n##\n("entity"<|>QUINTARA<|>GEO<|>Country that negotiated a swap of money in exchange for hostages)\n##\n##\n("entity"<|>TIRUZIA<|>GEO<|>Capital of Firuzabad where the Aurelians were being held)\n##\n("entity"<|>KROHAARA<|>GEO<|>Capital city in Quintara)\n##\n("entity"<|>CASHION<|>GEO<|>Capital city in Aurelia)\n##\n("entity"<|>SAMUEL NAMARA<|>PERSON<|>Aurelian who spent time in Tiruzia's Alhamia Prison)\n##\n("entity"<|>ALHAMIA PRISON<|>GEO<|>Prison in Tiruzia)\n##\n("entity"<|>DURKE BATAGLANI<|>PERSON<|>Aurelian journalist who was held hostage)\n##\n("entity"<|>MEGGIE TAZBAH<|>PERSON<|>Bratinas national and environmentalist who was held hostage)\n##\n("relationship"<|>FIRUZABAD<|>AURELIA<|>Firuzabad negotiated a hostage exchange with Aurelia<|>2)\n##\n("relationship"<|>QUINTARA<|>AURELIA<|>Quintara brokered the hostage exchange between Firuzabad and Aurelia<|>2)\n##\n("relationship"<|>QUINTARA<|>FIRUZABAD<|>Quintara brokered the hostage exchange between Firuzabad and Aurelia<|>2)\n##\n("relationship"<|>SAMUEL NAMARA<|>ALHAMIA PRISON<|>Samuel Namara was a prisoner at Alhamia prison<|>8)\n##\n("relationship"<|>SAMUEL NAMARA<|>MEGGIE TAZBAH<|>Samuel Namara and Meggie Tazbah were exchanged in the same hostage release<|>2)\n##\n("relationship"<|>SAMUEL NAMARA<|>DURKE BATAGLANI<|>Samuel Namara and Durke Bataglani were exchanged in the same hostage release<|>2)\n##\n("relationship"<|>MEGGIE TAZBAH<|>DURKE BATAGLANI<|>Meggie Tazbah and Durke Bataglani were exchanged in the same hostage release<|>2)\n##\n("relationship"<|>SAMUEL NAMARA<|>FIRUZABAD<|>Samuel Namara was a hostage in Firuzabad<|>2)\n##\n("relationship"<|>MEGGIE TAZBAH<|>FIRUZABAD<|>Meggie Tazbah was a hostage in Firuzabad<|>2)\n##\n("relationship"<|>DURKE BATAGLANI<|>FIRUZABAD<|>Durke Bataglani was a hostage in Firuzabad<|>2)\n<|COMPLETE|>\n\n######################\n-Real Data-\n######################\nEntity_types: organization,person,geo,event\nText: 廷的衙署,有自己的想法,可不是你落珈山的跟班。\n  偏偏观音没法在这上面纠缠,总不能说天廷不配吧?她还想再挑辟火罩的毛病,可转念一想,广目天王虽说在天廷供职,出身却是释门,她如果继续质疑,就是打自家耳光了——看来这老神仙绝对是处心积虑,要不天廷那么多有防火法宝的神祇,怎么独独去找广目借呢?\n  观音咬了咬嘴唇,一跺脚,终于说了实话:“李仙师,你这一难安排在哪儿不好,干嘛选一个叫观音禅院的地方!起贪心的还是禅院长老,这不是抹黑我吗?”\n  李长庚心里乐开花,面上却一脸无辜:“您看看舆图,玄奘一过西番国,下一站可不就在观音禅院?可是您交代的,说第九难第十难间隔不要太远。”\n  观音被这一席话噎得哑口无言,活活憋出了青颈法相。李长庚见她哑口无言,笑道:“没啥事我就去殿里啦,这一劫的揭帖还得写呢。” 观音大惊,赶紧拦住他:“老李,缓一缓,缓一缓,这揭帖暂时不能发,真的有损我的名誉啊。”\n  李长庚故作惊讶:“怎么会?这是观音禅院出的事,又不是观音大士您。” 观音急道:“哎呀,仙界什么样你还能不知道?万一被兜率宫的老君藏头去尾、添油加醋一转,就成了我观音指使偷窃袈裟了!”\n  “咳,实在不行,再出个澄清声明嘛。” 李长庚说。观音差点摔了玉净瓶:“谁会看那玩意儿!西王母当年发了多少声明说猴子在蟠桃园只偷过桃,有用吗?老李,你这篇揭帖必须撤下来,不然我去凌霄宝殿说个分明!”\n  见她开始口不择言了,李长庚不慌不忙亮出一份文书:“不劳你去凌霄殿,陛下早有批示。” 观音盯着末尾那先天太极看了一阵,气呼呼道:“我是释门中人,不懂你们玄门的暗语。” 李长庚说:“您看这个太极,阴阳二鱼首尾相衔,周转不休。什么意思呢?这是陛下教诲我等,咱们做事啊,不能顾头不顾腚。”\n  观音这才意识到,能在启明殿干了这么多年的,怎么可能是个单纯的老实人。她迅速调整了一下法相,换成合掌观音,陪着笑脸说:“之前事情多,没顾上沟通,是我不好。现在取经进入正轨,咱们流程上可以正规起来,接下来的护法方略大家一起商量着来。不过这篇揭帖真的影响太坏了,还请老李多帮帮忙。”\n  李长庚见火候差不多了,慢条斯理道:“其实嘛,倒也不是没办法补救。” 观音一听,赶紧请教。李长庚道:“\n######################\nOutput:"
}
}
{
"type": "error",
"data": "Error Invoking LLM",
"stack": "Traceback (most recent call last):\n File "D:\librarys\Lib\site-packages\graphrag\llm\base\base_llm.py", line 54, in _invoke\n output = await self._execute_llm(input, **kwargs)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File "D:\librarys\Lib\site-packages\graphrag\llm\openai\openai_chat_llm.py", line 53, in _execute_llm\n completion = await self.client.chat.completions.create(\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File "D:\librarys\Lib\site-packages\openai\resources\chat\completions.py", line 1661, in create\n return await self._post(\n ^^^^^^^^^^^^^^^^^\n File "D:\librarys\Lib\site-packages\openai\_base_client.py", line 1839, in post\n return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File "D:\librarys\Lib\site-packages\openai\_base_client.py", line 1533, in request\n return await self._request(\n ^^^^^^^^^^^^^^^^^^^^\n File "D:\librarys\Lib\site-packages\openai\_base_client.py", line 1634, in _request\n raise self._make_status_error_from_response(err.response) from None\nopenai.InternalServerError: Error code: 500 - {'error': {'message': 'health resp: Get "http://127.0.0.1:36767/health\": read tcp 127.0.0.1:56172->127.0.0.1:36767: read: connection reset by peer', 'type': 'api_error', 'param': None, 'code': None}}\n",
"source": "Error code: 500 - {'error': {'message': 'health resp: Get "http://127.0.0.1:36767/health\": read tcp 127.0.0.1:56172->127.0.0.1:36767: read: connection reset by peer', 'type': 'api_error', 'param': None, 'code': None}}",
"details": {
"input": "\n-Goal-\nGiven a text document that is potentially relevant to this activity and a list of entity types, identify all entities of those types from the text and all relationships among the identified entities.\n \n-Steps-\n1. Identify all entities. For each identified entity, extract the following information:\n- entity_name: Name of the entity, capitalized\n- entity_type: One of the following types: [organization,person,geo,event]\n- entity_description: Comprehensive description of the entity's attributes and activities\nFormat each entity as ("entity"<|><entity_name><|><entity_type><|><entity_description>)\n \n2. From the entities identified in step 1, identify all pairs of (source_entity, target_entity) that are clearly related to each other.\nFor each pair of related entities, extract the following information:\n- source_entity: name of the source entity, as identified in step 1\n- target_entity: name of the target entity, as identified in step 1\n- relationship_description: explanation as to why you think the source entity and the target entity are related to each other\n- relationship_strength: a numeric score indicating strength of the relationship between the source entity and target entity\n Format each relationship as ("relationship"<|><source_entity><|><target_entity><|><relationship_description><|><relationship_strength>)\n \n3. Return output in English as a single list of all the entities and relationships identified in steps 1 and 2. Use ## as the list delimiter.\n \n4. When finished, output <|COMPLETE|>\n \n######################\n-Examples-\n######################\nExample 1:\nEntity_types: ORGANIZATION,PERSON\nText:\nThe Verdantis's Central Institution is scheduled to meet on Monday and Thursday, with the institution planning to release its latest policy decision on Thursday at 1:30 p.m. PDT, followed by a press conference where Central Institution Chair Martin Smith will take questions. Investors expect the Market Strategy Committee to hold its benchmark interest rate steady in a range of 3.5%-3.75%.\n######################\nOutput:\n("entity"<|>CENTRAL INSTITUTION<|>ORGANIZATION<|>The Central Institution is the Federal Reserve of Verdantis, which is setting interest rates on Monday and Thursday)\n##\n("entity"<|>MARTIN SMITH<|>PERSON<|>Martin Smith is the chair of the Central Institution)\n##\n("entity"<|>MARKET STRATEGY COMMITTEE<|>ORGANIZATION<|>The Central Institution committee makes key decisions about interest rates and the growth of Verdantis's money supply)\n##\n("relationship"<|>MARTIN SMITH<|>CENTRAL INSTITUTION<|>Martin Smith is the Chair of the Central Institution and will answer questions at a press conference<|>9)\n<|COMPLETE|>\n\n######################\nExample 2:\nEntity_types: ORGANIZATION\nText:\nTechGlobal's (TG) stock skyrocketed in its opening day on the Global Exchange Thursday. But IPO experts warn that the semiconductor corporation's debut on the public markets isn't indicative of how other newly listed companies may perform.\n\nTechGlobal, a formerly public company, was taken private by Vision Holdings in 2014. The well-established chip designer says it powers 85% of premium smartphones.\n######################\nOutput:\n("entity"<|>TECHGLOBAL<|>ORGANIZATION<|>TechGlobal is a stock now listed on the Global Exchange which powers 85% of premium smartphones)\n##\n("entity"<|>VISION HOLDINGS<|>ORGANIZATION<|>Vision Holdings is a firm that previously owned TechGlobal)\n##\n("relationship"<|>TECHGLOBAL<|>VISION HOLDINGS<|>Vision Holdings formerly owned TechGlobal from 2014 until present<|>5)\n<|COMPLETE|>\n\n######################\nExample 3:\nEntity_types: ORGANIZATION,GEO,PERSON\nText:\nFive Aurelians jailed for 8 years in Firuzabad and widely regarded as hostages are on their way home to Aurelia.\n\nThe swap orchestrated by Quintara was finalized when $8bn of Firuzi funds were transferred to financial institutions in Krohaara, the capital of Quintara.\n\nThe exchange initiated in Firuzabad's capital, Tiruzia, led to the four men and one woman, who are also Firuzi nationals, boarding a chartered flight to Krohaara.\n\nThey were welcomed by senior Aurelian officials and are now on their way to Aurelia's capital, Cashion.\n\nThe Aurelians include 39-year-old businessman Samuel Namara, who has been held in Tiruzia's Alhamia Prison, as well as journalist Durke Bataglani, 59, and environmentalist Meggie Tazbah, 53, who also holds Bratinas nationality.\n######################\nOutput:\n("entity"<|>FIRUZABAD<|>GEO<|>Firuzabad held Aurelians as hostages)\n##\n("entity"<|>AURELIA<|>GEO<|>Country seeking to release hostages)\n##\n("entity"<|>QUINTARA<|>GEO<|>Country that negotiated a swap of money in exchange for hostages)\n##\n##\n("entity"<|>TIRUZIA<|>GEO<|>Capital of Firuzabad where the Aurelians were being held)\n##\n("entity"<|>KROHAARA<|>GEO<|>Capital city in Quintara)\n##\n("entity"<|>CASHION<|>GEO<|>Capital city in Aurelia)\n##\n("entity"<|>SAMUEL NAMARA<|>PERSON<|>Aurelian who spent time in Tiruzia's Alhamia Prison)\n##\n("entity"<|>ALHAMIA PRISON<|>GEO<|>Prison in Tiruzia)\n##\n("entity"<|>DURKE BATAGLANI<|>PERSON<|>Aurelian journalist who was held hostage)\n##\n("entity"<|>MEGGIE TAZBAH<|>PERSON<|>Bratinas national and environmentalist who was held hostage)\n##\n("relationship"<|>FIRUZABAD<|>AURELIA<|>Firuzabad negotiated a hostage exchange with Aurelia<|>2)\n##\n("relationship"<|>QUINTARA<|>AURELIA<|>Quintara brokered the hostage exchange between Firuzabad and Aurelia<|>2)\n##\n("relationship"<|>QUINTARA<|>FIRUZABAD<|>Quintara brokered the hostage exchange between Firuzabad and Aurelia<|>2)\n##\n("relationship"<|>SAMUEL NAMARA<|>ALHAMIA PRISON<|>Samuel Namara was a prisoner at Alhamia prison<|>8)\n##\n("relationship"<|>SAMUEL NAMARA<|>MEGGIE TAZBAH<|>Samuel Namara and Meggie Tazbah were exchanged in the same hostage release<|>2)\n##\n("relationship"<|>SAMUEL NAMARA<|>DURKE BATAGLANI<|>Samuel Namara and Durke Bataglani were exchanged in the same hostage release<|>2)\n##\n("relationship"<|>MEGGIE TAZBAH<|>DURKE BATAGLANI<|>Meggie Tazbah and Durke Bataglani were exchanged in the same hostage release<|>2)\n##\n("relationship"<|>SAMUEL NAMARA<|>FIRUZABAD<|>Samuel Namara was a hostage in Firuzabad<|>2)\n##\n("relationship"<|>MEGGIE TAZBAH<|>FIRUZABAD<|>Meggie Tazbah was a hostage in Firuzabad<|>2)\n##\n("relationship"<|>DURKE BATAGLANI<|>FIRUZABAD<|>Durke Bataglani was a hostage in Firuzabad<|>2)\n<|COMPLETE|>\n\n######################\n-Real Data-\n######################\nEntity_types: organization,person,geo,event\nText: 为,玄奘遇到什么劫难,可就怪不得我了。” 李长庚嘿嘿一笑,施展神通,以禅院为中心开始扫视方圆百里。\n  扫来扫去,真让他找到一只妖精。这是一头黑熊精,正在自家洞府里闭目修炼。李长庚懒得搞化身那一套虚头,直接飘进了洞府之内。\n  这只黑熊精皮毛苦涩,形销骨立,可见修炼得十分辛苦。它忽然看到一位仙人出现在眼前,吓得赶紧下拜。李长庚亲切地把它搀起来,随口询问。黑熊精略带羞涩地说它已成精四百五十多年,如今正努力化去横骨,再熬个五十年就够成仙的资格了。\n  黑熊精一脸憧憬的神情,让李长庚不期然想起了六耳猕猴。他咳了一声,说位列仙班可没那么容易的,但若你能配合我的工作,弄一个位列仙门还是有机会的。\n  黑熊精大喜过望,翻身便拜。李长庚微微一笑,说你附耳过来,然后细细交代了一通。黑熊精听得十分仔细,连连称是。\n  安顿完之后,李长庚拂尘一摆,又降去了观音禅院,仔细安排了一番,眼见着玄奘他们进了禅院休息,这才驾鹤回了九刹山。次日一早,他刚到启明殿,观音已经气急败坏找上门来。\n  “老李!你这第十难是怎么……怎么设计的?”\n  李长庚装糊涂:“就是按锦囊方略来的呀。我这次选的叫自作自受,安排了金池长老觊觎袈裟,纵火烧禅院,孙悟空借了广目天王的辟火罩……”\n  观音板着脸道:“你这一难的设计,干嘛要用那件金澜袈裟?袈裟乃是佛祖亲赐,万一有个闪失可怎么办?” 李长庚知道她是存心找碴,一拍胸脯:“大士放心,锦斓袈裟只是假丢,我派专人看着呢,不会出问题。” 观音一计不成,又挑一刺:“还有啊,你为什么安排孙悟空去找广目天王借辟火罩?简直是画蛇添足!齐天大圣那么大能耐,至于连一把火都解决不了么?您是老资格,怎么会犯这种错误?别人会说我们这一劫渡得太假了,到时候影响了玄奘不说,连佛祖也会尴尬。”\n  李长庚淡淡道:“灵山和天廷对取经大业都很重视,都要体现出关心,这不是您说的嘛。”\n  广目天王职务在南天门,李长庚这一手安排看似多余,其实是向观音点了一下立场——我启明殿是天廷的衙署,有自己的想法,可不是你落珈山的跟班。\n  偏偏观音没法在这上面纠缠,总不能说天廷不配吧?她还想再挑辟火罩的毛病,可转念一想,广目天王虽说在\n######################\nOutput:"
}
}
{
"type": "error",
"data": "Error Invoking LLM",
"stack": "Traceback (most recent call last):\n File "D:\librarys\Lib\site-packages\graphrag\llm\base\base_llm.py", line 54, in _invoke\n output = await self._execute_llm(input, **kwargs)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File "D:\librarys\Lib\site-packages\graphrag\llm\openai\openai_chat_llm.py", line 53, in _execute_llm\n completion = await self.client.chat.completions.create(\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File "D:\librarys\Lib\site-packages\openai\resources\chat\completions.py", line 1661, in create\n return await self._post(\n ^^^^^^^^^^^^^^^^^\n File "D:\librarys\Lib\site-packages\openai\_base_client.py", line 1839, in post\n return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File "D:\librarys\Lib\site-packages\openai\_base_client.py", line 1533, in request\n return await self._request(\n ^^^^^^^^^^^^^^^^^^^^\n File "D:\librarys\Lib\site-packages\openai\_base_client.py", line 1634, in _request\n raise self._make_status_error_from_response(err.response) from None\nopenai.InternalServerError: Error code: 500 - {'error': {'message': 'health resp: Get "http://127.0.0.1:36767/health\": dial tcp 127.0.0.1:36767: connect: connection refused', 'type': 'api_error', 'param': None, 'code': None}}\n",
"source": "Error code: 500 - {'error': {'message': 'health resp: Get "http://127.0.0.1:36767/health\": dial tcp 127.0.0.1:36767: connect: connection refused', 'type': 'api_error', 'param': None, 'code': None}}",
"details": {
"input": "\n-Goal-\nGiven a text document that is potentially relevant to this activity and a list of entity types, identify all entities of those types from the text and all relationships among the identified entities.\n \n-Steps-\n1. Identify all entities. For each identified entity, extract the following information:\n- entity_name: Name of the entity, capitalized\n- entity_type: One of the following types: [organization,person,geo,event]\n- entity_description: Comprehensive description of the entity's attributes and activities\nFormat each entity as ("entity"<|><entity_name><|><entity_type><|><entity_description>)\n \n2. From the entities identified in step 1, identify all pairs of (source_entity, target_entity) that are clearly related to each other.\nFor each pair of related entities, extract the following information:\n- source_entity: name of the source entity, as identified in step 1\n- target_entity: name of the target entity, as identified in step 1\n- relationship_description: explanation as to why you think the source entity and the target entity are related to each other\n- relationship_strength: a numeric score indicating strength of the relationship between the source entity and target entity\n Format each relationship as ("relationship"<|><source_entity><|><target_entity><|><relationship_description><|><relationship_strength>)\n \n3. Return output in English as a single list of all the entities and relationships identified in steps 1 and 2. Use ## as the list delimiter.\n \n4. When finished, output <|COMPLETE|>\n \n######################\n-Examples-\n######################\nExample 1:\nEntity_types: ORGANIZATION,PERSON\nText:\nThe Verdantis's Central Institution is scheduled to meet on Monday and Thursday, with the institution planning to release its latest policy decision on Thursday at 1:30 p.m. PDT, followed by a press conference where Central Institution Chair Martin Smith will take questions. Investors expect the Market Strategy Committee to hold its benchmark interest rate steady in a range of 3.5%-3.75%.\n######################\nOutput:\n("entity"<|>CENTRAL INSTITUTION<|>ORGANIZATION<|>The Central Institution is the Federal Reserve of Verdantis, which is setting interest rates on Monday and Thursday)\n##\n("entity"<|>MARTIN SMITH<|>PERSON<|>Martin Smith is the chair of the Central Institution)\n##\n("entity"<|>MARKET STRATEGY COMMITTEE<|>ORGANIZATION<|>The Central Institution committee makes key decisions about interest rates and the growth of Verdantis's money supply)\n##\n("relationship"<|>MARTIN SMITH<|>CENTRAL INSTITUTION<|>Martin Smith is the Chair of the Central Institution and will answer questions at a press conference<|>9)\n<|COMPLETE|>\n\n######################\nExample 2:\nEntity_types: ORGANIZATION\nText:\nTechGlobal's (TG) stock skyrocketed in its opening day on the Global Exchange Thursday. But IPO experts warn that the semiconductor corporation's debut on the public markets isn't indicative of how other newly listed companies may perform.\n\nTechGlobal, a formerly public company, was taken private by Vision Holdings in 2014. The well-established chip designer says it powers 85% of premium smartphones.\n######################\nOutput:\n("entity"<|>TECHGLOBAL<|>ORGANIZATION<|>TechGlobal is a stock now listed on the Global Exchange which powers 85% of premium smartphones)\n##\n("entity"<|>VISION HOLDINGS<|>ORGANIZATION<|>Vision Holdings is a firm that previously owned TechGlobal)\n##\n("relationship"<|>TECHGLOBAL<|>VISION HOLDINGS<|>Vision Holdings formerly owned TechGlobal from 2014 until present<|>5)\n<|COMPLETE|>\n\n######################\nExample 3:\nEntity_types: ORGANIZATION,GEO,PERSON\nText:\nFive Aurelians jailed for 8 years in Firuzabad and widely regarded as hostages are on their way home to Aurelia.\n\nThe swap orchestrated by Quintara was finalized when $8bn of Firuzi funds were transferred to financial institutions in Krohaara, the capital of Quintara.\n\nThe exchange initiated in Firuzabad's capital, Tiruzia, led to the four men and one woman, who are also Firuzi nationals, boarding a chartered flight to Krohaara.\n\nThey were welcomed by senior Aurelian officials and are now on their way to Aurelia's capital, Cashion.\n\nThe Aurelians include 39-year-old businessman Samuel Namara, who has been held in Tiruzia's Alhamia Prison, as well as journalist Durke Bataglani, 59, and environmentalist Meggie Tazbah, 53, who also holds Bratinas nationality.\n######################\nOutput:\n("entity"<|>FIRUZABAD<|>GEO<|>Firuzabad held Aurelians as hostages)\n##\n("entity"<|>AURELIA<|>GEO<|>Country seeking to release hostages)\n##\n("entity"<|>QUINTARA<|>GEO<|>Country that negotiated a swap of money in exchange for hostages)\n##\n##\n("entity"<|>TIRUZIA<|>GEO<|>Capital of Firuzabad where the Aurelians were being held)\n##\n("entity"<|>KROHAARA<|>GEO<|>Capital city in Quintara)\n##\n("entity"<|>CASHION<|>GEO<|>Capital city in Aurelia)\n##\n("entity"<|>SAMUEL NAMARA<|>PERSON<|>Aurelian who spent time in Tiruzia's Alhamia Prison)\n##\n("entity"<|>ALHAMIA PRISON<|>GEO<|>Prison in Tiruzia)\n##\n("entity"<|>DURKE BATAGLANI<|>PERSON<|>Aurelian journalist who was held hostage)\n##\n("entity"<|>MEGGIE TAZBAH<|>PERSON<|>Bratinas national and environmentalist who was held hostage)\n##\n("relationship"<|>FIRUZABAD<|>AURELIA<|>Firuzabad negotiated a hostage exchange with Aurelia<|>2)\n##\n("relationship"<|>QUINTARA<|>AURELIA<|>Quintara brokered the hostage exchange between Firuzabad and Aurelia<|>2)\n##\n("relationship"<|>QUINTARA<|>FIRUZABAD<|>Quintara brokered the hostage exchange between Firuzabad and Aurelia<|>2)\n##\n("relationship"<|>SAMUEL NAMARA<|>ALHAMIA PRISON<|>Samuel Namara was a prisoner at Alhamia prison<|>8)\n##\n("relationship"<|>SAMUEL NAMARA<|>MEGGIE TAZBAH<|>Samuel Namara and Meggie Tazbah were exchanged in the same hostage release<|>2)\n##\n("relationship"<|>SAMUEL NAMARA<|>DURKE BATAGLANI<|>Samuel Namara and Durke Bataglani were exchanged in the same hostage release<|>2)\n##\n("relationship"<|>MEGGIE TAZBAH<|>DURKE BATAGLANI<|>Meggie Tazbah and Durke Bataglani were exchanged in the same hostage release<|>2)\n##\n("relationship"<|>SAMUEL NAMARA<|>FIRUZABAD<|>Samuel Namara was a hostage in Firuzabad<|>2)\n##\n("relationship"<|>MEGGIE TAZBAH<|>FIRUZABAD<|>Meggie Tazbah was a hostage in Firuzabad<|>2)\n##\n("relationship"<|>DURKE BATAGLANI<|>FIRUZABAD<|>Durke Bataglani was a hostage in Firuzabad<|>2)\n<|COMPLETE|>\n\n######################\n-Real Data-\n######################\nEntity_types: organization,person,geo,event\nText: 盘坐继续修持起来。\n  也就一柱香的功夫,李长庚忽有感应,缓缓睁开眼睛,只见一道带着火花的飞符“唰”地飞入殿内。他嘿嘿一笑,来了。\n  飞符是观音所发,言辞间颇为急切:“老李,你怎么搞的?那野猪精怎么给自己加戏,主动要拜玄奘为师?” 李长庚还没回复,只见启明殿口突现霞光,原来观音已经气急败坏找上门来了。她脸色铁青,现出了千手本相,回旋舞动,可见气得不轻。\n  李长庚不待她质问,先迎上去问怎么回事?观音浮起怒容:“那头野猪精一见玄奘,立刻跪下来磕头,说是我安排的取经弟子,等师父等了许多年。玄奘联系我问有没有这事,我才知道出了这么大篓子——老李,这可和说好的不一样啊!”\n  李长庚一摊手:“方略你也是审过的,根本没这么一段。恐怕是那头野猪精听人说了取经的好处,自作主张吧?”\n  “不是老李你教的吗?” 观音不信,千手一起指过来。\n  李长庚脸色不悦:“你让玄奘直接拒了这头孽畜便是,我绝无二话。” 观音长长叹了口气:“现在这情况,不太好拒啊。”\n  “有什么不好拒?这野猪精连大士你都敢编排,直接雷劈都不多!”\n  观音“啧”了一声,一脸无奈:“老李你忘啦?玄奘身边还跟着三十九尊神仙呢。” 李长庚道:“那不正好做个见证吗?”\n  观音不知道这老神仙是真糊涂还是怎么,压低声音道:“如果我现在去高老庄,当面宣布\n  那野猪精所言不实,那几个护教伽蓝、四值功曹会怎么想?哦,他猪胆包天,是该死——但高老庄这一场劫难的方略,是观音审的,太白金星具体安排的,现在出了事故,是不是说明你们没有严格把关?是不是也要负点责任?那些家伙,自己不干活,挑起别人错处可是具足了神通。”\n  李长庚心中微微冷笑。都这时候了,观音还不忘记把黑锅朝启明殿挪一挪,指望自己跟她陪绑。他一捋胡须,稳稳道:“大士莫急,来,来,坐下我们商量一下,总会有两全之策的。”\n  观音说:“哪有心思坐下聊啊,咱俩赶紧去现场吧!”她正要催促,忽然手里的玉净瓶微微颤动。她暼了眼瓶里的水波涟漪,脸色微变,一手端起水瓶,一手拔下柳枝,另外两手冲李长庚做了个“稍等”的手势,同时一手捂耳,一手推门出去了。\n  李长庚也不急,回到案几前,慢悠悠做着前面几难的报销。过不多时,观音回来了,脸色要多古怪有多古�\n######################\nOutput:"
}
}

Steps to reproduce

No response

Expected Behavior

No response

GraphRAG Config Used

encoding_model: cl100k_base
skip_workflows: []
llm:
  api_key: "" #${GRAPHRAG_API_KEY}
  type: openai_chat # or azure_openai_chat
  model: qwen2.5:7b
  model_supports_json: true # recommended if this is available for your model.
  # audience: "https://cognitiveservices.azure.com/.default"
  #max_tokens: 4000
  request_timeout: 500.0
  api_base: http://xxxx:11434/v1/
  # api_version: v1
  # organization: <organization_id>
  # deployment_name: <azure_model_deployment_name>
  # tokens_per_minute: 150_000 # set a leaky bucket throttle
  # requests_per_minute: 10_000 # set a leaky bucket throttle
  # max_retries: 10
  # max_retry_wait: 10.0
  # sleep_on_rate_limit_recommendation: true # whether to sleep when azure suggests wait-times
  # concurrent_requests: 25 # the number of parallel inflight requests that may be made
  # temperature: 0 # temperature for sampling
  # top_p: 1 # top-p sampling
  # n: 1 # Number of completions to generate

parallelization:
  stagger: 0.3
  num_threads: 50 # the number of threads to use for parallel processing

async_mode: threaded # or asyncio

embeddings:
  ## parallelization: override the global parallelization settings for embeddings
  async_mode: threaded # or asyncio
  # target: required # or all
  # batch_size: 16 # the number of documents to send in a single request
  # batch_max_tokens: 8191 # the maximum number of tokens to send in a single request
  vector_store:
    type: lancedb
    db_uri: 'output\lancedb'
    container_name: default # A prefix for the vector store to create embedding containers. Default: 'default'.
    overwrite: true
  # vector_store: # configuration for AI Search
    # type: azure_ai_search
    # url: <ai_search_endpoint>
    # api_key: <api_key> # if not set, will attempt to use managed identity. Expects the `Search Index Data Contributor` RBAC role in this case.
    # audience: <optional> # if using managed identity, the audience to use for the token
    # overwrite: true # or false. Only applicable at index creation time
    # container_name: default # A prefix for the AzureAISearch to create indexes. Default: 'default'.
  llm:
    api_key: "" #${GRAPHRAG_API_KEY}
    type: openai_embedding # or azure_openai_embedding
    model: nomic-embed-text
    api_base: http://xxxx:11434/api/embeddings
    # api_version: 2024-02-15-preview
    # audience: "https://cognitiveservices.azure.com/.default"
    # organization: <organization_id>
    # deployment_name: <azure_model_deployment_name>
    # tokens_per_minute: 150_000 # set a leaky bucket throttle
    # requests_per_minute: 10_000 # set a leaky bucket throttle
    # max_retries: 10
    # max_retry_wait: 10.0
    # sleep_on_rate_limit_recommendation: true # whether to sleep when azure suggests wait-times
    # concurrent_requests: 25 # the number of parallel inflight requests that may be made

chunks:
  size: 1200
  overlap: 100
  group_by_columns: [id] # by default, we don't allow chunks to cross documents

input:
  type: file # or blob
  file_type: text # or csv
  base_dir: "input"
  file_encoding: utf-8
  file_pattern: ".*\\.txt$"

cache:
  type: file # or blob
  base_dir: "cache"
  # connection_string: <azure_blob_storage_connection_string>
  # container_name: <azure_blob_storage_container_name>

storage:
  type: file # or blob
  base_dir: "output"
  # connection_string: <azure_blob_storage_connection_string>
  # container_name: <azure_blob_storage_container_name>

update_index_storage: # Storage to save an updated index (for incremental indexing). Enabling this performs an incremental index run
  # type: file # or blob
  # base_dir: "update_output"
  # connection_string: <azure_blob_storage_connection_string>
  # container_name: <azure_blob_storage_container_name>

reporting:
  type: file # or console, blob
  base_dir: "logs"
  # connection_string: <azure_blob_storage_connection_string>
  # container_name: <azure_blob_storage_container_name>

entity_extraction:
  ## strategy: fully override the entity extraction strategy.
  ##   type: one of graph_intelligence, graph_intelligence_json and nltk
  ## llm: override the global llm settings for this task
  ## parallelization: override the global parallelization settings for this task
  ## async_mode: override the global async_mode settings for this task
  prompt: "prompts/entity_extraction.txt"
  entity_types: [organization,person,geo,event]
  max_gleanings: 1

summarize_descriptions:
  ## llm: override the global llm settings for this task
  ## parallelization: override the global parallelization settings for this task
  ## async_mode: override the global async_mode settings for this task
  prompt: "prompts/summarize_descriptions.txt"
  max_length: 500

claim_extraction:
  ## llm: override the global llm settings for this task
  ## parallelization: override the global parallelization settings for this task
  ## async_mode: override the global async_mode settings for this task
  # enabled: true
  prompt: "prompts/claim_extraction.txt"
  description: "Any claims or facts that could be relevant to information discovery."
  max_gleanings: 1

community_reports:
  ## llm: override the global llm settings for this task
  ## parallelization: override the global parallelization settings for this task
  ## async_mode: override the global async_mode settings for this task
  prompt: "prompts/community_report.txt"
  max_length: 2000
  max_input_length: 8000

cluster_graph:
  max_cluster_size: 10

embed_graph:
  enabled: false # if true, will generate node2vec embeddings for nodes
  # num_walks: 10
  # walk_length: 40
  # window_size: 2
  # iterations: 3
  # random_seed: 597832

umap:
  enabled: false # if true, will generate UMAP embeddings for nodes

snapshots:
  graphml: false
  raw_entities: false
  top_level_nodes: false

local_search:
  # text_unit_prop: 0.5
  # community_prop: 0.1
  # conversation_history_max_turns: 5
  # top_k_mapped_entities: 10
  # top_k_relationships: 10
  # llm_temperature: 0 # temperature for sampling
  # llm_top_p: 1 # top-p sampling
  # llm_n: 1 # Number of completions to generate
  # max_tokens: 12000

global_search:
  # llm_temperature: 0 # temperature for sampling
  # llm_top_p: 1 # top-p sampling
  # llm_n: 1 # Number of completions to generate
  # max_tokens: 12000
  # data_max_tokens: 12000
  # map_max_tokens: 1000
  # reduce_max_tokens: 2000
  # concurrency: 32

Logs and screenshots

No response

Additional Information

  • GraphRAG Version:0.4.1
  • Operating System:ubutnu 22.04
  • Python Version:python 3.11
  • Related Issues: use ollama deploy llm
@LuWei6896 LuWei6896 added bug Something isn't working triage Default label assignment, indicates new issue needs reviewed by a maintainer labels Nov 25, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working triage Default label assignment, indicates new issue needs reviewed by a maintainer
Projects
None yet
Development

No branches or pull requests

1 participant