Need support for concurrent task execution #333
Replies: 7 comments
-
Hi @Rohith-Scalers I'm assuming this is an inquiry about Workflows and not directly related to Llama Deploy? Let me know if I'm wrong. Referring to the tasks = []
for topic in ("pirates", "teachers", "doctors"):
tasks.append(w.run(topic=topic))
for res in await asyncio.gather(*tasks):
print(res) This way the llm calls will go "in parallel" (using quotes because it's one thread anyways so not really parallel) taking advantage of the async architecture. In this colab you can see how you can go from 10s to 3s using |
Beta Was this translation helpful? Give feedback.
-
hey thank you @masci can we do the same thing with a deployed workflow using llama deploy lets say we have deployed a workflow on for 8001 and can we Parallelly send tasks to that workflow |
Beta Was this translation helpful? Give feedback.
-
@Rohith-Scalers yes, Llama Deploy is designed to work that way - if that's not what you observe, it's a bug 😅. A "workflow service" is served by a uvicorn instance that handles concurrent requests, that would be the equivalent of using |
Beta Was this translation helpful? Give feedback.
-
heyy @masci the LlamaDeployClient is not async when i tried sending tasks form two different clients to same workflow the control plane published the second task after getting the results of first task , is there a way to make control plane publish tasks at the same time to workflow . |
Beta Was this translation helpful? Give feedback.
-
@Rohith-Scalers do you have a snippet of code or notebook I can use to quickly reproduce? |
Beta Was this translation helpful? Give feedback.
-
@masci you can use this manual_orchestration i have tried 2 things 1) sending tasks from multiple clients at the same ( it was processing in sequence ) 2) using asyncLlamaDeployClient (which is through time out error ) thank you for helping out :) |
Beta Was this translation helpful? Give feedback.
-
@Rohith-Scalers I wasn't able to reproduce, I can run multiple tasks and they will return as soon as they finish, not regarding the sequence. I'll convert this issue into a Github discussion as it doesn't seem to be a bug. |
Beta Was this translation helpful? Give feedback.
-
While this code works fine for single tasks, I noticed that when I attempt to run multiple instances of this workflow (with different inputs), the tasks are not processed concurrently. For example, running two tasks one after the other results in them being processed sequentially rather than in parallel, which defeats the purpose of using an asynchronous model.
one solution i can think of is to run a workflow as a step with another workflow and set num_work , I would greatly appreciate any insight or suggestions in To achieve parallelism
Beta Was this translation helpful? Give feedback.
All reactions