Releases: run-llama/llama_deploy
Releases · run-llama/llama_deploy
v0.3.5
What's Changed
New Features 🎉
- correct
.yml
file name by @ravi03071991 in #382 - fix: error in run command by @masci in #384
- fix: fix rabbitmq import errors by @masci in #386
- Prepare 0.3.5 by @masci in #387
New Contributors
- @ravi03071991 made their first contribution in #382
Full Changelog: v0.3.4...v0.3.5
v0.3.4
v0.3.4
v0.3.3
What's Changed
New Features 🎉
- Integrate the Solace PubSub+ broker into LlamaDeploy by @alimosaed in #345
- v0.3.3 by @logan-markewich in #377
New Contributors
- @alimosaed made their first contribution in #345
Full Changelog: v0.3.2...v0.3.3
v0.3.2
v0.3.1
What's Changed
New Features 🎉
- build(deps): bump aiohttp from 3.10.10 to 3.10.11 by @dependabot in #369
- Update release.yml by @masci in #371
- Use the new SDK within the api server by @masci in #372
- [0.3.1] vbump for llama-index-core by @logan-markewich in #373
Full Changelog: v0.3.0...v0.3.1
v0.3.0
What's Changed
New Features 🎉
- feat: add apiserver support to Python SDK by @masci in #327
- refact: make collections in apiserver models lazy by @masci in #343
- chore: deprecate old clients in favor of Client by @masci in #352
- feat: make the kafka topic configurable by @masci in #353
- feat: make
message_type
configurable in Control Plane by @masci in #356 - refact: Make topic explicit in message queue API by @masci in #358
- refact: Use the Python SDK in the CLI implementation by @masci in #365
Bug Fixes 🐛
Documentation 📚
Full Changelog: v0.2.4...v0.3.0
v0.2.4
v0.2.4
v0.2.3
v0.2.3
v0.2.1
v0.2.1
v0.2.0
v0.2.0 is out now, with the main improvement being the addition of streaming support!
Now, if you have a workflow that writes to the event stream like:
class ProgressEvent(Event):
progress: str
# create a dummy workflow
class MyWorkflow(Workflow):
@step()
async def run_step(self, ctx: Context, ev: StartEvent) -> StopEvent:
# Your workflow logic here
arg1 = str(ev.get("arg1", ""))
result = arg1 + "_result"
# stream events as steps run
ctx.write_event_to_stream(
ProgressEvent(progress="I am doing something!")
)
return StopEvent(result=result)
You can stream the events using the client
# create a session
session = client.create_session()
# kick off run
task_id = session.run_nowait("streaming_workflow", arg1="hello_world")
# stream events -- the will yield a dict representing each event
for event in session.get_task_result_stream(task_id):
print(event)
# get final result
result = session.get_task_result(task_id)
print(result)
# prints 'hello_world_result'