Skip to content

Commit

Permalink
port to 2.0
Browse files Browse the repository at this point in the history
  • Loading branch information
TuanaCelik committed Jan 10, 2024
1 parent 459807c commit 171d07b
Show file tree
Hide file tree
Showing 5 changed files with 50 additions and 41 deletions.
10 changes: 5 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,22 +17,22 @@ pinned: false
##### A simple app to get an overview of what the Mastodon user has been posting about and their tone

This is a demo just for fun 🥳
This repo contains a streamlit application that given a Mastodon username, tells you what type of things they've been posting about lately, their tone, and the languages they use. It uses the LLM by OpenAI `text-davinci-003`.
This repo contains a streamlit application that given a Mastodon username, tells you what type of things they've been posting about lately, their tone, and the languages they use. It uses the LLM by OpenAI `gpt-4`.

It's been built with [Haystack](https://haystack.deepset.ai) using the [`PromptNode`](https://docs.haystack.deepset.ai/docs/prompt_node) and by creating a custom [`PromptTemplate`](https://docs.haystack.deepset.ai/docs/prompt_node#templates)
It's been built with [Haystack](https://haystack.deepset.ai) using the [`OpenAIGenerator`](https://docs.haystack.deepset.ai/v2.0/docs/openaigenerator) and by creating a [`PromptBuilder`](https://docs.haystack.deepset.ai/v2.0/docs/promptbuilder)

https://user-images.githubusercontent.com/15802862/220464834-f42c038d-54b4-4d5e-8d59-30d95143b616.mov


### Points of improvement

Since we're using a generative model here, we need to be a bit creative with the prompt we provide it to minimize any hallucination or similar unwanted results. For this reason, I've tried to be a bit creative with the `PromptTemplate` and give some examples of _how_ to construct a summary. However, this still sometimes produces odd results.
Since we're using a generative model here, we need to be a bit creative with the prompt we provide it to minimize any hallucination or similar unwanted results. For this reason, I've tried to be a bit creative with the `PromptBuilder` template and give some examples of _how_ to construct a summary. However, this still sometimes produces odd results.

If you try to run it yourself and find ways to make this app better, please feel free to create an issue/PR 🙌

## To learn more about the PromptNode
## To learn more about the PromptBuilder

Check out our tutorial on the PromptNode and how to create your own templates [here](https://haystack.deepset.ai/tutorials/21_customizing_promptnode)
As of Haystack 2.0-Beta onwards, you can create prompt templates with Jinja. Check out guide on creating prompts [here](https://docs.haystack.deepset.ai/v2.0/docs/promptbuilder)

## Installation and Running
To run the bare application which does _nothing_:
Expand Down
2 changes: 1 addition & 1 deletion app.py
Original file line number Diff line number Diff line change
Expand Up @@ -54,5 +54,5 @@

if st.session_state.result:
voice = st.session_state.result
st.write(voice['results'][0])
st.write(voice[0])

5 changes: 2 additions & 3 deletions requirements.txt
Original file line number Diff line number Diff line change
@@ -1,7 +1,6 @@
safetensors==0.3.3.post1
farm-haystack==1.20.0
haystack-ai==2.0.0b4
streamlit==1.21.0
markdown
st-annotated-text
python-dotenv
mastodon-fetcher-haystack==0.0.1
mastodon-fetcher-haystack
70 changes: 40 additions & 30 deletions utils/haystack.py
Original file line number Diff line number Diff line change
@@ -1,51 +1,61 @@
import streamlit as st
from mastodon_fetcher_haystack.mastodon_fetcher import MastodonFetcher
from haystack import Pipeline
from haystack.nodes import PromptNode, PromptTemplate
from haystack.components.generators import OpenAIGenerator
from haystack.components.builders import PromptBuilder

def start_haystack(openai_key):
#Use this function to contruct a pipeline
fetcher = MastodonFetcher()

mastodon_template = PromptTemplate(prompt="""You will be given a post stream belonging to a specific Mastodon profile. Answer with a summary of what they've lately been posting about and in what languages.
You may go into some detail about what topics they tend to like postint about. Please also mention their overall tone, for example: positive,
negative, political, sarcastic or something else.
Examples:
Post stream: [@deepset_ai](https://mastodon.social/@deepset_ai): Come join our Haystack server for our first Discord event tomorrow, a deepset AMA session with @rusic_milos @malte_pietsch…
[@deepset_ai](https://mastodon.social/@deepset_ai): Join us for a chat! On Thursday 25th we are hosting a 'deepset - Ask Me Anything' session on our brand new Discord. Come…
[@deepset_ai](https://mastodon.social/@deepset_ai): Curious about how you can use @OpenAI GPT3 in a Haystack pipeline? This week we released Haystack 1.7 with which we introdu…
[@deepset_ai](https://mastodon.social/@deepset_ai): So many updates from @deepset_ai today!
Summary: This user has lately been reposting posts from @deepset_ai. The topics of the posts have been around the Haystack community, NLP and GPT. They've
been posting in English, and have had a positive, informative tone.
Post stream: I've directed my team to set sharper rules on how we deal with unidentified objects.\n\nWe will inventory, improve ca…
the incursion by China’s high-altitude balloon, we enhanced radar to pick up slower objects.\n \nBy doing so, w…
I gave an update on the United States’ response to recent aerial objects.
Summary: This user has lately been posting about having sharper rules to deal with unidentified objects and an incursuin by China's high-altitude
baloon. Their pots have mostly been neutral but determined in tone. They mostly post in English.
Post stream: {join(documents)}
Summary:
""")
prompt_node = PromptNode(model_name_or_path="gpt-4", default_prompt_template=mastodon_template, api_key=openai_key)
mastodon_template = """You will be given a post stream belonging to a specific Mastodon profile. Answer with a summary of what they've lately been posting about and in what languages.
You may go into some detail about what topics they tend to like postint about. Please also mention their overall tone, for example: positive,
negative, political, sarcastic or something else.
Examples:
Post stream: [@deepset_ai](https://mastodon.social/@deepset_ai): Come join our Haystack server for our first Discord event tomorrow, a deepset AMA session with @rusic_milos @malte_pietsch…
[@deepset_ai](https://mastodon.social/@deepset_ai): Join us for a chat! On Thursday 25th we are hosting a 'deepset - Ask Me Anything' session on our brand new Discord. Come…
[@deepset_ai](https://mastodon.social/@deepset_ai): Curious about how you can use @OpenAI GPT3 in a Haystack pipeline? This week we released Haystack 1.7 with which we introdu…
[@deepset_ai](https://mastodon.social/@deepset_ai): So many updates from @deepset_ai today!
Summary: This user has lately been reposting posts from @deepset_ai. The topics of the posts have been around the Haystack community, NLP and GPT. They've
been posting in English, and have had a positive, informative tone.
Post stream: I've directed my team to set sharper rules on how we deal with unidentified objects.\n\nWe will inventory, improve ca…
the incursion by China’s high-altitude balloon, we enhanced radar to pick up slower objects.\n \nBy doing so, w…
I gave an update on the United States’ response to recent aerial objects.
Summary: This user has lately been posting about having sharper rules to deal with unidentified objects and an incursuin by China's high-altitude
baloon. Their pots have mostly been neutral but determined in tone. They mostly post in English.
Post stream: {{ documents }}
Summary:
"""
prompt_builder = PromptBuilder(template=mastodon_template)
llm = OpenAIGenerator(model_name="gpt-4", api_key=openai_key)

st.session_state["haystack_started"] = True

mastodon_pipeline = Pipeline()
mastodon_pipeline.add_node(component=fetcher, name="MastodonFetcher", inputs=["Query"])
mastodon_pipeline.add_node(component=prompt_node, name="PromptNode", inputs=["MastodonFetcher"])
mastodon_pipeline.add_component("fetcher", fetcher)
mastodon_pipeline.add_component("prompt_builder", prompt_builder)
mastodon_pipeline.add_component("llm", llm)


mastodon_pipeline.connect("fetcher.documents", "prompt_builder.documents")
mastodon_pipeline.connect("prompt_builder.prompt", "llm.prompt")

return mastodon_pipeline


@st.cache_data(show_spinner=True)
def query(username, _pipeline):
try:
result = _pipeline.run(query=username, params={"MastodonFetcher": {"last_k_posts": 20}})
replies = _pipeline.run(data={"fetcher": {"username": username,
"last_k_posts": 20}})
result = replies['llm']['replies']
except Exception as e:
result = ["Please make sure you are providing a correct, public Mastodon account"]
return result
4 changes: 2 additions & 2 deletions utils/ui.py
Original file line number Diff line number Diff line change
Expand Up @@ -46,8 +46,8 @@ def sidebar():
st.markdown("---")
st.markdown(
"## How this works\n"
"This app was built with [Haystack](https://haystack.deepset.ai) using the"
" [`PromptNode`](https://docs.haystack.deepset.ai/docs/prompt_node) and custom [`PromptTemplate`](https://docs.haystack.deepset.ai/docs/prompt_node#templates).\n\n"
"This app was built with [Haystack 2.0-Beta](https://haystack.deepset.ai) using the"
" [`OpenAIGenerator`](https://docs.haystack.deepset.ai/v2.0/docs/openaigenerator) and [`PromptBuilder`](https://docs.haystack.deepset.ai/v2.0/docs/promptbuilder).\n\n"
" The source code is also on [GitHub](https://github.com/TuanaCelik/should-i-follow)"
" with instructions to run locally.\n"
"You can see how the `PromptNode` was set up [here](https://github.com/TuanaCelik/should-i-follow/blob/main/utils/haystack.py)")
Expand Down

0 comments on commit 171d07b

Please sign in to comment.