Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adding message parameter support for OpenAI models #382

Open
Hkllopp opened this issue Jun 14, 2024 · 5 comments
Open

Adding message parameter support for OpenAI models #382

Hkllopp opened this issue Jun 14, 2024 · 5 comments
Labels
enhancement New feature or request feature request

Comments

@Hkllopp
Copy link

Hkllopp commented Jun 14, 2024

OpenAI models uses the message parameter for the prompt. ScrapegraphAI also use this parameter to link to the prompt argument on scrapper invocation.
However, sometimes when using openAI models, we need to make multiple prompts to better guide the response (like in this article and this documentation).

Is it possible to replace the standart scrapGraphAI prompt when providing message argument in the graph_config ?

Example :

import os
from scrapegraphai.graphs import SmartScraperGraph

OPENAI_API_KEY = os.getenv("OPENAI_API_KEY")

system_prompt = "Provide output in valid JSON."
user_prompt = "List me all the news article with a brief description for each one."

graph_config = {
    "llm": {
        "api_key": OPENAI_API_KEY,
        "model": "gpt-3.5-turbo",
        "response_format": {"type": "json_object"},
        "seed": 0,
        "temperature": 0,
        "messages": [
            {"role": "system", "content": system_prompt},
            {"role": "user", "content": user_prompt},
        ],
    },
    "verbose": True,
}

smart_scraper_graph = SmartScraperGraph(
    prompt=user_prompt,
    # also accepts a string with the already downloaded HTML code
    source="https://perinim.github.io/projects",
    config=graph_config,
)

# TypeError: openai.resources.chat.completions.Completions.create() got multiple values for keyword argument 'messages'```
@VinciGit00
Copy link
Collaborator

ok, it could be an idea but please provide me a use case for the system prompt.
The output is already in the json format

@ehecatl
Copy link

ehecatl commented Jun 14, 2024

Screenshot 2024-06-14 at 3 19 56 p m Well, another example is that users that want to use groq with llama3-70b they need to mention the word JSON in the system-prompt message, its mandatory.

@VinciGit00
Copy link
Collaborator

Even if the output format is in json?

@ehecatl
Copy link

ehecatl commented Jun 17, 2024 via email

@VinciGit00
Copy link
Collaborator

give me an example please

@PeriniM PeriniM added enhancement New feature or request feature request labels Jun 18, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request feature request
Projects
None yet
Development

No branches or pull requests

4 participants