Replies: 3 comments 1 reply
-
Hi, did you find a solution? |
Beta Was this translation helpful? Give feedback.
0 replies
-
{
"role": "system",
"content": "****your prompts. Return a JSON response only. IMPORTANT: Format the output as a JSON. Only return a JSON response with no other comment or text. If you return any other text than JSON, you will have failed.",
} i use this prompt, and try it on deepseek and qwen, it works well. |
Beta Was this translation helpful? Give feedback.
1 reply
-
For GPT-4o, it supports constraining output to a JSON structure using a json_schema. However, it's unclear if this is supported in Dify, as it requires beforehand constructing the schema or defining it through Pydantic's BaseModel. If not supported, especially in UI-based workflows like Dify that encapsulate many details, I suggest:
|
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Self Checks
1. Is this request related to a challenge you're experiencing? Tell me about your story.
When I try to send LLM output variables as JSON format from HTTP request to GAS (GoogleAppScript), I get an error that LLM output is not in JSON format, the JSON format I write in the HTTP request node is correct (I checked many times). I think the problem is in the LLM output itself, but how can I make it conform to the JSON format?
2. Additional context or comments
No response
Beta Was this translation helpful? Give feedback.
All reactions