Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

.Net: Empty response from model when using a prompt template with inline functions and function auto calling is set to Auto #9862

Open
crperez5 opened this issue Dec 2, 2024 · 1 comment
Assignees
Labels
bug Something isn't working .NET Issue or Pull requests regarding .NET code

Comments

@crperez5
Copy link

crperez5 commented Dec 2, 2024

I took the VectorStoreRAG sample app and made slight adjustments to connect it to my local Llama 3.2 and Qdrant instances.

My idea was to perform a vector search on Qdrant to retrieve context information from a PDF file. Additionally, the AI needs to access real-time data to compute the final response to the user. To achieve this, I added a plugin that fetches real-time price information.

Expected Behavior

My expectation is for the model to call both plugins: the SearchPlugin, which is inline, and the PricesTablePlugin. Both plugins have been added to the kernel.

Actual Behavior

The model returns an empty response (an empty string).

If the arguments are passed like this, the model returns an empty response:

arguments: new KernelArguments(new OllamaPromptExecutionSettings
{
    FunctionChoiceBehavior = FunctionChoiceBehavior.Auto()
})
{
    { "question", question },
},

If the function auto-calling setting is omitted, the model does return a response, but it lacks price information. However, it includes the context retrieved from the PDF search:

arguments: new KernelArguments()
{
    { "question", question },
},

Both plugins are registered in the following way:

kernel.Plugins.AddFromType<PricesTablePlugin>("PricesTable");
kernel.Plugins.Add(vectorStoreTextSearch.CreateWithGetTextSearchResults("SearchPlugin"));

Full Code Example

private async Task ChatLoopAsync(CancellationToken cancellationToken)
{
    ...

    Console.WriteLine("Assistant > Press Enter with no prompt to exit.");

    kernel.Plugins.AddFromType<PricesTablePlugin>("PricesTable");
    kernel.Plugins.Add(vectorStoreTextSearch.CreateWithGetTextSearchResults("SearchPlugin"));

    while (!cancellationToken.IsCancellationRequested)
    {
        Console.WriteLine($"Assistant > What would you like to know from the loaded PDFs: ({pdfFiles})?");

        Console.Write("User > ");
        var question = Console.ReadLine();

        if (string.IsNullOrWhiteSpace(question))
        {
            appShutdownCancellationTokenSource.Cancel();
            break;
        }

        // Invoke the LLM with a template that uses the search plugin to:
        // 1. Retrieve related information to the user query from the vector store.
        // 2. Add the information to the LLM prompt.
        var response = kernel.InvokePromptStreamingAsync(
            promptTemplate: """
                Please use this information to answer the question:
                {{#with (SearchPlugin-GetTextSearchResults question)}}
                  {{#each this}}
                    Name: {{Name}}
                    Value: {{Value}}
                    Link: {{Link}}
                    -----------------
                  {{/each}}
                {{/with}}

                Include citations to the relevant information where it is referenced in the response.

                Question: {{question}}
                """,
            arguments: new KernelArguments(new OllamaPromptExecutionSettings
            {
                FunctionChoiceBehavior = FunctionChoiceBehavior.Auto()
            })
            {
                { "question", question },
            },
            templateFormat: "handlebars",
            promptTemplateFactory: new HandlebarsPromptTemplateFactory(),
            cancellationToken: cancellationToken);


        Console.Write("\nAssistant > ");


        try
        {
            await foreach (var message in response.ConfigureAwait(false))
            {
                Console.Write(message);
            }
            Console.WriteLine();
        }
        catch (Exception ex)
        {
            Console.WriteLine($"Call to LLM failed with error: {ex}");
        }
    }
}

Platform Details
OS: Windows 10
IDE: Visual Studio Code
Language: C#
Source: Latest Semantic Kernel, Llama 3.2

@crperez5 crperez5 added the bug Something isn't working label Dec 2, 2024
@markwallace-microsoft markwallace-microsoft added .NET Issue or Pull requests regarding .NET code triage labels Dec 2, 2024
@evchaki
Copy link
Contributor

evchaki commented Dec 2, 2024

@crperez5 - thanks for logging this. @westey-m - can you please take a look?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working .NET Issue or Pull requests regarding .NET code
Projects
Status: Sprint: In Progress
Development

No branches or pull requests

4 participants