Vercel AI SDK RSC - Using tools in StreamText

I’m trying to build a simple model to query on employee on my database. I defined the following schema and object:

const result = await streamText({
        model: openai("gpt-4o-mini"),
        temperature:0.5,
        messages:history,
        system: systemPrompt,
        tools: {
          getEmployeeInformation: employeeInformationTool,
        },
        toolChoice: "auto",
        maxToolRoundtrips: 5,
      });

export const employeeInformationTool = tool({
  description:
    "Retrieve the information (id and characteristics) for one employe based on his name. It can be a single name (first or last) or a combination of first and last name",
  parameters:  jsonSchema<{ queries: string[] }>({
    type: 'object',
    properties: {
      queries: { 
        type: 'array',
        items: { type: 'string' } 
      },
    },
    required: ['queries'],
  }),
  // location below is inferred to be a string:
  execute: async ({ queries }) => {
    const res = await getEmployeesByNameForAI(queries)
    return res
  },
});

When asking a question such as “Can you get for me the details on Jason and Bob” I was surprised to see that it results in 2 round calls of the tool “getEmployeeInformation” with the queries being [“Jason”], [“Bob”] rather than 1 call with an array of string like [“Jason”, “Bob”].
Could you please help me understanding how I should design either the prompt or the tool so that it results in 1 single call ?

Thanks
Regards

Hi, @stephtriple675-gmail! Welcome to the Vercel Community :smile:

Thanks for your patience. How did you get on with this? Did you make any progress?

I’d suggest to:

  1. Update the tool description to explicitly state that it can handle multiple names in a single call.
  2. Modify the system prompt to instruct the AI to group multiple names when using the tool.
  3. Consider updating the parameter schema to better represent the desired input format.

Let us know how you got on!

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.