Unpacking the Microsoft Agent Framework



Agent orchestration is probably the most interesting part of this first release, as it supports several different orchestration models that provide support for different types of workflow. The simplest option is sequential orchestration. Agents are called one at a time, waiting for the response from the first agent before using it to build the prompt for the next. More complex scenarios can use concurrent orchestration. The initial query data calls several agents at the same time, working in parallel, moving on to the next workflow phase once all the agents have responded. Many of these orchestration models are drawn directly from traditional workflow processes, much like those used by tools like BizTalk. The remaining orchestration models are new and depend on the behavior of LLM-based agents.

Orchestration in a world of language models

The first new model is group chat orchestration. The agents in a process can communicate with each other, sharing results and updating based on that data until they converge on a single response. The second, hand-off orchestration, is a more evolved version of sequential orchestration, where not only does the data passed between agents update, so do the prompts, responding to changes in the context of the workflow. Finally, there’s support for what’s being called “magentic” workflow. This implements a supervisory manager agent that coordinates a subset of agents, orchestrating them as needed and bringing in humans where necessary. This last option is intended for complex problems, which may not have been considered for process automation using existing non-AI techniques.

These approaches are quite different from how we’ve built workflows in the past, and it’s important to experiment before deploying them. You’ll need to be careful with base prompts, ensuring that operations terminate and that if an answer or a consensus can’t be found, your agents will generate a suitable error message rather than simply generating a plausible output.



Source link