Building AI agents with the GitHub Copilot SDK
GitHub Copilot is one of the more mature AI assistants in use, having begun life as a way to use AI tools for code completion. Since then, Copilot has added features, becoming a resource for coordinating and orchestrating a wide variety of development-focused agents and services. Part of that development has been making Copilot available everywhere developers are: inside the browser, in your code editor, and now, with the Copilot CLI, in your terminal.
GitHub Copilot is especially useful when combined with Microsoft services like the Work IQ MCP server, enabling you to build prompts that combine code with specifications, and more importantly, with the discussions that take place outside traditional development environments. A missed email may hold a key feature request or an important user story. Using the Copilot CLI and Work IQ to surface the requirement and convert it into code helps reduce the risk of later rework and ensures that a project is still aligned to business needs.
Having GitHub Copilot at your fingertips is surprisingly helpful, but it does require switching context from application to terminal and back again. With it embedded in Visual Studio and Visual Studio Code, that isn’t too much of a problem, but what if you wanted to take that model and implement it inside your own applications? Building an agent command-and-control loop isn’t easy, even if you’re using platforms like Semantic Kernel.
Introducing the GitHub Copilot SDK
This is where the new GitHub Copilot SDK offers a solution. If you’ve got the Copilot CLI binaries installed on your device, whether it’s Windows, macOS, or Linux, then you can use the SDK (and the CLI’s own JSON APIs) to embed the Copilot CLI in your code, giving you access to its orchestration features and to GitHub’s Model Context Protocol (MCP) registry. Having an integrated registry simplifies the process of discovery and installation for MCP servers, quickly bringing new features into your agent applications.
When your code uses the GitHub Copilot SDK with the CLI, it’s using it as a server. This lets it run headless, so you won’t see the interactions between your code and the underlying agents. You don’t need to run the CLI on devices running GitHub Copilot SDK applications; thanks to remote access, you can install it on a central server. Users will still need a GitHub Copilot license to use GitHub Copilot SDK applications though, even if they’re working against a remote instance.
The SDK lets you treat the Copilot CLI server as a tool for executing and managing models and MCP servers instead of having to build your own, which simplifies development significantly. Running the CLI in server mode is a simple matter of launching it as a server and defining the port used for prompting. You can then use the fully qualified domain name of the machine running the server and the defined port as a connection string.
Once the CLI is installed, add the appropriate SDK dialect. Official support is provided for JavaScript and TypeScript via Node.js, .NET, Python, and Go. Node support is hosted in npm, .NET in NuGet, and Python in pip. For now, the Go SDK can be found in the project’s GitHub repository and can be installed using Go’s get command. It’s a fast-moving project, so be sure to regularly check for updates.
Using the GitHub Copilot SDK
Calling the SDK is simple enough. In .NET you start by creating a new CopilotClient and adding a session with a support model before sending an asynchronous session message to the Copilot CLI containing a prompt. The response has the agent’s answer and can then be managed by your code.
This is one of the simplest ways to interact with a large language model (LLM) from your code, with similar approaches used in other languages and platforms. You’ll need to write code to display or parse the data returned from the SDK, so it’s useful to build base prompts in your code that can force the underlying GitHub Copilot LLM to return formats that can be parsed easily.
The default is to wait for Copilot CLI to return a complete response, but there’s support for displaying response data as it’s returned by adding a streaming directive to the session definition. This approach can be helpful if you’re building an interactive user experience or where LLM responses can take time to generate.
Like Semantic Kernel and other agent development frameworks, the GitHub Copilot SDK is designed to use tools that link your code to LLMs, which can structure calls to data that Copilot will convert to natural language. For example, you could build a tool that links to a foreign exchange platform to return the exchange rate between two currencies on a specific date, which can then be used in a Copilot SDK call, adding the tool name as part of the session definition.
Using tools and MCP servers
Tools are built as handlers linked to either local code or a remote API. Working with API calls allows you to quickly ground the LLM and reduce the risks associated with token exhaustion. Once you have a tool, you can have an interactive session allow users to build their own prompts and deliver them to the grounded LLM as part of a larger application or as a task-specific chatbot in Teams or another familiar application.
You don’t have to write your own tool; if you’re working with existing platforms and data, your SDK code can call an MCP server, like the one offered by Microsoft 365, to provide quick access to larger data sources. As the GitHub Copilot SDK builds on the capabilities of the Copilot CLI, you can start by defining a link to the relevant MCP endpoint URLs and then allow the tool to call the necessary servers as needed. An application can include links to more than one MCP server, so linking Work IQ to your GitHub repositories bridges the gap between code and the email chains that inspired it.
MCP servers can be local or remote, using HTTP for remote connections and stdio for local. You may need to include authorization tokens in your MCP server definition, as well as accepting all its tools or choosing specific tools for your GitHub Copilot SDK agent to use.
Other options for the SDK agent session include defining a base prompt for all queries to help avoid inconsistencies and to provide some context for interactions outside of the user prompt. Different agents in the same application can be given different base prompts and separate names for separate functions. A local MCP server might give your application access to a PC’s file system, while a remote server might be GitHub’s own one, giving your application access to code and repositories.
You’re not limited to the official language support. There are community releases of the SDK for Java, Rust, C++, and Clojure, so you can work with familiar languages and frameworks. As they’re not official releases, they may not be coordinated with GitHub’s own SDKs and won’t have the same level of support.
Working in the Microsoft Agent Framework
Usefully, the Microsoft Agent Framework now supports the GitHub Copilot SDK, so you can integrate and orchestrate its agents with ones built from other tools and frameworks, such as Fabric or Azure OpenAI. This lets you build complex AI-powered applications from proven components, using Agent Framework to orchestrate workflow across multiple agents. You’re not limited to a single LLM, either. It’s possible to work with ChatGPT in one agent and Claude in another.
Tools like the GitHub Copilot SDK are a useful way to experiment with agent development, taking the workflows you’ve built inside GitHub and Visual Studio Code and turning them into their own MCP-powered applications. Once you’ve built a fleet of different single-purpose agents, you can chain them together using higher-level orchestration frameworks, thereby automating workflows that bring in information from across your business and your application development life cycle.
Original Link:https://www.infoworld.com/article/4125776/building-ai-agents-with-the-github-copilot-sdk.html
Originally Posted: Tue, 03 Feb 2026 09:00:00 +0000












What do you think?
It is nice to know your opinion. Leave a comment.