At the Directions EMEA 2025 conference, Microsoft announced the public preview of their Business Central MCP Server. In case you haven’t heard, MCP is an open-source standard to connect AI applications to external systems. In the case of the BC MCP Server, this means that AI applications can access Business Central. I don’t want to go into the details on the BC side, as I’m sure that Microsoft will do a good job in documenting that. But for very understandable reasons, Microsoft only supports the Microsoft Copilot Studio as client, even though you may still want to use it with other clients, such as Visual Studio Code with Github Copilot or cagent.

The TL;DR

Here’s what you need to do for that:

  • Get the code from the BcMCPProxy sample in the BCTech Github repo
  • Compile it into an executable file
  • Follow the instructions in the repository to create an app registration, which you will need to authenticate against Business Central
  • Configure the proxy as an MCP Server in your client
  • Start using it!

For example, this is what it looks like in VS Code:

The details: Getting the BcMCPProxy executable

The BC MCP Proxy is a .NET 8 application, which means that you need a .NET SDK to compile it. To make that easier, I have created a pull request that adds devcontainer support as well as build tasks for VS Code to make building self-contained executables easier. Until that PR is merged (if at all), you can clone my fork and open the samples/BcMCPProxy folder in a dev container. Once it has started, you can run the build tasks and select your operating system. In my case, this is Windows-ARM64, but of course your setup may differ. You will then have an executable file that you can copy to your host machine and use it in the MCP setup later.

Configuring VS Code

One way to configure an MCP Server in VS Code is to use a .vscode/mcp.json file (see the official docs for all options). As explained in the repo mentioned above for Claude Desktop, you need to add some configuration options. It should look like this:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
{
    "servers": {
        "bc": {
            "type": "stdio",
            "command": "C:\\Users\\tobia\\deleteme\\BcMCPProxy.exe",
            "args": [
                "--TenantId",
                "<tenant-id>",
                "--ClientId",
                "<client-id>",
                "--Environment",
                "<environment>",
                "--Company",
                "<company>",
                "--ConfigurationName",
                "<configuration-name>"
            ]
        }
    }
}

As you can see, you need to fill in the following:

  • the ID of your Entra ID tenant in line 8;
  • the client ID of your app registration in line 10;
  • the name of your BC environment where you have the MCP feature enabled and configured in line 12;
  • the name of the BC company you want to use in line 14;
  • the name of the MCP configuration in line 16. You can then click on “Start” and use the tools in GitHub Copilot Chat within VS Code, as shown in the video above!

Configuring cagent

Another client that works well is Docker’s cagent. In this case, the configuration is very similar, we only need to also add the LLM that we want to use. An example could look like this, using an Azure AI Foundry model:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
version: "1"

agents:
  root:
    description: An agent that interacts with Microsoft Dynamics 365 Business Central
    instruction: |
      You are an agent helping the user to interact with the ERP system Microsoft Dynamics 365 Business Central.
    model: cloud-gpt-5
    toolsets:
      - type: mcp
        command: C:\Users\tobia\deleteme\BcMCPProxy.exe
        args: [
            "--TenantId",
            "<tenant-id>",
            "--ClientId",
            "<client-id>",
            "--Environment",
            "<environment>",
            "--Company",
            "<company>",
            "--ConfigurationName",
            "<configuration-name>"
        ]

models:
  cloud-gpt-5:
    provider: azure
    model: gpt-5
    base_url: <your-azure-ai-foundry-endpoint>
    provider_opts:
      azure_api_version: 2025-01-01-preview

You can see the same setup as for VS Code in lines 10–23, as well as the model setup in lines 25–31 and the reference to it in line 8. With this configuration, the BC MCP Server can be used in a similar way:

With that, you should also be able to adapt the configuration for other clients. Have fun building applications and solving problems with it!