Meai Azure AI Foundry MCP Sample
.NET is also capable of building AI infused app really easily. This repository provides several sample apps using Microsoft.Extensions.AI, Azure AI Foundry, Foundry Local and MCP server/client.
Ask AI about Meai Azure AI Foundry MCP Sample
Powered by Claude · Grounded in docs
I know everything about Meai Azure AI Foundry MCP Sample. Ask me about installation, configuration, usage, or troubleshooting.
0/500
Reviews
Documentation
MEAI, Azure AI Foundry, Foundry Local and MCP Sample
Did you know .NET is the most versatile language to build both frontend and backend apps? In addition to that, .NET is also capable of building AI infused app really easily. This repository provides several sample apps using Microsoft.Extensions.AI, Azure AI Foundry, Foundry Local and MCP server/client.
Prerequisites
Getting started
Build Foundry Local SDK
NOTE: This is an interim solution until the official NuGet package is ready.
-
Clone Foundry Local repository.
git clone https://github.com/microsoft/Foundry-Local.git foundry-local -
Build the C# SDK.
cd foundry-local/sdk/cs dotnet restore && dotnet build -
Get the NuGet package location.
# bash/zsh FOUNDRY_LOCAL_NUGET_PACKAGE_PATH=$(find $(pwd) -name "*.nupkg" -type f -exec dirname {} \; | sort -u)# PowerShell $FOUNDRY_LOCAL_NUGET_PACKAGE_PATH = $(Get-ChildItem -Path . -Filter *.nupkg -Recurse).Directory.FullName
Get ready to run apps on your local machine
-
Clone this repository.
git clone https://github.com/devkimchi/meai-azure-ai-foundry-mcp-sample.git -
Create
nuget.config.{{FOUNDRY_LOCAL_NUGET_PACKAGE_PATH}}is the value from the previous section.# bash/zsh cd meai-azure-ai-foundry-mcp-sample cat ./nuget.sample.config \ | sed 's|/path/to/foundry-local/sdk/cs/src/bin/Debug|{{FOUNDRY_LOCAL_NUGET_PACKAGE_PATH}}|g' \ > nuget.config# PowerShell cd meai-azure-ai-foundry-mcp-sample $(Get-Content -Path ./nuget.sample.config) ` -replace "/path/to/foundry-local/sdk/cs/src/bin/Debug", "{{FOUNDRY_LOCAL_NUGET_PACKAGE_PATH}}" ` | Out-File -Path nuget.config -ForceNOTE: This step is only required until the official NuGet package is published.
Run Microsoft.Extensions.AI sample app
-
Make sure you have API keys for OpenAI, Azure OpenAI, GitHub Models, Anthropic and/or Google Vertex AI.
-
Store those API keys to user secrets:
# OpenAI dotnet user-secrets --project ./src/Meai.ClientApp set ConnectionStrings:openai "Endpoint=https://api.openai.com/v1;Key={{OPENAI_API_KEY}}" # Azure OpenAI dotnet user-secrets --project ./src/Meai.ClientApp set ConnectionStrings:openai "Endpoint={{AZURE_OPENAI_ENDPOINT}};Key={{AZURE_OPENAI_API_KEY}}" # GitHub Models dotnet user-secrets --project ./src/Meai.ClientApp set ConnectionStrings:openai "Endpoint=https://models.inference.ai.azure.com;Key={{GITHUB_PAT}}" # Anthropic dotnet user-secrets --project ./src/Meai.ClientApp set ConnectionStrings:anthropic "Endpoint=https://api.anthropic.com;Key={{ANTHROPIC_API_KEY}}" # Google Vertex AI dotnet user-secrets --project ./src/Meai.ClientApp set ConnectionStrings:google "Endpoint=https://generativelanguage.googleapis.com;Key={{GOOGLE_API_KEY}}" -
Make sure which LLM you're going to use at
src/Meai.ClientApp/appsettings.json// To use OpenAI, Azure OpenAI or GitHub Models { "MEAI": { "ChatClient": "openai" } }// To use Anthropic { "MEAI": { "ChatClient": "anthropic" } }// To use Google Vertex AI { "MEAI": { "ChatClient": "google" } } -
Run the app.
dotnet watch run --project ./src/Meai.ClientApp
Run Azure AI Foundry sample app
-
Make sure you have an API key for Azure AI Foundry.
-
Store the API key to user secrets:
# Azure AI Foundry dotnet user-secrets --project ./src/Meai.ClientApp set ConnectionStrings:foundry "Endpoint={{AZURE_AI_FOUNDRY_ENDPOINT}};Key={{AZURE_AI_FOUNDRY_API_KEY}}" -
Make sure which LLM you're going to use at
src/Foundry.ClientApp/appsettings.json{ "MEAI": { "ChatClient": "foundry" } } -
Run the app.
dotnet watch run --project ./src/Foundry.ClientApp
Run Foundry Local sample app
-
Make sure you have installed Foundry Local CLI.
-
Add model to Foundry Local. You can add any model from the list by running
foundry model ls.foundry model download qwen2.5-0.5b -
Make sure which LLM you're going to use at
src/Foundry.ClientApp/appsettings.json{ "MEAI": { "ChatClient": "local" } } -
Run the app.
dotnet watch run --project ./src/Foundry.ClientApp
Run MCP server/client sample app
-
Make sure you have API keys for OpenAI, Azure OpenAI or GitHub Models.
-
Store those API keys to user secrets:
# OpenAI dotnet user-secrets --project ./src/Meai.ClientApp set ConnectionStrings:openai "Endpoint=https://api.openai.com/v1;Key={{OPENAI_API_KEY}}" # Azure OpenAI dotnet user-secrets --project ./src/Meai.ClientApp set ConnectionStrings:openai "Endpoint={{AZURE_OPENAI_ENDPOINT}};Key={{AZURE_OPENAI_API_KEY}}" # GitHub Models dotnet user-secrets --project ./src/Meai.ClientApp set ConnectionStrings:openai "Endpoint=https://models.inference.ai.azure.com;Key={{GITHUB_PAT}}" -
Run the MCP server app.
dotnet run --project ./src/McpTodo.ServerApp -
Run the MCP client app in another terminal.
dotnet watch run --project ./src/mcpTodo.ClientApp
Known Issues: Installing Foundry Local CLI
On Windows, if you keep failing to install Foundry Local CLI on your machine, try this way as a workaround:
# Download the package and its dependency
$releaseUri = "https://github.com/microsoft/Foundry-Local/releases/download/v0.3.9267/FoundryLocal-x64-0.3.9267.43123.msix"
Invoke-WebRequest -Method Get -Uri $releaseUri -OutFile ./FoundryLocal.msix
$crtUri = "https://aka.ms/Microsoft.VCLibs.x64.14.00.Desktop.appx"
Invoke-WebRequest -Method Get -Uri $crtUri -OutFile ./VcLibs.appx
# Install the Foundry Local package
Add-AppxPackage ./FoundryLocal.msix -DependencyPath ./VcLibs.appx
