Edit

Share via


What is Fabric Pro-Dev MCP Server?

Microsoft Fabric Pro-Dev MCP Server enables AI agents to interact with Fabric using natural language while running locally on your development machine. Built on the Model Context Protocol (MCP), it provides development-focused tools, file system access, and extensibility for building Fabric solutions.

Fabric Pro-Dev MCP Server runs as a local subprocess on your development machine, providing AI agents with access to Fabric operations and local file system resources. It's optimized for development workflows where you need direct file access, custom tools, and integration with your local development environment.

Best for:

  • Developers building Fabric solutions locally
  • Advanced scenarios requiring file system access
  • Custom development workflows and automation
  • Offline or disconnected development environments
  • Teams that want to extend the server with custom tools

Key features:

  • Local subprocess architecture — Runs on your machine, no cloud dependency
  • File system access — Read/write local configuration and data files
  • Development tooling — Tools optimized for building Fabric solutions
  • Open source — Extend and customize for your workflows
  • Offline capable — Works in disconnected development environments
  • Extensible — Add custom tools and workflows for your team

Get started: See the sections below for installation and setup, or jump to the complete documentation on GitHub.

Why use Fabric Pro-Dev MCP Server?

Fabric Pro-Dev MCP Server enables AI-assisted development workflows that go beyond simple conversation:

  • Accelerate development — Generate boilerplate code, scaffold projects, and automate repetitive tasks
  • Local-first approach — Work with local files and configurations without cloud dependencies
  • Integrated workflow — Seamlessly combine file system operations with Fabric API calls
  • Team extensibility — Add custom tools that match your team's development practices
  • Offline development — Continue working even without internet connectivity
  • Version control — Manage your MCP server configuration alongside your code

Getting started

Prerequisites

Before installing, verify you have:

  • Node.js — Version 18 or later (download)
  • Microsoft Fabric access — A Fabric tenant and workspace
  • AI agent — VS Code with GitHub Copilot, Claude Desktop, or custom MCP client

Quick start

  1. Install the server:

    npm install -g fabric-pro-dev-mcp-server
    
  2. Configure authentication: Set up your local credentials for Fabric API access

  3. Connect your AI agent: Configure VS Code, Claude, or your custom client to use the local server

  4. Start developing: Ask your AI agent to help with Fabric tasks

Full installation guide: Complete setup instructions on GitHub

What is Model Context Protocol

Model Context Protocol (MCP) is an open standard that enables AI agents to securely access external data sources and services through a unified interface. MCP provides:

  • Standardized interface — One protocol works across all AI agents
  • Secure by design — Built-in authentication and access control
  • Typed operations — Defined schemas reduce errors
  • Discoverable capabilities — Agents explore available operations automatically

Learn more at modelcontextprotocol.io

Architecture

Fabric Pro-Dev MCP Server runs as a subprocess on your local machine, translating natural language prompts into Fabric API operations and file system actions:

AI Agent ↔ Fabric Pro-Dev MCP Server (local) ↔ Fabric REST APIs
          ↕                                      ↕
     Local File System                    Local Dev Tools

Request flow:

  1. AI agent sends prompt to local server subprocess
  2. Server processes request with configured credentials
  3. Server calls Fabric API or accesses local file system
  4. Response returned through MCP protocol
  5. AI agent translates results to natural language
  6. Server terminates when AI agent session ends

Documentation

Detailed documentation for Fabric Pro-Dev MCP Server:

Common development scenarios

Project scaffolding

Create a new Fabric semantic model project in my current directory

The AI agent uses the Pro-Dev server to generate project structure, configuration files, and starter code.

Local testing

Validate my semantic model configuration before deploying  

The server reads local files, validates schemas, and reports potential issues.

Deployment automation

Deploy my local semantic model to my dev workspace

The server packages local files and uploads them to your Fabric workspace.

Extending the server

Add custom tools for your team's workflows:

  1. Define tool schema with inputs and outputs
  2. Implement tool logic in TypeScript/JavaScript
  3. Register tool in server configuration
  4. Test with your AI agent

View extension guide on GitHub