AIML Docs

Using MCP Integrations

Model Context Protocol (MCP) integrations allow your AIML application to interact with external services, APIs, or custom logic hosted elsewhere. MCP tools can be integrated directly into your LLM elements, enabling AI models to access external capabilities.

Defining MCP Tools

MCP tools are defined inline within the tools prop of an <llm> element. Each tool specifies its transport type and connection details:

Transport Types

AIML supports two transport types for MCP connections:

  1. SSE (Server-Sent Events): For real-time streaming connections
  2. Streamable HTTP: For standard HTTP-based MCP servers

Tool Definition Structure

<llm
  model="accounts/fireworks/models/llama-v3p1-8b-instruct"
  tools={[
    {
      type: "mcp",
      mcp: {
        transport: "sse" | "streamable-http",
        url: "https://your-mcp-server.com/endpoint",
        headers: {
          "Authorization": "Bearer token",
          // Additional headers as needed
        }
      }
    }
  ]}
>
  <!-- LLM content -->
</llm>

Examples

Example 1: Weather Service with SSE Transport

<llm 
  id="weatherAssistant"
  model="accounts/fireworks/models/llama-v3p1-8b-instruct"
  tools={[
    {
      type: "mcp",
      mcp: {
        transport: "sse",
        url: "https://mcp.weatherservice.com/stream",
        headers: {
          "X-API-Key": "your-api-key"
        }
      }
    }
  ]}
>
  <prompt>
    What's the weather like in ${location}?
  </prompt>
</llm>

Example 2: Database Query Service with Streamable HTTP

<llm
  id="dataAnalyst"
  model="claude-3"
  tools={[
    {
      type: "mcp",
      mcp: {
        transport: "streamable-http",
        url: "https://mcp.database.example.com/query"
      }
    }
  ]}
>
  <instructions>
    You are a data analyst assistant. Use the database tools to answer questions about company data.
  </instructions>
  <prompt>
    ${userQuery}
  </prompt>
</llm>

Example 3: Multiple MCP Tools

You can combine multiple MCP tools in a single LLM element:

<llm
  id="researchAssistant"
  model="accounts/fireworks/models/llama-v3p1-8b-instruct"
  tools={[
    {
      type: "mcp",
      mcp: {
        transport: "sse",
        url: "https://mcp.search.example.com/v1",
        headers: {
          "Authorization": "Bearer your-search-api-key"
        }
      }
    },
    {
      type: "mcp",
      mcp: {
        transport: "streamable-http",
        url: "https://mcp.knowledge.example.com/api"
      }
    },
    {
      type: "function",
      function: {
        name: "calculate",
        description: "Perform basic calculations",
        parameters: {
          type: "object",
          properties: {
            expression: { type: "string" }
          }
        }
      }
    }
  ]}
>
  <instructions>
    You are a research assistant with access to search, knowledge base, and calculation tools.
  </instructions>
</llm>

Common Patterns

1. Weather Service Integration

Example of integrating a weather MCP service:

<llm
  model="accounts/fireworks/models/llama-v3p1-8b-instruct"
  tools={[
    {
      type: "mcp",
      mcp: {
        transport: "sse",
        url: "https://api.weather-mcp.com/v1/stream",
        headers: {
          "Authorization": "Bearer your-api-key-here"
        }
      }
    }
  ]}
>
  <!-- LLM content -->
</llm>

2. Combining MCP with Traditional Function Tools

Mix MCP tools with local function tools for hybrid capabilities:

<llm
  model="claude-3"
  tools={[
    // MCP tool for external data
    {
      type: "mcp",
      mcp: {
        transport: "streamable-http",
        url: "https://api.external-service.com/v1"
      }
    },
    // Local function tool
    {
      type: "function",
      function: {
        name: "processData",
        description: "Process and transform data locally",
        parameters: {
          type: "object",
          properties: {
            data: { type: "array" },
            operation: { type: "string" }
          }
        }
      }
    }
  ]}
>
  <instructions>
    Use the MCP tool to fetch data and the local function to process it.
  </instructions>
</llm>

Advanced Usage

Dynamic Tool Configuration

You can dynamically configure MCP tools based on runtime conditions:

<datamodel>
  <data 
    id="userTier"
    type="STRING"
    value="premium"
  />
</datamodel>
 
<llm
  id="assistant"
  model="accounts/fireworks/models/llama-v3p1-8b-instruct"
  tools={({state}) => {
    const tools = [];
    
    // Basic tools for all users
    tools.push({
      type: "mcp",
      mcp: {
        transport: "sse",
        url: "https://mcp.basic.example.com/api"
      }
    });
    
    // Premium tools
    if (state.userTier === "premium") {
      tools.push({
        type: "mcp",
        mcp: {
          transport: "streamable-http",
          url: "https://mcp.premium.example.com/api",
          headers: {
            "X-User-Tier": "premium"
          }
        }
      });
    }
    
    return tools;
  }}
>
  <instructions>
    Assist the user based on their subscription tier.
  </instructions>
</llm>

Error Handling

MCP connections may fail or timeout. The runtime automatically handles connection errors and will log them to the console. Tools that fail to connect will not be available to the LLM.

<llm
  model="accounts/fireworks/models/llama-v3p1-8b-instruct"
  tools={[
    {
      type: "mcp",
      mcp: {
        transport: "sse",
        url: "https://mcp.weather.example.com/api",
        headers: {
          "X-API-Key": "your-weather-api-key"
        }
      }
    }
  ]}
>
  <instructions>
    If the weather service is unavailable, inform the user politely and suggest checking back later.
  </instructions>
  <prompt>
    ${userRequest}
  </prompt>
</llm>

Limitations

  • Headers must be static (dynamic headers based on runtime state are not yet supported)