8 ~30 min

Build an MCP server

Turn your persona.json into an MCP server any AI client can query.

What MCP buys you here

MCP in plain words

MCP (Model Context Protocol) is a standard that lets an AI assistant call tools provided by an external program. Think of it as a USB port for AI: any MCP-aware client (Antigravity, Gemini CLI, etc.) can plug into any MCP server, no custom integration needed. Below, you'll build a server that exposes you as a set of tools.

The Model Context Protocol is a small open standard for letting AI clients call tools and read resources from external servers. Once your portfolio is behind an MCP server, any MCP-aware client — Antigravity, Gemini CLI, custom Gemini agents — can ask about you and get structured answers. Your portfolio stops being a static page and becomes an API.

Prerequisites

Set up the server project

Write the server

Create index.js with the following content. It registers four tools and wires them to a stdio transport, which is what most MCP clients speak by default.

// index.js — MCP server exposing persona.json
// Verify against latest @modelcontextprotocol/sdk docs.
import { Server } from "@modelcontextprotocol/sdk/server/index.js";
import { StdioServerTransport } from "@modelcontextprotocol/sdk/server/stdio.js";
import {
  CallToolRequestSchema,
  ListToolsRequestSchema,
} from "@modelcontextprotocol/sdk/types.js";
import { readFileSync } from "node:fs";
import { fileURLToPath } from "node:url";
import { dirname, join } from "node:path";

const __dirname = dirname(fileURLToPath(import.meta.url));
const persona = JSON.parse(
  readFileSync(join(__dirname, "persona.json"), "utf8")
);

const server = new Server(
  { name: "persona-mcp", version: "0.1.0" },
  { capabilities: { tools: {} } }
);

const tools = [
  { name: "get_skills",     description: "Return the persona's skills.",     inputSchema: { type: "object", properties: {} } },
  { name: "get_projects",   description: "Return the persona's projects.",   inputSchema: { type: "object", properties: {} } },
  { name: "get_experience", description: "Return the persona's experience.", inputSchema: { type: "object", properties: {} } },
  {
    name: "query_persona",
    description: "Answer a free-form question about the persona using keyword matching.",
    inputSchema: {
      type: "object",
      properties: { question: { type: "string" } },
      required: ["question"],
    },
  },
];

server.setRequestHandler(ListToolsRequestSchema, async () => ({ tools }));

function asText(payload) {
  return { content: [{ type: "text", text: JSON.stringify(payload, null, 2) }] };
}

function queryPersona(question) {
  const q = String(question || "").toLowerCase();
  const haystack = JSON.stringify(persona).toLowerCase();
  // Dumb v1: split into sentences, return the ones that share words with the question.
  const words = q.split(/\W+/).filter((w) => w.length > 3);
  const flat = JSON.stringify(persona);
  const hits = flat
    .split(/(?<=[.!?])\s+/)
    .filter((s) => words.some((w) => s.toLowerCase().includes(w)))
    .slice(0, 5);
  return {
    matched: hits.length,
    excerpts: hits.length ? hits : [haystack.slice(0, 400)],
  };
}

server.setRequestHandler(CallToolRequestSchema, async (req) => {
  const { name, arguments: args } = req.params;
  switch (name) {
    case "get_skills":     return asText(persona.skills     ?? []);
    case "get_projects":   return asText(persona.projects   ?? []);
    case "get_experience": return asText(persona.experience ?? []);
    case "query_persona":  return asText(queryPersona(args?.question));
    default:
      throw new Error(`Unknown tool: ${name}`);
  }
});

const transport = new StdioServerTransport();
await server.connect(transport);
console.error("persona-mcp ready on stdio");
About the SDK shape

The MCP SDK has evolved quickly. The structure above (Server + request handlers + StdioServerTransport) reflects the public API at the time of writing. If imports fail, check node_modules/@modelcontextprotocol/sdk/dist or the SDK README for the current export paths.

Run it locally

node index.js

You should see persona-mcp ready on stdio on stderr. The process waits for an MCP client to connect over stdin/stdout — it will not respond to direct typing.

Register with your AI client

Antigravity

Open the MCP settings panel and add a stdio server entry:

{
  "mcpServers": {
    "persona": {
      "command": "node",
      "args": ["/absolute/path/to/mcp-persona/index.js"]
    }
  }
}

See the Antigravity docs for the exact config file location on your platform. Restart Antigravity — your tools should appear in the tools menu.

Gemini CLI

Gemini CLI uses the same MCP config shape, in ~/.gemini/settings.json. Same JSON above, dropped under mcpServers. Restart your CLI session and the tools become available.

Test it

From your AI client, ask:

The client should pick the right tool, call it, and answer using the JSON your server returned.

Tip

This works in any MCP client, not just the workshop's setup. The same index.js plugs into Antigravity, Gemini CLI, custom Gemini agents, and command-line MCP hosts without changes.

Where to take it next

Key takeaways
  • MCP turns a JSON file into a callable API surface for any AI client.
  • A useful server is small: a few tools, a stdio transport, no infrastructure.
  • Tools are typed inputs and outputs — keep them narrow and well-named.
  • Start dumb (keyword matching) and only add an LLM where the dumb path fails.
  • One implementation, many clients: same server works in Antigravity, Gemini CLI, and custom agents.
Go deeper after this