Back to Blog

Building a Peptide Pharmacy with ACP and MCP

A biotech-inspired storefront interface connected to an ACP backend and MCP tools for AI ordering

TL;DR - Key Takeaways

  1. ACP and MCP solve different layers of agent commerce: ACP defines the merchant checkout lifecycle, while MCP exposes tools that AI clients like Claude Desktop and Cursor can call
  2. I built a full demo stack: a peptide storefront on GitHub Pages, a Cloudflare Workers backend for ACP checkout, and a remote MCP endpoint for AI agent ordering
  3. Checkout state lives in Durable Objects: every ACP checkout session gets durable, isolated state for items, buyer info, fulfillment, totals, and order completion
  4. The MCP server wraps the same commerce backend: the website and the AI tools call the same catalog and checkout system instead of drifting into separate implementations
  5. The whole project is live and open source: you can browse the storefront, hit the API, and connect the MCP endpoint from Claude or Cursor today

The Real Gap in Agent Commerce

AI assistants are increasingly good at discovering products.

The harder problem is finishing the transaction.

Once an agent finds something interesting, it still needs a structured way to:

  • create a cart
  • collect buyer and shipping information
  • surface totals and fulfillment options
  • confirm payment
  • return an order state the user can trust

That is the job of the Agentic Commerce Protocol (ACP).

But there is a second gap: even if you expose a commerce backend, how does Claude Desktop, Cursor, or another MCP-compatible AI client talk to it in a tool-native way?

That is where MCP comes in.

So instead of building just a checkout API, I built a demo that combines both layers:

  • ACP for the merchant-side checkout lifecycle
  • MCP for the AI-client tool interface
  • GitHub Pages for the storefront UI
  • Cloudflare Workers for the API and tool runtime

To make the example concrete, I modeled the store as a research-use-only peptide pharmacy.

Live URLs:


ACP vs MCP: Different Layers, Same Flow

One thing that gets muddled in agent-commerce discussions is that ACP and MCP are not competing protocols.

They sit at different levels.

Protocol Job In this project
ACP Defines checkout lifecycle between an AI application and a merchant system Create, update, complete, cancel checkout sessions
MCP Exposes tools to AI clients in a standard format list_products, create_checkout, complete_checkout, and friends

In other words:

  • ACP answers: how should a merchant backend model a transaction?
  • MCP answers: how should an AI client call tools?

That distinction matters because the official ACP checkout spec today is centered on REST endpoints and product feeds. MCP is not the official transport for ACP checkout yet.

So in this demo, I used a practical architecture:

  • the merchant backend implements ACP-style checkout endpoints
  • the MCP server wraps those same capabilities for Claude, Cursor, and other tool-calling clients

That gives you a system that is useful right now, while still lining up with where the protocols are headed.


Architecture

The architecture is intentionally simple: one catalog, one checkout model, two clients.

flowchart LR
    webUser["Web User"] --> storefront["GitHub Pages Storefront"]
    aiAgent["Claude / Cursor / MCP Client"] --> mcpEndpoint["/mcp Streamable HTTP"]
    storefront --> api["Cloudflare Worker API"]
    mcpEndpoint --> api
    api --> sessions["Durable Object Checkout Sessions"]
    api --> catalog["Product Catalog"]

The important design decision is that the storefront and the MCP server do not each invent their own commerce logic.

They both hit the same backend:

  • GET /products
  • GET /products/search?q=...
  • POST /checkout_sessions
  • POST /checkout_sessions/:id
  • POST /checkout_sessions/:id/complete
  • POST /checkout_sessions/:id/cancel

And the Worker also exposes:

  • ALL /mcp

That means the browser cart flow and the AI-agent flow stay aligned.


The Cloudflare Worker: ACP Checkout Backend

I used a single Worker as the API entry point, then routed checkout session state into a Durable Object.

This is the high-level routing layer:

if (url.pathname === "/mcp") {
  return handleMcpRequest(request, env)
}

if (request.method === "POST" && url.pathname === "/checkout_sessions") {
  const body = await readJson<CreateCheckoutRequest>(request)
  const sessionId = `checkout_${crypto.randomUUID()}`
  const stub = env.CHECKOUT_SESSIONS.getByName(sessionId)

  const session = await stub.createSession({
    sessionId,
    items: body.items,
    buyer: body.buyer,
    fulfillment_address: body.fulfillment_address,
    origin: url.origin,
  })

  return jsonResponse(session, 201)
}

Why a Durable Object here?

Because checkout is stateful.

Each session needs to remember:

  • selected items
  • buyer details
  • fulfillment address
  • fulfillment option
  • current totals
  • final order state

That maps cleanly to a one-checkout-session-per-Durable-Object model.

Instead of pushing that state into global memory or a fake singleton map, each session has a durable coordination boundary.

For a commerce flow, that is the right tradeoff.

What the checkout session returns

Every checkout response includes the merchant truth the UI or AI agent needs:

  • line_items
  • fulfillment_options
  • totals
  • messages
  • links
  • status

The demo uses the ACP-style statuses:

  • not_ready_for_payment
  • ready_for_payment
  • completed
  • canceled

That makes the flow easy to render in both a browser and an AI client.

Abuse prevention: per-IP API throttling

Because the backend is public, I added a lightweight per-IP throttle layer in the Worker so the demo is a little harder to abuse.

The implementation uses a second Durable Object as a rate-limit coordination atom. Instead of one global limiter, the Worker routes each request to a deterministic Durable Object keyed by:

  • route bucket
  • client IP

That keeps the logic simple while still matching the coordination model that Durable Objects are good at.

The current limits are:

Bucket Scope Limit
Catalog reads GET /products, GET /products/search, GET /products/:id, GET /checkout_sessions/:id 60 requests / minute / IP
Checkout writes POST /checkout_sessions* 15 requests / minute / IP
MCP ALL /mcp 30 requests / minute / IP
Health GET /health no throttle

When the limit is exceeded, the Worker returns:

  • HTTP 429
  • retry-after
  • x-rate-limit-limit
  • x-rate-limit-remaining
  • x-rate-limit-reset

That is not a substitute for a full production abuse-prevention stack, but it is a practical first layer for a public demo.


Product Catalog Design

I used a peptide catalog because it creates a more interesting domain than a generic T-shirt store.

The demo includes products like:

  • BPC-157
  • TB-500
  • GHK-Cu
  • Selank
  • Semax
  • Epithalon

Each product includes:

  • category
  • purity
  • dosage forms
  • price
  • stock status
  • keywords
  • a research_use_only flag

That last flag matters.

Since this is a peptide-pharmacy demo, the UI and the tool responses should be explicit about the compliance model. Every path in the system makes it clear that the storefront is research-use-only and that payment completion is demo-only.

That keeps the example realistic without pretending to be a production medical commerce system.


MCP Server: Same Backend, Tool-Native Interface

On top of the Worker API, I added a remote MCP server using the TypeScript SDK's streamable HTTP transport.

The server registers tools like this:

server.registerTool(
  "create_checkout",
  {
    title: "Create checkout",
    description:
      "Create a demo ACP checkout session from one or more peptide items.",
    inputSchema: toolSchemas.createCheckout,
  },
  handlers.createCheckout
)

server.registerTool(
  "complete_checkout",
  {
    title: "Complete checkout",
    description:
      "Complete the demo checkout using placeholder payment data compatible with ACP flows.",
    inputSchema: toolSchemas.completeCheckout,
  },
  handlers.completeCheckout
)

The registered tools are:

Tool Purpose
list_products Browse the catalog
search_products Search by name, keyword, or use case
get_product_details Fetch a product's metadata
create_checkout Start a checkout session
update_checkout Add buyer info, address, or shipping selection
get_checkout_status Read the latest state
complete_checkout Complete the demo order
cancel_checkout Cancel the session

The nice part is that the handlers are thin wrappers over the same backend state:

  • create a checkout session stub
  • call the Durable Object
  • serialize the response
  • return it as MCP content plus structured content

So the MCP layer is not a toy mock. It is an alternate interface to the same commerce engine.


Streamable HTTP on Cloudflare Workers

For transport, I used the MCP SDK's Web Standard Streamable HTTP Server Transport, which works cleanly on Cloudflare Workers:

export async function handleMcpRequest(
  request: Request,
  env: Env
): Promise<Response> {
  const transport = new WebStandardStreamableHTTPServerTransport({
    sessionIdGenerator: undefined,
    enableJsonResponse: true,
  })

  const server = createCommerceMcpServer(env, new URL(request.url).origin)
  await server.connect(transport)
  return transport.handleRequest(request)
}

I kept the transport stateless and let the actual commerce state live in Durable Objects.

That is a good separation:

  • MCP transport handles tool invocation
  • Durable Objects handle business state

It also means a remote MCP client can connect to the Worker endpoint directly:

{
  "mcpServers": {
    "acp-peptide-pharmacy": {
      "command": "npx",
      "args": [
        "mcp-remote",
        "https://acp-peptide-pharmacy-backend.tech-sumit.workers.dev/mcp"
      ]
    }
  }
}

That exact pattern works for both Claude Desktop and Cursor.


Frontend: GitHub Pages Storefront

The frontend is intentionally static.

I wanted the split to be obvious:

  • GitHub Pages serves the storefront
  • Cloudflare Workers serves the commerce runtime

That forces the browser UI to behave like any other public client calling a remote commerce backend.

The UI itself has a biotech / clinical-luxury feel:

  • serif display type for a more premium editorial look
  • lab-glass cards and assay-style labels
  • category filtering on the home page
  • a product detail page
  • a cart + checkout page that drives the ACP flow

One practical deployment detail: GitHub Pages does not publish arbitrary folders like /frontend directly. The clean solution was to keep frontend/ as the source and deploy it through a Pages workflow rather than duplicating it into /docs.

That lets the repo stay organized without compromising the deployment path.


Usage Samples

1. Search the catalog over HTTP

curl "https://acp-peptide-pharmacy-backend.tech-sumit.workers.dev/products/search?q=bpc"

2. Create a checkout session

curl -X POST "https://acp-peptide-pharmacy-backend.tech-sumit.workers.dev/checkout_sessions" \
  -H "content-type: application/json" \
  -d '{
    "items": [
      { "id": "bpc-157", "quantity": 1 }
    ],
    "buyer": {
      "first_name": "Maya",
      "last_name": "Patel",
      "email": "maya@example.com"
    },
    "fulfillment_address": {
      "name": "Dr. Maya Patel",
      "line_one": "123 Lab Lane",
      "city": "Austin",
      "state": "TX",
      "postal_code": "78701",
      "country": "US"
    }
  }'

3. Connect from Claude Desktop or Cursor

Use the deployed MCP endpoint:

{
  "mcpServers": {
    "acp-peptide-pharmacy": {
      "command": "npx",
      "args": [
        "mcp-remote",
        "https://acp-peptide-pharmacy-backend.tech-sumit.workers.dev/mcp"
      ]
    }
  }
}

4. Example natural-language prompt

You can give an MCP-capable client a prompt like:

Browse the peptide catalog, show me recovery-focused options, and create a checkout for one vial of BPC-157.

Then continue with:

Add my shipping details, pick the standard option, and complete the demo order.

Because the tool layer exposes the exact checkout lifecycle, the agent can move from discovery to transaction without screen scraping.

5. Real MCP order example: BPC-157 x 10

After wiring the live MCP endpoint into Cursor, I placed a concrete demo order for 10 units of BPC-157 through the MCP tools.

The flow was:

  1. search_products for BPC-157
  2. create_checkout with quantity: 10
  3. update_checkout with buyer + fulfillment address
  4. complete_checkout with demo payment data
  5. get_checkout_status to verify the final order state

The final result came back as:

Field Value
Product BPC-157
Quantity 10
Checkout status completed
Order status created
Shipping option Standard Research Shipping
Subtotal $1,290.00
Estimated tax $103.20
Shipping $12.00
Total $1,405.20
Order ID order_2b9bb098-1257-4547-9a45-7040eed3ef6d

Order permalink:

https://acp-peptide-pharmacy-backend.tech-sumit.workers.dev/orders/order_2b9bb098-1257-4547-9a45-7040eed3ef6d

This is exactly the kind of detail I wanted from the demo: not just “the tools exist,” but a full, verifiable transaction with real totals and a returned order object.


Verification

I verified the stack at three layers:

Local automated tests

  • vitest for ACP API behavior
  • vitest for MCP tool integration against the Worker transport

Live backend smoke tests

  • GET /health returned {"ok":true,...}
  • GET /products/search?q=bpc returned the BPC-157 record
  • live MCP connection returned all 8 tools and a valid search_products result

Live frontend deployment checks

  • GitHub Pages workflow deployed successfully
  • homepage, cart page, stylesheet, and app script all returned 200

I was not able to run a full live browser-click smoke test in this environment, so the browser-side checkout path still has some residual risk even though the deployed HTML/assets and the underlying backend flow are verified.


What I Would Improve Next

This is a demo, not a production commerce integration.

If I were pushing it further, I would add:

  1. a real merchant product feed
  2. auth and entitlement controls on the MCP layer
  3. webhook-driven order updates
  4. a delegated payment implementation instead of demo tokens
  5. a browser automation smoke suite for the GitHub Pages checkout UI

That said, as a reference architecture, it already shows the most important pattern:

ACP owns the commerce state, MCP owns the AI tool interface, and both can sit on top of the same backend.


Resources

SA
Written by Sumit Agrawal

Software Engineer & Technical Writer specializing in full-stack development, cloud architecture, and AI integration.

Related Posts