Somewhere around Day 16 of building Postlark, we published an npm package and accidentally created our most effective distribution channel. No ad spend, no influencer outreach, no Product Hunt launch strategy. Just npx @postlark/mcp-server showing up where AI agents already look for tools.

Distribution Used to Mean Something Else

Traditional SaaS distribution is a funnel. You write blog posts, run ads, rank for keywords, get people to a landing page, convince them to sign up. Every step loses most of the audience. A 3% conversion rate is considered good, which means 97% of the people who found you walked away.

For a platform designed around AI agent publishing, this model has an obvious hole. The "user" who creates and publishes content isn't always a human clicking through a browser. It's Claude Code running a workflow, or an n8n automation chaining RSS feeds to blog posts, or a scheduled cron job that fires at 6 AM. These agents don't read landing pages. They don't watch demo videos. They discover tools through registries, package managers, and protocol specifications.

So we had a choice: market to the humans who configure the agents, or make the platform directly discoverable by the agents themselves. We chose both, but the second one turned out to matter more.

What an MCP Server Actually Is (for Distribution)

We covered the Model Context Protocol itself back in post three. The technical story — tool definitions, Zod schemas, stdio transport — is there if you want it. What that post didn't get into is the distribution angle.

An MCP server is a standalone process that exposes capabilities to AI agents through a standardized protocol. Ours ships as @postlark/mcp-server on npm. When someone runs claude mcp add postlark -- npx @postlark/mcp-server, their Claude Code instance gains sixteen tools for managing blogs — creating posts, scheduling content, pulling analytics, searching across publications.

Here's the part that matters for distribution: the package registry is the marketplace. npm has over two million packages. AI coding assistants already understand how to search it, install from it, and configure packages from it. When someone asks Claude "help me set up automated blog publishing," the MCP server can be suggested, installed, and configured in a single conversation. No browser tab, no signup flow, no onboarding wizard.

The package becomes its own sales pitch. Its README is the landing page. Its tool descriptions are the feature list. The agent reads them, understands the capabilities, and starts using them. The entire "acquisition funnel" collapses into one step.

Sixteen Tools, Zero UI

Our MCP server started with seven tools on Day 15. Create a post. Update it. List them. Get one by slug. Delete. Schedule. Pull analytics. Basic CRUD stuff — a thin wrapper over the REST API that any HTTP client could replicate.

By Phase 4, it grew to sixteen. Multi-blog management landed (list_blogs, set_active_blog), then content discovery (search_posts, discover_posts), image uploads, and the publishing workflow got richer. Each tool carries its own description and parameter schema, so an agent understands not just what it can call but when each tool is appropriate.

The thing about these tools is that they removed a UI dependency entirely for a certain class of user. Someone running a content operation across multiple blogs — say, a developer maintaining a personal site, a company engineering blog, and a niche topic publication — never needs to open a dashboard. The agent handles the routing. "Publish this to the engineering blog" and "schedule that for Thursday on my personal site" resolve to the same MCP server with different blog context.

We didn't plan this as a feature. We planned it as an API. The feature emerged when people started living inside their terminals full-time and never visiting app.postlark.ai at all.

The n8n Bet

npm isn't the only registry that matters. n8n — the open-source workflow automation platform — has its own ecosystem of community nodes. Over five thousand of them, covering everything from Slack to Shopify to obscure government data APIs.

When we looked at n8n's node directory, the "blog" category had exactly two entries: WordPress and Ghost. That gap was impossible to ignore.

n8n already supports MCP natively as of early 2025. Any n8n AI Agent node can connect to @postlark/mcp-server and use all sixteen tools without a dedicated integration. But a community node does something an MCP connection doesn't: it shows up in the search bar. When someone types "blog" into n8n's editor and sees Postlark alongside WordPress and Ghost, that's awareness we couldn't buy with advertising.

The node itself is straightforward — authenticated REST calls wrapped in n8n's credential and operation abstractions. The marketing value is the permanent presence in a directory browsed by over 200,000 active automation builders, many of whom are running exactly the kind of "AI generates content → platform publishes it" workflows that Postlark was built for.

Why Open Beats Proprietary Here

We could have built a closed integration layer. Proprietary SDK, custom authentication dance, maybe a partner program with application forms and approval queues. Plenty of platforms do this. It gives you control over who integrates and how.

We went the opposite direction. Public npm package. Open REST API with straightforward Bearer token auth. OpenAPI spec served at a public endpoint so anyone can auto-generate a client in their language of choice. llms.txt on every blog so AI crawlers understand the platform's capabilities without scraping documentation.

The reasoning was simple. Postlark is small. We don't have the gravity to make developers come to us and learn our proprietary way of doing things. So we went where they already are — npm, the MCP protocol, n8n's node registry, the OpenAPI ecosystem — and made integration trivially easy.

Open ecosystems compound in ways closed ones don't. Every new MCP-compatible agent that ships is another potential distribution partner we didn't have to negotiate with. Every automation platform that adds MCP support is another channel. The protocol does the business development for us.

What We Learned

Three things, specifically.

Your package README matters more than your landing page. For agent-driven discovery, the README is the first (and often only) thing that gets read. We rewrote ours three times before settling on a structure that front-loads the installation command and tool list. Pretty hero sections and animated GIFs are irrelevant when your reader is an LLM parsing text.

Tool descriptions are UX copy. Each MCP tool has a description field that agents use to decide when to invoke it. "Create a new blog post" versus "Create and publish a new blog post with Markdown content, tags, and optional scheduling" — the second one gets invoked correctly more often. We spent more time on those strings than on any dashboard tooltip.

Distribution follows protocols, not platforms. Being on npm matters because npm is where the packages are. Supporting MCP matters because MCP is what the agents speak. If tomorrow a new protocol emerges and gains traction, being there early matters more than perfecting our presence on the current one. The lesson isn't "bet on MCP" — it's "bet on open standards and stay portable."

The Uncomfortable Part

There's a tension we haven't fully resolved. The more successful agent-driven distribution becomes, the less visibility we have into who's using the platform and why. A human user signs up, browses the dashboard, maybe joins a Discord. An agent installs a package, creates an API key, and starts publishing. We see the API calls but not the person behind them.

That's fine for revenue — usage is usage. It's less fine for product development. When your most active users never touch your UI, traditional feedback loops break down. No NPS surveys, no heatmaps, no "what brought you here today?" modals. We're still figuring out what replaces those signals in an agent-first world.

For now, the answer is "read the MCP server logs and see what tools get called in what patterns." Not elegant, but honest.