How I learned to stop worrying and love the linter.

Or, how building automation into your documentation can save you time, headaches, and the seemingly inevitable back-and-forth on grammar nits.

How I learned to stop worrying and love the linter.

By early summer 2025, our documentation at Galileo had entered its “please don’t look too closely” phase. The content was good — great, even — but getting a doc shipped felt like reliving the same PR comment thread over and over again:

“Can you fix this header spacing?”
“We don’t use Oxford commas.”
“Wait, why don’t we use Oxford commas?”

Each review cycle was a tug-of-war between technical accuracy and style consistency. The irony wasn’t lost on us — we were building AI evaluation tools that could spot model hallucinations in seconds, but our docs still relied on vibes and Google Docs comments to maintain quality.

As the Senior Developer Experience Engineer, my job was to make life easier for developers — both the ones using our SDKs and the ones writing about them. So I decided to treat our documentation like any other production system: design it, measure it, and automate the boring parts.

Step 1: Docs as code

The first step was admitting what every good DevRel knows deep down — documentation is code. It just happens to compile in people’s brains instead of their terminals.

I started with a practical, example-driven style guide — short, readable, and full of copy-paste-able patterns. No philosophical debates about “voice and tone.” Just:

“Here’s how we write a how-t0-guide. Here’s why it works. Now go do that.”

Then we wired that style guide into the developer workflow itself.

Step 2: Let the robots yell first

Enter markdownlint-cli2 and Mintlify’s dev check. Together, they became the gentle-but-firm robot editors that never sleep and never forget a missing newline.

We wired these tools into our GitHub Actions pipeline — so every pull request ran a quick linter check and a Mintlify build test. If something broke, the bot told you immediately (and without judgment).

The markdown linter: markdownlint-cli2

This little command line gem became the backbone of our doc hygiene. I configured it with a custom rule set based on our internal style guide — things like:

  • Consistent heading hierarchy (#, ##, ### in order, no skipping levels)
  • No trailing white space or double spaces (catholic school in the early 2000s has made me guilty of the double spacing 😅)
  • Code blocks fenced properly
  • Brand product names and terms enforced
  • List indentation rules enforced
  • Line length limits… but with mercy (because we’re not monsters)

Each rule had a reason, and each reason was documented.
I tuned the config to flag actual issues, not false positives. Developers shouldn’t have to debate with a linter about whether URLs count toward line length.

Here’s a simplified snippet of what runs in CI:

name: markdown-lint
on:
  pull_request:
    paths: ["**/*.md", "**/*.mdx"]
  push:
    branches: [main]
    paths: ["**/*.md", "**/*.mdx"]
jobs:
  lint:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4
      - uses: actions/setup-node@v4
        with:
          node-version: "lts/*"
      - name: Run markdownlint-cli2
        run: npx -y markdownlint-cli2 "**/*.md" "#node_modules"

Nothing fancy. Just predictable, fast, and consistent.
If a contributor ran the same command locally, they got the exact same results as CI. No “works on my machine.” No “I swear it built fine yesterday.”

We pinned actions by major version (for stability) but left minor versions flexible, so updates and security fixes flowed in automatically.

The Mintlify Dev Check: Our canary in the coal mine

Linting catches the little stuff, but Mintlify’s --check flag catches the big ones:
broken configuration, MDX syntax errors, bad frontmatter, and OpenAPI spec mismatches that could crash the entire docs build.

We wired it into GitHub Actions right after linting:

mintlify-check:
  runs-on: ubuntu-latest
  steps:
    - uses: actions/checkout@v4
    - uses: actions/setup-node@v4
      with:
        node-version: "lts/*"
    - name: Mintlify dev check
      run: npx -y mintlify dev --check

If a build failed, it didn’t just block the merge — it uploaded debug logs as artifacts. That meant contributors could download the failure details right from the Actions tab, see exactly what broke, and fix it without pinging a human reviewer.

The Mintlify check became our early warning system — like a polite robot gently saying,

“Hey, before you hit deploy… maybe don’t break the entire docs site.”

Making this a local-first workflow

I wanted docs contribution to feel developer-native.
So the same commands that ran in CI were available locally, one-liner style:

npx -y markdownlint-cli2 "**/*.md" "#node_modules"
npx -y mintlify dev --check

That meant contributors could lint, validate, and preview the docs without touching the pipeline.
No guessing, no waiting for CI to yell at you from the cloud.

When contributors fixed issues locally, they learned what the bots were checking for — and how to avoid those mistakes next time. It turned automation into education.

The tools under the hood

The stack was intentionally boring — in the best way possible.
Each tool solved one specific pain point and played nicely with the others.

  • Markdownlint-cli2: Handles consistency, structure, and formatting. The guardrails.
  • Act: Allows you to run GitHub actions locally — a real game changer when building new tests! (TY to Jim Bennett for showing me this handy tool!)
  • Mintlify: Powers the docs site, and mintlify dev --check ensures nothing explodes before merge.
  • GitHub Actions: Glues everything together. Simple YAML workflows run on every PR, catching errors before humans waste time.
  • Node.js (LTS): The quiet workhorse keeping everything cross-platform.
  • Prettier (optional layer): For contributors who want to format before pushing, because some of us like our Markdown aligned just so.

Every piece serves a single principle: make docs feel like part of the product, not an afterthought.

Suddenly, the review process shifted from “Can you fix your bullet formatting?” to “Let’s make this example more accurate.”

Automation didn’t make the process colder — it made it kinder. The machines handled the nitpicks, freeing humans to focus on clarity and creativity.

Step 3: Make contributing feel effortless

I built contributor templates and local validation commands that mirrored CI.
That meant any developer — not just the docs team — could run one command and know their doc would pass review:

npx -y markdownlint-cli2 "**/*.md" "#node_modules"
npx -y mintlify dev --check

The result? Fewer Slack threads, fewer broken builds, and way more people actually writing documentation.

By the end of the quarter, we’d cut review times by almost half.

Step 4: Build systems that teach

As the project matured, I realized something: automation isn’t just about catching errors — it’s about teaching at scale.

Each rule, lint, and validation check became a feedback loop that made contributors better writers. They weren’t just fixing errors — they were learning our standards, fast.

We stopped chasing perfection and started building reliability.

Within a few weeks, the difference was obvious — not just in the docs, but in how people felt about contributing to them.

  • Reviews got faster. What used to take two or three rounds of back-and-forth now wrapped up cleanly in one. Reviewers focused on clarity instead of commas.
  • More people contributed. Once contributors could self-check their work, the fear of “doing it wrong” disappeared. We doubled our active contributors almost overnight.
  • Tone and structure finally matched. Every page started sounding like it belonged to the same product and team — consistent, readable, and human.
  • Automation quietly did its job. CI/CD checks ran smoothly; errors surfaced early; no one had to debug broken Markdown in production again.
  • And the vibe shifted. Review threads turned from “formatting police reports” into real conversations about improving learning flow and developer comprehension.

The docs stopped being a bottleneck. They became a bragging point — something we could show off with pride at internal demos and onboarding sessions.

New hires actually started using the docs as templates for how to structure their first PRs.
And best of all? Nobody was arguing about commas anymore.

Lessons learned

  • Automation isn’t the enemy — bad automation is. If your bots yell too much, tune them.
  • Good docs are invisible. You only notice them when they break.
  • Style guides should be opinionated, not oppressive. Rules are there to empower, not punish.
  • Developer experience starts with empathy. The easier it is to contribute, the more people will.

Its not about the code — it's about communication.

This project taught me that developer experience is never just about code — it’s about communication. Automation is how we scale that communication without losing our humanity.

At Galileo, documentation became more than a resource — it became part of our developer infrastructure.

And yes, automation can still be a pain in the ass.

But once it’s tuned, it’s the best coworker you’ll ever have: fast, consistent, and quietly saving you from your own typos. 😉