I documented internal Search infrastructure at Google for about five years, from 2018 to early 2023. (From 2023 until my resignation in December 2024, I worked on content strategy and operations for AI Overviews. Similar skillset, but very different output.)

For a few of those years, my primary focus was Local Search, the system behind what happens when you search for “coffee shops near me” and get a map with pins and business listings. It’s one of those features that seems simple from the outside but is genuinely complex underneath, pulling from both Google Maps and Google Search infrastructure.

Documenting Local Search was hard. A single page could require three or four SME meetings—the classic technical writer ritual where you interview an engineer, take notes, draft something, and iterate until it’s accurate. Committing to a page, learning the system, writing it up, and getting approval took a minimum of one week and often way longer.

I’ve been at Fern for nine months now, and I’ve had maybe five or six SME meetings. Total! It’s hard to come up with a neat explanation like “and that’s all because of AI!” though AI certainly has a lot to do with it.

The documentation bottleneck at Google

Local Search straddled two large organizations: Geo (Maps) and Search. They coordinated to some extent but had largely separate infrastructure and workflows (at the time, that is. It was started to coordinate and merge in my last year or so and I can only imagine that this process continues today). No AI assistance at the time (pre 2023), so documenting anything in Local Search meant three or four SME interviews just to write a single page.

I could never just produce something close to publishable on my own. For almost everything I wrote, I did a lot of initial legwork myself—digging through code, piecing things together from old design docs. I’d produce a draft that was maybe 60% right, and then iterate with engineers asynchronously in a Google Doc or on a CL (googlespeak for PR). That iteration was slow. Reviews sat for days because people were busy, and internal documentation was never a priority (I could write many pages on why this was, for another time!)

This was a legacy codebase where everything affected everything else, and the knowledge lived in people’s heads. You couldn’t just read the code and understand the system—you needed context that wasn’t written down anywhere. The SME meetings weren’t optional overhead; they were the work. And at the time, I actually thought this was all pretty efficient.

What changed

I left Google in 2024 and didn’t really start using AI tools until 2025, when I was doing some freelance work and rebuilding my personal website. I started at Fern in June 2025 and walked into a massive backlog of documentation issues.

On the surface Fern is obviously extremely different from Google, and as a result my writing process looks entirely different now.

The first difference is organizational. Fern is small—around 25 people, pretty flat structure. (My tech writing team at Google was 25 people!) Everyone’s on Slack, online often, and there’s a culture of saying almost everything in public channels. It’s easy to start a thread about a new feature, ask for clarification, or request a review. Or search Slack! Searching Slack, including customer support threads, is a legitimate goldmine of information.

Documentation also matters here in a direct way: Fern’s product is external, and good docs are how customers actually implement things. It’s easier for a sales engineer to send someone a docs page than to explain the same thing over chat for the tenth time in a week. People get that and they don’t have time to waste on repetitive handholding. Plus, Fern’s product IS documentation sites, so their audience cares about good documentation perhaps more than a typical dev tools engineer.

At Google, even with documentation, a lot of customer handholding was still necessary. Internal teams serving internal customers meant money wasn’t directly on the line during those interactions. The customer didn’t have other options—there’s one way to implement X in Google Search, and one team to do it. You can’t go to a competitor. That dynamic made some people complacent, sometimes. (Not everyone. I worked with a lot of amazing people at Google. I learned to be a technical writer there and have a lot of respect for most of the teams I worked with.)

Another big thing, of course, is AI. Now, I use Claude (Claude Code and in my browser depending) extensively to draft content and explain technical concepts. Fern started using Devin AI internally for coding tasks, and I started using it for documentation drafts Its writing can be verbose, but the codebase integration is a game-changer. I can ask “can you explain how this option works?” and get an answer within minutes, with a drafted PR if I want one. Fern recently launched Fern Writer, which is essentially Devin with a wrapper to understand Fern-specific components, and I use it extensively. I can throw it a code PR that I have no context on, and it produces a decent PR draft in the documentation repository that I can iterate on.

I don’t often merge these PRs without changes. I’ll iterate over Slack-“be less verbose,” “fix those broken links,” “describe how this option relates to that other option on a different page”—and often push a commit or two myself. The flexibility is the point.

Teasing apart the variables

It’s tempting to say “AI killed the SME meeting,” but that’s not quite right. The organizational stuff matters too. Fern’s size, flat structure, Slack-native culture, external product where docs directly affect revenue—all of that creates an environment where information flows more freely and people are motivated to help documentation happen.

But AI is the multiplier. The “can you explain how this option works?” → instant answer → drafted PR pipeline is genuinely new. At Google, I couldn’t spin up a test environment easily. I couldn’t get a codebase-aware explanation in minutes. The information existed, but extracting it required scheduling meetings, waiting for async responses, synthesizing across multiple partial explanations.

Now I can do a lot of that extraction myself or have AI do it for me, so the documentation bottleneck has absolutely shifted.

SME meetings are dead, long live the SME meeting

I don’t think SME meetings are dead. For genuinely complex, cross-cutting features—the kind of thing Local Search was full of—you probably still need humans in a room (or on a call) working through the edges together. AI can explain what code does, but it can’t always explain why the code is that way, or what tradeoffs were considered, or what’s likely to change next quarter. For an internal audience, especially, this can be relevant.

But the default has shifted. The question used to be “who do I need to talk to?” Now it’s “can I figure this out myself first?” And increasingly, the answer is yes.

I’m curious how this plays out across the industry, whether other technical writers are experiencing something similar, and what it means for the craft longer term. For now, I’m just enjoying the velocity!