How AI is changing collaboration between docs and support
Overview
Technical writers and support teams have always worked closely together, but I think that the rise of AI is fundamentally changing how we collaborate, close documentation gaps and ensure that we provide users with the information they need, when they need it. With the increasing adoption of automation and large language models (LLMs) in technical writing, we’re seeing new workflows emerge - some promising, some still rough around the edges.
This post explores how AI is enhancing documentation workflows, details some of my own experiments, and touches on the evolving relationship between support and docs - and what this all means for technical writers.
From backlogs to bots: the state of docs and support
If you work in docs, you probably know the pain of a never-ending backlog of documentation tickets — issues specifically flagged for docs updates by support teams, engineers, product managers or even your users. But there’s another, often overlooked source of documentation work: the thousands of general support tickets that hint at documentation gaps but are often never formally flagged as “docs issues.”
Support teams can be overwhelmed with tickets, but many them could be prevented with better documentation. Technical writers can also be overwhelmed with requests and struggle to prioritize which gaps to fill first. The traditional handoff between support and docs — manual flagging, Slack messages, clarification emails — can be slow and prone to missed opportunities.
The idea of using AI to bridge this gap is both tempting and terrifying. Could AI automatically identify which support tickets indicate documentation problems? Could it draft fixes and open pull requests? Could it help us finally get ahead of the backlog?
In practice, from what I’m seeing, results are mixed.
For example, when a developer on my team asked GitHub Copilot to fix some small documentation issues based on PR comments, it ended up creating more work for us, not less. As my colleague put it: “Damn AI creates more work than it saves.”
But there are success stories too. From what I’ve read on the Write the Docs Slack, some teams have used AI agents to batch-fix well-defined issues flagged by support — typos, minor wording changes or adding notes. A human reviews good candidate tickets, runs the prompt and the agent creates a PR. This works best for “quick wins,” but more complex changes still need human expertise. I’ve also had some success with a more manual approach, which I’ll describe later.
So how can AI help support and docs teams work together more effectively? Not just to fix typos faster, but to identify patterns, surface documentation gaps and close the feedback loop between what users are asking and what we’re writing?
The traditional workflow: manual handoffs
Historically, collaboration between support and docs teams has relied on manual processes. When a support agent spotted a documentation gap — perhaps a missing troubleshooting step or unclear instructions — they would flag the issue, often by adding a comment to a shared ticketing system, tagging documentation teams in Slack or email or manually creating a documentation ticket or issue.
The technical writer would then review the flagged issue, investigate and draft an update — usually after clarifying details with the support agent. This process, while thorough, can be slow and depends a lot on individual initiative and communication.
AI workflows: from ticket to pull request
Here’s a potential AI-powered workflow that I’ve seen some teams experimenting with:
- A user or support agent opens a ticket (e.g., in Jira or Zendesk).
- AI evaluates the ticket to determine if it affects documentation.
- If yes, the AI drafts a documentation update — sometimes even opening a pull request or issue on GitHub via an integration.
- A human reviews, edits and merges (or rejects) the change.
This approach is still experimental, but it shows promise for reducing the docs backlog and surfacing those documentation gaps.
AI for ticket triage
Support teams often deal with huge volumes of tickets — sometimes tens of thousands a year. Not all of these are actual bugs or issues with the system itself: many are queries about how the system works. Support teams that I work with estimate that nearly half of all tickets are just questions about existing functionality that perhaps stem from a lack of knowledge about the product.
An AI that has access to your docs can help here by automatically analyzing incoming tickets, clustering them by topic and identifying which ones are documentation-related. By surfacing frequent queries and recurring pain points, AI could:
- Help flag tickets that indicate a documentation gap or unclear instructions
- Group similar questions to reveal patterns and high-impact topics
- Suggest updates or new articles to address common queries
This approach not only helps support teams respond faster, but also enables technical writers to proactively improve documentation — plugging gaps before they become a backlog.
The following diagram illustrates a possible workflow:
[Support tickets]
|
v
[AI triage & clustering]
|
v
+-----------------------------+
| |
[Docs-related?] [Not docs-related]
| |
v v
[Suggest doc updates] [Route to support/engineering]
|
v
[Tech writer review & update]
Experimenting with AI-powered ticket analysis
I’ve been testing a simple workflow that attempts to bridge the gap between support tickets and documentation updates.
I manually copy information from a support ticket into my LLM (which has access to a local version of our documentation website). I then ask it to evaluate whether the issue could be solved — or at least partially addressed — by updating the documentation.
The LLM analyzes the ticket against our existing docs and typically:
- Identifies whether the information exists but is unclear or hard to find
- Suggests specific sections that need updating
- Proposes new content where there are gaps
- Flags cases where the issue isn’t documentation-related at all
Once I have this evaluation, I ask it to draft the necessary updates. I then review, edit and incorporate the changes into our docs.
What I’ve learned:
- Context is everything. The LLM needs access to your documentation.
- It’s faster than manual research. Instead of searching through dozens of pages to find relevant sections, the LLM surfaces them instantly.
- The output still needs human review. The suggestions are often a good starting point, but they require editing for tone, accuracy and completeness.
- It works best for common patterns. Tickets about “how do I…” or “where can I find…” tend to produce useful results. Complex technical issues or edge cases still need deeper investigation.
My future plans include experimenting with automating this workflow further — perhaps by integrating with our ticketing system to pull tickets directly and using APIs to create PRs automatically.
How AI enables new models of collaboration
The rise of AI isn’t just changing our workflows — it’s reshaping how support, education and documentation teams work together.
Four modes of AI-augmented technical writing
In a recent post, Fabrizio Ferri Benedetti describes four “modes” in which AI can support documentation work:
- Watercooler: AI as a conversational assistant for clarifying issues and gathering context
- Whisperer: AI as an inline suggester for repetitive formatting and fixes
- Wordsmith: AI as a drafter or reviser for creating and improving content
- Assembler: AI as an automation engine for large-scale maintenance and validation
Each mode offers different opportunities for support and docs teams to collaborate more efficiently — whether it’s clarifying issues, automating repetitive fixes or validating documentation at scale. Crucially, these modes all require human expertise to guide, review and contextualize AI output, reinforcing the idea that AI is a powerful tool, not a replacement for skilled technical writers.
From silos to unified teams
At Write the Docs Berlin 2025 last year, many talks highlighted how these AI capabilities are enabling a deeper organizational shift: support, education and documentation teams are moving from working in silos to unified, outcome-focused missions.
As Stephan Delbos described, merging content and education teams allowed them to focus on user activation and measurable business outcomes like ticket deflection and feature adoption. Automation is a key enabler here — it allows teams to share workflows, data and ownership in ways that weren’t previously practical.
Dimple Poojary showed how linking documentation workflows directly to sprint boards fosters shared ownership and keeps docs up-to-date. Diana Breza’s team went further, implementing automated testing for documentation to catch issues before users do — reducing support load and improving quality.
Docs-as-code workflows, as I discussed in my own talk, empower support and product teams to contribute directly to documentation, speeding up the feedback loop and making documentation a truly collaborative effort.
The result? A more proactive, data-driven and collaborative approach to documentation — one where support and docs teams work together to deliver better outcomes for users, enabled by AI tools that operate in the four modes that Fabrizio describes.
A strategic shift: docs as support
A recent post by Kate Tungusova, “Docs as Support: How LLMs re-wrote my documentation strategy in 2025”, highlights another key shift: documentation is no longer just for humans. Increasingly, LLMs are the primary consumers of docs, retrieving context to solve user issues before a ticket is even filed.
This changes how we write:
- Length and detail are assets, not liabilities. LLMs thrive on exhaustive, precise documentation.
- Writers become “vibe coders” (workflow/tool builders) or “product experts” (deep solution engineers). (I talked a little bit about this in a previous post on the evolving role of technical writers.)
- Troubleshooting guides are now chronological, teaching fundamentals before error correction.
The goal? Solve the customer’s issue through docs before they ever need to file a ticket.
KCS and technical writing: complementary, not competitive
Another useful framework is Knowledge Centered Service (KCS), which distinguishes between:
- Technical documentation: Proactive, foundational content that defines how products work.
- KCS articles: Reactive, ticket-driven content that addresses specific failures and edge cases.
As Kevin Kuhl argues in his post (KCS and Technical Writing), these are “distinct magisteria” — separate but complementary domains. KCS generates valuable data about user pain points, which technical writers can use to improve proactive documentation. The best results come when technical writers act as strategic partners, using KCS data to drive evidence-based improvements.
The human in the loop
Despite the promise of AI, I think it’s clear that human expertise remains essential. AI is great for well-defined, repetitive fixes. However, complex, ambiguous or high-impact changes still need a human touch. Technical writers add value as coaches, architects and strategists — not just text editors (which was never the case anyway).
There’s no harm in experimenting, but be prepared for mixed results and always keep humans in the loop.
Conclusion: The future of docs and support
AI is changing the way docs and support teams collaborate, but it’s not a silver bullet. The most effective teams will likely blend automation with human judgment, using AI to handle the grunt work — ticket triage, pattern recognition, batch fixes — while technical writers focus on strategy, architecture and high-value content.
What’s the real win? Faster ticket deflection, proactive documentation updates and a tighter feedback loop between what users are asking and what we’re writing. When support and docs teams work together — enabled by AI but guided by human expertise — we don’t just close tickets faster. We can even prevent them from being opened in the first place.
The future is evidence-based, collaborative and a little bit experimental. As technical writers, our role is evolving from reactive documenters to strategic partners in the support ecosystem. And that’s where our value has never been clearer.
References: