2Slides Logo
When NOT to Use AI for Presentations: 7 Scenarios to Build Manually Instead
2Slides Team
12 min read

When NOT to Use AI for Presentations: 7 Scenarios to Build Manually Instead

AI presentation tools have real limits. Despite being a 2Slides team article, we'll call out seven specific scenarios where AI is the wrong choice in 2026: high-stakes investor pitches with live-negotiated numbers, board-level decks where every line is politically contested, keynote speeches where one custom visual defines the talk, regulated-content presentations requiring compliance review, decks presenting proprietary or confidential client data in consumer AI tools, training materials teaching skills better shown than summarized, and crisis-communication decks. For these seven, start from a blank slide or a human-designer draft. For most other decks β€” quarterly updates, internal reviews, sales collateral, course material, standard client reports β€” AI gets you 90% of the way there in 60 seconds. But knowing the 10% where AI shouldn't be used is what distinguishes a professional communicator from a prompt-pusher. Use the right tool for the job, and use judgment about which job it is.


The 7 Scenarios Where AI Is the Wrong Tool

Below are the seven cases where we recommend you close the AI tab and open a blank slide. Each one shares a common trait: the cost of a subtle error is much higher than the cost of extra hours.

1. High-stakes investor pitches with live-negotiated numbers

A Series A or later investor deck is not a content problem. It is a negotiation artifact. Every valuation input, every forward revenue line, every dilution scenario is a position you will have to defend in a room with people who have read thousands of decks.

AI is great at generating a draft of a pitch structure. It is bad at knowing that your $4.2M ARR number was agreed with your CFO at 11:47pm last night after a long discussion about how to treat one specific reversed customer. It does not know that the 73% gross margin figure is a number your board chair pushed back on three times. These are human memories, and they live in the deck down to the pixel.

Better choice: Build the 8–12 "number slides" manually in PowerPoint or Google Slides, with the CFO at the keyboard. Use AI only to draft the market-sizing, competitor landscape, and team slides, then review every word.

2. Board-level decks where every line is politically contested

Board decks are a different genre from pitch decks. The audience already owns part of the company. The stakes are not "do they invest?" but "do they trust management?" Each bullet is weighed against prior quarters' bullets. A verb tense change ("we are scaling" vs. "we have scaled") can derail 40 minutes of a board meeting.

AI models do not understand the political history of your company β€” which metric your lead director hates seeing reported a certain way, which phrase your chair flagged last quarter. They produce plausible but politically tone-deaf copy.

Better choice: Start from last quarter's approved board deck as your template. Update in place. Let AI help only with appendix content (customer quotes, case studies) that doesn't carry political weight.

3. Keynote talks where one signature visual defines the speech

A keynote is a performance, not a document. The best keynotes have one or two signature visuals that people will screenshot and share for months afterward. Think of Hans Rosling's bubble charts, or the original iPhone reveal slide with nothing but the product silhouette.

These visuals are concepts a speaker develops over weeks of iteration. AI tools produce competent but generic imagery. They cannot invent the metaphor that makes your talk memorable, because they do not know what idea you are trying to anchor.

Better choice: Work with a designer or a strong in-house visual editor on the 3–5 signature slides. Use AI to speed up the other 25 supporting slides, then match their visual style to your hero moments.

4. Regulated-content presentations (pharma, finance, legal)

If your deck will be read by the FDA, an SEC examiner, a compliance officer, or opposing counsel, AI-generated copy is a liability. Most consumer AI tools will "helpfully" paraphrase claims in ways that can accidentally violate disclosure rules β€” for example, softening a required risk warning, or making a performance claim that triggers a different review pathway.

It is also extremely hard to prove, in a regulated audit, which model produced which sentence on which date. That chain of custody matters.

Better choice: Draft regulated content manually in an approved template, have it reviewed by compliance or counsel, and keep AI out of the critical slides. AI can still help with internal training content that stays inside the firewall.

5. Confidential or proprietary client data decks (in consumer AI tools)

This is less about AI being the wrong tool and more about AI being the wrong deployment. If you paste a client's confidential customer list, unreleased financials, or a prospective M&A target into a free consumer chatbot, you may be violating your NDA β€” regardless of whether the tool "trains on your data" or not.

For a deeper dive on this, see our companion piece on whether AI presentations are safe for confidential data.

Better choice: Use enterprise AI tools with signed DPAs and no-training guarantees, OR keep the confidential slides entirely manual and let AI handle only the non-sensitive framing sections.

6. Training decks that teach skills better shown than summarized

AI is very good at summarizing a skill. It is bad at teaching one. A coding workshop, a machine-assembly training, a clinical-procedure deck β€” these require the creator to have done the thing recently, at scale, and to know which 3 out of 47 steps are the ones trainees actually get wrong.

An AI-generated training deck will list all 47 steps with equal weight. A human trainer knows to spend 15 minutes on step 12 and 10 seconds on step 3. That judgment shows in slide pacing, emphasis, and live demos.

Better choice: Build training decks from your own recent notes and incident logs. Use AI only for the glossary, pre-reads, and review quizzes β€” not the teaching sequence itself.

7. Crisis-communication decks

An outage postmortem. A product-recall briefing. A workforce-reduction all-hands. A public apology to customers. These presentations carry legal, reputational, and human weight. Every sentence is scrutinized by employees, press, regulators, and sometimes courts.

AI tools tend to produce language that is reasonable but hollow β€” corporate-adjacent phrasing that can read as dismissive in a moment requiring specificity. It may also hallucinate facts about timelines or causes, and hallucinated facts in a crisis deck are career-ending.

Better choice: Have the accountable executive write the first draft longhand. Route it through legal, HR, and comms. AI has no role in this workflow.


What to Do Instead: A Scenario-by-Scenario Playbook

ScenarioRecommended primary toolAI's role (if any)Time budget
Investor pitch (number slides)PowerPoint or Google Slides, manualDraft market/team slides only20–40 hours
Board deckPrior quarter's approved templateAppendix copy only8–16 hours
Keynote with signature visualsHuman designer + Figma or KeynoteSupporting slides only30–80 hours
Regulated pharma/finance/legalApproved compliance templateNone in critical slides15–50 hours
Confidential client dataEnterprise-licensed tool with DPAOnly with no-training guarantee5–20 hours
Skills-training deckCreator-written from incident logsGlossary and quizzes only10–30 hours
Crisis communicationExecutive longhand draftNone4–24 hours (fast)

Three patterns emerge from this table. First, AI never disappears entirely β€” but its scope shrinks to safe peripheries. Second, the time budget balloons for these decks because the cost of a defect is high. Third, the tool changes less than the workflow changes: you are optimizing for review gates, not speed.

How do I know if my deck falls into one of these seven categories?

Ask three questions. Does a single wrong sentence cost money, trust, or legal standing? Will a named human (regulator, board member, journalist) read this slide-by-slide? Is there a specific precedent I have to match? If you answer "yes" to any of them, treat it as a manual-first deck.

What if I only have 24 hours and it's one of the seven?

Use the manual-first rule on the 4–6 slides that actually carry the weight. Use AI to accelerate the other 20 slides that are scaffolding β€” agenda, appendix, definitions, references. A hybrid approach still beats a fully-manual approach in a time crunch, as long as the critical slides stay human.

Is this just a disclaimer to protect 2Slides?

No. We actually lose revenue by publishing this, because we are telling you not to use our category of tool in certain cases. We publish it because honest scope-setting is how trust is built, and because our long-term position depends on users knowing when AI slide generation is the right call β€” which it is for most decks, most of the time. See our more optimistic take on whether AI can make professional pitch decks for the other side of this argument.


When AI Is Right (the Other 90%)

To be clear, these are the deck types where 2Slides and similar tools save you 4–8 hours per deck with no meaningful downside:

  • Quarterly internal reviews to your own team
  • Standard client-facing status reports
  • Training pre-reads and onboarding decks
  • Sales collateral generated from an existing, approved pitch
  • Conference session decks that are not keynotes
  • Course lectures, especially for repeat teaching
  • Internal workshop materials
  • Report-to-slide conversion from PDF or CSV data
  • Marketing webinar decks
  • Investor update newsletters (non-fundraising)
  • Research summary decks for stakeholders
  • Project kickoff and status decks

For these, the 60-second AI draft is the correct starting point. A human editor then spends 15–45 minutes tightening, fact-checking, and adding voice. That beats the old workflow β€” where a human spent 3 hours on the same deck β€” by roughly 4x.

What's the real productivity math?

If 90% of your decks take 45 minutes with AI (vs. 3 hours manual), and 10% take the same 20+ hours manual either way, your weighted average per deck drops from 3 hours to about 1 hour 15 minutes. Over 50 decks a year, that is roughly 87 hours recovered β€” about two full work weeks. That gain disappears if you use AI on the 10% where you shouldn't. Pick carefully.


Frequently Asked Questions

Does this mean AI presentation tools aren't ready for enterprise use?

It means AI presentation tools are ready for most enterprise use, with human oversight on the high-stakes remainder. Enterprise deployment is less about whether AI can draft slides and more about whether your team has workflows that separate the 90% from the 10%. The tool is ready. The governance often isn't.

If I use AI for a first draft of an investor deck, is that safe?

It depends on which slides. For market sizing, competitor overview, team bios, and case studies: yes, AI drafts are safe as a starting point. For the financials, valuation, use of funds, and projections: no. Those slides should be in the CFO's hands from the first pixel, because you will be defending every number in that room.

Aren't you just hedging against future AI quality issues?

A fair question. Our view is that AI presentation quality will keep improving β€” but the seven scenarios listed here are not about quality limits. They are about accountability limits. Even a perfect AI cannot be the author-of-record on a regulated filing or a crisis apology. Those require a named human signature, literally and figuratively.

How do I explain this policy to my team?

Use a simple rule: "If you would want a lawyer, CFO, or PR lead to read it before it ships, AI is a drafting assistant, not the author." If the deck is internal, routine, or reversible, AI is the primary tool. Teams adopt this distinction quickly because it matches how they already feel about other assistive tools, like spell-check or grammar suggestions.

Should I tell my audience an AI helped make the deck?

For the 90% use case, no more than you would announce you used spell-check. For the 10% high-stakes use case, the question is moot because AI shouldn't be the author. If regulators or investors directly ask, answer honestly: the deck was drafted with AI assistance, reviewed by humans, and finalized by a named author who takes responsibility for every claim.


The Takeaway

AI presentation tools solve a real problem β€” the tax of producing routine slides β€” and they solve it well. The seven scenarios in this article are not an indictment of that category. They are a map of where the category stops being useful and starts being dangerous. High-stakes investor pitches, politically weighted board decks, signature-visual keynotes, regulated content, confidential client data in consumer tools, skills-training material, and crisis communication all share one property: the cost of a near-miss is measured in dollars, jobs, or trust. AI is a probabilistic system, and probabilistic systems are the wrong tool when the acceptable error rate is zero.

For everything else β€” the other 90% of the slides you will ever make β€” AI is not just acceptable, it is obviously correct. A 60-second draft followed by 20 minutes of human editing will beat a 3-hour manual build every time, with no measurable quality gap. The professional skill for the next decade is not "can you use AI for slides?" It is "can you tell which slides are which?" Teams that learn this distinction fast will have a compounding advantage. Teams that either over-trust or under-use AI will lose ground to those that apply judgment to each deck.

For the 90% of decks where AI accelerates instead of limits you β€” try 2Slides free.

About 2Slides

Create stunning AI-powered presentations in seconds. Transform your ideas into professional slides with 2slides AI Agent.

Try For Free