Parallel or Perish: Rethinking Training for AI
Explore how AI is changing training workflows and how integrating learning and doing can boost efficiency
One signal 🔭
One prompt 🧠
One subtraction opportunity ➖
Created by Sam Rogers · Powered by Snap Synapse
🔭 Signal: AI Breaks the Training Workflow
For decades, orgs have successfully rolled out new tech in this sequence:
- Train the people
- Deploy the tool
- Measure adoption
That workflow works at human speeds with waterfall-style projects. For Agile projects, we've often reversed the first two steps so it's:
- Deploy the tool
- Train the people
- Measure adoption
But either way fails at the inhuman speeds now required in business. AI simply moves too fast for the luxury of serialized steps. By the time step two is done, step one has changed underneath us.
The answer isn't more speed. It's a new flow.
🧠 Strategic (Human) Prompt: What if learning and doing ran in parallel?
Instead of asking: What training do we need to support this rollout?
Ask: How do we embed learning within this roll out?
Questions worth putting on the table:
- Are learning objectives as explicit as KPIs?
- Are both measured side-by-side, on the same cycle?
- Are team discoveries documented as frequently as business outcomes?
➖ Subtraction Opportunity: Sunset the Serial
Learning doesn't have to be a lagging step.
Stop separating rollout from ramp-up.
Parallel workflows subtract the waiting/wasting time in our adoption cycles.
🩻 Analogy of the Week: Surgery in Progress

Learning & sharing along the way.
In an operating room, learning and doing are inseparable.
The lead surgeon, anesthesiologist, nurses, and techs are all acting at once. They are each observing, adjusting, and sharing relevant discoveries in real time.
Every professional comes in highly trained, with their own role-specific instruments and dashboards: a vitals monitor, an anesthetic drip, imaging on a screen.
Each tool shows part of the picture. But none of these is the patient.
If anyone confuses their dashboard with the patient, or confuses the team effort, the patient might not make it out alive.
AI adoption is the same.
The rollout, the training, the measurement, these can’t run in sequence anymore. They require parallel delivery and coordination.
And dashboards are just indicators. They don’t keep the system alive.
Real resilience comes from team synchrony: trained professionals working in clear roles, in parallel, sharing discoveries as fast as the situation evolves, and always grounding decisions in the living system itself. Not just its representations.
♬ Closing Notes
The old playbook taught us to roll out tech, then catch humans up.
AI flips that on its head. The pace is inhuman, and the only way forward is to design learning and doing as one contiguous and complementary flow.
The future won’t reward teams who measure adoption late.
It will reward the ones who learn and perform in sync, with discovery and delivery stitched together from the start.
Back in the 2000s, I was on an early Agile project where L&D was scrambling to stay just one sprint behind developers. I suggested we flip it, and get half a sprint ahead, embedding us in the design workflow for customer-facing features, then develop and ship only those features we could already explain and support. Everyone laughed. Many years later, most training still runs behind. AI just makes the cost of that lag unbearable.
👉 Hit reply with one example of (or resistance to!) learning and doing in parallel where you work, I’ll share anonymized patterns in a future issue.
Until next week,
Sam Rogers
Inhuman Workflow Designer
Snap Synapse – tools and thinking partners to fuel your AI transformation