When AI slows down expert developers and why that’s part of the journey
Blog: OpenText Blogs

For decades, software teams have operated under the assumption that “automation always accelerates.” But recent controlled data suggests a more nuanced story—especially when it comes to experienced developers. In a 2025 randomized controlled trial (RCT) of seasoned open-source contributors, METR researchers found that allowing AI assistance increased task time by about 19% versus working without AI. Are we saying AI slows down expert developers?
That finding flies in the face of intuition: developers in the study expected a 24% speedup, and afterward still believed AI had improved their throughput by 20%. In other words: AI felt helpful, but in practice, it was slowing things down.
In what ways could AI slow down productivity (especially for experienced developers)?
The METR study authors dig into possible causes around how AI slows down expert developers, isolating five plausible contributing factors. A few stand out in particular:
- Context switching and verification overhead: When humans must interpret, correct, or adapt AI-generated code, the additional verification steps impose friction.
- Model suggestions misaligned with high internal standards: Experts operate under strict conventions, testing/coverage norms, documentation, or architectural constraints. AI’s output often requires rewriting or refactoring.
- Learning curves and limited tool maturity: In the study, many developers used AI tools like Cursor or Claude-based agents only for a few dozen hours. Deep fluency with prompting, chain-of-thought scaffolding, or fine-tuned models may come later.
- Partial adoption versus full immersion: Because developers had the option to use AI, they vacillated between modes, incurring context-switching costs.
Importantly, the study’s authors caution that these results are a snapshot of a specific setting (experienced devs working in familiar codebases), and not a blanket condemnation of AI in software. Indeed, AI may still help in other contexts, e.g. for junior developers, or in domains with less domain familiarity.
Slower now, faster later: A change-management perspective
This short-term drag in productivity is not a failure of AI, but part of the adoption curve. In organizational change theory, early phases often see regressions or productivity dips as teams absorb new tools, revise workflows, and realign roles. It’s the “cold start cost” of transformation.
In technology adoption models, we expect:
- Onboarding friction - Teams must learn how to integrate AI, update conventions, and resolve mismatches.
- Toolwork iteration - Modifications such as prompt libraries, model fine-tuning, guardrails, and domain adapters emerge gradually.
- Institutionalization - Over time, AI becomes embedded in dev workflows; performance improves once feedback loops, integration, and standards settle.
In this light, a 19% slowdown early isn’t a bug, it’s a signal: we’re in the transition phase.
But this doesn’t excuse blind rollouts. The key is targeted deployment: putting AI where it yields net gain and mitigating where it produces drag.
Putting AI in the right places—pragmatically
To manage the transition well, leadership should consider:
- Use AI for scaffolding, not core logic: Let AI help with boilerplate, test stubs, documentation, or code templates. Leave architectural reasoning, edge-case logic, and domain-specific subtleties to humans.
- Adopt mixed-mode workflows: Team members might toggle AI assistance depending on context, e.g. safe zones vs high-risk code paths.
- Curate domain-aware models: Train or fine-tune AI on your codebase’s patterns, style guides, and internal libraries, reducing “mismatch repair” overhead.
- Progressive enablement and training: Allow teams to gain experience in less critical subsystems, build trust, and scale gradually.
- Instrumentation and feedback loops: Collect metrics on time spent, pull request edits, and developer feedback. Evolve AI placements iteratively.
In short: the right AI in the right places can overcome the early drag.
Why OpenText DevOps Cloud holds an advantage
In the OpenText DevOps Cloud platform, we don’t approach AI as a monolithic “silver bullet.” We embed intelligent assistance where it has the highest expected delta:
- Context-aware code suggestions and knowledge infusion: The AI module is tightly integrated with your internal libraries, APIs, and architecture, reducing misaligned output.
- Guardrails & developer override: We build in safety and fallback paths, so devs retain ultimate control when AI output is incorrect or misaligned.
- Governance, traceability, and auditability: Every AI-assisted change is logged, traceable, and reviewable — aligning with enterprise compliance mandates.
- Incremental rollout with usage telemetry: We instrument adoption, track cost vs benefit, and guide teams through the transition with insights.
- Cross-functional alignment: We don’t just layer AI onto dev tools. We embed it into CI/CD, issue management, and observability — so intelligence percolates end-to-end.
In this way, OpenText DevOps Cloud treats AI not as a replacement, but as an accelerant—one that’s controlled, phased, and adaptive.
From drag to acceleration
The METR study’s result of a 19% slowdown is eye-opening. It forces us to challenge the assumption that AI always speeds up expert developers. If applied without strategy, AI may slow down expert developers for a short time. But it doesn’t negate AI’s potential. Rather, it highlights that adoption is a journey: there will be friction, missteps, and false starts.
The path forward is not “AI everywhere, right now,” but “AI in the right places, with the right discipline.” Through gradual rollout, domain alignment, feedback loops, and governance, the drag converts into acceleration.
Look for a DevOps platform that provides orchestration, intelligence, and control so teams can move past the early productivity dip and toward sustained gains—without sacrificing code quality, reliability, or developer confidence.
Download the guide to find out how experts are handling it: 9 experts on enterprise DevOps platforms.
The post When AI slows down expert developers and why that’s part of the journey appeared first on OpenText Blogs.
