Loading...
Loading...
Challenge ideas, assumptions, and decisions by playing devil's advocate to identify weaknesses and prevent groupthink
npx skill4agent add wojons/skills devils-advocate# Challenge a technical design decision
npm run devils-advocate:challenge -- --decision "use-microservices" --context "ecommerce-platform"
# Test a product assumption
npm run devils-advocate:test -- --assumption "users-want-mobile-app" --data "user-research.json"
# Identify weaknesses in a proposal
npm run devils-advocate:weaknesses -- --proposal "architecture-redesign.pdf" --perspective "security"
# Generate counterarguments for a business case
npm run devils-advocate:counterarguments -- --business-case "expand-to-europe.md" --stakeholder "competitor"
# Comprehensive devil's advocate review
npm run devils-advocate:review -- --document "project-plan.md" --thoroughness highDevil's Advocate Analysis
──────────────────────────────
Subject: Microservices Architecture Proposal
Date: 2026-02-26
Analysis Duration: 45 minutes
Core Claims Identified:
1. "Microservices will improve development velocity by 40%"
2. "Team autonomy will increase with bounded contexts"
3. "System will be more resilient to failures"
4. "Scaling will be more efficient and cost-effective"
5. "Technology diversity will enable better tool selection"
Argument Analysis:
1. Claim: "Microservices improve development velocity by 40%"
Supporting Evidence: Case studies from other companies
Devil's Advocate Challenges:
- Case studies may not apply to our context (different scale, team structure)
- 40% improvement assumes optimal implementation and mature DevOps
- Velocity gains may be offset by coordination overhead
- Microservices introduce new failure modes (network, versioning, data consistency)
- Learning curve could reduce velocity initially
Alternative Explanation:
The perceived velocity improvement may come from better practices
(CI/CD, testing) that could be applied to monoliths too.
2. Claim: "Team autonomy increases with bounded contexts"
Supporting Evidence: Conway's Law, team structure diagrams
Devil's Advocate Challenges:
- Bounded contexts require clear domain boundaries (do we have them?)
- Teams need new skills (distributed systems, SRE practices)
- Increased autonomy may lead to inconsistent standards
- Cross-team coordination becomes more formal and slower
- Some domains naturally span multiple services
Alternative Perspective:
Monolith with modular architecture might provide similar autonomy benefits
without operational complexity.
3. Claim: "System more resilient to failures"
Supporting Evidence: Failure isolation diagrams, redundancy claims
Devil's Advocate Challenges:
- Distributed systems have MORE failure modes (network partitions, service discovery)
- Resilience requires sophisticated infrastructure (circuit breakers, retries, fallbacks)
- Debugging failures across services is harder
- Data consistency becomes a major challenge
- Cascading failures possible if dependencies not managed
Contradictory Evidence:
Many companies report increased operational complexity and failure rates
after microservices adoption without proper preparation.
Risk Assessment:
- High Risk: Data consistency across services
- Medium Risk: Operational complexity overwhelming team
- Medium Risk: Cross-team coordination overhead
- Low Risk: Technology choice limitations
Alternative Approaches Worth Considering:
1. Modular monolith with clear internal boundaries
2. Start with coarse-grained services, refine later
3. Hybrid approach: monolith for core domain, services for edge functions
4. Focus on DevOps maturity first, then evaluate service boundaries
Critical Questions Unanswered:
1. What specific metrics will measure "improved velocity"?
2. How will we handle distributed transactions?
3. What is the rollback plan if microservices don't deliver value?
4. How will team structure change to support service ownership?
5. What monitoring and observability investments are needed?
Recommendations:
1. Pilot with one non-critical service first
2. Define clear success metrics before proceeding
3. Invest in foundational capabilities (monitoring, deployment, testing)
4. Consider evolutionary architecture rather than big-bang rewrite
5. Document assumptions and revisit after 3 months
Analysis Value:
- Weaknesses identified: 8 significant, 3 critical
- Alternative perspectives generated: 5 viable alternatives
- Assumptions challenged: 12 core assumptions
- Risk areas highlighted: 3 high-risk areas needing mitigation
- Decision quality improvement: High (prevents potential costly mistake)