Insider Insights: What Tech Leaders Really Do Instead of 100% Test Automation 🎯

Shreyansh Shukla
6 min readNov 26, 2024

--

I was recently sitting in a critical release related meeting, the tension was palpable as we approached a critical release. That’s when one of the manager dropped the bombshell: “We need 100% test automation before we can release. No exceptions.”

A hush fell over the conference room. I could see the nods from some team members, the uncertain glances from others. But something about this absolute statement didn’t sit right with me. It felt too… absolute.

After the meeting, I dove deep into research. The more I read, the more I realized the “automation gospel” was more myth than reality. Major tech giants like Google, Microsoft, and Amazon — companies renowned for their engineering excellence — don’t actually pursue 100% test automation. They’ve learned something crucial: not all tests are created equal, and blindly automating everything can be counterproductive.

Core Insights

The State of Testing Report 2023 crystallized what I was beginning to understand. Successful companies maintain a strategic balance between automated and manual testing. It’s not about coverage numbers, but about meaningful, intelligent testing that actually improves product quality.

Strategic Automation Focus

Automation is powerful, but it’s not a silver bullet. Some scenarios require human intuition, exploratory testing, and nuanced judgment that no script can replicate. Manual testing catches those subtle user experience issues, edge cases, and contextual problems that automated tests might miss.

What to Automate

  • Repetitive, predictable test scenarios
  • Performance and load testing
  • Regression test suites
  • Areas with consistent, well-defined inputs and expected outcomes

When to switch towards Manual Testing

  • Usability assessments
  • Exploratory testing
  • Complex user interaction scenarios
  • Evaluating subjective quality aspects

The Real Cost of Automation Obsession đź’¸

As noted in recent testing literature, “the test automation process can be very costly.” This cost manifests in several ways:

  1. Initial Development Costs
  • Higher-skilled developers required
  • Complex framework setup
  • Test script development time

2. Maintenance Burden

  • Test script updates for each UI change
  • Framework upgrades
  • False positive investigation
  • Technical debt accumulation

3. Hidden Costs

  • Training and onboarding
  • Infrastructure and tools
  • Documentation updates
  • Cross-team coordination

Initial Development Costs

Skilled Personnel Requirements

  1. Senior Automation Engineers
  • India: â‚ą20–₹35 lakhs/year
  • US: $120,000–$180,000 per year (â‚ą99 lakhs–₹1.49 crores/year)

2. Framework Architects

  • India: â‚ą30–₹50 lakhs/year
  • US: $140,000–$200,000 per year (â‚ą1.16 crores–₹1.65 crores/year)

3. DevOps Specialists for CI/CD Integration

  • India: â‚ą25–₹45 lakhs/year
  • US: $130,000–$190,000 per year (â‚ą1.07 crores–₹1.57 crores/year)

Framework Setup and Training

Initial Setup Timeline

  • Framework Selection: 1–3 months
  • Initial Configuration: 1–2 months
  • CI/CD Integration: 2–4 weeks
  • Team Training: 40+ hours per team member

Ongoing Costs

  • Framework Maintenance: 10–15% of total development time
  • Training for New Team Members: â‚ą50,000–₹1 lakh per person
  • Tool Licenses: $10,000–$50,000 annually (â‚ą8.3 lakhs–₹41.5 lakhs/year)

Maintenance Burden

According to the Capgemini World Quality Report, 45% of an organization’s testing budget is allocated to maintenance activities.

Key Maintenance Considerations:

  • Test Script Updates:
  • UI Changes: Affect 15–30% of tests per sprint.
  • Framework Upgrades: Require quarterly updates.
  • False Positives: Account for 5–15% of all test runs.
  • Annual Maintenance Cost: Estimated at 40–50% of the initial development investment.

The Strategic Automation Triangle 🎯

  1. Stability Assessment
  • Change frequency
  • Architecture stability
  • UI/UX maturity

2. Criticality Evaluation

  • Business impact
  • User visibility
  • Revenue impact
  • Security implications

3. Fragility Analysis

  • Technical complexity
  • Integration points
  • Historical defect patterns

Based on industry best practices and the referenced material, here are the three critical factors to consider when deciding what to automate:

  • Stability: How frequently does the feature change?

The more stable an application is, the less maintenance it will require and therefore the less it will cost to maintain automated test cases.

  • Criticality: What’s the business impact?

Functionalities that have a very high frequency of execution or with a major impact on the business in the event of failure will need to be tested every time new SW is deployed.

  • Fragility: What’s the risk of defects?

This is a functionality covered by a complex development where it is considered likely that a new version could include defects.

The Smart Automation Pyramid đź“Š

Instead of blindly pursuing 100% automation, follow the modern interpretation of Mike Cohn’s Test Automation Pyramid:

The Smart Automation Pyramid

  1. Unit Tests (50–60%)
  • Component-level testing
  • Fast execution
  • High ROI
  • Developer-owned

2. Integration Tests (20–30%)

  • Service level
  • API testing
  • Contract testing
  • System integration

3. UI Tests (10–20%)

  • Critical user journeys
  • Cross-browser testing
  • Performance testing
  • End-to-end flows

Best Practices and Recommendations

  1. Start Small, Scale Smart
  • Begin with critical paths
  • Measure ROI early
  • Adjust based on data
  • Build maintainable frameworks

2. Invest in Infrastructure

  • Reliable test environments
  • Proper test data management
  • Monitoring and analytics
  • CI/CD integration

3. Balance Team Skills

  • Mix of manual and automation expertise
  • Regular training and knowledge sharing
  • Cross-functional collaboration
  • Culture of quality

4. Measure and Adjust

  • Track automation ROI
  • Monitor maintenance costs
  • Analyze test coverage
  • Regular framework audits

Real-World Examples: Making Smart Automation Decisions 🔍

E-commerce Checkout Flow

Let’s analyze how to approach testing an e-commerce checkout system:

Feature: Checkout Process
Components:
- Product Search and Cart addition
- Payment processing
- Inventory management
- Order confirmation
- Email notifications

Smart Automation Strategy:

âś… Automate:

  • Unit tests for price calculations
  • API tests for payment gateway integration
  • Critical path UI tests
  • Inventory deduction verification
  • Order confirmation generation

❌ Manual Testing:

  • Edge case UI scenarios
  • Visual validation of receipt layouts
  • Payment error message clarity
  • Mobile responsiveness
  • Accessibility testing

Rationale: The core business logic (payments, inventory) is stable and critical — perfect for automation. UI elements like error messages and layouts benefit more from human evaluation.

ROI Calculation Example đź“Š

Let’s look at a practical ROI calculation for automating a feature:

Let's look at a practical ROI calculation for automating a feature:

Use Case: Product Search and Checkout Function

Manual Testing Costs:
- Time per execution: 15 minutes
- Executions per sprint: 20
- Tester hourly rate: $50
- Monthly cost: $250

Automation Costs:
- Development time: 8 hours
- Monthly maintenance: 2 hours
- Developer hourly rate: $75
- Initial investment: $600
- Monthly maintenance cost: $150

Break-even Analysis:
- Monthly savings: $100
- Break-even point: 6 months

Decision: Automate âś…

High execution frequency
Stable feature
Clear ROI after 6 months

Industry Leaders’ Test Automation Strategies

Google’s Testing Strategy

(Insights from James Whittaker’s “How Google Tests Software”)

Test Engineering Productivity (TEP)

  • Team Focus: Dedicated TEP team for testing tools and frameworks
  • Priority: Developer productivity over coverage metrics
  • Platform: Google’s TAP (Test Automation Platform)

Success Metrics

  • Release velocity (primary metric)
  • Test execution time
  • Test reliability (flake rate < 1%)

Strategic Test Selection

Manual Testing Focuses

  • User experience validation
  • Exploratory testing
  • New feature validation

Automated Testing Targets

  • Core business logic
  • Critical user journeys
  • Performance-critical components

Meta’s Risk-Based Approach

Testing Priority Matrix

High Risk + High Usage (100% Automation)

  • Example: News Feed rendering

Medium Risk + High Usage (80% Automation)

  • Example: Comment posting

Low Risk + Low Usage (30% Automation)

  • Example: Profile theme changes

E2E Testing Strategy

Instagram Story Creation

Automated Components:

  • Media upload
  • Basic filters
  • Posting mechanism

Manual Validation:

  • Filter effects quality
  • Multi-device preview
  • Creative tools UX

Microsoft’s DevDiv Testing Transformation

Testing Distribution

Modern Testing Mix

  • Unit Tests: 65%
  • Integration Tests: 20%
  • UI Automation: 10%
  • Manual Testing: 5%

Cost Structure (Azure DevOps)

Development Costs (Per Feature)

  • India: â‚ą3–5 lakhs
  • United States: $8,000-$12,000 (â‚ą6.6–9.9 lakhs)

Maintenance Costs (Monthly)

  • India: â‚ą20,000–30,000
  • United States: $500-$800 (â‚ą41,000–66,000)

Key Takeaways

  1. Test automation strategies vary based on product risk and usage
  2. Balance between automated and manual testing is crucial
  3. Focus on developer productivity and efficient test execution
  4. Cost and resource allocation differ by region and company

Conclusion

The pursuit of 100% test automation is often counterproductive and expensive. Leading tech companies have shown that a strategic approach to test automation, based on risk, stability, and business value, yields better results.

By following the Smart Automation Pyramid and learning from industry leaders’ experiences, organizations can build a more effective and efficient testing strategy.

References

  1. Google Testing Blog
  2. Microsoft DevOps Blog
  3. Meta Engineering Blog
  4. Agile Testing: A Practical Guide
  5. Continuous Testing Report 2023
  6. State of DevOps Report
  7. Swiggy Engineering Blog

--

--

Shreyansh Shukla
Shreyansh Shukla

Written by Shreyansh Shukla

Quality Engineering @Sony Pictures Networks | Learning and sharing insights on Software Quality | https://www.linkedin.com/in/shreyanshukla/

No responses yet