Module 6

The Test Life-Cycle

When should you write tests? When should you automate them? And who owns what? These questions have plagued testing teams since the dawn of Agile. Let us apply some logical clarity to the chaos.

Three interlocking gears representing System, Test, and Automation life-cycles
The three parallel life-cycles that must be synchronized for testing success

Dear Marilyn: Our development team uses Agile sprints, but our test automation is always two sprints behind. By the time we automate a test, the feature has already changed. How do we catch up?

— Frustrated in Phoenix

Dear Frustrated: You are not behind—you are simply running the wrong race. The problem is not speed; it is synchronization. You have three parallel life-cycles that must move together, and you are treating them as if they were sequential.

Think of it this way: A symphony orchestra does not wait for the violins to finish before the cellos begin. They play together, each following the same score. Your System Development, Test Development, and Automation Development must do the same.

The Three Parallel Life-Cycles

System Development

The product team builds features. Each sprint delivers working functionality that users can interact with.

Owner: Development Team

Test Development

The test team designs test cases using high-level actions. These are human-readable specifications of what to verify.

Owner: Test Designer / Business Analyst

Automation Development

The automation team implements the low-level actions that make the tests executable against the actual system.

Owner: Automation Engineer

The Key Insight: In ABT, the Test Designer can write complete, meaningful tests using high-level actions before the automation is ready. The tests are not "waiting" for automation—they are valid specifications that can be reviewed, refined, and approved while the automation catches up.

Agile Integration: The Sprint Timeline

Here is how the three life-cycles align within a typical two-week sprint:

1-2

Days 1-2: Sprint Planning

System: Stories are selected and refined.

Test: Test Designer identifies which Test Modules will be affected. Creates placeholder high-level actions for new functionality.

Automation: Reviews upcoming actions, estimates implementation effort.

3-8

Days 3-8: Development

System: Features are built and unit tested.

Test: Test Designer writes complete test cases using high-level actions. Tests are reviewed with Product Owner for correctness.

Automation: Implements low-level actions as features become available. Updates interface definitions as UI elements are finalized.

9-10

Days 9-10: Integration & Testing

System: Features are integrated and deployed to test environment.

Test: Automated tests are executed. Manual exploratory testing fills gaps where automation is not yet ready.

Automation: Fixes any automation issues discovered during execution. Completes any remaining action implementations.

Cross-over Test Modules

Dear Marilyn: Our sprint is building a new "Quick Order" feature, but it needs to work with the existing inventory and customer systems. How do we test these connections without duplicating tests?

— Connecting in Connecticut

Dear Connecting: You have discovered the need for what I call Cross-over Test Modules. These are tests that verify the relationships between the new sprint items and the existing application.

Think of your application as a city. Each sprint builds a new neighborhood. Cross-over tests verify that the roads connecting the new neighborhood to the existing city are properly paved and the traffic flows correctly.

Main Level Tests

Test the new feature in isolation. "Quick Order" creates orders correctly.

Interaction Tests

Test the UI elements. Buttons, forms, and navigation work as expected.

Cross-over Tests

Test the connections. "Quick Order" correctly updates inventory and customer records.

When to Create Cross-over Tests

  • New feature uses data from existing modules (orders use customer data)
  • New feature modifies shared resources (inventory levels, account balances)
  • New feature triggers workflows in other modules (notifications, approvals)
  • New feature must coexist with existing features (same screen, shared navigation)

The Ownership Model

Dear Marilyn: Our company wants to outsource test automation to a vendor. Will ABT still work?

— Considering in Chicago

Dear Considering: ABT was designed precisely for this scenario. The three-layer architecture creates natural boundaries for ownership:

LayerTypical OwnerSkills RequiredCan Outsource?
High-Level ActionsBusiness Analyst / Test DesignerDomain knowledge, test designRarely (requires business context)
Mid-Level ActionsSenior Test EngineerTest architecture, abstractionSometimes (with good documentation)
Low-Level ActionsAutomation EngineerProgramming, tool expertiseOften (technical, well-defined)

The beauty of this model is that the vendor implementing low-level actions does not need to understand your business. They only need to know: "When someone calls click_button with argument 'Submit', find the element with ID 'btn_submit' and click it." The business logic remains safely in-house.

Three Outsourcing Models

Depending on your organization's needs, ABT supports three distinct outsourcing arrangements:

1

Fully Outsourced

A complete external team handles system development, test development, and automation. They work as an independent unit.

Best for: Complete product development, offshore teams with full ownership
2

Fully Integrated

Outsourced team members participate directly in your sprints. They work in your timezone (or overlap hours) and attend daily standups.

Best for: Staff augmentation, when you need extra hands but want full control
3

Second Unit

Your onshore team sends work items directly to an offshore team. They complete automation tasks overnight, ready for review next morning.

Best for: "Follow the sun" development, maximizing 24-hour productivity

Key Principles for Life-Cycle Success

1. Tests Before Code

High-level test cases can (and should) be written before the feature is implemented. They serve as executable specifications that clarify requirements.

2. Automation Follows Design

Never automate before the test design is stable. Automating a poorly designed test just makes a bad test run faster.

3. Interface Definitions Last

Wait until the UI is stable before finalizing interface definitions. This is the most volatile part of the system and should be the last thing you lock down.

4. Run What You Can

If only 60% of your tests are automated, run that 60% automatically and the other 40% manually. Partial automation is better than no automation.

Module Summary

  • Three parallel life-cycles (System, Test, Automation) must be synchronized, not sequential.
  • Test design can proceed independently of automation using high-level actions.
  • Cross-over Test Modules verify connections between new sprint features and existing application components.
  • The three-layer architecture enables clear ownership boundaries and three outsourcing models: Fully Outsourced, Fully Integrated, and Second Unit.
  • In Agile, test design happens during development, not after.