1. Explain verifiable levels of test coverage in Quality On Time?

► Integrated framework for testing designed to support tight Test Execution Cycle goals with verifiable levels of test coverage
► Considers the "Paraclete Relationship": the tight bond between the test system and the system under test
--Low maintenance requirements on the test framework per change in the AUT
► Wrapped around a central relational database (ODBC)
► Works on several tools offered by major vendors
► Central point of maintenance for all test artifacts

2. Explain Strategic Considerations in Quality On Time?

► Before implementing a framework, the Test Organization should clarify its Strategic Goals and Objectives.
--Specific Risks Identified & Mitigated
--Measurement Requirements (Metrics)
--Timeliness Goals (Test Cycle Turnaround Time)
--Test Coverage Requirements (Code, Window/Object, Req'ts)
--Team Skill Mix Requirements (e.g.: 1:10 Technical/Business)
--Test Maintenance Requirements
► If properly framed, the framework's major requirements should follow the Strategic Goals and Objectives of the Test Organization

3. Explain Typical Framework Components in Quality On Time?

► Test Harness Structure/Directory Tree
--Framework Components Across AUTs
--AUT-specific constructs
--Release/Version-specific constructs
► Scripting Templates and Coding Standards
--Testing Hierarchy, Test Templates, Function Templates
► Configuration Management Tools, Policies, Procedures
--PVCS, ClearCase, MS SourceSafe, etc.
► Mechanisms to address peculiarities of specific technologies
--Functions for increased class support for Active X, etc.
► Data repository or Database

4. Explain A Closer Look at Team Skill Mix in Quality On Time?

► Typical mix: 2:8:1 (2 SMEs; 8 Testers; 1 Developer/Guru)
--Such "Stratified Teams" are most efficient - required level of abstraction is raised for the typical test writer and coding challenges are placed on developer/guru
► SMEs should be trained in the discipline of testing
--Hierarchies and types of testing, Testability of requirements, what verifications constitute necessary and sufficient testing, etc.
► SMEs and Testers together constitute the "Test Scripters"
--Combination of Record and Function Generator (F7) Calls to start
► Developers/Gurus constitute "Test Harness Support"
--Work with scripters to write functions to streamline scripting
--Supports/expands the framework infrastructure

5. Explain Number of Supported Applications and Releases in Quality On Time?

► Tool should load with full knowledge of AUT
--For single AUT, harness-load command added to config (e.g.: TSLINIT for WinRunner or Settings --> Options -->Environment -->Startup Test)
--Loads harness, all AUT-specific extensions, GUI Map Files, etc.
► For Teams with Multiple Projects
--Embed a User Interface step (e.g.: "create_list_dialogue") to select AUT at tool startup
► Helpful Hint Should also load knowledge and functions for any needed ancillary applications (Telnet sessions, Middleware apps, Rumba, etc.)

6. Explain Maturity of Applications in Quality On Time?

# Mature Applications: Straightforward scripting,
--AUT is stable: slight risk to script base per change in AUT
# New Applications (can expect changes in windows/objects/navigational paths)
--Considerable risk to existing script base as AUT changes
--Affects "granularity" of the test case
# If risk to existing script base is moderate of high
--Consider a "State Navigation" component to the Framework
--Allows navigational components of a test to be consolidated for ease of maintenance
--Must understand the "Anatomy of a Test Case"

7. Unattended Requirements in Quality On Time?

* Determine continuum of unattended needs:
--Close monitoring of executing scripts
--Daily lab execution with periodic monitoring
--Overnight execution - little of no monitoring
* Harness robustness and error-recovery must be adequate for level of unattended need
* Typically requires strategies and negotiations with Network Security to ensure testing needs and security needs are met
* Central Reporting Log (Cycle Execution Log) can additionally be implemented for remote status monitoring

8. Explain Miscellaneous in Quality On Time?

* Harness and Tests should be under Configuration Management/Version Control
* Consider mechanisms to differentiate between a failed test (AUT is really broken) and a failed test case (test case is not implemented correctly)
* Build a little, Test a little
* Test Early/Test Often
* Consider implementing links to ancillary applications to assist in defect discovery (Boundschecker, Norton Utilities, etc.)
* Keep it simple!

9. Explain A Closer Look at Test Frameworks in Quality On Time?

# A Plethora of "Architectural Frameworks" have emerged over the past several years
--General Purpose Frameworks: "One-Size-Fits-All"
--Technology-Specific (Telephony Interfaces, Multi-Platform Applications, etc.)
# Very difficult to anticipate all requirements in a "One-Size-Fits-All" model --Unnecessary constructs (e.g.: May contain elaborate environment constructs for single environment projects, etc.)
--Insufficient constructs (e.g.: May lack key Active X support for required third-party controls, etc.)
--Success of a framework is often hard to measure

10. Explain Evaluation Criteria in Quality On Time?

* Supports Strategic QA Goals & Objectives
* Conceptual Simplicity & Streamlined Use
* Efficient and Effective Test Development, Execution, and Reporting
* Maintenance and Robustness Considerations (Scripts and Harness)
* Each Construct is Necessary, Sum of Constructs are Sufficient
* Poised for Expansion
* Matched to Team Skill Set

Download Interview PDF

11. Explain Team Skill Mix in Quality On Time?

* Business Skilled (a.k.a. "Subject Matter Expert" or "SME"): Understands business needs in-depth, spotty knowledge of technology, no coding knowledge
* Tester Skilled: Understands discipline of testing and test development, spotty knowledge of scripting/coding
* Developer Skilled: Understands software development practices. Proficient coder. Usually NOT skilled in the discipline of testing!
* Guru: Understands software development practices in-depth at the strategic and tactical level. Also understands test practices in-depth. Gets the "Full-Picture".

12. Explain Skill Mix: Impact on Framework in Quality On Time?

* Strive for the simplest scripting environment possible
--Move all complexities to the Developer/Guru
--Gated by the skills of the Developers/Gurus
* Open Architecture allows tremendous flexibility in customization
* Hide as much of the complexity of the framework as possible
--Automatically load harness components at tool load time
--Incorporate routine maintenance and special reporting needs into simple function calls
* Consider the maintainability and simplicity of the framework itself when making enhancements to it

13. Explain A Closer Look at State Navigation in Quality On Time?

* By combining setup and restore functions into a State Navigation component, we can eliminate about 30%* of the required steps to each test case
* Such reduction affects those portions of the test cases that are most prone to rework as per change in the AUT
* Can also perform routine tasks such as monitoring system resources, timestamping, and error recovery
* * Average based on 6 years experience across wide range of industries and applications

14. Explain Distributed Requirements in Quality On Time?

* Scripts must retain independence from each other
--If there are script dependencies, consider a 2-tier approach where at the "batch" tier (sometimes called Scenario) is independent, and all dependencies are handled within batch script
* Consider concurrent impact on data --Similar test cases require different input data to run on different machines simultaneously
--Consider implementing machine-specific (C: drive) and machine-independent (LAN drive) data sources to feed data-driven tests
* Central Reporting Log (Cycle Execution Log) can be implemented to collect remote results of all machines in the test cycle