1. Give a Test Plan Approval?

Test Plan Approval
Business Approval
______________________ __________
[Name/Title] Date

Testing Approval
_________________________ ______________ [Name/Title] Date

Appendices:
★ Appendix 1 User Scenario Test Suite
★ Appendix 2 Concurrency Load Testing Suite
★ Appendix 3 Data Element from Load Test
★ Appendix 4 Test Scripts - Requires Webload or Text Editor - IN JAVASCRIPT
★ Appendix 5 Error or Web Server Failures.
★ Appendix 5 Web Monitoring Data.

2. List of Appendices in Load performance test plan?

Specific test case, test design and test script information to be added as we go. Here are a few examples:
★ Real-World User-Level Test Suite
★ Concurrency Test Suite
★ Data Elements
★ Test Scripts
★ Error Reports
★ Web Monitoring Data

3. What are test deliverables in Load performance test plan?

Test Deliverables:
★ This test plan
★ Performance testing goals
★ Workload definitions
★ User scenario designs
★ Performance test designs
★ Test procedures
★ System baseline/System-under-test configurations
★ Metrics to collect
★ Tool evaluation and selection reports (first time, or as needed)
★ Test scripts/suites
★ Test run results
★ Analysis reports against the collected data
★ Performance related error reports (e.g., failed transactions)
★ Functional bug reports (e.g., data integrity problems)
★ Periodic status reports
★ Final report

4. What is Exclusions in Load performance test plan?

Set clear expectations-State which goals will be outside of the scope of this testing. For example:
★ Content accuracy or appropriateness testing is out of the scope of this plan.
★ The integration of any major third party components (for example a search engine, credit card processor, or mapping component) with the site will be tested, though the scope of the project does not include in-depth functional testing of these components.
★ Internationalization
★ Compatibility Testing

5. What is system under test environment in Load performance test plan?

★ Specifying mixes of system hardware, software, memory, network protocol, bandwidth, etc.
★ Network access variables: For example, 56K modem, 128K Cable modem, T1, etc.
★ Demographic variables: For example San Francisco, Los Angeles, Chicago, New York, Paris, London, etc.
★ ISP infrastructure variables: For example, first tier, second tier, etc.
★ Client baseline configurations
★ Computer variables
★ Browser variables
★ Server baseline configurations
★ Computer variables
★ System architecture variables and diagrams

Other questions to consider asking:
★ What is the definition of "system"?
★ How many other users are using the same resources on the system under test (SUT)?
★ Are you testing the SUT in its complete, real-world environment (with load balances, replicated database, etc.)?
★ Is the SUT inside or outside the firewall?
★ Is the load coming from the inside or outside of the firewall?

6. What are Load Descriptions in Load performance test plan?

Load Descriptions:
★ Server-based
★ Number of users and/or sessions
★ Average session time
★ Number of page views
★ Average page views per session
★ Peak period (e.g., 75% of traffic is from 11:00 AM-4:00 PM)
★ Number of hits
★ Average page size
★ Most requested pages
★ Average time spend on page
★ New users vs. returning users
★ Frequency of visits (e.g., 75% of users made one visit)
★ Demographics
★ Client information such as browser, browser version, Java script support, Java script enable/disable, and so on.

User-based:
★ Number of users
★ Session length
★ User activities and frequency of activities per session
★ Think/Read/Data-input time
★ Percentage by functional group
★ Percentage by human speed
★ Percentage by human patience (cancellation rates)
★ Percentage by domain expertise (speed)
★ Percentage by familiarity (speed)
★ Percentage by demographics (arrival rates)

Other questions to consider:
★ What is the definition of "workload"?
★ How do we size the workload?
★ What is the expected workload?
★ What's the mix ratio of static pages vs. code?
★ What is the definition of "increased load"?
★ What is future growth? Can it be quantified?
★ What is the definition of scalability?

7. Which tools are used in Load performance test plan?

State the tool solutions for the project:
★ Load testing tools
★ Monitoring tools
★ Tool Options:
★ Product vs. Application Service Provider (ASP)
★ Freeware
★ Lease or rent
★ Purchase
★Build
★ Outsourcing (testing with virtual client licensing included)

8. Define Bug Reporting and Regression Instructions?

Bug Reporting and Regression Instructions describe the bug reporting process and the fix/change regression test procedures.

9. What are Testing Process, Status Reporting, Final Report in Load/Performance Test Plan?

Describe the testing and reporting procedures. For example:
★ The internal test team will execute all created scripts. These Scripts will be generated and executed against the system at least three times. We will execute these scripts again, after subsequent hardware, software, or other fixes are introduced.
Test team will baseline load as follows:
Load Test Team will test Nile.com with 1000 Simultaneous Clients/Users, and report back on the following metrics:
★ Response Time each transaction hitting the Web site.
★ Any web or database server errors as reported in the data log.
★ Round time
★ Failed Web Transactions
★ There will be Status Reports sent to Team Lead detailing:
★ Performance tests run
★ Performance metrics collected
★ Performance Errors and status
★ Number of Bugs Entered
★ Status Summary
★ Additional load testing, if needed.
★ The Final Report will include summary bug counts, overall performance assessment, and test project summary items.
Additional Information to be provided by Development Team:
★ Build Schedule
★ Acceptance test criteria
★ Deployment Plans

10. What are the Testing Process, Status Reporting, Final Report in Load/Performance Test Plan?

Describe the testing and reporting procedures. For example:
The internal test team will execute all created scripts. These Scripts will be generated and executed against the system at least three times. We will execute these scripts again, after subsequent hardware, software, or other fixes are introduced.

11. What are the other questions to consider in Load/Performance Test Plan?

★ What is response time?
★ What is acceptable response time?
★ Which metrics should we collect?
★ What is the correlation between demand and increased load?
★ How do we determine which components are problematic?
★ How do we correlate financial implications?

12. What are the performance/capability goals of Load/Performance Test Plan?

★ Identify goals:
★ Percentage of requested static pages that must meet the acceptable response time?
★ Percentage of requested scripts that must meet the acceptable response time?
★ The baseline multiplier (2x, 4x, ...) that the system must be capable of handling?
★ The spike ratio that the system must be capable of handling?
★ The peak ratio that the system must be capable of handling?
★ The burstiness ratio that the system must be capable of handling?
★ Tolerance ratio: Imposed load ? 25 %?
★ Safety ratio: Imposed load x 2?
★ Spike ratio: Imposed load x 3?
★ Burstiness ratio: Imposed load x 5?
★ Increase the load by multiplying the load baseline by 1x, 2x, 3x, 4x, Nx gradually until unacceptable response time is reached.

13. What to Be specific in Load/Performance Test Plan?

★ Specify what tests you will run
★ Estimate how many cycles of each test you will run
★ Schedule your tests ahead of time
★ Specify by what criteria you will consider the SUT to be ready-for-test
★ Forward thinking: Determine and communicate the planned tests and how the tests are scheduled.

14. What is the Test Types and Schedules of Load/Performance Test Plan?

Specify the test types (with definition for each) to run:
★ Acceptance test
★ Baseline test
★ 2B1 load test
★ Goal-reaching test
★ Spike test
★ Burstiness test
★ Stress test
★ Scalability test
★ Regression test
★ Benchmark test

15. What is the approach of Load/Performance Test Plan?

The high-level description of the testing approach that enables us to cost effectively meet the expectation stated in the Scope section.

16. What is the scope of Load/Performance Test Plan?

★ What does this document entail?
★ What is being tested?
★ What is the overall objective of this plan?
For examples:
★ To document test objectives, test requirements, test designs, test procedures, and other project management information.
★ To solicit feedback and build consensus
★ To define development and testing deliverables
★ To secure commitment and resources for the test effort

17. What you know about our organization?

Do your homework prior to the job interview. Doing the background work will help you stand out. Find out who the main players are-have they been in the news recently? You're not expected to know every date and individual, but you need to have a solid understanding of the company as a whole.

18. Tell me why are you looking for another job?

On the surface, this appears to be a simple question, yet it is easy to slip. I would suggest not mentioning money at this stage as you may come across as totally mercenary. If you are currently employed, you can say it's about developing your career and yourself as an individual. If you are in the unfortunate position of having been downsized, stay positive and keep it brief. If you were fired, you should have a solid explanation. Whatever your circumstances, do not talk about the drama but remember to stay positive.

19. How to handle stressful situations and working under pressure?

There are several ways of addressing this one. You may be the sort of person that works well under pressure; you may even thrive under pressure. Whatever the case, make sure you don't say you panic. You want to give specific examples of stressful situations and how well you dealt with them. You may also want to list a few tools you use to help you, such as to-do lists, etc. It is alright to say that you will ask for assistance when the job is more than what you can handle. It is equally acceptable to say that you work best under pressure if this is indeed the case and relevant to the particular role.

20. Regarding salary, what are your expectations?

This question is always a tricky one and a dangerous game to play in an interview. It is a common mistake to discuss salary before you have sold yourself, and like in any negotiation, knowledge is power. Do your homework and make sure you have an idea of what this job is offering. You can try asking them about the salary range. If you want to avoid the question altogether, you could say that at the moment, you are looking to advance in your career and money isn't your main motivator. If you do have a specific figure in mind and you are confident you can get it, then it may be worth going for.

21. Why should we hire you?

This is an important question that you will need to answer carefully. It is your chance to stand out and draw attention to your skills, especially those that haven't already been addressed. Saying "because I need a job" or "I'm really good" just won't cut it. Don't speculate about other candidates and their possible strengths or flaws. Make sure you focus on you. Explain why you make a good employee, why you are a good fit for the job and the company, and what you can offer. Keep it succinct and highlight your achievements.

22. Tell me what sort of person do you not like to work with?

This is not an easy one as you have no idea whom you would be working with. Even if you can immediately think of a long list of people who you don't like to work with, you could take some time to think and say that it's a difficult question as you have always gotten on fine with your colleagues.

23. Tell me what kind of decisions do you find most difficult to take?

There is no right or wrong answer here. The logic behind this type of question is that your past behavior is likely to predict what you will do in the future. What the interviewer is looking for is to understand what you find difficult.

24. Are you also applying for some other jobs?

If you are serious about changing jobs then it is likely that you are applying to other positions. It is also a way of showing that you are in demand. Be honest but don't go into too much detail; you don't want to spend a great deal of time on this. If asked about names of who you have spoken to, it is absolutely legitimate to say you prefer not to disclose that information at this stage.

25. Tell me what are you like working in a team?

Your answer is of course that you are an excellent team player-there really is no other valid answer here as you will not function in an organization as a loner. You may want to mention what type of role you tend to adopt in a team, especially if you want to emphasize key skills such as leadership. Be prepared to give specific examples in a very matter of fact sort of way.

26. Why you want this job?

This question typically follows on from the previous one. Here is where your research will come in handy. You may want to say that you want to work for a company that is X, Y, Z, (market leader, innovator, provides a vital service, whatever it may be). Put some thought into this beforehand, be specific, and link the company's values and mission statement to your own goals and career plans.

27. Who are the main competitors of our company?

This shows you really understand the industry and the main players. Think about a few and say how you think they compare (similarities and differences). This is a good opportunity to highlight what you think are the company's key strengths.

28. What would your previous co-workers say about you?

This is not the arena for full disclosure. You want to stay positive and add a few specific statements or paraphrase. Something like "Joe Blogs always mentioned how reliable and hard working I was" is enough.

29. Who gives the assignment to test?

Our assumption is that the WG is giving the assignment, and we will be as consensus-driven as possible (with the chair and vice-chair providing advice as necessary in between opportunities for the group to confer). Subject to group consensus in cases of controversy. We'll operate on an assumption of communications transparency; so, for example, we should copy the wg-uma list on answers.
We suspect that the SMART project has testing materials that haven't been shared yet. We will ask them to share this as soon as possible if they're able.

30. What is test assignment?

The goal in this conversation is to be as concrete as possible, within time constraints.
The overall goal of having a hosted validator is to test conformance and interop by implementations, against the protocol's testable assertions.
Since the spec documents to date are incomplete, we can't get 100% of the way in the time allotted for the bounty program.
So given all this, the suggestion is to develop a draft (likely incomplete) test plan that highlights wherever more information is needed from the WG.
A good model is the SAML 2.0 test plan, but it had an advantage that UMA currently doesn't have: a conformance requirements document! This is something the group should put a priority on.

31. What is Load/Performance Test Plan?

Reference Documents
Reference information used for the development of this plan including:
Business requirements
Technical requirements
Test requirements
and other dependencies

32. Explain Sample Test Strategy Worksheet?

Type of Computing Environments

Web-based, Mainframe Batch
Purpose of Testing

To validate business processes are supported by the customized application.
Type of Software

Vendor-developed, Web-based
Scope of Testing

A/P, A/R, Payroll, HR, Integration with existing systems
Critical Success Factors

Security, Correctness, Performance, Ease of Use, Interoperability
Phases of Testing

Unit, System, UAT
Audience

Internal - Accounting, Payroll, HR, new and existing employees, personnel in interfaced systems
Tradeoffs

Scope, Cost
Types of Testing

Functional, based on: Business processes, Security policies, Use cases
Development Tools and Test Tools (e.g., GUI builders, automated capture/playback, etc.)

Screen capture, test management, defect management
Business/operational concerns

Payroll processing must be correct - could be fined if in error
HR processing must be correct
Accounting processing must be correct
Performance times must be within specified limits

Risks

Application Risks - Security, Correctness, Data conversion
Project Risks - Lack of experience with application and technology, No defined requirements, No defined processes, High employee turnover

Constraints

Lack of time, lack of management support, lack of experience, lack of dedicated test environment

Assumptions

Critical need of new application, Ongoing vendor support, vendor will customize application, vendor will fix defects

Deliverables

Final test report, Defect log, Baseline of correct test results for future tests

Sample Test Strategy Worksheet

Project Phase Testing Phase Stakeholders Purpose/Why

33. Test Plan Outline:

1. BACKGROUND
2. INTRODUCTION
3. ASSUMPTIONS
4. TEST ITEMS
List each of the items (programs) to be tested.
5. FEATURES TO BE TESTED
List each of the features (functions or requirements) which will be tested or demonstrated by the test.
6. FEATURES NOT TO BE TESTED
Explicitly lists each feature, function, or requirement which won't be tested and why not.
7. APPROACH
Describe the data flows and test philosophy.
Simulation or Live execution, Etc.
8. ITEM PASS/FAIL CRITERIA Blanket statement
Itemized list of expected output and tolerances
9. SUSPENSION/RESUMPTION CRITERIA
Must the test run from start to completion?
Under what circumstances may it be resumed in the middle?
Establish check-points in long tests.
10. TEST DELIVERABLES
What, besides software, will be delivered?
Test report
Test software
11. TESTING TASKS Functional tasks (e.g., equipment set up)
Administrative tasks
12. ENVIRONMENTAL NEEDS
Security clearance
Office space & equipment
Hardware/software requirements
13. RESPONSIBILITIES
Who does the tasks in Section 10?
What does the user do?
14. STAFFING & TRAINING
15. SCHEDULE
16. RESOURCES
17. RISKS & CONTINGENCIES
18. APPROVALS

34. Test Plan Template:

Test Plan Template
(Name of the Product)
TABLE OF CONTENTS

1.0 INTRODUCTION

2.0 OBJECTIVES AND TASKS
2.1 Objectives
2.2 Tasks

3.0 SCOPE

4.0 Testing Strategy
4.1 Alpha Testing (Unit Testing)
4.2 System and Integration Testing
4.3 Performance and Stress Testing
4.4 User Acceptance Testing
4.5 Batch Testing
4.6 Automated Regression Testing
4.7 Beta Testing

5.0 Hardware Requirements

6.0 Environment Requirements
6.1 Main Frame
6.2 Workstation

7.0 Test Schedule

8.0 Control Procedures

9.0 Features to Be Tested

10.0 Features Not to Be Tested

11.0 Resources/Roles & Responsibilities

12.0 Schedules

13.0 Significantly Impacted Departments (SIDs)

14.0 Dependencies

15.0 Risks/Assumptions

16.0 Tools

17.0 Approvals

18.0 References

Appendices

35. What is The Sample of the test plan?

Title

Submitted to: [Name] [Address]

Submitted by: [Name] [Address]

Document No
Contract No
Date

Approvals: [Person Name] [Person Title] [Business Name]

Table of Contents

1. Introduction
* Purpose
* Scope
2. Applicability
* Applicable Documents
* Documents
3. Program Management and Planning
* The SQA Plan
* Organization
* Tasks
4. Software Training
* SQA Personnel
* Software Developer Training Certification
5. SQA Program Requirements
* Program Resources Allocation Monitoring
* SQA Program Audits
1. Scheduled Audits
2. Unscheduled Audits
3. Audits of the SQA Organization
4. Audit Reports
* SQA Records
* SQA Status Reports
* Software Documentation
* Requirements Traceability
* Software Development Process
* Project reviews
1. Formal Reviews
2. Informal Reviews
* Tools and Techniques
* Software Configuration Management
* Release Procedures
* Change Control
* Problem Reporting
* Software Testing
1. Unit Test
2. Integration Test
3. System Testing
4. Validation Testing

Attachment 1 Coding Documentation Guidelines
Attachment 2 Testing Requirements