Test Summary Report
Project Name
Test Summary Report
Version X.X
Table
of Contents
Appendix
A: Test Incident Reports (TIRs)
Appendix
D: Referenced Documents
Appendix
F: Notes to the Author/Template Instructions
No
table of figures entries found.
Table 1 - Test Case Summary Results
Table 2 - Test Incident Summary Results
Table 3 - <Test Category/Function>
Results
Table 8 - Example Test Incident Report
(TIR)
Table 9 - Incident Description
Table 10 - Incident Resolution
Table 6 - Referenced Documents
Instructions: Provide full identifying information for the
automated system, application, or situation for which the Test Summary Report
applies, including as applicable, identifications number(s), title(s)/name(s),
abbreviation(s)/acronym(s), part number(s), version number(s), and release
number(s). Summarize the purpose of the document, the scope of activities that
resulted in its development, the intended audience for the document, and
expected evolution of the document. Also describe any security or privacy
considerations associated with use of the Test Summary Report.
1.1
Overview
Instructions: Provide a brief description of the testing
process employed. Summarize what testing activities took place, including the
versions/releases of the software, environment, etc. Identify the test
functions performed, the test period(s), test location(s), and the test
participants and their roles in the testing process.
Summary Assessment
Instructions: Provide an overall assessment of the build or
release tested, with a summary of the test results, including the number of
test incidents summarized by impact/severity level. Include in the Glossary
section of this document operational definitions for each of the reported
impact/severity levels established for the project. If test results are
maintained in an automated tool, the information may be exported or printed from
the tool for inclusion in this document.
**ATTENTION**: Please ensure the
accuracy of numbers listed on this table. For example, the number of test cases
passed plus the number of test cases failed plus the number of test cases held
must match the total number of test cases reviewed.
Test Cases Planned: Number of
test cases planned to execute for this release
Test Cases Run: Actual number of
planned test cases executed
Test Cases Reviewed: Number of
executed test cases reviewed based on result
Test Cases Passed: Actual number
of reviewed test cases that met the expected result
Test Cases Failed: Actual number
of reviewed test cases that failed to meet the expected result
Test Cases To Be Run: Number of
planned test cases remaining to be executed
Test Cases Held: Number of
planned test cases on hold/not applicable/postponed at this point of time
The
following is a summary of the test case results obtained for the reported test
effort. Refer to subordinate sections of this document for detailed results and
explanations of any reported variances.
Table 1 - Test Case Summary Results
|
Summary
Assessment |
Total
Number of Test Cases |
% of
Total Planned |
Comments |
|
Test
Cases Planned |
<# test cases planned> |
<% total planned> |
<Comments> |
|
Test
Cases Run |
<# test cases run> |
<% total planned test cases run> |
<Comments> |
|
Test
Cases Reviewed |
<# test cases reviewed> |
<% total planned test cases reviewed> |
<Comments> |
|
Test
Cases Passed |
<# test cases passed> |
<% total planned test cases passed> |
<Comments> |
|
Test
Cases Failed |
<# test cases failed> |
<% total planned test cases failed> |
<Comments> |
|
Test
Cases To Be Run |
<# test cases to be run> |
<% total planned test cases to be run> |
<Comments> |
|
Test
Cases Held |
<# test cases held> |
<% total planned test cases held> |
<Comments> |
The
following is a summary of the test incidents (i.e., unexpected results,
problems, and/or defects) that were reported during the testing:
Table 2 - Test Incident Summary Results
|
Impact/Severity
Level |
Total
Reported |
Total
# Resolved |
%
Total Resolved |
Total
# Unresolved |
%
Total Unresolved |
|
<Impact/Severity level> |
<# total reported> |
<# total resolved> |
<% total resolved> |
<# total unresolved> |
<% total unresolved> |
|
<Impact/Severity level> |
<# total reported> |
<# total resolved> |
<% total resolved> |
<# total unresolved> |
<% total unresolved> |
|
<Impact/Severity level> |
<# total reported> |
<# total resolved> |
<% total resolved> |
<# total unresolved> |
<% total unresolved> |
|
Combined
Totals |
<Combined total # reported> |
<Combined total # resolved> |
<Combined total % reported> |
<Combined total # unresolved> |
<Combined total % unresolved> |
Instructions: Briefly describe the testing process employed
for each test category (i.e., development testing, validation testing,
implementation testing, and operational testing) and each test function
performed (i.e., a collection of related test cases comprising a specific type
of test (e.g., user acceptance testing, Section 508 testing, regression
testing, system acceptance testing, ST&E, etc.). Also summarize the test
results for each test category/function. As appropriate, include separate
sub-sections for each test category/function performed. If test results are
maintained in an automated tool, the information may be exported or printed
from the tool for inclusion in this document.
2.1
<Test Category/Function>
Table 3 - <Test Category/Function> Results
summarizes the test cases employed for <test category/function> and the
test results obtained for each test case.
Table 3 - <Test
Category/Function> Results
|
Test
Case/Script ID |
Test
Case/Script Description |
Date
Tested |
Pass/Fail |
Comments |
|
<Test case/script ID> |
<Test case/script description> |
<MM/DD/YYYY> |
<Pass/Fail> |
<Comments> |
Instructions: If the test case failed, list the
corresponding TIR ID in the Comments column.
The
calculated level of success for <test category/function> was <the
percentage of the total number of test cases defined for the test that
passed>%.
2.2
<Test Category/Function>
Instructions: All of the information described above in the section
for <test category/function> should be replicated for each defined test
category/function. The reported test categories/functions should be consistent
with what are defined in the corresponding Test Plan.
Instructions: Describe any variances between the testing
that was planned and the testing that actually occurred. Also, explain if the
number of planned tests has changed from a previous report. It is important to
account for all planned tests. Also, provide an assessment of the manner in
which the test environment may be different from the operational environment
and the effect of this difference on the test results.
Instructions: Provide a brief description of the unexpected
results, problems, or defects that occurred during the testing.
4.1
Resolved Test Incidents
Instructions: Identify all resolved test incidents and
summarize their resolutions. Reference may be made to Test Incident Reports
that describe in detail the unexpected results, problems, or defects reported
during testing, along with their documented resolutions, which may be included
as an appendix to this document. If test results are maintained in an automated
tool, the information may be exported or printed from the tool for inclusion in
this document.
4.2
Unresolved Test Incidents
Instructions: Identify all unresolved test incidents and
provide a plan of action for their resolution. Reference may be made to Test
Incident Reports that describe in detail the unexpected results, problems, or
defects reported during testing, which may be included as an appendix to this
document. If test results are maintained in an automated tool, the information
may be exported or printed from the tool for inclusion in this document.
Recommendations
Instructions: Provide any recommended improvements in the
design, operation, or future testing of the business product that resulted from
the testing being reported. A discussion of each recommendation and its impact
on the business product may be provided. If there are no recommendations to
report, then simply state as such.
Appendix A: Test Incident
Reports (TIRs)
Instructions: Identify and describe any Test Incident
Reports that occurred during the course of testing activities, including:
·
Resolved Test Incident Reports (TIRs) - Include
a completed TIR for each unexpected result, problem, or defect reported and
resolved during testing.
·
Unresolved Test Incident Reports - include a
completed TIR for each unexpected result, problem, or defect reported during
testing that remains unresolved.
Table 8 - Example Test Incident Report (TIR)
|
Category |
Details |
|
Test
Incident ID |
<Test incident ID> |
|
Test
Case ID |
<Test case full name> |
|
Test
Incident Date |
<MM/DD/YYYY> |
|
Test
Incident Time |
<Test incident time> |
|
Tester
name |
<First name last name> |
|
Tester
Phone |
<NNN-NNN-NNNN> |
Table 4 - Incident Description
|
Category |
Details |
|
Error
message and/or description of unexpected result, problem, or defect. For
unexpected results, describe how the actual results differed from the
expected results |
<Error message/description of incident> |
|
Test
case procedure step where incident occurred, if applicable |
<Test case procedure step where incident occurred> |
|
Failed
software (e.g., program name, screen name, etc.), if known |
<Failed software> |
|
Test
case anomalies or special circumstances (e.g., inputs, environment, etc.) |
<Test case anomalies/special circumstances> |
|
Impact
on testing or test item |
<Impact on testing/test team> |
|
Description
Prepared By |
<First name last name> |
|
Date |
<MM/DD/YYYY> |
Table 5 - Incident Resolution
|
Category |
Details |
|
Incident
Referred to |
<First name last name> |
|
Date |
<MM/DD/YYYY> |
|
Incident
determined to be the result of |
<Program error, data error, or environmental
problem> |
|
If
“Program Error” has been selected, name program or module |
<Program or module> |
|
Impact/severity
level determined to be |
<High/Severe, Moderate/Serious, or
Low/Insignificant> |
|
Description
of all resolution activities |
<Description of resolution activities> |
|
Resolution
Prepared By |
<First name last name> |
|
Date |
<MM/DD/YYYY> |
Instructions:
Provide information on how the development and distribution of the Test Summary
Report will be controlled and tracked. Use the table below to provide the
version number, the date of the version, the author/owner of the version, and a
brief description of the reason for creating the revised version.
Table 6 - Record of Changes
|
Version Number |
Date |
Author/Owner |
Description of Change |
|
<X.X> |
<MM/DD/YYYY> |
CMS |
<Description
of Change> |
|
<X.X> |
<MM/DD/YYYY> |
CMS |
<Description
of Change> |
|
<X.X> |
<MM/DD/YYYY> |
CMS |
<Description
of Change> |
Instructions:
Provide clear and concise definitions for terms used in this document that may
be unfamiliar to readers of the document. Terms are to be listed in
alphabetical order.
|
Term |
Acronym |
Definition |
|
<Term> |
<Acronym> |
<Definition> |
|
<Term> |
<Acronym> |
<Definition> |
|
<Term> |
<Acronym> |
<Definition> |
Appendix D:
Referenced Documents
Instructions:
Summarize the relationship of this document to other relevant documents.
Provide identifying information for all documents used to arrive at and/or
referenced within this document (e.g., related and/or companion documents,
prerequisite documents, relevant technical documentation, etc.).
Table 6 - Referenced Documents
|
Document Name |
Document Location and/or URL |
Issuance Date |
|
<Document
Name> |
<Document
Location and/or URL> |
<MM/DD/YYYY> |
|
<Document
Name> |
<Document
Location and/or URL> |
<MM/DD/YYYY> |
|
<Document
Name> |
<Document
Location and/or URL> |
<MM/DD/YYYY> |
The undersigned acknowledge
that they have reviewed the Test Summary Report and agree with the information
presented within this document. Changes to this Test Summary Report will be
coordinated with, and approved by, the undersigned, or their designated
representatives.
Instructions:
List the individuals whose signatures are desired. Examples of such individuals
are Business Owner, Project Manager (if identified), and any appropriate
stakeholders. Add additional lines for signature as necessary.
|
Document Approved By |
Date Approved |
|
Name: <Name>, <Job
Title> - <Company> |
Date |
|
Name: <Name>, <Job
Title> - <Company> |
Date |
|
Name: <Name>, <Job
Title> - <Company> |
Date |
|
Name: <Name>, <Job
Title> - <Company> |
Date |
Appendix F: Notes
to the Author/Template Instructions
This document
is a template for creating a Test Summary Report for a given investment or
project. The final document should be delivered in an electronically searchable
format. The Test Summary Report should stand on its own with all elements
explained and acronyms spelled out for reader/reviewers, including reviewers
outside CMS who may not be familiar with CMS projects and investments.
This template
was designed based on best practices and information to support CMS governance
and IT processes. Use of this template is not mandatory, rather programs
are encouraged to adapt this template to their needs by adding or removing
sections as appropriate. Programs are also encouraged to leverage these
templates as the basis for web-based system development artifacts.
This template
includes instructions, boilerplate text, and fields. The author should note
that:
·
Each section provides instructions or describes
the intent, assumptions, and context for content included in that section.
Instructional text appears in blue italicized font throughout this template.
·
Instructional text in each section should be
replaced with information specific to the particular investment.
·
Some text and tables are provided as boilerplate
examples of wording and formats that may be used or modified as appropriate.
When using
this template, follow these steps:
1. Table captions and descriptions are to be placed
left-aligned, above the table.
2. Modify any boilerplate text, as appropriate, to your
specific project.
3. All documents must be compliant with Section 508
requirements.
4. Figure captions and descriptions are to be placed
left-aligned, below the figure. All figures must have an associated tag
providing appropriate alternative text for Section 508 compliance.
5. Delete this “Notes to the Author/Template Instructions” page
and all instructions to the author before finalizing the initial draft of the
document.
