您好,欢迎来到筏尚旅游网。
搜索
您的当前位置:首页IEEE Standard for Software Reviews 1028-1997

IEEE Standard for Software Reviews 1028-1997

来源:筏尚旅游网


IEEE Std 1028-1997IEEE Std 1028-1997

IEEE Standard for Software Reviews

IEEE Computer Society

Sponsored by the

Software Engineering Standards Committee

4 March 1998

SH94592

IEEE Std 1028-1997

IEEE Standard for Software Reviews

Sponsor

Software Engineering Standards Committeeof the

IEEE Computer Society

Approved 9 December 1997

IEEE Standards Board

Abstract: This standard defines five types of software reviews, together with procedures required for theexecution of each review type. This standard is concerned only with the reviews; it does not define proce-dures for determining the necessity of a review, nor does it specify the disposition of the results of thereview. Review types include management reviews, technical reviews, inspections, walk-throughs, andaudits.

Keywords: audit, inspection, review, walk-through

The Institute of Electrical and Electronics Engineers, Inc.345 East 47th Street, New York, NY 10017-2394, USACopyright © 1998 by the Institute of Electrical and Electronics Engineers, Inc.All rights reserved. Published 1998. Printed in the United States of America.ISBN 1-55937-987-1No part of this publication may be reproduced in any form, in an electronic retrieval system or otherwise, without the prior written permission of the publisher.

IEEE Standards documents are developed within the IEEE Societies and the Standards Coordinat-ing Committees of the IEEE Standards Board. Members of the committees serve voluntarily andwithout compensation. They are not necessarily members of the Institute. The standards developedwithin IEEE represent a consensus of the broad expertise on the subject within the Institute as wellas those activities outside of IEEE that have expressed an interest in participating in the develop-ment of the standard.

Use of an IEEE Standard is wholly voluntary. The existence of an IEEE Standard does not implythat there are no other ways to produce, test, measure, purchase, market, or provide other goods andservices related to the scope of the IEEE Standard. Furthermore, the viewpoint expressed at thetime a standard is approved and issued is subject to change brought about through developments inthe state of the art and comments received from users of the standard. Every IEEE Standard is sub-jected to review at least every five years for revision or reaffirmation. When a document is morethan five years old and has not been reaffirmed, it is reasonable to conclude that its contents,although still of some value, do not wholly reflect the present state of the art. Users are cautioned tocheck to determine that they have the latest edition of any IEEE Standard.

Comments for revision of IEEE Standards are welcome from any interested party, regardless ofmembership affiliation with IEEE. Suggestions for changes in documents should be in the form of aproposed change of text, together with appropriate supporting comments.

Interpretations: Occasionally questions may arise regarding the meaning of portions of standards asthey relate to specific applications. When the need for interpretations is brought to the attention ofIEEE, the Institute will initiate action to prepare appropriate responses. Since IEEE Standards rep-resent a consensus of all concerned interests, it is important to ensure that any interpretation hasalso received the concurrence of a balance of interests. For this reason, IEEE and the members of itssocieties and Standards Coordinating Committees are not able to provide an instant response tointerpretation requests except in those cases where the matter has previously received formalconsideration.

Comments on standards and requests for interpretations should be addressed to:

Secretary, IEEE Standards Board445 Hoes LaneP.O. Box 1331

Piscataway, NJ 08855-1331USA

Note: Attention is called to the possibility that implementation of this standard mayrequire use of subject matter covered by patent rights. By publication of this standard,no position is taken with respect to the existence or validity of any patent rights inconnection therewith. The IEEE shall not be responsible for identifying all patents forwhich a license may be required by an IEEE standard or for conducting inquiries intothe legal validity or scope of those patents that are brought to its attention.Authorization to photocopy portions of any individual standard for internal or personal use isgranted by the Institute of Electrical and Electronics Engineers, Inc., provided that the appropriatefee is paid to Copyright Clearance Center. To arrange for payment of licensing fee, please contactCopyright Clearance Center, Customer Service, 222 Rosewood Drive, Danvers, MA 01923 USA;(508) 750-8400. Permission to photocopy portions of any individual standard for educational class-room use can also be obtained through the Copyright Clearance Center.

Introduction

(This introduction is not part of IEEE Std 1028-1997, IEEE Standard for Software Reviews.)

This Introduction provides the user with the rationale and background of the reviews outlined in this stan-dard and their relationships to other IEEE standards.Purpose

This standard defines five types of software reviews, together with procedures required for the execution ofeach review type. This standard is concerned only with the reviews; it does not define procedures for deter-mining the necessity of a review, nor does it specify the disposition of the results of the review. Review typesinclude management reviews, technical reviews, inspections, walk-throughs, and audits.

This standard is meant to be used either in conjunction with other IEEE software engineering standards or asa stand-alone definition of software review procedures. In the latter case, local management must determinethe events that precede and follow the actual software reviews.

The need for reviews is described in several other IEEE standards, as well as standards prepared by otherstandards-writing organizations. IEEE Std 1028-1997 is meant to support these other standards. In particu-lar, reviews required by the following standards can be executed using the procedures described herein:——————————

IEEE Std 730-19 [B1]aIEEE Std 828-1990 [B2]IEEE Std 1012-1986 [B5]IEEE Std 1058.1-1987 [B8]IEEE Std 1074-1995 [B10]IEEE Std 1219-1992 [B11]IEEE Std 1220-1994 [B12]IEEE Std 1228-1994 B13]

IEEE Std 1298-1992 (AS 3563.1-1991) [B14]ISO/IEC 12207:1995 [B15]

The use of IEEE Std 1044-1993 [B7] is encouraged as part of the reporting procedures for this standard.General application intent

This standard applies throughout the scope of any selected software life-cycle model and provides a standardagainst which software review plans can be prepared and assessed. Maximum benefit can be derived fromthis standard by planning for its application early in the project life cycle.

This standard for software reviews was written in consideration of both the software and its system operatingenvironment. It can be used where software is the total system entity or where it is part of a larger system.Care should be taken to integrate software review activities into any total system life-cycle planning; soft-ware reviews should exist in concert with hardware and computer system reviews to the benefit of the entiresystem.

Reviews carried out in conformance with this standard may include both personnel internal to the projectand customers or acquirers of the product, according to local procedures. Subcontractors may also beincluded if appropriate.

aThe numbers in brackets correspond to those of the bibliography in Annex C.

Copyright © 1998 IEEE. All rights reserved.

iii

The information obtained during software reviews (particularly inspections) may be of benefit for improvingthe user’s software acquisition, supply, development, operation, and maintenance processes. The use ofreview data for process improvement is not required by this standard, but their use is strongly encouraged.Conformance

Conformance to this standard for a specific review type can be claimed when all mandatory actions (indi-cated by “shall”) are carried out as defined in this standard for the review type used. Claims for conformanceshould be phrased to indicate the review types used; for example, “conforming to IEEE Std 1028-1997 forinspections.”

Development procedure

This standard was developed by the Software Engineering Review Working Group. The entire standardswriting procedure was carried out via electronic mail.Participants

At the time this standard was completed, the Software Engineering Review Working Group had the follow-ing membership:

J. Dennis Lawrence, ChairPatricia A. Trellue, Technical Editor

Frank AckermanLeo BeltracchiRon Berlack

Antonio BertolinoRichard J. BlauwAudrey BrewerJames E. CardowHu ChengPat DaggettRonald Dean†Janet Deeney*†Claude G. DiderichLeo G. EganMartin ElliotJon Fairclough** Principal writers† Ballot resolution

Karol FruehaufAndrew GabbTom GilbJon HagarJohn Harauz

Hans-Ludwig HausenMichael HauxHerb HechtChuck HowellLaura IppolitoRikkila Juha

George X. KambicMyron S. KarasikStanley H. LevinsonMichael S. LinesJordan Matejceck

Archibald McKinlayWarren L. Persons†Peter T. Poon Christian ReiserHelmut SandmayrHans Schaefer*Katsu ShintaniMel E. SmyreJulia StesneyGina To†

André Villas-BoasDolores WallaceDavid A. WheelerRon Yun

Tony Zawilski

iv

Copyright © 1998 IEEE. All rights reserved.

The following persons were on the balloting committee:

Leo Beltracchi

Mordechai Ben-MenachemH. Ronald BerlackAudrey C. BrewerAlan L. BridgesKathleen L. BriggsDavid W. BurnettEdward R. ByrneThomas G. CallaghanStuart Ross CampbellJames E. CardowJaya R. CarlLeslie ChambersKeith Chan

John P. ChihorekS. V. ChiyyarathAntonio M. CicuTheo Clarke

Sylvain ClermontRosemary ColemanDarrell CookseyGeoff CozensThomas CrowleyGregory T. DaichHillary DavidsonBostjan K. DergancSanjay DewalMichael P. DewaltCharles DrozRobert G. EbenauChrisof EbertWilliam Eventoff

Jonathan H. FaircloughJohn W. FendrichJay Forster

Kirby FortenberryBarry L. GarnerAdel N. GhannamHiranmay Ghosh

Marilyn Ginsberg-FinnerM. Joel GittlemanJohn Garth Glynn

Copyright © 1998 IEEE. All rights reserved.

Julio Gonzalez-SanzLewis Gray

Lawrence M. GuntherJon HagarJohn HarauzRob HarkerHerbert HechtWilliam HefleyManfred HeinMark Henley

Umesh P. HiriyannaiahJohn W. HorchFabrizio ImelioGeorge JackelenFrank V. JorgensenVladan V. JovanovicWilliam S. JunkGeorge X. KambicDavid W. KaneMyron S. KarasikRon S. KenettJudy Kerner

Robert J. KierzykMotti Y. KleinDwayne L. KnirkShaye KoenigJoan Kundig

Thomas M. KuriharaJ. Dennis LawrenceRandal Leavitt

Stanley H. LevinsonMichael LinesWilliam M. LivelyDieter LookDavid MaiborPhilip P. MakTomoo MatsubaraScott D. MatthewsPatrick McCraySue McGrathBret Michael

Alan Miller

Millard Allen MobleyJames W. MooreMike OttewillMark PaulkDavid E. PeercyWarren L. PersonsJohn G. PhippenPeter T. Poon

Margaretha W. PriceLawrence S. PrzybylskiKenneth R. PtackTerence P. RoutAndrew P. SageHelmut SandmayrStephen R. SchachHans SchaeferDavid J. Schultz

Gregory D. SchumacherRobert W. ShillatoKatsutoshi ShintaniCarl A. SingerJames M. SivakAlfred R. SorkowitzDonald W. SovaFred J. StraussMichael SurrattDouglas H. ThieleBooker ThomasCarmen J. TrammellPatricia A. TrellueRichard D. TuckerMargaret C. UpdikeTheodore J. UrbanowiczGlenn D. VenablesDolores WallaceDavid A. Wheeler

Camille S. White-PartainCharles D. WilsonPaul R. WorkWeider D. YuPeter F. Zoll

v

When the IEEE Standards Board approved this standard on 9 December 1997, it had the followingmembership:

Donald C. Loughry, Chair

Richard J. Holleman, Vice Chair

Andrew G. Salem, Secretary

Lowell JohnsonRobert KennellyE. G. \"Al\" KienerJoseph L. Koepfinger*Stephen R. LambertLawrence V. McCallL. Bruce McClungMarco W. Migliaro

Louis-François PauGerald H. PetersonJohn W. PopeJose R. RamosRonald H. ReimerIngo RüschJohn S. RyanChee Kiow Tan

Howard L. Wolfman

Clyde R. Camp

Stephen L. DiamondHarold E. Epstein

Donald C. FleckensteinJay Forster*

Thomas F. GarrityDonald N. HeirmanJim Isaak

Ben C. Johnson*Member Emeritus

Also included are the following nonvoting IEEE Standards Board liaisons:

Satish K. AggarwalAlan H. CooksonPaula M. Kelty

IEEE Standards Project Editor

vi

Copyright © 1998 IEEE. All rights reserved.

Contents

1.

Overview..............................................................................................................................................11.11.21.31.41.52.3.4.

Purpose.........................................................................................................................................1Scope............................................................................................................................................1Conformance................................................................................................................................2Organization of standard..............................................................................................................2Application of standard................................................................................................................3

References............................................................................................................................................4Definitions............................................................................................................................................4Management reviews............................................................................................................................14.24.34.44...7

Introduction..................................................................................................................................5Responsibilities............................................................................................................................6Input.............................................................................................................................................7Entry criteria................................................................................................................................7Procedures....................................................................................................................................7Exit criteria...................................................................................................................................9Output..........................................................................................................................................9

5.Technical reviews................................................................................................................................95.15.25.35.45.55.65.7

Introduction..................................................................................................................................9Responsibilities..........................................................................................................................10Input...........................................................................................................................................10Entry criteria..............................................................................................................................11Procedures..................................................................................................................................11Exit criteria.................................................................................................................................13Output........................................................................................................................................13

6.Inspections.........................................................................................................................................136.16.26.36.46.56.66.76.86.9

Introduction................................................................................................................................13Responsibilities..........................................................................................................................14Input...........................................................................................................................................15Entry criteria..............................................................................................................................15Procedures..................................................................................................................................16Exit criteria.................................................................................................................................18Output........................................................................................................................................18Data collection recommendations..............................................................................................19Improvement..............................................................................................................................20

7.Walk-throughs....................................................................................................................................207.17.27.37.47.5

Introduction................................................................................................................................20Responsibilities..........................................................................................................................20Input...........................................................................................................................................21Entry criteria..............................................................................................................................21Procedures..................................................................................................................................22

Copyright © 1998 IEEE. All rights reserved.

vii

7.67.77.87.98.

Exit criteria.................................................................................................................................23Output........................................................................................................................................23Data collection recommendations..............................................................................................24Improvement..............................................................................................................................24

Audits.................................................................................................................................................258.18.28.38.48.58.68.7

Introduction................................................................................................................................25Responsibilities..........................................................................................................................26Input...........................................................................................................................................27Entry criteria..............................................................................................................................27Procedures..................................................................................................................................28Exit criteria.................................................................................................................................30Output........................................................................................................................................30

Annex A (informative) Relationship of this standard to the life cycle processes of other standards............32Annex B (informative) Comparison of review types.....................................................................................35Annex C (informative) Bibliography.............................................................................................................37

viii

Copyright © 1998 IEEE. All rights reserved.

IEEE Standard for Software Reviews

1. Overview

1.1 Purpose

The purpose of this standard is to define systematic reviews applicable to software acquisition, supply, devel-opment, operation, and maintenance. This standard describes how to carry out a review. Other standards orlocal management define the context within which a review is performed, and the use made of the results ofthe review. Software reviews can be used in support of the objectives of project management, system engi-neering (for example, functional allocation between hardware and software), verification and validation,configuration management, and quality assurance. Different types of reviews reflect differences in the goalsof each review type. Systematic reviews are described by their defined procedures, scope, and objectives.

1.2 Scope

This standard provides minimum acceptable requirements for systematic software reviews, where “system-atic” includes the following attributes:a)b)c)

Team participation

Documented results of the review

Documented procedures for conducting the review

Reviews that do not meet the requirements of this standard are considered to be nonsystematic reviews. Thisstandard is not intended to discourage or prohibit the use of nonsystematic reviews.

The definitions, requirements, and procedures for the following five types of reviews are included within thisstandard:a)b)c)d)e)

Management reviewsTechnical reviewsInspectionsWalk-throughsAudits

This standard does not establish the need to conduct specific reviews; that need is defined by other softwareengineering standards or by local procedures. This standard provides definitions, requirements, and proce-dures that are applicable to the reviews of software development products throughout the software life cycle.

Copyright © 1998 IEEE. All rights reserved.

1

IEEE

Std 1028-1997

IEEE STANDARD FOR

Users of this standard shall specify where and when this standard applies and any intended deviations fromthis standard.

It is intended that this standard be used with other software engineering standards that determine the prod-ucts to be reviewed, the timing of reviews, and the necessity for reviews. This standard is closely alignedwith IEEE Std 1012-1986 [B5],1 but can also be used with IEEE Std 1074-1995 [B10], IEEE Std 730-19[B1], ISO/IEC 12207:1995 [B15], and other standards. Use with other standards is described in Annex A. Auseful model is to consider IEEE Std 1028-1997 as a subroutine to the other standards. Thus, if IEEE Std1012-1986 were used to carry out the verification and validation process, the procedure in IEEE Std 1012-1986 could be followed until such time as instructions to carry out a specific review are encountered. At thatpoint, IEEE Std 1028-1997 would be “called” to carry out the review, using the specific review typedescribed herein. Once the review has been completed, IEEE Std 1012-1986 would be returned to for dispo-sition of the results of the review and any additional action required by IEEE Std 1012-1986.

In this model, requirements and quality attributes for the software product are “parameter inputs” to thereview and are imposed by the “caller.” When the review is finished, the review outputs are “returned” to the“caller” for action. Review outputs typically include anomaly lists and action item lists; the resolution of theanomalies and action items are the responsibility of the “caller.”

1.3 Conformance

Conformance to this standard for a specific review type can be claimed when all mandatory actions (indi-cated by “shall”) are carried out as defined in this standard for the review type used. Claims for conformanceshould be phrased to indicate the review types used; for example, “conforming to IEEE Std 1028-1997 forinspections.” The word “shall” is used to express a requirement, “should,” to express a recommendation, and“may,” to express alternative or optional methods of satisfying a requirement.

1.4 Organization of standard

Clauses 4–8 of this standard provide guidance and descriptions for the five types of systematic reviewsaddressed by this standard. Each of these clauses contains the following information:a)b)c)d)

Introduction. Describes the objectives of the systematic review and provides an overview of the sys-tematic review procedures.

Responsibilities. Defines the roles and responsibilities needed for the systematic review.Input. Describes the requirements for input needed by the systematic review.

Entry criteria. Describes the criteria to be met before the systematic review can begin, including1)Authorization2)Initiating event

Procedures. Details the procedures for the systematic review, including1)Planning the review2)Overview of procedures3)Preparation

4)Examination/evaluation/recording of results5)Rework/follow-up

Exit criteria. Describes the criteria to be met before the systematic review can be considered com-plete.

Output. Describes the minimum set of deliverables to be produced by the systematic review.

e)

f)g)

1The numbers in brackets correspond to those of the bibliography in Annex C.

2

Copyright © 1998 IEEE. All rights reserved.

SOFTWARE REVIEWS

IEEE

Std 1028-1997

1.5 Application of standard

The procedures and terminology defined in this standard apply to software acquisition, supply, development,operation, and maintenance processes requiring systematic reviews. Systematic reviews are performed on asoftware product as required by other standards or local procedures.

The term “software product” is used in this standard in a very broad sense. Examples of software productsinclude, but are not limited to, the following:a)b)c)d)e)f)g)h)i)j)k)l)m)n)o)p)q)r)s)t)u)v)w)x)y)z)aa)ab)ac)ad)ae)af)ag)ah)ai)aj)ak)

Anomaly reportsAudit reports

Back up and recovery plansBuild proceduresContingency plansContracts

Customer or user representative complaintsDisaster plans

Hardware performance plansInspection reportsInstallation plans

Installation proceduresMaintenance manualsMaintenance plans

Management review reportsOperations and user manuals

Procurement and contracting methodsProgress reportsRelease notes

Reports and data (for example, review, audit, project status, anomaly reports, test data)Request for proposalRisk management plans

Software configuration management plans (see IEEE Std 828-1990 [B2])Software design descriptions (see IEEE Std 1016-1987 [B6])

Software project management plans (see IEEE Std 1058-1987 [B8])Software quality assurance plans (see IEEE Std 730-19 [B1])Software requirements specifications (see IEEE Std 830-1993 [B4])Software safety plans (see IEEE 1228-1994 [B13])

Software test documentation (see IEEE Std 829-1983 [B3])Software user documentation (see IEEE Std 1063-1987 [B9])

Software verification and validation plans (see IEEE Std 1012-1986 [B5])Source code

Standards, regulations, guidelines, and proceduresSystem build proceduresTechnical review reportsVendor documentsWalk-through reports

This standard permits reviews that are held by means other than physically meeting in a single location.Examples include telephone conferences, video conferences, and other means of group electronic communi-cation. In such cases the communication means should be defined in addition to the meeting places, and allother review requirements remain applicable.

Copyright © 1998 IEEE. All rights reserved.

3

IEEE

Std 1028-1997

IEEE STANDARD FOR

In order to make use of this standard to carry out a software review, first decide the objective of the review.Next, select an appropriate review type using the guidance in Annex B or a local procedure. Then follow theprocedure described in the appropriate clause (4–8) of this standard.

2. References

This standard shall be used in conjunction with the following publications. If the following publications aresuperseded by an approved revision, the revision shall apply. (Additional standards that may be used to pre-pare software products that are the subject of reviews are cited in a bibliography in Annex C.)IEEE Std 100-1996, The IEEE Standard Dictionary of Electrical and Electronics Terms, Sixth Edition.IEEE Std 610.12-1990, IEEE Standard Glossary of Software Engineering Terminology.

3. Definitions

For purposes of this standard, the following terms and definitions apply. IEEE Std 610.12-19902 and IEEEStd 100-1996 should be consulted for terms not defined in this clause.

Six of the terms given here are defined in other IEEE software engineering standards. The definition of theterm “anomaly” is identical to that given in IEEE Std 1044-1993 [B7]. The terms “audit,” “inspection,”“review,” “software product,” and “walk-through” are all defined in IEEE Std 610.12-1990; however, someminor modifications have been made to those definitions to more closely match the content of this standard,as explained in the succeeding paragraph.

IEEE Std 610.12-1990 uses different terms for the object of a review: audits and reviews are defined thereinin terms of “work products,” inspections are defined in terms of “development products,” and walk-throughsare defined in terms of “segment of documentation or code.” “Work products” are not defined in IEEE Std610.12-1990. Since “software product” is defined therein, and it is desirable to use a single term in this stan-dard, a change in terminology was made. Since software products being reviewed are not limited to those“designated for delivery to a user,” that phrase was dropped from the definition of “software product.” Thedefinition of “inspection” has been changed considerably. No other changes to the definitions from IEEE Std610.12-1990 were made.

3.1 anomaly: Any condition that deviates from expectations based on requirements specifications, designdocuments, user documents, standards, etc., or from someone’s perceptions or experiences. Anomalies maybe found during, but not limited to, the review, test, analysis, compilation, or use of software products orapplicable documentation.

3.2 audit: An independent examination of a software product, software process, or set of software processesto assess compliance with specifications, standards, contractual agreements, or other criteria.

3.3 inspection: A visual examination of a software product to detect and identify software anomalies,including errors and deviations from standards and specifications. Inspections are peer examinations led byimpartial facilitators who are trained in inspection techniques. Determination of remedial or investigativeaction for an anomaly is a mandatory element of a software inspection, although the solution should not bedetermined in the inspection meeting.

2Information on references can be found in Clause 2.

4

Copyright © 1998 IEEE. All rights reserved.

IEEE

Std 1028-1997

SOFTWARE REVIEWS

3.4 management review: A systematic evaluation of a software acquisition, supply, development, operation,or maintenance process performed by or on behalf of management that monitors progress, determines thestatus of plans and schedules, confirms requirements and their system allocation, or evaluates the effective-ness of management approaches used to achieve fitness for purpose.

3.5 review: A process or meeting during which a software product is presented to project personnel, manag-ers, users, customers, user representatives, or other interested parties for comment or approval.

3.6 software product: (A) A complete set of computer programs, procedures, and associated documentationand data. (B) One or more of the individual items in (A).

3.7 technical review: A systematic evaluation of a software product by a team of qualified personnel thatexamines the suitability of the software product for its intended use and identifies discrepancies from speci-fications and standards. Technical reviews may also provide recommendations of alternatives and examina-tion of various alternatives.

3.8 walk-through: A static analysis technique in which a designer or programmer leads members of thedevelopment team and other interested parties through a software product, and the participants ask questionsand make comments about possible errors, violation of development standards, and other problems.

4. Management reviews

4.1 Introduction

The purpose of a management review is to monitor progress, determine the status of plans and schedules,confirm requirements and their system allocation, or evaluate the effectiveness of management approachesused to achieve fitness for purpose. Management reviews support decisions about corrective actions, changesin the allocation of resources, or changes to the scope of the project.

Management reviews are carried out by, or on behalf of, the management personnel having direct responsi-bility for the system. Management reviews identify consistency with and deviations from plans, or adequa-cies and inadequacies of management procedures. This examination may require more than one meeting.The examination need not address all aspects of the product.

Examples of software products subject to management review include, but are not limited toa)b)c)d)e)f)g)h)i)j)k)l)m)n)o)p)

Anomaly reportsAudit reports

Back-up and recovery plansContingency plans

Customer or user representative complaintsDisaster plans

Hardware performance plansInstallation plansMaintenance plans

Procurement and contracting methodsProgress reports

Risk management plans

Software configuration management plansSoftware project management plansSoftware quality assurance plansSoftware safety plans

Copyright © 1998 IEEE. All rights reserved.

5

IEEE

Std 1028-1997

IEEE STANDARD FOR

q)r)s)t)Software verification and validation plansTechnical review reportsSoftware product analyses

Verification and validation reports

4.2 Responsibilities

Management reviews are carried out by, or on behalf of, the management personnel having direct responsi-bility for the system. Technical knowledge may be necessary to conduct a successful management review.Management reviews shall be performed by the available personnel who are best qualified to evaluate thesoftware product.

The following roles shall be established for the management review:a)b)c)d)e)

Decision makerReview leaderRecorder

Management staffTechnical staff

The following roles may also be established for the management review:f)g)h)

Other team members

Customer or user representative

Individual participants may act in more than one role

4.2.1 Decision maker

The decision maker is the person for whom the management review is conducted. The decision maker shalldetermine if the review objectives have been met.4.2.2 Review leader

The review leader shall be responsible for administrative tasks pertaining to the review, shall be responsiblefor planning and preparation as described in 4.5.2 and 4.5.4, shall ensure that the review is conducted in anorderly manner and meets its objectives, and shall issue the review outputs as described in 4.7.4.2.3 Recorder

The recorder shall document anomalies, action items, decisions, and recommendations made by the reviewteam.

4.2.4 Management staff

Management staff assigned to carry out management reviews are responsible for active participation in thereview. Managers responsible for the system as a whole have additional responsibilities as defined in 4.5.1.4.2.5 Technical staff

The technical staff shall provide the information necessary for the management staff to fulfill its responsibil-ities.

6

Copyright © 1998 IEEE. All rights reserved.

SOFTWARE REVIEWS

IEEE

Std 1028-1997

4.2.6 Customer or user representative

The role of the customer or user representative should be determined by the review leader prior to the review.

4.3 Input

Input to the management review shall include the following:a)b)c)d)e)f)

A statement of objectives for the management reviewThe software product being evaluatedSoftware project management plan

Status, relative to plan, of the software product completed or in progressCurrent anomalies or issues listDocumented review procedures

Input to the management review should also include the following:g)h)i)j)

Status of resources, including finance, as appropriateRelevant review reports

Any regulations, standards, guidelines, plans, or procedures against which the software productshould be evaluated

Anomaly categories (See IEEE Std 1044-1993 [B7])

Additional reference material may be made available by the individuals responsible for the software productwhen requested by the review leader.

4.4 Entry criteria

4.4.1 Authorization

The need for conducting management reviews should initially be established in the appropriate project plan-ning documents, as listed in 4.1. Under these plans, completion of a specific software product or completionof an activity may initiate a management review. In addition to those management reviews required by a spe-cific plan, other management reviews may be announced and held at the request of software quality manage-ment, functional management, project management, or the customer or user representative, according tolocal procedures.4.4.2 Preconditions

A management review shall be conducted only when both of the following conditions have been met:a)b)

A statement of objectives for the review is established by the management personnel for whom thereview is being carried out

The required review inputs are available

4.5 Procedures

4.5.1 Management preparation

Managers shall ensure that the review is performed as required by applicable standards and procedures andby requirements mandated by law, contract, or other policy. To this end, managers shall

Copyright © 1998 IEEE. All rights reserved.

7

IEEE

Std 1028-1997

IEEE STANDARD FOR

a)b)c)d)e)f)Plan time and resources required for reviews, including support functions, as required in IEEE Std1058.1-1987 [B8] or other appropriate standards

Provide funding and facilities required to plan, define, execute, and manage the reviewsProvide training and orientation on review procedures applicable to a given project

Ensure appropriate levels of expertise and knowledge sufficient to comprehend the software productunder review

Ensure that planned reviews are conducted

Act on review team recommendations in a timely manner

4.5.2 Planning the review

The review leader shall be responsible for the following activities:a)b)c)d)e)

Identify, with appropriate management support, the review teamAssign specific responsibilities to the review team membersSchedule and announce the meeting

Distribute review materials to participants, allowing adequate time for their preparation

Set a timetable for distribution of review material, the return of comments, and forwarding of com-ments to the author for disposition

4.5.3 Overview of review procedures

A qualified person should present an overview session for the review team when requested by the reviewleader. This overview may occur as part of the review meeting (see 4.5.6) or as a separate meeting.4.5.4 Preparation

Each review team member shall examine the software product and other review inputs prior to the reviewmeeting. Anomalies detected during this examination should be documented and sent to the review leader.The review leader should classify anomalies to ensure that review meeting time is used most effectively. Thereview leader should forward the anomalies to the author of the software product for disposition.4.5.5 Examination

The management review shall consist of one or more meetings of the review team. The meetings shallaccomplish the following goals:a)b)c)d)e)f)

Review the objectives of the management review

Evaluate the software product under review against the review objectivesEvaluate project status, including the status of plans and schedulesReview anomalies identified by the review team prior to the reviewGenerate a list of action items, emphasizing risksDocument the meeting

The meetings should accomplish the following goals as appropriate:g)h)i)j)

Evaluate the risk issues that may jeopardize the success of the projectConfirm software requirements and their system allocation

Decide the course of action to be taken or recommendations for actionIdentify other issues that should be addressed

4.5.6 Rework/follow-up

The review leader shall verify that the action items assigned in the meeting are closed.

8

Copyright © 1998 IEEE. All rights reserved.

IEEE

Std 1028-1997

SOFTWARE REVIEWS

4.6 Exit criteria

The management review shall be considered complete when the activities listed in 4.5.5 have been accom-plished and the output described in 4.7 exists.

4.7 Output

The output from the management review shall be documented evidence that identifiesa)b)c)d)e)f)g)

The project being reviewedThe review team membersReview objectives

Software product reviewedSpecific inputs to the review

Action item status (open, closed), ownership and target date (if open) or completion date (if closed)A list of anomalies identified by the review team that must be addressed for the project to meet itsgoals

Although this standard sets minimum requirements for the content of the documented evidence, it is left tolocal procedures to prescribe additional content, format requirements, and media.

5. Technical reviews

5.1 Introduction

The purpose of a technical review is to evaluate a software product by a team of qualified personnel to deter-mine its suitability for its intended use and identify discrepancies from specifications and standards. It pro-vides management with evidence to confirm whethera)b)c)

The software product conforms to its specifications

The software product adheres to regulations, standards, guidelines, plans, and procedures applicableto the project

Changes to the software product are properly implemented and affect only those system areas iden-tified by the change specification

Technical reviews may also provide the recommendation and examination of various alternatives, whichmay require more than one meeting. The examination need not address all aspects of the product.Examples of software products subject to technical review include, but are not limited toa)b)c)d)e)f)g)h)

Software requirements specificationSoftware design descriptionSoftware test documentationSoftware user documentationMaintenance manualSystem build proceduresInstallation proceduresRelease notes

Copyright © 1998 IEEE. All rights reserved.

9

IEEE

Std 1028-1997 IEEE STANDARD FOR

5.2 Responsibilities

The following roles shall be established for the technical review:a) Decision makerb) Review leaderc) Recorder

d) Technical staff

The following roles may also be established for the technical review:e) Management stafff) Other team members

g) Customer or user representative

Individual participants may act in more than one role.5.2.1 Decision maker

The decision maker is the person for whom the technical review is conducted. The decision maker shalldetermine if the review objectives have been met.5.2.2 Review leader

The review leader shall be responsible for the review. This responsibility includes performing administrativetasks pertaining to the review, ensuring that the review is conducted in an orderly manner, and ensuring thatthe review meets its objectives. The review leader shall issue the review outputs as described in 5.7.5.2.3 Recorder

The recorder shall document anomalies, action items, decisions, and recommendations made by the reviewteam.

5.2.4 Technical staff

The technical staff shall actively participate in the review and evaluation of the software product.5.2.5 Management staff

The management staff may participate in the technical review for the purpose of identifying issues thatrequire management resolution.

5.2.6 Customer or user representative

The role of the customer or user representative should be determined by the review leader prior to the review.

5.3 Input

Input to the technical review shall include the following:a) A statement of objectives for the technical reviewb) The software product being examinedc) Software project management plan

10

Copyright © 1998 IEEE. All rights reserved.

SOFTWARE REVIEWS

IEEE

Std 1028-1997

d)e)Current anomalies or issues list for the software productDocumented review procedures

Input to the technical review should also include the following:f)g)h)

Relevant review reports

Any regulations, standards, guidelines, plans, and procedures against which the software product isto be examined

Anomaly categories (See IEEE Std 1044-1993 [B7])

Additional reference material may be made available by the individuals responsible for the software productwhen requested by the review leader.

5.4 Entry criteria

5.4.1 Authorization

The need for conducting technical reviews of a software product shall be defined by project planning docu-ments. In addition to those technical reviews required by a specific plan, other technical reviews may beannounced and held at the request of functional management, project management, software quality man-agement, systems engineering, or software engineering according to local procedures. Technical reviewsmay be required to evaluate impacts of hardware anomalies or deficiencies on the software product.5.4.2 Preconditions

A technical review shall be conducted only when both of the following conditions have been met:a)b)

A statement of objectives for the review is establishedThe required review inputs are available

5.5 Procedures

5.5.1 Management preparation

Managers shall ensure that the review is performed as required by applicable standards and procedures andby requirements mandated by law, contract, or other policy. To this end, managers shalla)b)c)d)e)f)

Plan time and resources required for reviews, including support functions, as required in IEEE Std1058.1-1987 [B8] or other appropriate standards

Provide funding and facilities required to plan, define, execute, and manage the reviewsProvide training and orientation on review procedures applicable to a given project

Ensure that review team members possess appropriate levels of expertise and knowledge sufficientto comprehend the software product under reviewEnsure that planned reviews are conducted

Act on review team recommendations in a timely manner

5.5.2 Planning the review

The review leader shall be responsible for the following activities:a)b)c)

Identify, with appropriate management support, the review teamAssign specific responsibilities to the review team membersSchedule and announce the meeting place

Copyright © 1998 IEEE. All rights reserved.

11

IEEE

Std 1028-1997

IEEE STANDARD FOR

d)e)

Distribute review materials to participants, allowing adequate time for their preparation

Set a timetable for distribution of review material, the return of comments and forwarding of com-ments to the author for disposition

As a part of the planning procedure, the review team shall determine if alternatives are to be discussed at thereview meeting. Alternatives may be discussed at the review meeting, afterwards in a separate meeting, orleft to the author of the software product to resolve.5.5.3 Overview of review procedures

A qualified person should present an overview of the review procedures for the review team when requestedby the review leader. This overview may occur as a part of the review meeting (see 5.5.6) or as a separatemeeting.

5.5.4 Overview of the software product

A technically qualified person should present an overview of the software product for the review team whenrequested by the review leader. This overview may occur either as a part of the review meeting (see 5.5.6) oras a separate meeting.5.5.5 Preparation

Each review team member shall examine the software product and other review inputs prior to the reviewmeeting. Anomalies detected during this examination should be documented and sent to the review leader.The review leader should classify anomalies to ensure that review meeting time is used most effectively. Thereview leader should forward the anomalies to the author of the software product for disposition.

The review leader shall verify that team members are prepared for the technical review. The review leadershould gather individual preparation times and record the total. The review leader shall reschedule the meet-ing if the team members are not adequately prepared.5.5.6 Examination

During the technical review the review team shall hold one or more meetings. The meetings shall accom-plish the following goals:a)b)c)

Decide on the agenda for evaluating the software product and anomaliesEvaluate the software productDetermine if

1)The software product is complete;

2)The software product conforms to the regulations, standards, guidelines, plans and procedures

applicable to the project;

3)Changes to the software product are properly implemented and affect only the specified areas; 4)The software product is suitable for its intended use;5)The software product is ready for the next activity; 6)Hardware anomalies or specification discrepancies existIdentify anomalies

Generate a list of action items, emphasizing risksDocument the meeting

d)e)f)

After the software product has been reviewed, documentation shall be generated to document the meeting,list anomalies found in the software product, and describe any recommendations to management.

12

Copyright © 1998 IEEE. All rights reserved.

IEEE

Std 1028-1997

SOFTWARE REVIEWS

When anomalies are sufficiently critical or numerous, the review leader should recommend that an addi-tional review be applied to the modified software product. This, at a minimum, should cover product areaschanged to resolve anomalies as well as side effects of those changes.5.5.7 Rework/follow-up

The review leader shall verify that the action items assigned in the meeting are closed.

5.6 Exit criteria

A technical review shall be considered complete when the activities listed in 5.5.6 have been accomplished,and the output described in 5.7 exists.

5.7 Output

The output from the technical review shall consist of documented evidence that identifiesa)b)c)d)e)f)g)h)i)j)k)

The project being reviewedThe review team membersThe software product reviewedSpecific inputs to the review

Review objectives and whether they were met

A list of resolved and unresolved software product anomalies

A list of unresolved system or hardware anomalies or specification action itemsA list of management issues

Action item status (open, closed), ownership and target date (if open), or completion date (if closed)Any recommendations made by the review team on how to dispose of unresolved issues and anoma-lies

Whether the software product meets the applicable regulations, standards, guidelines, plans, andprocedures without deviations

Although this standard sets minimum requirements for the content of the documented evidence, it is left tolocal procedures to prescribe additional content, format requirements, and media.

6. Inspections

6.1 Introduction

The purpose of an inspection is to detect and identify software product anomalies. This is a systematic peerexamination thata)b)c)d)e)f)

Verifies that the software product satisfies its specifications

Verifies that the software product satisfies specified quality attributes

Verifies that the software product conforms to applicable regulations, standards, guidelines, plans,and procedures

Identifies deviations from standards and specifications

Collects software engineering data (for example, anomaly and effort data) (optional)

Uses the collected software engineering data to improve the inspection process itself and its support-ing documentation (for example, checklists) (optional)

Copyright © 1998 IEEE. All rights reserved.

13

IEEE

Std 1028-1997

IEEE STANDARD FOR

Inspections consist of three to six participants. An inspection is led by an impartial facilitator who is trainedin inspection techniques. Determination of remedial or investigative action for an anomaly is a mandatoryelement of a software inspection, although the resolution should not occur in the inspection meeting. Collec-tion of data for the purpose of analysis and improvement of software engineering procedures (including allreview procedures) is strongly recommended but is not a mandatory element of software inspections.Examples of software products subject to inspections include, but are not limited toa)b)c)d)e)f)g)h)i)

Software requirements specificationSoftware design descriptionSource code

Software test documentationSoftware user documentationMaintenance manualSystem build proceduresInstallation proceduresRelease notes

6.2 Responsibilities

The following roles shall be established for the inspection:a)b)c)d)e)

Inspection leaderRecorderReaderAuthorInspector

All participants in the review are inspectors. The author shall not act as inspection leader and should not actas reader or recorder. Other roles may be shared among the team members. Individual participants may actin more than one role.

Individuals holding management positions over any member of the inspection team shall not participate inthe inspection.

6.2.1 Inspection leader

The inspection leader shall be responsible for administrative tasks pertaining to the inspection, shall beresponsible for planning and preparation as described in 6.5.2 and 6.5.4, shall ensure that the inspection isconducted in an orderly manner and meets its objectives, should be responsible for collecting inspection data(if appropriate), and shall issue the inspection output as described in 6.7.6.2.2 Recorder

The recorder shall document anomalies, action items, decisions, and recommendations made by the inspec-tion team. The recorder should record inspection data required for process analysis. The inspection leadermay be the recorder.6.2.3 Reader

The reader shall lead the inspection team through the software product in a comprehensive and logical fash-ion, interpreting sections of the work (for example, generally paraphrasing groups of 1–3 lines), and high-lighting important aspects.

14

Copyright © 1998 IEEE. All rights reserved.

SOFTWARE REVIEWS

IEEE

Std 1028-1997

6.2.4 Author

The author shall be responsible for the software product meeting its inspection entry criteria, for contributingto the inspection based on special understanding of the software product, and for performing any reworkrequired to make the software product meet its inspection exit criteria.6.2.5 Inspector

Inspectors shall identify and describe anomalies in the software product. Inspectors shall be chosen to repre-sent different viewpoints at the meeting (for example, sponsor, requirements, design, code, safety, test, inde-pendent test, project management, quality management, and hardware engineering). Only those viewpointspertinent to the inspection of the product should be present.

Some inspectors should be assigned specific review topics to ensure effective coverage. For example, oneinspector may focus on conformance with a specific standard or standards, another on syntax, another foroverall coherence. These roles should be assigned by the inspection leader when planning the inspection, asprovided in 6.5.2 (b).

6.3 Input

Input to the inspection shall include the following:a)b)c)d)e)

A statement of objectives for the inspectionThe software product to be inspectedDocumented inspection procedureInspection reporting formsCurrent anomalies or issues list

Input to the inspection may also include the following:f)g)h)i)j)

Inspection checklists

Any regulations, standards, guidelines, plans, and procedures against which the software product isto be inspected

Hardware product specificationsHardware performance data

Anomaly categories (see IEEE Std 1044-1993 [B7])

Additional reference material may be made available by the individuals responsible for the software productwhen requested by the inspection leader.

6.4 Entry criteria

6.4.1 Authorization

Inspections shall be planned and documented in the appropriate project planning documents (for example,the overall project plan, or software verification and validation plan).

Additional inspections may be conducted during acquisition, supply, development, operation, and mainte-nance of the software product at the request of project management, quality management, or the author,according to local procedures.

Copyright © 1998 IEEE. All rights reserved.

15

IEEE

Std 1028-1997

IEEE STANDARD FOR

6.4.2 Preconditions

An inspection shall be conducted only when both of the following conditions have been met:a)b)

A statement of objectives for the inspection is established.The required inspection inputs are available.

6.4.3 Minimum entry criteria

An inspection shall not be conducted until all of the following events have occurred, unless there is a docu-mented rationale, accepted by management, for exception from these provisions:a)b)c)d)e)

The software product that is to be inspected is complete and conforms to project standards for con-tent and format.

Any automated error-detecting tools (such as spell-checkers and compilers) required for the inspec-tion are available.

Prior milestones are satisfied as identified in the appropriate planning documents.Required supporting documentation is available.

For a re-inspection, all items noted on the anomaly list that affect the software product under inspec-tion are resolved.

6.5 Procedures

6.5.1 Management preparation

Managers shall ensure that the inspection is performed as required by applicable standards and proceduresand by requirements mandated by law, contract, or other policy. To this end, managers shalla)b)c)d)e)f)

Plan time and resources required for inspection, including support functions, as required in IEEE Std1058.1-1987 [B8] or other appropriate standards

Provide funding and facilities required to plan, define, execute, and manage the inspectionProvide training and orientation on inspection procedures applicable to a given project

Ensure that review team members possess appropriate levels of expertise and knowledge sufficientto comprehend the software product under inspectionEnsure that planned inspections are conducted

Act on inspection team recommendations in a timely manner

6.5.2 Planning the inspection

The author shall assemble the inspection materials for the inspection leader.The inspection leader shall be responsible for the following activities:a)b)c)d)e)

Identifying, with appropriate management support, the inspection teamAssigning specific responsibilities to the inspection team membersScheduling the meeting and selecting the meeting place

Distributing inspection materials to participants, and allowing adequate time for their preparationSetting a timetable for distribution of inspection material and for the return of comments and for-warding of comments to the author for disposition

As a part of the planning procedure, the inspection team shall determine if alternatives are to be discussed atthe inspection meeting. Alternatives may be discussed at the inspection meeting, afterwards in a separatemeeting, or left to the authors of the software product to resolve.

16

Copyright © 1998 IEEE. All rights reserved.

SOFTWARE REVIEWS

IEEE

Std 1028-1997

6.5.3 Overview of inspection procedures

The author should present an overview of the software product to be inspected. This overview should beused to introduce the inspectors to the software product. The overview may be attended by other project per-sonnel who could profit from the presentation.

Roles shall be assigned by the inspection leader. The inspection leader shall answer questions about anychecklists and the role assignments and should present inspection data such as minimal preparation timesand the typical number of anomalies found in past similar products.6.5.4 Preparation

Each inspection team member shall examine the software product and other review inputs prior to the reviewmeeting. Anomalies detected during this examination shall be documented and sent to the inspection leader.The inspection leader should classify anomalies to ensure that inspection meeting time is used effectively.The inspection leader should forward the anomalies to the author of the software product for disposition.The inspection leader or reader shall specify a suitable order in which the software product will be inspected(such as sequential, hierarchical, data flow, control flow, bottom up, or top down). The reader shall ensurethat he or she is able to present the software product at the inspection meeting.6.5.5 Examination

The inspection meeting shall follow this agenda:6.5.5.1 Introduce meeting

The inspection leader shall introduce the participants and describe their roles. The inspection leader shallstate the purpose of the inspection and should remind the inspectors to focus their efforts toward anomalydetection, not resolution. The inspection leader should remind the inspectors to direct their remarks to thereader and to comment only on the software product, not their author. Inspectors may pose questions to theauthor regarding the software product. The inspection leader shall resolve any special procedural questionsraised by the inspectors.

6.5.5.2 Establish preparedness

The inspection leader shall verify that inspectors are prepared for the inspection. The inspection leader shallreschedule the meeting if the inspectors are not adequately prepared. The inspection leader should gatherindividual preparation times and record the total in the inspection documentation.6.5.5.3 Review general items

Anomalies referring to the software product in general (and thus not attributable to a specific instance orlocation) shall be presented to the inspectors and recorded.6.5.5.4 Review software product and record anomalies

The reader shall present the software product to the inspection team. The inspection team shall examine thesoftware product objectively and thoroughly, and the inspection leader shall focus this part of the meeting oncreating the anomaly list. The recorder shall enter each anomaly, location, description, and classification onthe anomaly list. IEEE Std 1044-1993 [B7] may be used to classify anomalies. During this time, the authorshall answer specific questions and contribute to anomaly detection based on the author’s special under-standing of the software product. If there is disagreement about an anomaly, the potential anomaly shall belogged and marked for resolution at the end of the meeting.

Copyright © 1998 IEEE. All rights reserved.

17

IEEE

Std 1028-1997

IEEE STANDARD FOR

6.5.5.5 Review the anomaly list

At the end of the inspection meeting, the inspection leader should have the anomaly list reviewed with theteam to ensure its completeness and accuracy. The inspection leader should allow time to discuss everyanomaly where disagreement occurred. The inspection leader should not allow the discussion to focus onresolving the anomaly but on clarifying what constitutes the anomaly.6.5.5.6 Make exit decision

The purpose of the exit decision is to bring an unambiguous closure to the inspection meeting. The exit deci-sion shall determine if the software product meets the inspection exit criteria and shall prescribe any appro-priate rework and verification. Specifically, the inspection team shall identify the software productdisposition as one of the following:a)b)c)

Accept with no or minor rework. The software product is accepted as is or with only minor rework(for example, that would require no further verification).

Accept with rework verification. The software product is to be accepted after the inspection leader ora designated member of the inspection team (other than the author) verifies rework.

Re-inspect. Schedule a re-inspection to verify rework. At a minimum, a re-inspection shall examinethe software product areas changed to resolve anomalies identified in the last inspection, as well asside effects of those changes.

6.5.6 Rework/follow-up

The inspection leader shall verify that the action items assigned in the meeting are closed.

6.6 Exit criteria

An inspection shall be considered complete when the activities listed in 6.5.5 have been accomplished, andthe output described in 6.7 exists.

6.7 Output

The output of the inspection shall be documented evidence that identifiesa)b)c)d)e)f)g)h)i)j)k)

The project being inspectedThe inspection team membersThe inspection meeting duration The software product inspected

The size of the materials inspected (for example, the number of text pages)Specific inputs to the inspection

Inspection objectives and whether they were met

The anomaly list, containing each anomaly location, description, and classification

The inspection anomaly summary listing the number of anomalies identified by each anomaly cate-gory

The disposition of the software product

An estimate of the rework effort and rework completion date

The output of the inspection should include the following documentation:l)

The total preparation time of the inspection team

18

Copyright © 1998 IEEE. All rights reserved.

SOFTWARE REVIEWS

IEEE

Std 1028-1997

Although this standard sets minimum requirements for the content of the documented evidence, it is left tolocal procedures to prescribe additional content, format requirements, and media.

6.8 Data collection recommendations

Inspections should provide data for the analysis of the quality of the software product, the effectiveness ofthe acquisition, supply, development, operation and maintenance processes, and the efficiency of the inspec-tion itself. In order to maintain the effectiveness of inspections, data should not be used to evaluate the per-formance of individuals. To enable these analyses, anomalies that are identified at an inspection meetingshould be classified in accordance with 6.8.1 through 6.8.3.

Inspection data should contain the identification of the software product, the date and time of the inspection,the inspection leader, the preparation and inspection times, the volume of the materials inspected, and thedisposition of the inspected software product. The capture of this information can be used to optimize localguidance for inspections.

The management of inspection data requires a capability to store, enter, access, update, summarize, andreport categorized anomalies. The frequency and types of the inspection analysis reports, and their distribu-tion, are left to local standards and procedures.6.8.1 Anomaly classification

Anomalies may be classified by technical type according to, for example, IEEE Std 1044-1993 [B7].6.8.2 Anomaly classes

Anomaly classes provide evidence of nonconformance and may be categorized, for example, asa)b)c)d)e)f)g)h)i)j)

Missing

Extra (superfluous)AmbiguousInconsistent

Improvement desirable

Not conforming to standards

Risk-prone, i.e., the review finds that, although an item was not shown to be “wrong,” the approachtaken involves risks (and there are known safer alternative methods)Factually incorrect

Not implementable (e.g., because of system constraints or time constraints)Editorial

6.8.3 Anomaly ranking

Anomalies may be ranked by potential impact on the software product, for example, asa)b)

Major. Anomalies that would result in failure of the software product or an observable departurefrom specification.

Minor. Anomalies that deviate from relevant specifications but will not cause failure of the softwareproduct or an observable departure in performance.

6.9 Improvement

Inspection data should be analyzed regularly in order to improve the inspection itself, and the software activ-ities used to produce software products. Frequently occurring anomalies may be included in the inspection

Copyright © 1998 IEEE. All rights reserved.

19

IEEE

Std 1028-1997

IEEE STANDARD FOR

checklists or role assignments. The checklists themselves should also be inspected regularly for superfluousor misleading questions. The preparation times, meeting times, and number of participants should be ana-lyzed to determine connections between preparation rate, meeting rate, and number and severity of anoma-lies found.

A “chief inspector” role should exist. The chief inspector acts as the inspection owner, and collects and feedsback data about the inspection. This chief inspector should be responsible for the proposed follow-up on theinspection itself.

7. Walk-throughs

7.1 Introduction

The purpose of a systematic walk-through is to evaluate a software product. A walk-through may be held forthe purpose of educating an audience regarding a software product. The major objectives are toa)b)c)d)

Find anomalies

Improve the software product

Consider alternative implementations

Evaluate conformance to standards and specifications

Other important objectives of the walk-through include exchange of techniques and style variations andtraining of the participants. A walk-through may point out several deficiencies (for example, efficiency andreadability problems in the software product, modularity problems in design or code, or untestable specifica-tions).

Examples of software products subject to walk-throughs include, but are not limited to,a)b)c)d)e)f)g)h)i)

Software requirements specificationSoftware design descriptionSource code

Software test documentationSoftware user documentationMaintenance manualSystem build proceduresInstallation proceduresRelease notes

7.2 Responsibilities

The following roles shall be established for the walk-through:a)b)c)d)

Walk-through leaderRecorderAuthor

Team member

For a review to be considered a systematic walk-through, a team of at least two members shall be assembled.Roles may be shared among the team members. The walk-through leader or the author may serve as therecorder. The walk-through leader may be the author.

20

Copyright © 1998 IEEE. All rights reserved.

SOFTWARE REVIEWS

IEEE

Std 1028-1997

Individuals holding management positions over any member of the walk-through team shall not participatein the walk-through.7.2.1 Walk-through leader

The walk-through leader shall conduct the walk-through, shall handle the administrative tasks pertaining tothe walk-through (such as distributing documents and arranging the meeting), and shall ensure that the walk-through is conducted in an orderly manner. The walk-through leader shall prepare the statement of objec-tives to guide the team through the walk-through. The walk-through leader shall ensure that the team arrivesat a decision or identified action for each discussion item, and shall issue the walk-through output asdescribed in 7.7.7.2.2 Recorder

The recorder shall note all decisions and identified actions arising during the walk-through meeting. In addi-tion, the recorder should note all comments made during the walk-through that pertain to anomalies found,questions of style, omissions, contradictions, suggestions for improvement, or alternative approaches.7.2.3 Author

The author should present the software product in the walk-through.

7.3 Input

Input to the walk-through shall include the following:a)b)c)

A statement of objectives for the walk-throughThe software product being examined

Standards that are in effect for the acquisition, supply, development, operation, and/or maintenanceof the software product

Input to the walk-through may also include the following:d)e)

Any regulations, standards, guidelines, plans, and procedures against which the software product isto be inspected

Anomaly categories (see IEEE Std 1044-1993 [B7])

7.4 Entry criteria

7.4.1 Authorization

The need for conducting walk-throughs shall be established in the appropriate project planning documents.Additional walk-throughs may be conducted during acquisition, supply, development, operation, and main-tenance of the software product at the request of project management, quality management, or the author,according to local procedures.7.4.2 Preconditions

A walk-through shall be conducted only when both of the following conditions have been met:a)b)

A statement of objectives for the review is established by the management personnel for whom thereview is being carried out.

The required review inputs are available.

Copyright © 1998 IEEE. All rights reserved.

21

IEEE

Std 1028-1997

IEEE STANDARD FOR

7.5 Procedures

7.5.1 Management preparation

Managers shall ensure that the walk-through is performed as required by applicable standards and proce-dures and by requirements mandated by law, contract, or other policy. To this end, managers shalla)b)c)d)e)f)

Plan time and resources required for walk-throughs, including support functions, as required inIEEE Std 1058.1-1987 [B8] or other appropriate standards

Provide funding and facilities required to plan, define, execute, and manage the walk-throughProvide training and orientation on walk-through procedures applicable to a given project

Ensure that walk-through team members possess appropriate levels of expertise and knowledge suf-ficient to comprehend the software product

Ensure that planned walk-throughs are conducted

Act on walk-through team recommendations in a timely manner

7.5.2 Planning the walk-through

The walk-through leader shall be responsible for the following activities:a)b)c)

Identifying the walk-through team

Scheduling the meeting and selecting the meeting place

Distributing necessary input materials to participants, and allowing adequate time for their prepara-tion

7.5.3 Overview

An overview presentation should be made by the author as part of the walk-through meeting.7.5.4 Preparation

The walk-through leader shall distribute the software product and convene a walk-through meeting. Teammembers shall prepare for the meeting by examining the software product and preparing a list of items fordiscussion in the meeting. These items should be divided into two categories: general and specific. Generalitems apply to the whole product; specific items apply to a part of it.

Each walk-through team member shall examine the software product and other review inputs prior to thereview meeting. Anomalies detected during this examination shall be documented and sent to the walk-through leader. The walk-through leader should classify anomalies to ensure that walk-through meeting timeis used effectively. The walk-through leader should forward the anomalies to the author of the software prod-uct for disposition.

The author or walk-through leader shall specify a suitable order in which the software product will beinspected (such as sequential, hierarchical, data flow, control flow, bottom up, or top down).7.5.5 Examination

The walk-through leader shall introduce the participants and describe their roles. The walk-through leadershall state the purpose of the walk-through and should remind the team members to focus their effortstoward anomaly detection, not resolution. The walk-through leader should remind the team members tocomment only on the software product, not its author. Team members may pose questions to the authorregarding the software product. The walk-through leader shall resolve any special procedural questionsraised by the team members.

22

Copyright © 1998 IEEE. All rights reserved.

SOFTWARE REVIEWS

IEEE

Std 1028-1997

The author shall present an overview of the software product under review. This is followed by a general dis-cussion during which team members raise their general items. After the general discussion, the author seri-ally presents the software product in detail (hence the name “walk-through”). Team members raise theirspecific items when the author reaches them in the presentation. New items may be raised during the meet-ing. The walk-through leader coordinates discussion and guides the meeting to a decision or identified actionon each item. The recorder notes all recommendations and required actions.During the walk-through meeting,a)b)c)d)e)

The author or walk-through leader should make an overview presentation of the software productunder examination

The walk-through leader shall coordinate a discussion of the general anomalies of concern

The author or walk-through leader shall present the software product, describing every portion of itTeam members shall raise specific anomalies as the author reaches the part of the software productto which the anomalies relate

The recorder shall note recommendations and actions arising out of the discussion upon each anom-aly

After the walk-through meeting, the walk-through leader shall issue the walk-through output detailinganomalies, decisions, actions, and other information of interest. Minimum content requirements for thewalk-through output are provided in 7.7.7.5.6 Rework/follow-up

The walk-through leader shall verify that the action items assigned in the meeting are closed.

7.6 Exit criteria

The walk-through shall be considered complete whena)b)c)

The entire software product has been examined

Recommendations and required actions have been recordedThe walk-through output has been completed

7.7 Output

The output of the walk-through shall be documented evidence that identifiesa)b)c)d)e)f)g)

The walk-through team membersThe software product being examined

The statement of objectives that were to be accomplished during this walk-through meeting andwhether they were met

A list of the recommendations made regarding each anomalyA list of actions, due dates, and responsible people

Any recommendations made by the walk-through team on how to dispose of deficiencies and unre-solved anomalies

Any proposals made by the walk-through team for follow-up walk-throughs

Although this standard sets minimum requirements for the content of the documented evidence, it is left tolocal procedures to prescribe additional content, format requirements, and media.

Copyright © 1998 IEEE. All rights reserved.

23

IEEE

Std 1028-1997

IEEE STANDARD FOR

7.8 Data collection recommendations

Walk-throughs should provide data for the analysis of the quality of the software product, the effectivenessof the acquisition, supply, development, operation, and maintenance processes, and the efficiency of thewalk-through itself. In order to maintain the effectiveness of walk-throughs, data should not be used to eval-uate the performance of individuals. To enable these analyses, anomalies that are identified at a walk-through meeting should be classified in accordance with 7.8.1 through 7.8.3.

Walk-through data should contain the identification of the software product, the date and time of the walk-through, the walk-through leader, the preparation and walk-through times, the volume of the materialswalked through, and the disposition of the software product. The capture of this information can be used tooptimize local guidance for walk-throughs.

The management of walk-through data requires a capability to store, enter, access, update, summarize, andreport categorized anomalies. The frequency and types of the walk-through analysis reports, and their distri-bution, are left to local standards and procedures.7.8.1 Anomaly classification

Anomalies may be classified by technical type according to, for example, IEEE Std 1044-1993 [B7].7.8.2 Anomaly classes

Anomaly classes provide evidence of nonconformance, and may be categorized, for example, asa)b)c)d)e)f)g)h)i)j)

Missing

Extra (superfluous)AmbiguousInconsistent

Improvement desirable

Not conforming to standards

Risk-prone, i.e., the review finds that although an item was not shown to be “wrong,” the approachtaken involves risks (and there are known safer alternative methods)Factually incorrect

Not implementable (e.g., because of system constraints or time constraints)Editorial

7.8.3 Anomaly ranking

Anomalies may be ranked by potential impact on the software product, for example, asa)b)

Major. Anomalies that would result in failure of the software product or an observable departurefrom specification

Minor. Anomalies that deviate from relevant specifications but will not cause failure of the softwareproduct or an observable departure in performance

7.9 Improvement

Walk-through data should be analyzed regularly in order to improve the walk-through itself and to improvethe software activities used to produce the software product. Frequently occurring anomalies may beincluded in the walk-through checklists or role assignments. The checklists themselves should also beinspected regularly for superfluous or misleading questions. The preparation times, meeting times, and num-

24

Copyright © 1998 IEEE. All rights reserved.

SOFTWARE REVIEWS

IEEE

Std 1028-1997

ber of participants should be analyzed to determine connections between preparation rate, meeting rate, andnumber and severity of anomalies found.

8. Audits

8.1 Introduction

The purpose of a software audit is to provide an independent evaluation of conformance of software prod-ucts and processes to applicable regulations, standards, guidelines, plans, and procedures.Examples of software products subject to audit include, but are not limited to, the following:a)b)c)d)e)f)g)h)i)j)k)l)m)n)o)p)q)r)s)t)u)v)w)x)y)z)aa)ab)ac)ad)ae)af)

Back-up and recovery plansContingency plansContracts

Customer or user representative complaintsDisaster plans

Hardware performance plansInstallation plans

Installation proceduresMaintenance plans

Management review reportsOperations and user manuals

Procurement and contracting methods

Reports and data (for example, review, audit, project status, anomaly reports, test data)Request for proposalRisk management plans

Software configuration management plans (see IEEE Std 828-1990 [B2])Software design descriptions (see IEEE Std 1016-1987 [B6])Source code

Unit development folders

Software project management plans (see IEEE Std. 1058-1987 [B8])Software quality assurance plans (see IEEE Std 730-19 [B1])Software requirements specifications (see IEEE Std 830-1993 [B4])Software safety plans (see IEEE Std 1228-1994 [B13])Software test documentation (see IEEE Std 829-1983 [B3])Software user documentation (see IEEE Std 1063-1987 [B9])

Software verification and validation plans (see IEEE Std 1012-1986 [B5])Standards, regulations, guidelines, and proceduresSystem build proceduresTechnical review reportsVendor documentsWalk-through reports

Deliverable media (such as tapes and diskettes)

The examination should begin with an overview meeting during which the auditors and audited organizationexamine and agree upon the arrangements for the audit.

When stipulated in the audit plan, the auditors may make recommendations. These should be reported sepa-rately.

Copyright © 1998 IEEE. All rights reserved.

25

IEEE

Std 1028-1997

IEEE STANDARD FOR

8.2 Responsibilities

The following roles shall be established for an audit:a)b)c)d)e)

Lead auditorRecorderAuditor(s)Initiator

Audited organization

The lead auditor may act as recorder. The initiator may act as lead auditor. Additional auditors should beincluded in the audit team; however, audits by a single person are permitted.8.2.1 Lead auditor

The lead auditor shall be responsible for the audit. This responsibility includes administrative tasks pertain-ing to the audit, ensuring that the audit is conducted in an orderly manner, and ensuring that the audit meetsits objectives. a)b)c)d)e)f)g)h)i)

Preparing the audit plan (see 8.5.2)Assembling the audit teamManaging the audit team

Making decisions regarding the conduct of the auditMaking decisions regarding any audit observationsPreparing the audit report (see 8.7)

Reporting on the inability or apparent inability of any of individuals involved in the audit to fulfilltheir responsibilities

Negotiating any discrepancies or inconsistencies with the initiator which could impair the ability tosatisfy the exit criteria (8.6)

Recommending corrective actions

The lead auditor shall be free from bias and influence that could reduce his ability to make independent,objective evaluations.8.2.2 Recorder

The recorder shall document anomalies, action items, decisions, and recommendations made by the auditteam.8.2.3 Auditor

The auditors shall examine products, as defined in the audit plan. They shall document their observationsand recommend corrective actions. All auditors shall be free from bias and influences that could reduce theirability to make independent, objective evaluations, or shall identify their bias and proceed with acceptancefrom the initiator.8.2.4 Initiator

The initiator shall be responsible for the following activities:a)b)c)

Decide upon the need for an audit

Decide upon the purpose and scope of the auditDecide the software products to be audited

26

Copyright © 1998 IEEE. All rights reserved.

SOFTWARE REVIEWS

IEEE

Std 1028-1997

d)e)f)g)h)Decide the evaluation criteria, including the regulations, standards, guidelines, plans, and proceduresto be used for evaluation

Decide upon who will carry out the auditReview the audit report

Decide what follow-up action will be requiredDistribute the audit report

The initiator may be a manager in the audited organization, a customer or user representative of the auditedorganization, or a third party.8.2.5 Audited organization

The audited organization shall provide a liaison to the auditors and shall provide all information requestedby the auditors. When the audit is completed, the audited organization should implement corrective actionsand recommendations.

8.3 Input

Inputs to the audit shall be listed in the audit plan and shall include the following:a)b)c)d)e)

Purpose and scope of the audit

Background information about the audited organizationSoftware products to be audited

Evaluation criteria, including applicable regulations, standards, guidelines, plans, and procedures tobe used for evaluation

Evaluation criteria: for example, “acceptable,” “needs improvement,” “unacceptable,” “not rated”

Inputs to the audit should also include the following:f)

Records of previous similar audits

8.4 Entry criteria

8.4.1 Authorization

An initiator decides upon the need for an audit. This decision may be prompted by a routine event, such asthe arrival at a project milestone, or a non-routine event, such as the suspicion or discovery of a major non-conformance.

The initiator selects an auditing organization that can perform an independent evaluation. The initiator pro-vides the auditors with information that defines the purpose of the audit, the software products to be audited,and the evaluation criteria. The initiator should request the auditors to make recommendations. The leadauditor produces an audit plan and the auditors prepare for the audit.

The need for an audit may be established by one or more of the following events:a)b)c)

The supplier organization decides to verify compliance with the applicable regulations, standards,guidelines, plans, and procedures (this decision may have been made when planning the project).The customer organization decides to verify compliance with applicable regulations, standards,guidelines, plans, and procedures.

A third party, such as a regulatory agency or assessment body, decides upon the need to audit thesupplier organization to verify compliance with applicable regulations, standards, guidelines, plans,and procedures.

Copyright © 1998 IEEE. All rights reserved.

27

IEEE

Std 1028-1997

IEEE STANDARD FOR

In every case, the initiator shall authorize the audit.8.4.2 Preconditions

An audit shall be conducted only when all of the following conditions have been met:a)b)c)

The audit has been authorized by an appropriate authorityA statement of objectives of the audit is establishedThe required audit inputs are available

8.5 Procedures

8.5.1 Management preparation

Managers shall ensure that the audit is performed as required by applicable standards and procedures and byrequirements mandated by law, contract, or other policy. To this end, managers shalla)b)c)d)e)f)

Plan time and resources required for audits, including support functions, as required in IEEE Std1058.1-1987 [B8], legal or regulatory documents, or other appropriate standards

Provide funding and facilities required to plan, define, execute, and manage the auditsProvide training and orientation on the audit procedures applicable to a given project

Ensure appropriate levels of expertise and knowledge sufficient to comprehend the software productbeing audited

Ensure that planned audits are conducted

Act on audit team recommendations in a timely manner

8.5.2 Planning the auditThe audit plan shall describe thea)b)c)d)e)f)g)h)i)j)k)l)m)

Purpose and scope of the audit

Audited organization, including location and managementSoftware products to be audited

Evaluation criteria, including applicable regulations, standards, guidelines, plans, and procedures tobe used for evaluationAuditor’s responsibilities

Examination activities (for example, interview staff, read and evaluate documents, observe tests)Audit activity resource requirementsAudit activity schedule

Requirements for confidentiality (for example, company confidential, restricted information, classi-fied information)ChecklistsReport formatsReport distribution

Required follow-up activities

Where sampling is used, a statistically valid sampling method shall be used to establish selection criteria andsample size.

The audit plan shall be approved by the initiator. The audit plan should allow for changes based on informa-tion gathered during the audit, subject to approval by the initiator.

28

Copyright © 1998 IEEE. All rights reserved.

SOFTWARE REVIEWS

IEEE

Std 1028-1997

8.5.3 Opening meeting

An opening meeting between the audit team and audited organization shall occur at the beginning of theexamination phase of the audit. The overview meeting agenda shall includea)b)c)d)e)f)

Purpose and scope of the auditSoftware products being auditedAudit procedures and outputs

Expected contributions of the audited organization to the audit (for example, the number of people tobe interviewed, meeting facilities)Audit schedule

Access to facilities, information, and documents required

8.5.4 Preparation

The initiator shall notify the audited organization’s management in writing before the audit is performed,except for unannounced audits. The notification shall define the purpose and scope of the audit, identifywhat will be audited, identify the auditors, and identify the audit schedule. The purpose of notification is toenable the audited organization to ensure that the people and material to be examined in the audit are avail-able.

Auditors shall prepare for the audit by studying thea)b)c)d)e)

Audit plan

Audited organizationProducts to be audited

Applicable regulations, standards, guidelines, plans, and procedures to be used for evaluationEvaluation criteria

In addition, the lead auditor shall make the necessary arrangements forf)g)h)i)

Team orientation and trainingFacilities for audit interviews

Materials, documents, and tools required by the audit proceduresExamination activities

8.5.5 Examination

Examination shall consist of evidence collection and analysis with respect to the audit criteria, a closingmeeting between the auditors and audited organization, and preparing an audit report.8.5.5.1 Evidence collection

The auditors shall collect evidence of conformance and non-conformance by interviewing audited organiza-tion staff, examining documents, and witnessing processes. The auditors should attempt all the examinationactivities defined in the audit plan. They shall undertake additional investigative activities if they considersuch activities required to define the full extent of conformance or non-conformance.

Auditors shall document all observations of non-conformance and exemplary conformance. An observationis a statement of fact made during an audit that is substantiated by objective evidence. Examples of non-con-formance area)b)

Applicable regulations, standards, guidelines, plans, and procedures not used at allApplicable regulations, standards, guidelines, plans, and procedures not used correctly

Copyright © 1998 IEEE. All rights reserved.

29

IEEE

Std 1028-1997

IEEE STANDARD FOR

Observations should be categorized as major or minor. An observation should be classified as major if thenon-conformity will likely have a significant effect on product quality, project cost, or project schedule.All observations shall be verified by discussing them with the audited organization before the closing auditmeeting.

8.5.5.2 Closing meeting

The lead auditor shall convene a closing meeting with the audited organization’s management. The closingmeeting should reviewa)b)c)d)e)f)

Actual extent of implementation of the audit plan

Problems experienced in implementing the audit plan, if anyObservations made by the auditorsPreliminary conclusions of the auditors

Preliminary recommendations of the auditors

Overall audit assessment (for example, whether the audited organization successfully passed theaudit criteria)

Comments and issues raised by the audited organization should be resolved. Agreements should be reachedduring the closing audit meeting and must be completed before the audit report is finalized.8.5.5.3 Reporting

The lead auditor shall prepare the audit report, as described in 8.7. The audit report should be prepared assoon as possible after the audit. Any communication between auditors and the audited organization madebetween the closing meeting and the issue of the report should pass through the lead auditor.

The lead auditor shall send the audit report to the initiator. The initiator should distribute the audit reportwithin the audited organization.8.5.6 Follow-up

Rework, if any, shall be the responsibility of the initiator and audited organization and shall includea)b)

Determining what corrective action is required to remove or prevent a non-conformityInitiating the corrective action

8.6 Exit criteria

An audit shall be considered complete whena)b)

The audit report has been submitted to the initiator

All of the auditing organization’s follow-up actions included in the scope of the audit have been per-formed, reviewed, and approved

8.7 Output

The output of the audit is the audit report. The audit report shall contain thea)b)c)

Purpose and scope of the audit

Audited organization, including location, liaison staff, and managementIdentification of the software products audited

30

Copyright © 1998 IEEE. All rights reserved.

SOFTWARE REVIEWS

IEEE

Std 1028-1997

d)e)f)g)h)i)j)k)Applicable regulations, standards, guidelines, plans, and procedures used for evaluationEvaluation criteria

Summary of auditor’s organizationSummary of examination activities

Summary of the planned examination activities not performedObservation list, classified as major or minor

A summary and interpretation of the audit findings including the key items of non-conformanceThe type and timing of audit follow-up activities

Additionally, when stipulated by the audit plan, recommendations shall be provided to the audited organiza-tion or the initiator. Recommendations may be reported separately from results.

Although this standard sets minimum requirements for report content, it is left to local standards to prescribeadditional content, report format requirements, and media.

Copyright © 1998 IEEE. All rights reserved.

31

IEEE

Std 1028-1997

IEEE STANDARD FOR

Annex A

(informative)

Relationship of this standard to the life cycle processes of other standards

This standard may be used in conjunction with other IEEE or ISO/IEC standards. In particular, IEEE Std730-19 [B1], IEEE Std 1012-1986 [B5], IEEE Std 1074-1995 [B10], and ISO/IEC 12207:1995 [B15] allrequire that software reviews take place during the software life cycle. The following table shows, for eachof these standards, a possible mapping to the five review types described in the body of IEEE Std 1028-1997.

Standard

IEEE Std 730-19 [B1]

Clause3.6.2.13.6.2.23.6.2.33.6.2.43.6.2.53.6.2.63.6.2.73.6.2.83.6.2.93.6.2.10

Review title

Software requirements reviewPreliminary design reviewCritical design reviewSoftware V& V plan reviewFunctional auditPhysical auditIn-process auditManagerial reviewsSoftware configuration management revIewPostmortem reviewConcept documentation evaluation

Software requirements traceabil-ity analysis, requirements evalu-ation, and interface analysisDesign traceability analysis, design evaluation, and design interface analysisSource code traceabilityanalysis, evaluation, and interface analysis

Source code documentation evaluation

Corresponding IEEE Std 1028-1997 review typeTechnical reviewTechnical reviewTechnical reviewManagement reviewAuditAuditAudit

Management reviewManagement review

Management review, technical review

Technical review

Technical review, inspection, walk-through

Technical review, inspection, walk-through

Technical review, inspection, walk-through

Technical review, inspection, walk-through

IEEE Std 1012-1986 [B5]3.5.23.5.3

3.5.4

3.5.5

3.5.5

32

Copyright © 1998 IEEE. All rights reserved.

SOFTWARE REVIEWS

IEEE

Std 1028-1997

Standard

IEEE Std 1012-1986 [B5]

ClauseAppendix

Review title

Algorithm analysisAudit performanceConfiguration control auditControl flow analysisDatabase analysisData flow analysisDesign walk-throughFeasibility study evaluationFunctional auditIn-process audit

Operational readiness reviewPhysical audit

Requirements walk-throughSizing and timing analysisSource code walk-throughTest evaluationTest readiness reviewTest walk-through

User documentation evaluation

Corresponding IEEE Std 1028-1997 review typeTechnical review, inspection, walk-throughAuditAudit

Technical review, walk-throughTechnical review, inspection, walk-through

Technical review, inspection, walk-throughWalk-throughManagement reviewAuditAudit

Management review, technical reviewAuditWalk-through

Technical review, inspection, walk-throughWalk-through

Technical review, inspection, audit

Management review, technical reviewWalk-throughTechnical review, auditAll types

Management review, technical review, inspection, walk-through, audit

Management review, technical review, inspection, walk-through, audit

IEEE Std 1074-1995 [B10]ISO/IEC 12207:1995 [B15]

7.1.3.25.2.4.5

Plan verification and validationProject management plan

5.2.6.2Supplier/acquirer joint reviews

Copyright © 1998 IEEE. All rights reserved.

33

IEEE

Std 1028-1997

IEEE STANDARD FOR

Standard

ISO/IEC 12207:1995 [B15]

Clause5.3.1.3

Review title

Development process

Corresponding IEEE Std 1028-1997 review typeManagement review, technical review, inspection, walk-through, audit

Technical review, inspection, walk-through

Technical review, inspection, walk-through

Technical review, inspection, walk-through

Technical review, inspection, walk-through

Technical review, inspection, walk-through

Technical review, inspection, walk-through

Technical review, inspection, walk-through

Technical review, inspection, walk-through

Technical review, inspection, walk-through

Technical review, inspection, walk-through

Management review, technical review, inspection, walk-through, auditManagement reviewTechnical reviewAudit

Management review, technical review

Inspection, walk-through

5.3.2.25.3.3.25.3.4.25.3.5.65.3.6.75.3.7.55.3.8.55.3.9.35.3.10.35.3.11.26.1.2.3

System requirements analysis evaluation

System architectural design evaluation

Software requirements analysis evaluation

Software architectural design evaluation

Software detailed design evalua-tion

Software code evaluationSoftware integration evaluationSoftware qualification testing evaluation

System integration evaluationSystem qualification test evaluationDocument review

6.6.26.6.36.77.1.4B.3.c

Project management reviewsTechnical reviewsAudit processReview and evaluationTailoring—reviews and audits

34

Copyright © 1998 IEEE. All rights reserved.

SOFTWARE REVIEWS

IEEE

Std 1028-1997

Annex B

(informative)

Comparison of review types

The following table compares the five types of reviews in a number of salient characteristics. This is meantto be indicative of the ways in which the review types match with or differ from one another.

CharacteristicObjective

Management reviewEnsure progress; recommend corrective action; ensure proper

allocation of resourcesManagement team charts course of action; decisions made at the meeting or as a result of recommenda-tions

Leader verifies that action items are

closed; change verification left to other project controlsTwo or more peopleManagement, technical

leadership and peer mix

Technical reviewEvaluate

conformance to specifications and plans; ensure change integrityReview team requests man-agement or technical leader-ship to act on recommenda-tions

Leader verifies that action items are

closed; change verification left to other project controlsThree or more peopleTechnical leadership and peer mix

InspectionFind anomalies; verify resolution; verify product quality

Walk-throughFind anomalies; examine alternatives; improve

product; forum for learningThe team agrees on changes to be made by the author

AuditIndependently evaluate compliance with objective standards and regulationsAudited organi-zation, initiator, acquirer,

customer or user

Decision-making

Review team chooses pre-defined product dispositions; defects must be removed

Change verification

Leader verifies that action items are

closed; change verification left to other project controlsThree to six peoplePeers meet with documented attendance

Leader verifies that action items are

closed; change verification left to other project controlsTwo to seven peopleTechnical leadership and peer mix

Responsibility of the audited organization

Recommended group sizeGroupattendance

One to five peopleAuditors, audited

organization, management and technical personnelLead auditor

GroupleadershipVolume of material

Usually the responsible managerModerate to high, depending on the specific meeting objectives

Usually the lead engineerModerate to high, depending on the specific meeting objectives

Trained facilitatorRelatively low

Facilitator or authorRelatively low

Moderate to high, depending on the specific audit objectives

Copyright © 1998 IEEE. All rights reserved.

35

IEEE

Std 1028-1997

IEEE STANDARD FOR

CharacteristicPresenter

Management reviewProject

representative

Technical reviewDevelopment team

representative

InspectionA reader

Walk-throughAuthor

AuditAuditors collect and examine information pro-vided by audited organizationNot a formal project

requirement. May be done locally.Formal audit report;

observations, findings, deficienciesYes (formal auditing training)YesYesYesOptional

Data collection

As required by applicable policies, standards, or plansManagement review

documentation

Not a formal project

requirement. May be done locally.Technical review

documentation

Strongly recommended

Recommended

Output

Anomaly list, anomaly summary, inspection documentationYes

Anomaly list, action items, decisions, follow-up proposalsNo

Formal facilitator trainingDefined

participant rolesUse of defect checklistsManagement participatesCustomer or user

representative participates

NoNo

YesNoYesOptional

YesNoOptionalOptional

YesYesNoOptional

YesNoNoOptional

36

Copyright © 1998 IEEE. All rights reserved.

SOFTWARE REVIEWS

IEEE

Std 1028-1997

Annex C

(informative)

Bibliography

The standards listed here may be useful in the preparation of software products that can be reviewed usingthe procedure documented in this standard:

[B1] IEEE Std 730-19, IEEE Standard for Software Quality Assurance Plans.3

[B2] IEEE Std 828-1990, IEEE Standard for Software Configuration Management Plans.[B3] IEEE Std 829-1983 (R1991), IEEE Standard for Software Test Documentation.

[B4] IEEE Std 830-1993, IEEE Recommended Practice for Software Requirements Specifications.[B5] IEEE Std 1012-1986 (R1992), IEEE Standard for Software Verification and Validation Plans.[B6] IEEE Std 1016-1987 (R1993), IEEE Recommended Practice for Software Design Descriptions.[B7] IEEE Std 1044-1993, IEEE Standard Classification for Software Anomalies.

[B8] IEEE Std 1058-1987 (R1993), IEEE Standard for Software Project Management Plans.[B9] IEEE Std 1063-1987 (R1993), IEEE Standard for Software User Documentation.[B10] IEEE Std 1074-1995, IEEE Standard for Developing Software Life Cycle Processes.[B11] IEEE Std 1219-1992, IEEE Standard for Software Maintenance.

[B12] IEEE Std 1220-1994, IEEE Trial-Use Standard for Application and Management of the Systems Engi-neering Process.

[B13] IEEE Std 1228-1994, IEEE Standard for Software Safety Plans.

[B14] IEEE Std 1298-1992 (AS 3563.1-1991), IEEE Standard for Software Quality Management System,Part 1: Requirements.

[B15] ISO/IEC 12207:1995, Information technology—Software life cycle processes.4

[B16] ISO 9001:1994, Quality systems—Model for quality assurance in design/development, production,installation and servicing.

[B17] ISO 10011-1:1990, Guidelines for auditing quality systems—Part 1: Auditing.

3IEEE publications are available from the Institute of Electrical and Electronics Engineers, 445 Hoes Lane, P.O. Box 1331, Piscataway,

NJ 08855-1331, USA.

4ISO publications are available from the ISO Central Secretariat, Case Postale 56, 1 rue de Varembé, CH-1211, Genève 20, Switzer-

land/Suisse. ISO publications are also available in the United States from the Sales Department, American National Standards Institute,11 West 42nd Street, 13th Floor, New York, NY 10036, USA.

Copyright © 1998 IEEE. All rights reserved.

37

因篇幅问题不能全部显示,请点此查看更多更全内容

Copyright © 2019- efsc.cn 版权所有 赣ICP备2024042792号-1

违法及侵权请联系:TEL:199 1889 7713 E-MAIL:2724546146@qq.com

本站由北京市万商天勤律师事务所王兴未律师提供法律服务