Automotive SPICE

../_images/image1.jpg

Quality Management in the Automotive Industry

Automotive SPICE®

Process Reference Model

Process Assessment Model

Version 3.1

Title:

Automotive SPICE Process Assessment / Reference Model

Author(s):

VDA QMC Working Group 13 / Automotive SIG

Version:

3.1

Date:

2017-11-01

Status:

PUBLISHED

Confidentiality:

Public

Revision ID:

656

Copyright notice

This document is a revision of the Automotive SPICE process assessment model 2.5 and the process reference model 4.5, which has been developed under the Automotive SPICE initiative by consensus of the car manufacturers within the Automotive Special Interest Group (SIG), a joint special interest group of Automotive OEM, the Procurement Forum and the SPICE User Group.

It has been revised by the Working Group 13 of the Quality Management Center (QMC) in the German Association of the Automotive Industry with the representation of members of the Automotive Special Interest Group, and with the agreement of the SPICE User Group. This agreement is based on a validation of the Automotive SPICE 3.0 version regarding any ISO copyright infringement and the statements given from VDA QMC to the SPICE User Group regarding the current and future development of Automotive SPICE.

This document reproduces relevant material from:

  • ISO/IEC 33020:2015 Information technology – Process assessment – Process measurement framework for assessment of process capability

ISO/IEC 33020:2015 provides the following copyright release statement:

‘Users of this International Standard may reproduce subclauses 5.2, 5.3, 5.4 and 5.6 as part of any process assessment model or maturity model so that it can be used for its intended purpose.’

  • ISO/IEC 15504-5:2006 Information Technology – Process assessment – Part 5: An exemplar Process Assessment Model

ISO/IEC 15504-5:2006 provides the following copyright release statement:

‘Users of this part of ISO/IEC 15504 may freely reproduce the detailed descriptions contained in the exemplar assessment model as part of any tool or other material to support the performance of process assessments, so that it can be used for its intended purpose.’

Relevant material from one of the mentioned standards is incorporated under the copyright release notice.

Acknowledgement

The VDA, the VDA QMC and the Working Group 13 explicitly acknowledge the high quality work carried out by the members of the Automotive Special Interest Group. We would like to thank all involved people, who have contributed to the development and publication of Automotive SPICE®.

Derivative works

You may not alter, transform, or build upon this work without the prior consent of both the SPICE User Group and the VDA Quality Management Center. Such consent may be given provided ISO copyright is not infringed.

The detailed descriptions contained in this document may be incorporated as part of any tool or other material to support the performance of process assessments, so that this process assessment model can be used for its intended purpose, provided that any such material is not offered for sale.

All distribution of derivative works shall be made at no cost to the recipient.

Distribution

The Automotive SPICE® process assessment model may only be obtained by download from the www.automotivespice.com web site.

It is not permitted for the recipient to further distribute the document.

Change requests

Any problems or change requests should be reported through the defined mechanism at the www.automotivespice.com web site.

Trademark

Automotive SPICE® is a registered trademark of the Verband der Automobilindustrie e.V. (VDA) For further information about Automotive SPICE® visit www.automotivespice.com.

Document history

Version

Date

By

Notes

2.0

2005-05-04

AutoSIG / SUG

DRAFT RELEASE, pending final editorial review

2.1

2005-06-24

AutoSIG / SUG

Editorial review comments implemented
Updated to reflect changes in FDIS 15504-5

2.2

2005-08-21

AutoSIG / SUG

Final checks implemented: FORMAL RELEASE

2.3

2007-05-05

AutoSIG / SUG

Revision following CCB: FORMAL RELEASE

2.4

2008-08-01

AutoSIG / SUG

Revision following CCB: FORMAL RELEASE

2.5

2010-05-10

AutoSIG / SUG

Revision following CCB: FORMAL RELEASE

3.0

2015-07-16

VDA QMC WG13

Changes: See release notes

3.1

2017-11-01

VDA QMC WG13

Changes: See www.automotivespice.com

Release notes

Version 3.0 of the process assessment model incorporates the following major changes:

Chapter 1

Editorial adaption to ISO/IEC 330xx series, Notes regarding combined PRM/PAM in this document

Chapter 2

Adaption to ISO/IEC 330xx series

Chapter 3

Text optimized for better understanding and adapted to ISO/IEC 330xx series.

Chapter 4

Renaming ENG to SYS/SWE, Structure of old ENG Processes changed, Rework of AS 4.5 process reference model and AS 2.5 process performance indicators focusing on a set of highly significant processes assessed within the automotive industry (VDA Scope).

Chapter 5

Adaption based on AS 2.5 to the measurement framework of ISO/IEC 33020

Annex A

Conformity statement adapted to ISO/IEC 33004

Annex B

Modifications on work product characteristics according to the changes in chapter 4.

Annex C

Update to recent standards. Introduction of specific terms used in AS 3.0

Annex D

Added the major concepts used for AS 3.0, incorporated Annex E of AS 2.5

Annex E

Updated references to other standards

Version 3.1 of the process assessment model incorporates minor changes. Please refer to www.automotivespice.com for a detailed change log.

Table of contents

List of Figures

Figure 1 — Process assessment model relationship ………………………………………………………. 11

Figure 2 — Automotive SPICE process reference model - Overview ………………………………………….. 12

Figure 3 — Relationship between assessment indicators and process capability………………………………. 22

Figure 4 — Possible levels of abstraction for the term “process” ………………………………………… 23

Figure 5 — Performing a process assessment for determining process capability …………………………….. 23

Figure D.1 — The “Plug-in” concept …………………………………………………………………… 122

Figure D.2 — The tip of the “V” ……………………………………………………………………… 123

Figure D.3 — Element, component, unit, and item ……………………………………………………….. 123

Figure D.4 — Bidirectional traceability and consistency ………………………………………………… 124

Figure D.5 — Agree, summarize and communicate …………………………………………………………. 125

Figure D.6 — Evaluation, verification criteria and compliance …………………………………………… 126

Figure D.7 — Strategy and plan ………………………………………………………………………. 127

List of Tables

Table 1 — Abbreviation List …………………………………………………………………………. 9

Table 2 — Primary life cycle processes – ACQ process group ……………………………………………… 13

Table 3 — Primary life cycle processes – SPL process group ……………………………………………… 13

Table 4 — Primary life cycle processes – SYS process group ……………………………………………… 13

Table 5 — Primary life cycle processes – SWE process group ……………………………………………… 14

Table 6 — Supporting life cycle processes - SUP process group …………………………………………… 14

Table 7 — Organizational life cycle processes - MAN process group ……………………………………….. 14

Table 8 — Organizational life cycle processes - PIM process group ……………………………………….. 15

Table 9 — Organizational life cycle processes - REU process group ……………………………………….. 15

Table 10 — Process capability levels according to ISO/IEC 33020 …………………………………………. 16

Table 11 — Process attributes according to ISO/IEC 33020 ……………………………………………….. 16

Table 12 — Rating scale according to ISO/IEC 33020 …………………………………………………….. 17

Table 13 — Rating scale percentage values according to ISO/IEC 33020 …………………………………….. 17

Table 14 — Refinement of rating scale according to ISO/IEC 33020 ………………………………………… 17

Table 15 — Refined rating scale percentage values according to ISO/IEC 33020 ……………………………… 18

Table 16 — Process capability level model according to ISO/IEC 33020 …………………………………….. 20

Table 17 — Template for the process description ……………………………………………………….. 24

Table B.1 — Structure of WPC tables ………………………………………………………………….. 94

Table B.2 — Work product characteristics ……………………………………………………………… 94

Table C.1 — Terminology …………………………………………………………………………….. 119

Table E.1 — Reference standards ……………………………………………………………………… 128

1. Introduction

1.1. Scope

Process assessment is a disciplined evaluation of an organizational unit’s processes against a process assessment model.

The Automotive SPICE process assessment model (PAM) is intended for use when performing conformant assessments of the process capability on the development of embedded automotive systems. It was developed in accordance with the requirements of ISO/IEC 33004.

Automotive SPICE has its own process reference model (PRM), which was developed based on the Automotive SPICE process reference model 4.5. It was further developed and tailored considering the specific needs of the automotive industry. If processes beyond the scope of Automotive SPICE are needed, appropriate processes from other process reference models such as ISO/IEC 12207 or ISO/IEC 15288 may be added based on the business needs of the organization.

The PRM is incorporated in this document and is used in conjunction with the Automotive SPICE process assessment model when performing an assessment.

This Automotive SPICE process assessment model contains a set of indicators to be considered when interpreting the intent of the Automotive SPICE process reference model. These indicators may also be used when implementing a process improvement program subsequent to an assessment.

1.2. Terminology

Automotive SPICE follows the following precedence for use of terminology:

  1. ISO/IEC 33001 for assessment related terminology

  2. ISO/IEC/IEEE 24765 and ISO/IEC/IEEE 29119 terminology (as contained in Annex C)

  3. Terms introduced by Automotive SPICE (as contained in Annex C)

1.3. Abbreviations

Table 1 Table 1 — Abbreviation List

AS

Automotive SPICE

BP

Base Practice

CAN

Controller Area Network

CASE

Computer-Aided Software Engineering,

CCB

Change Control Board

CFP

Call For Proposals

CPU

Central Processing Unit

ECU

Electronic Control Unit

EEPROM

Electrically Erasable Programmable Read-Only Memory

GP

Generic Practice

GR

Generic Resource

IEC

International Electrotechnical Commission

IEEE

Institute of Electrical and Electronics Engineers

I/O

Input / Output

ISO

International Organization for Standardization

ITT

Invitation To Tender

LIN

Local Interconnect Network

MISRA

Motor Industry Software Reliability Association

MOST

Media Oriented Systems Transport

PA

Process Attribute

PAM

Process Assessment Model

PRM

Process Reference Model

PWM

Pulse Width Modulation

RAM

Random Access Memory

ROM

Read Only Memory

SPICE

Software Process Improvement and Capability dEtermination

SUG

Spice User Group

USB

Universal Serial Bus

WP

Work Product

WPC

Work Product Characteristic

2. Statement of compliance

The Automotive SPICE process assessment model and process reference model is conformant with the ISO/IEC 33004, and can be used as the basis for conducting an assessment of process capability.

ISO/IEC 33020 is used as an ISO/IEC 33003 compliant Measurement Framework.

A statement of compliance of the process assessment model and process reference model with the requirements of ISO/IEC 33004 is provided in Annex A.

3. Process capability determination

The concept of process capability determination by using a process assessment model is based on a two-dimensional framework. The first dimension is provided by processes defined in a process reference model (process dimension). The second dimension consists of capability levels that are further subdivided into process attributes (capability dimension). The process attributes provide the measurable characteristics of process capability.

The process assessment model selects processes from a process reference model and supplements with indicators. These indicators support the collection of objective evidence which enable an assessor to assign ratings for processes according to the capability dimension.

The relationship is shown in Figure 1:

../_images/figure1.png

Fig. 1 Figure 1 — Process assessment model relationship

3.1. Process reference model

Processes are grouped by process category and at a second level into process groups according to the type of activity they address.

There are 3 process categories: Primary life cycle processes, Organizational life cycle processes and Supporting life cycle processes.

Each process is described in terms of a purpose statement. The purpose statement contains the unique functional objectives of the process when performed in a particular environment. For each purpose statement a list of specific outcomes is associated, as a list of expected positive results of the process performance.

For the process dimension, the Automotive SPICE process reference model provides the set of processes shown in Figure 2.

../_images/figure2.png

Fig. 2 Figure 2 — Automotive SPICE process reference model - Overview

3.1.1. Primary life cycle processes category

The primary life cycle processes category consists of processes that may be used by the customer when acquiring products from a supplier, and by the supplier when responding and delivering products to the customer including the engineering processes needed for specification, design, development, integration and testing.

The primary life cycle processes category consists of the following groups:

  • the Acquisition process group;

  • the Supply process group;

  • the System engineering process group;

  • the Software engineering process group.

The Acquisition process group (ACQ) consists of processes that are performed by the customer, or by the supplier when acting as a customer for its own suppliers, in order to acquire a product and/or service.

Table 2 Table 2 — Primary life cycle processes – ACQ process group

ACQ.3

Contract Agreement

ACQ.4

Supplier Monitoring

ACQ.11

Technical Requirements

ACQ.12

Legal and Administrative Requirements

ACQ.13

Project Requirements

ACQ.14

Request for Proposals

ACQ.15

Supplier Qualification

The Supply process group (SPL) consists of processes performed by the supplier in order to supply a product and/or a service.

Table 3 Table 3 — Primary life cycle processes – SPL process group

SPL.1

Supplier Tendering

SPL.2

Product Release

The System Engineering process group (SYS) consists of processes addressing the elicitation and management of customer and internal requirements, the definition of the system architecture and the integration and testing on the system level.

Table 4 Table 4 — Primary life cycle processes – SYS process group

SYS.1

Requirements Elicitation

SYS.2

System Requirements Analysis

SYS.3

System Architectural Design

SYS.4

System Integration and Integration Test

SYS.5

System Qualification Test

The Software Engineering process group (SWE) consists of processes addressing the management of software requirements derived from the system requirements, the development of the corresponding software architecture and design as well as the implementation, integration and testing of the software.

Table 5 Table 5 — Primary life cycle processes – SWE process group

SWE.1

Software Requirements Analysis

SWE.2

Software Architectural Design

SWE.3

Software Detailed Design and Unit Construction

SWE.4

Software Unit Verification

SWE.5

Software Integration and Integration Test

SWE.6

Software Qualification Test

3.1.2. Supporting life cycle processes category

The supporting life cycle processes category consists of processes that may be employed by any of the other processes at various points in the life cycle.

Table 6 Table 6 — Supporting life cycle processes - SUP process group

SUP.1

Quality Assurance

SUP.2

Verification

SUP.4

Joint Review

SUP.7

Documentation

SUP.8

Configuration Management

SUP.9

Problem Resolution Management

SUP.10

Change Request Management

3.1.3. Organizational life cycle processes category

The organizational life cycle processes category consists of processes that develop process, product, and resource assets which, when used by projects in the organization, will help the organization achieve its business goals.

The organizational life cycle processes category consists of the following groups:

  • the Management process group;

  • the Process Improvement process group;

  • the Reuse process group.

The Management process group (MAN) consists of processes that may be used by anyone who manages any type of project or process within the life cycle.

Table 7 Table 7 — Organizational life cycle processes - MAN process group

MAN.3

Project Management

MAN.5

Risk Management

MAN.6

Measurement

The Process Improvement process group (PIM) covers one process that contains practices to improve the processes performed in the organizational unit.

Table 8 Table 8 — Organizational life cycle processes - PIM process group

PIM.3

Process Improvement

The Reuse process group (REU) covers one process to systematically exploit reuse opportunities in organization’s reuse programs.

Table 9 Table 9 — Organizational life cycle processes - REU process group

REU.2

Reuse Program Management

3.2. Measurement framework

The measurement framework provides the necessary requirements and rules for the capability dimension. It defines a schema which enables an assessor to determine the capability level of a given process. These capability levels are defined as part of the measurement framework.

To enable the rating, the measurement framework provides process attributes defining a measurable property of process capability. Each process attribute is assigned to a specific capability level. The extent of achievement of a certain process attribute is represented by means of a rating based on a defined rating scale. The rules from which an assessor can derive a final capability level for a given process are represented by a process capability level model.

Automotive SPICE 3.1 uses the measurement framework defined in ISO/IEC 33020:2015.

Note

NOTE: Text incorporated from ISO/IEC 33020 within this chapter is written in italic font and marked with a left side bar.

3.2.1. Process capability levels and process attributes

The process capability levels and process attributes are identical to those defined in ISO/IEC 33020 clause 5.2. The detailed descriptions of the capability levels and the corresponding process attributes can be found in chapter 5.

Process attributes are features of a process that can be evaluated on a scale of achievement, providing a measure of the capability of the process. They are applicable to all processes.

A capability level is a set of process attribute(s) that work together to provide a major enhancement in the capability to perform a process. Each attribute addresses a specific aspect of the capability level. The levels constitute a rational way of progressing through improvement of the capability of any process.

According to ISO/IEC 33020 there are six capability levels, incorporating nine process attributes:

Table 10 Table 10 — Process capability levels according to ISO/IEC 33020

Level 0:

Incomplete process

The process is not implemented, or fails to achieve its process purpose.

Level 1:

Performed process

The implemented process achieves its process purpose

Level 2:

Managed process

The previously described performed process is now implemented in a managed fashion (planned, monitored and adjusted) and its work products are appropriately established, controlled and maintained.

Level 3:

Established process

The previously described managed process is now implemented using a defined process that is capable of achieving its process outcomes.

Level 4:

Predictable process

The previously described established process now operates predictively within defined limits to achieve its process outcomes.

Quantitative management needs are identified, measurement data are collected and analyzed to identify assignable causes of variation.

Corrective action is taken to address assignable causes of variation.

Level 5:

Innovating process

The previously described predictable process is now continually improved to respond to organizational change.

Within this process assessment model, the determination of capability is based upon the nine process attributes (PA) defined in ISO/IEC 33020 and listed in Table 11.

Table 11 Table 11 — Process attributes according to ISO/IEC 33020

Attribute ID

Process Attributes

Level 0: Incomplete process

Level 1: Performed process

PA 1.1

Process performance process attribute

Level 2: Managed process

PA 2.1

Performance management process attribute

PA 2.2

Work product management process attribute

Level 3: Established process

PA 3.1

Process definition process attribute

PA 3.2

Process deployment process attribute

Level 4: Predictable process

PA 4.1

Quantitative analysis process attribute

PA 4.2

Quantitative control process attribute

Level 5: Innovating process

PA 5.1

Process innovation process attribute

PA 5.2

Process innovation implementation process attribute

3.2.2. Process attribute rating

To support the rating of process attributes, the ISO/IEC 33020 measurement framework provides a defined rating scale with an option for refinement, different rating methods and different aggregation methods depending on the class of the assessment (e.g. required for organizational maturity assessments).

Rating scale

Within this process measurement framework, a process attribute is a measureable property of process capability. A process attribute rating is a judgement of the degree of achievement of the process attribute for the assessed process.

The rating scale is defined by ISO/IEC 33020 as shown in table 12.

Table 12 Table 12 — Rating scale according to ISO/IEC 33020

N

Not achieved

There is little or no evidence of achievement of the defined process attribute in the assessed process.

P

Partially achieved

There is some evidence of an approach to, and some achievement of, the defined process attribute in the assessed process.

Some aspects of achievement of the process attribute may be unpredictable.

L

Largely achieved

There is evidence of a systematic approach to, and significant achievement of, the defined process attribute in the assessed process.

Some weaknesses related to this process attribute may exist in the assessed process.

F

Fully achieved

There is evidence of a complete and systematic approach to, and full achievement of, the defined process attribute in the assessed process.

No significant weaknesses related to this process attribute exist in the assessed process.

The ordinal scale defined above shall be understood in terms of percentage achievement of a process attribute.

The corresponding percentages shall be:

Table 13 Table 13 — Rating scale percentage values according to ISO/IEC 33020

N

Not achieved

0 to ≤ 15% achievement

P

Partially achieved

> 15% to ≤ 50% achievement

L

Largely achieved

> 50% to ≤ 85% achievement

F

Fully achieved

> 85% to ≤ 100% achievement

The ordinal scale may be further refined for the measures P and L as defined below.

Table 14 Table 14 — Refinement of rating scale according to ISO/IEC 33020

P-

Partially achieved:

There is some evidence of an approach to, and some achievement of, the defined process attribute in the assessed process.

Many aspects of achievement of the process attribute may be unpredictable.

P+

Partially achieved:

There is some evidence of an approach to, and some achievement of, the defined process attribute in the assessed process.

Some aspects of achievement of the process attribute may be unpredictable.

L-

Largely achieved:

There is evidence of a systematic approach to, and significant achievement of, the defined process attribute in the assessed process.

Many weaknesses related to this process attribute may exist in the assessed process.

L+

Largely achieved:

There is evidence of a systematic approach to, and significant achievement of, the defined process attribute in the assessed process.

Some weaknesses related to this process attribute may exist in the assessed process.

The corresponding percentages shall be:

Table 15 Table 15 — Refined rating scale percentage values according to ISO/IEC 33020

P-

Partially achieved -

> 15% to ≤ 32.5% achievement

P+

Partially achieved +

> 32.5 to ≤ 50% achievement

L-

Largely achieved -

> 50% to ≤ 67.5% achievement

L+

Largely achieved +

> 67.5% to ≤ 85% achievement

Rating and aggregation method

ISO/IEC 33020 provides the following definitions:

A process outcome is the observable result of successful achievement of the process purpose.

A process attribute outcome is the observable result of achievement of a specified process attribute.

Process outcomes and process attribute outcomes may be characterised as an intermediate step to providing a process attribute rating.

When performing rating, the rating method employed shall be specified relevant to the class of assessment. The following rating methods are defined.

The use of rating method may vary according to the class, scope and context of an assessment. The lead assessor shall decide which (if any) rating method to use. The selected rating method(s) shall be specified in the assessment input and referenced in the assessment report.

ISO/IEC 33020 provides the following 3 rating methods:

Rating method R1

The approach to process attribute rating shall satisfy the following conditions:

  1. Each process outcome of each process within the scope of the assessment shall be characterized for each process instance, based on validated data;

  2. Each process attribute outcome of each process attribute for each process within the scope of the assessment shall be characterised for each process instance, based on validated data;

  3. Process outcome characterisations for all assessed process instances shall be aggregated to provide a process performance attribute achievement rating;

  4. Process attribute outcome characterisations for all assessed process instances shall be aggregated to provide a process attribute achievement rating.

Rating method R2

The approach to process attribute rating shall satisfy the following conditions:

  1. Each process attribute for each process within the scope of the assessment shall be characterized for each process instance, based on validated data;

  2. Process attribute characterisations for all assessed process instances shall be aggregated to provide a process attribute achievement rating.

Rating method R3

Process attribute rating across assessed process instances shall be made without aggregation.

In principle the three rating methods defined in ISO/IEC 33020 depend on

  1. whether the rating is made only on process attribute level (Rating method 3 and 2) or – with more level of detail – both on process attribute and process attribute outcome level (Rating method 1); and

  2. the type of aggregation ratings across the assessed process instances for each process

If a rating is performed for both process attributes and process attribute outcomes (Rating method 1), the result will be a process performance attribute outcome rating on level 1 and a process attribute achievement rating on higher levels.

Depending on the class, scope and context of the assessment an aggregation within one process (one-dimensional, vertical aggregation), across multiple process instances (one-dimensional, horizontal aggregation) or both (two-dimensional, matrix aggregation) is performed.

ISO/IEC 33020 provides the following examples:

When performing an assessment, ratings may be summarised across one or two dimensions.

For example, when rating a

  • process attribute for a given process, one may aggregate ratings of the associated process (attribute) outcomes – such an aggregation will be performed as a vertical aggregation (one dimension).

  • process (attribute) outcome for a given process attribute across multiple process instances, one may aggregate the ratings of the associated process instances for the given process (attribute) outcome such an aggregation will be performed as a horizontal aggregation (one dimension)

  • process attribute for a given process, one may aggregate the ratings of all the process (attribute) outcomes for all the processes instances – such an aggregation will be performed as a matrix aggregation across the full scope of ratings (two dimensions)

The standard defines different methods for aggregation. Further information can be taken from ISO/IEC 33020.

3.2.3. Process capability level model

The process capability level achieved by a process shall be derived from the process attribute ratings for that process according to the process capability level model defined in Table 16.

The process capability level model defines the rules how the achievement of each level depends on the rating of the process attributes for the assessed and all lower levels.

As a general rule the achievement of a given level requires a largely achievement of the corresponding process attributes and a full achievement of any lower lying process attribute.

Table 16 Table 16 — Process capability level model according to ISO/IEC 33020

Scale

Process attribute

Rating

Level 1

PA 1.1: Process Performance

Largely

Level 2

PA 1.1: Process Performance

PA 2.1: Performance Management

PA 2.2: Work Product Management

Fully

Largely

Largely

Level 3

PA 1.1: Process Performance

PA 2.1: Performance Management

PA 2.2: Work Product Management

PA 3.1: Process Definition

PA 3.2: Process Deployment

Fully

Fully

Fully

Largely

Largely

Level 4

PA 1.1: Process Performance

PA 2.1: Performance Management

PA 2.2: Work Product Management

PA 3.1: Process Definition

PA 3.2: Process Deployment

PA 4.1: Quantitative Analysis

PA 4.2: Quantitative Control

Fully

Fully

Fully

Fully

Fully

Largely

Largely

Level 5

PA 1.1: Process Performance

PA 2.1: Performance Management

PA 2.2: Work Product Management

PA 3.1: Process Definition

PA 3.2: Process Deployment

PA 4.1: Quantitative Analysis

PA 4.2: Quantitative Control

PA 5.1: Process Innovation

PA 5.2: Process Innovation Implementation

Fully

Fully

Fully

Fully

Fully

Fully

Fully

Largely

Largely

3.3. Process assessment model

The process assessment model offers indicators in order to identify whether the process outcomes and the process attribute outcomes (achievements) are present or absent in the instantiated processes of projects and organizational units. These indicators provide guidance for assessors in accumulating the necessary objective evidence to support judgments of capability. They are not intended to be regarded as a mandatory set of checklists to be followed.

In order to judge the presence or absence of process outcomes and process achievements an assessment obtains objective evidence. All such evidence comes from the examination of work products and repository content of the assessed processes, and from testimony provided by the performers and managers of the assessed processes. This evidence is mapped to the PAM indicators to allow establishing the correspondence to the relevant process outcomes and process attribute achievements.

There are two types of indicators:

  • Process performance indicators, which apply exclusively to capability Level 1. They provide an indication of the extent of fulfillment of the process outcomes

  • Process capability indicators, which apply to Capability Levels 2 to 5. They provide an indication of the extent of fulfillment of the process attribute achievements.

Assessment indicators are used to confirm that certain practices were performed, as shown by evidence collected during an assessment. All such evidence comes either from the examination of work products of the processes assessed, or from statements made by the performers and managers of the processes. The existence of base practices and work products provide evidence of the performance of the processes associated with them. Similarly, the existence of process capability indicators provides evidence of process capability.

The evidence obtained should be recorded in a form that clearly relates to an associated indicator, in order that support for the assessor’s judgment can be confirmed or verified as required by ISO/IEC 33002.

3.3.1. Process performance indicators

Types of process performance indicators are

  • Base practices (BP)

  • Work products (WP).

Both BPs and WPs relate to one or more process outcomes. Consequently, BPs and WPs are always process-specific and not generic. BPs represent activity-oriented indicators. WPs represent resultoriented indicators. Both BP and WP are used for judging objective evidence that an assessor is to collect, and accumulate, in the performance of an assessment. In that respect BPs and WPs are alternative indicator sets the assessor can use.

The PAM offers a set of work product characteristics (WPC, see Annex B) for each WP. These are meant to offer a good practice and state-of-the-art knowledge guide for the assessor. Therefore, WP and WPC are supposed to be a quickly accessible information source during an assessment. In that respect WPs and WPCs represent an example structure only. They are neither a “strict must” nor are they normative for organizations. Instead, the actual structure, form and content of concrete work products and documents for the implemented processes must be defined by the project and organization, respectively. The project and/or organization ensures that the work products are appropriate for the intended purpose and needs, and in relation to the development goals.

3.3.2. Process capability indicators

Types of process capability indicators are:

  • Generic Practice (GP)

  • Generic Resource (GR)

Both GPs and GRs relate to one or more PA Achievements. In contrast to process performance indicators, however, they are of generic type, i.e. they apply to any process.

The difference between GP and GR is that the former represent activity-oriented indicators while the latter represent infrastructure- oriented indicators for judging objective evidence. An assessor has to collect and accumulate evidence supporting process capability indicators during an assessment. In that respect GPs and GRs are alternative indicators sets the assessor can use.

In spite of the fact that level 1 capability of a process is only characterized by the measure of the extent to which the process outcomes are achieved the measurement framework (see chapter 3.2) requires each level to reveal a process attribute, and, thus, requires the PAM to introduce at least one process capability indicator. Therefore, the only process performance attribute for capability Level 1 (PA.1.1) has a single generic practice (GP 1.1.1) pointing as an editorial reference to the respective process performance indicators (see Figure 3).

../_images/figure3.png

Fig. 3 Figure 3 — Relationship between assessment indicators and process capability

3.3.3. Understanding the level of abstraction of a PAM

The term “process” can be understood at three levels of abstraction. Note that these levels of abstraction are not meant to define a strict black-or-white split, nor is it the aim to provide a scientific classification schema – the message here is to understand that, in practice, when it comes to the term “process” there are different abstraction levels, and that a PAM resides at the highest.

../_images/figure4.png

Fig. 4 Figure 4 — Possible levels of abstraction for the term “process”

Capturing experience acquired during product development (i.e. at the DOING level) in order to share this experience with others means creating a HOW level. However, a HOW is always specific to a particular context such as a company, an organizational unit, or a product line. For example, the HOW of a project, organizational unit, or company A is potentially not applicable as is to a project, organizational unit, or company B. However, both might be expected to adhere the principles represented by PAM indicators for process outcomes and process attribute achievements. These indicators are at the WHAT level while deciding on solutions for concrete templates, proceedings, and tooling etc. is left to the HOW level.

../_images/figure5.png

Fig. 5 Figure 5 — Performing a process assessment for determining process capability

4. Process reference model and performance indicators (Level 1)

The processes in the process dimension can be drawn from the Automotive SPICE process reference model, which is incorporated in the tables below indicated by a red bar at the left side.

Each table related to one process in the process dimension contains the process reference model (indicated by a red bar) and the process performance indicators necessary to define the process assessment model. The process performance indicators consist of base practices (indicated by a green bar) and output work products (indicated by a blue bar).

Table 17 Table 17 — Template for the process description

Process

reference

model

Process ID

Process name

Process purpose

Process outcomes

The individual processes are described in terms of process name, process purpose, and process outcomes to define the Automotive SPICE process reference model.

Additionally a process identifier is provided.

Process

performance

indicators

Base practices

A set of base practices for the process providing a definition of the tasks and activities needed to accomplish the process purpose and fulfill the process outcomes

Output work products

A number of output work products associated with each process

NOTE: Refer to Annex B for the characteristics associated with each work product.

4.1. Acquisition process group (ACQ)

4.1.1. ACQ.3 Contract Agreement

4.1.2. ACQ.4 Supplier Monitoring

4.1.3. ACQ.11 Technical Requirements

4.1.4. ACQ.12 Legal and Administrative Requirements

4.1.5. ACQ.13 Project Requirements

4.1.6. ACQ.14 Request for Proposals

4.1.7. ACQ.15 Supplier Qualification

4.2. Supply process group (SPL)

4.2.1. SPL.1 Supplier Tendering

4.2.2. SPL.2 Product Release

4.3. System engineering process group (SYS)

4.3.1. SYS.1 Requirements Elicitation

4.3.2. SYS.2 System Requirements Analysis

4.3.3. SYS.3 System Architectural Design

4.3.4. SYS.4 System Integration and Integration Test

4.3.5. SYS.5 System Qualification Test

4.4. Software engineering process group (SWE)

4.4.1. SWE.1 Software Requirements Analysis

4.4.2. SWE.2 Software Architectural Design

4.4.3. SWE.3 Software Detailed Design and Unit Construction

4.4.4. SWE.4 Software Unit Verification

4.4.5. SWE.5 Software Integration and Integration Test

4.4.6. SWE.6 Software Qualification Test

4.5. Supporting process group (SUP)

4.5.1. SUP.1 Quality Assurance

4.5.2. SUP.2 Verification

4.5.3. SUP.4 Joint Review

4.5.4. SUP.7 Documentation

4.5.5. SUP.8 Configuration Management

4.5.6. SUP.9 Problem Resolution Management

4.5.7. SUP.10 Change Request Management

4.6. Management process group (MAN)

4.6.1. MAN.3 Project Management

4.6.2. MAN.5 Risk Management

4.6.3. MAN.6 Measurement

4.7. Process improvement process group (PIM)

4.7.1. PIM.3 Process Improvement

4.8. Reuse process group (REU)

4.8.1. REU.2 Reuse Program Management

5. Process capability levels and process attributes

Process capability indicators are the means of achieving the capabilities addressed by the considered process attributes. Evidence of process capability indicators supports the judgment of the degree of achievement of the process attribute.

The capability dimension of the process assessment model consists of six capability levels matching the capability levels defined in ISO/IEC 33020. The process capability indicators for the 9 process attributes included in the capability dimension for process capability level 1 to 5 are described.

Each of the process attributes in this process assessment model is identical to the process attribute defined in the process measurement framework. The generic practices address the characteristics from each process attribute. The generic resources relate to the process attribute as a whole.

Process capability level 0 does not include any type of indicators, as it reflects a non-implemented process or a process which fails to partially achieve any of its outcomes.

NOTE: ISO/IEC 33020 process attribute definitions and attribute outcomes are duplicated from ISO/IEC 33020 in italic font and marked with a left side bar.

5.1. Process capability Level 0: Incomplete process

The process is not implemented, or fails to achieve its process purpose. At this level there is little or no evidence of any systematic achievement of the process purpose.

5.2. Process capability Level 1: Performed process

The implemented process achieves its process purpose. The following process attribute demonstrates the achievement of this level

5.2.1. PA 1.1 Process performance process attribute

The process performance process attribute is a measure of the extent to which the process purpose is achieved. As a result of full achievement of this attribute:

a) the process achieves its defined outcomes

Generic practices

GP 1.1.1 Achieve the process outcomes [ACHIEVEMENT a]

Achieve the intent of the base practices.

Produce work products that evidence the process outcomes.

Generic resources

Resources are used to achieve the intent of process specific base practices [ACHIEVEMENT a]

5.3. Process capability Level 2: Managed process

The previously described Performed process is now implemented in a managed fashion (planned, monitored and adjusted) and its work products are appropriately established, controlled and maintained.

The following process attributes, together with the previously defined process attribute, demonstrate the achievement of this level:

5.3.1. PA 2.1 Performance management process attribute

The performance management process attribute is a measure of the extent to which the performance of the process is managed. As a result of full achievement of this process attribute:

  1. Objectives for the performance of the process are identified;

  2. Performance of the process is planned;

  3. Performance of the process is monitored;

  4. Performance of the process is adjusted to meet plans;

  5. Responsibilities and authorities for performing the process are defined, assigned and communicated;

  6. Personnel performing the process are prepared for executing their responsibilities;

  7. Resources and information necessary for performing the process are identified, made available, allocated and used;

  8. Interfaces between the involved parties are managed to ensure both effective communication and clear assignment of responsibility.

Generic practices

GP 2.1.1 Identify the objectives for the performance of the process. [ACHIEVEMENT a]

Performance objectives are identified based on process requirements.

The scope of the process performance is defined.

Assumptions and constraints are considered when identifying the performance objectives.

NOTE 1: Performance objectives may include

  1. timely production of artifacts meeting the defined quality criteria,

  2. process cycle time or frequency

  3. resource usage; and

  4. boundaries of the process.

NOTE 2: At minimum, process performance objectives for resources, effort and schedule should be stated.

GP 2.1.2 Plan the performance of the process to fulfill the identified objectives. [ACHIEVEMENT b]

Plan(s) for the performance of the process are developed.

The process performance cycle is defined.

Key milestones for the performance of the process are established.

Estimates for process performance attributes are determined and maintained.

Process activities and tasks are defined.

Schedule is defined and aligned with the approach to performing the process.

Process work product reviews are planned.

GP 2.1.3 Monitor the performance of the process against the plans. [ACHIEVEMENT c]

The process is performed according to the plan(s).

Process performance is monitored to ensure planned results are achieved and to identify possible deviations

GP 2.1.4 Adjust the performance of the process. [ACHIEVEMENT d]

Process performance issues are identified.

Appropriate actions are taken when planned results and objectives are not achieved.

The plan(s) are adjusted, as necessary.

Rescheduling is performed as necessary.

GP 2.1.5 Define responsibilities and authorities for performing the process. [ACHIEVEMENT e]

Responsibilities, commitments and authorities to perform the process are defined, assigned and communicated.

Responsibilities and authorities to verify process work products are defined and assigned.

The needs for process performance experience, knowledge and skills are defined.

GP 2.1.6 Identify, prepare, and make available resources to perform the process according to plan. [ACHIEVEMENT f, g]

The human and infrastructure resources, necessary for performing the process are identified made available, allocated and used.

The individuals performing and managing the process are prepared by training, mentoring, or coaching to execute their responsibilities.

The information necessary to perform the process is identified and made available.

GP 2.1.7 Manage the interfaces between involved parties. [ACHIEVEMENT h]

The individuals and groups involved in the process performance are determined.

Responsibilities of the involved parties are assigned.

Interfaces between the involved parties are managed.

Communication is assured between the involved parties.

Communication between the involved parties is effective.

Generic resources

Human resources with identified objectives, responsibilities and authorities [ACHIEVEMENT e, f, h]

Facilities and infrastructure resources [ACHIEVEMENT g, h]

Project planning, management and control tools, including time and cost reporting [ACHIEVEMENT a, b, c, d]

Workflow management system [ACHIEVEMENT d, f, g, h]

Email and/or other communication mechanisms [ACHIEVEMENT b, c, d, f, g, h]

Information and/or experience repository [ACHIEVEMENT b, d, e]

Problem and issues management mechanisms [ACHIEVEMENT c]

5.3.2. PA 2.2 Work product management process attribute

The work product management process attribute is a measure of the extent to which the work products produced by the process are appropriately managed. As a result of full achievement of this process attribute:

  1. Requirements for the work products of the process are defined;

  2. Requirements for documentation and control of the work products are defined;

  3. Work products are appropriately identified, documented, and controlled;

  4. Work products are reviewed in accordance with planned arrangements and adjusted as necessary to meet requirements.

NOTE 1: Requirements for documentation and control of work products may include requirements for the identification of changes and revision status, approval and re-approval of work products, distribution of work products, and for making relevant versions of applicable work products available at points of use.

NOTE 2: The work products referred to in this clause are those that result from the achievement of the process purpose through the process outcomes.

Generic practices

GP 2.2.1 Define the requirements for the work products. [ACHIEVEMENT a]

The requirements for the work products to be produced are defined. Requirements may include defining contents and structure.

Quality criteria of the work products are identified.

Appropriate review and approval criteria for the work products are defined.

GP 2.2.2 Define the requirements for documentation and control of the work products. [ACHIEVEMENT b]

Requirements for the documentation and control of the work products are defined. Such requirements may include requirements for

  1. distribution,

  2. identification of work products and their components and

  3. traceability.

Dependencies between work products are identified and understood.

Requirements for the approval of work products to be controlled are defined.

GP 2.2.3 Identify, document and control the work products. [ACHIEVEMENT c]

The work products to be controlled are identified.

Change control is established for work products.

The work products are documented and controlled in accordance with requirements.

Versions of work products are assigned to product configurations as applicable.

The work products are made available through appropriate access mechanisms.

The revision status of the work products may readily be ascertained.

GP 2.2.4 Review and adjust work products to meet the defined requirements. [ACHIEVEMENT d]

Work products are reviewed against the defined requirements in accordance with planned arrangements.

Issues arising from work product reviews are resolved.

Generic resources

Requirement management method/toolset [ACHIEVEMENT a, b, c]

Configuration management system [ACHIEVEMENT b, c]

Documentation elaboration and support tool [ACHIEVEMENT b, c]

Document identification and control procedure [ACHIEVEMENT b, c]

Work product review methods and experiences [ACHIEVEMENT d]

Review management method/toolset [ACHIEVEMENT d]

Intranets, extranets and/or other communication mechanisms [ACHIEVEMENT b, c]

Problem and issue management mechanisms [ACHIEVEMENT d]

5.4. Process capability Level 3: Established process

The previously described Managed process is now implemented using a defined process that is capable of achieving its process outcomes.

The following process attributes, together with the previously defined process attributes, demonstrate the achievement of this level:

5.4.1. PA 3.1 Process definition process attribute

The process definition process attribute is a measure of the extent to which a standard process is maintained to support the deployment of the defined process. As a result of full achievement of this process attribute:

  1. A standard process, including appropriate tailoring guidelines, is defined and maintained that describes the fundamental elements that must be incorporated into a defined process;

  2. The sequence and interaction of the standard process with other processes is determined.

  3. Required competencies and roles for performing the process are identified as part of the standard process;

  4. Required infrastructure and work environment for performing the process are identified as part of the standard process;

  5. Suitable methods and measures for monitoring the effectiveness and suitability of the process are determined.

Generic practices

GP 3.1.1 Define and maintain the standard process that will support the deployment of the defined process. [ACHIEVEMENT a]

A standard process is developed and maintained that includes the fundamental process elements.

The standard process identifies the deployment needs and deployment context.

Guidance and/or procedures are provided to support implementation of the process as needed.

Appropriate tailoring guideline(s) are available as needed.

GP 3.1.2 Determine the sequence and interaction between processes so that they work as an integrated system of processes. [ACHIEVEMENT b]

The standard process’s sequence and interaction with other processes are determined.

Deployment of the standard process as a defined process maintains integrity of processes.

GP 3.1.3 Identify the roles and competencies, responsibilities, and authorities for performing the standard process. [ACHIEVEMENT c]

Process performance roles are identified

Competencies for performing the process are identified.

Authorities necessary for executing responsibilities are identified.

GP 3.1.4 Identify the required infrastructure and work environment for performing the standard process. [ACHIEVEMENT d]

Process infrastructure components are identified (facilities, tools, networks, methods, etc.).

Work environment requirements are identified.

GP 3.1.5 Determine suitable methods and measures to monitor the effectiveness and suitability of the standard process. [ACHIEVEMENT e]

Methods and measures for monitoring the effectiveness and suitability of the process are determined.

Appropriate criteria and data needed to monitor the effectiveness and suitability of the process are defined.

The need to conduct internal audit and management review is established.

Process changes are implemented to maintain the standard process.

Generic resources

Process modeling methods/tools [ACHIEVEMENT a, b, c, d]

Training material and courses [ACHIEVEMENT a, b, c, d]

Resource management system [ACHIEVEMENT d]

Process infrastructure [ACHIEVEMENT a, b, d]

Audit and trend analysis tools [ACHIEVEMENT e]

Process monitoring method [ACHIEVEMENT e]

5.4.2. PA 3.2 Process deployment process attribute

The process deployment process attribute is a measure of the extent to which the standard process is deployed as a defined process to achieve its process outcomes. As a result of full achievement of this process attribute:

  1. A defined process is deployed based upon an appropriately selected and/or tailored standard process;

  2. Required roles, responsibilities and authorities for performing the defined process are assigned and communicated;

  3. Personnel performing the defined process are competent on the basis of appropriate education, training, and experience;

  4. Required resources and information necessary for performing the defined process are made available, allocated and used;

  5. Required infrastructure and work environment for performing the defined process are made available, managed and maintained;

  6. Appropriate data are collected and analysed as a basis for understanding the behaviour of the process, to demonstrate the suitability and effectiveness of the process, and to evaluate where continual improvement of the process can be made.

Generic practices

GP 3.2.1 Deploy a defined process that satisfies the context specific requirements of the use of the standard process. [ACHIEVEMENT a]

The defined process is appropriately selected and/or tailored from the standard process.

Conformance of defined process with standard process requirements is verified.

GP 3.2.2 Assign and communicate roles, responsibilities and authorities for performing the defined process. [ACHIEVEMENT b]

The roles for performing the defined process are assigned and communicated.

The responsibilities and authorities for performing the defined process are assigned and communicated. GP 3.2.3 Ensure necessary competencies for performing the defined process. [ACHIEVEMENT c]

Appropriate competencies for assigned personnel are identified.

Suitable training is available for those deploying the defined process.

GP 3.2.4 Provide resources and information to support the performance of the defined process. [ACHIEVEMENT d]

Required human resources are made available, allocated and used.

Required information to perform the process is made available, allocated and used.

GP 3.2.5 Provide adequate process infrastructure to support the performance of the defined process. [ACHIEVEMENT e]

Required infrastructure and work environment is available.

Organizational support to effectively manage and maintain the infrastructure and work environment is available.

Infrastructure and work environment is used and maintained.

GP 3.2.6 Collect and analyze data about performance of the process to demonstrate its suitability and effectiveness. [ACHIEVEMENT f]

Data required to understand the behavior, suitability and effectiveness of the defined process are identified.

Data is collected and analyzed to understand the behavior, suitability and effectiveness of the defined process.

Results of the analysis are used to identify where continual improvement of the standard and/or defined process can be made.

NOTE 1: Data about process performance may be qualitative or quantitative.

Generic resources

Feedback mechanisms (customer, staff, other stakeholders) [ACHIEVEMENT f]

Process repository [ACHIEVEMENT a]

Resource management system [ACHIEVEMENT b, c, d]

Knowledge management system [ACHIEVEMENT a, b, d, f]

Problem and change management system [ACHIEVEMENT f]

Working environment and infrastructure [ACHIEVEMENT d, e]

Data collection analysis system [ACHIEVEMENT f]

Process assessment framework [ACHIEVEMENT f]

Audit/review system [ACHIEVEMENT f]

5.5. Process capability Level 4: Predictable process

The previously described Established process now operates predictively within defined limits to achieve its process outcomes. Quantitative management needs are identified, measurement data are collected and analysed to identify assignable causes of variation. Corrective action is taken to address assignable causes of variation.

The following process attributes, together with the previously defined process attributes, demonstrate the achievement of this level:

5.5.1. PA 4.1 Quantitative analysis process attribute

The quantitative analysis process attribute is a measure of the extent to which information needs are defined, relationships between process elements are identified and data are collected. As a result of full achievement of this process attribute:

  1. The process is aligned with quantitative business goals;

  2. Process information needs in support of relevant defined quantitative business goals are established;

  3. Process measurement objectives are derived from process information needs;

  4. Measurable relationships between process elements that contribute to the process performance are identified;

  5. Quantitative objectives for process performance in support of relevant business goals are established;

  6. Appropriate measures and frequency of measurement are identified and defined in line with process measurement objectives and quantitative objectives for process performance;

  7. Results of measurement are collected, validated and reported in order to monitor the extent to which the quantitative objectives for process performance are met.

NOTE 1: Information needs typically reflect management, technical, project, process or product needs.

Generic practices

GP 4.1.1 Identify business goals. [ACHIEVEMENT a]

Business goals are identified that are supported by the quantitatively measured process.

GP 4.1.2 Establish process information needs. [ACHIEVEMENT a, b]

Stakeholders of the identified business goals and the quantitatively measured process, and their information needs are identified, defined and agreed.

GP 4.1.3 Derive process measurement objectives from process information needs. [ACHIEVEMENT a, c]

The process measurement objectives to satisfy the established process information needs are derived.

GP 4.1.4 Identify measurable relationships between process elements. [ACHIEVEMENT a, d]

Identify the relationships between process elements, which contribute to the derived measurement objectives.

GP 4.1.5 Establish quantitative objectives. [ACHIEVEMENT a, e]

Establish quantitative objectives for the identified measurable process elements and their relationships. Agreement with process stakeholders is established.

GP 4.1.6 Identify process measures that support the achievement of the quantitative objectives. [ACHIEVEMENT a, f]

Detailed measures are defined to support monitoring, analysis and verification needs of the quantitative objectives.

Frequency of data collection is defined.

Algorithms and methods to create derived measurement results from base measures are defined, as appropriate.

Verification mechanism for base and derived measures is defined.

NOTE 1: Typically, the standard process definition is extended to include the collection of data for process measurement.

GP 4.1.7 Collect product and process measurement results through performing the defined process. [ACHIEVEMENT a, g]

Data collection mechanism is created for all identified measures.

Required data is collected within the defined frequency, and recorded.

Measurement results are analyzed, and reported to the identified stakeholders.

NOTE 2: A product measure can contribute to a process measure, e.g. the productivity of testing characterized by the number of defects found in a given timeframe in relation to the product defect rate in the field.

Generic resources

Management information (cost, time, reliability, profitability, customer benefits, risks etc.) [ACHIEVEMENT a, b, c, d, e, f]

Applicable measurement techniques [ACHIEVEMENT a, d]

Product and Process measurement tools and results databases [ACHIEVEMENT a, d, e, f, g]

Process measurement framework [ACHIEVEMENT a, d, e, f, g]

Tools for data analysis and measurement [ACHIEVEMENT a, b, c, d, e, f]

5.5.2. PA 4.2 Quantitative control process attribute

The quantitative control process attribute is a measure of the extent to which objective data are used to manage process performance that is predictable. As a result of full achievement of this process attribute:

  1. Techniques for analyzing the collected data are selected;

  2. Assignable causes of process variation are determined through analysis of the collected data;

  3. Distributions that characterize the performance of the process are established;

  4. Corrective actions are taken to address assignable causes of variation;

  5. Separate distributions are established (as necessary) for analyzing the process under the influence of assignable causes of variation.

Generic practices

GP 4.2.1 Select analysis techniques. [ACHIEVEMENT a]

Analysis methods and techniques for control of the process measurements are defined.

GP 4.2.2 Establish distributions that characterize the process performance. [ACHIEVEMENT c]

Expected distributions and corresponding control limits for measurement results are defined.

GP 4.2.3 Determine assignable causes of process variation. [ACHIEVEMENT b]

Each deviation from the defined control limits is identified and recorded.

Determine assignable causes of these deviations by analyzing collected data using the defined analysis techniques. All deviations and assigned causes are recorded.

GP 4.2.4 Identify and implement corrective actions to address assignable causes. [ACHIEVEMENT d]

Corrective actions are determined, recorded, and implemented to address assignable causes of variation.

Corrective action results are monitored and evaluated to determine their effectiveness.

GP 4.2.5 Establish separate distributions for analyzing the process [ACHIEVEMENT e]

Separate distributions are used to quantitatively understand the variation of process performance under the influence of assignable causes.

Generic resources

Process control and analysis techniques [ACHIEVEMENT a, c]

Statistical analysis tools/applications [ACHIEVEMENT a, b, c, e]

Process control tools/applications [ACHIEVEMENT d, e]

5.6. Process capability Level 5: Innovating process

The previously described Predictable process is now continually improved to respond to change aligned with organizational goals.

The following process attributes, together with the previously defined process attributes, demonstrate the achievement of this level:

5.6.1. PA 5.1 Process innovation process attribute

The process innovation process attribute is a measure of the extent to which changes to the process are identified from investigations of innovative approaches to the definition and deployment of the process. As a result of full achievement of this process attribute:

  1. Process innovation objectives are defined that support the relevant business goals;

  2. Appropriate data are analysed to identify opportunities for innovation;

  3. Innovation opportunities derived from new technologies and process concepts are identified;

  4. An implementation strategy is established to achieve the process innovation objectives.

Generic practices

GP 5.1.1 Define the process innovation objectives for the process that support the relevant business goals. [ACHIEVEMENT a]

New business visions and goals are analyzed to give guidance for new process objectives and potential areas of process innovation.

GP 5.1.2 Analyze data of the process to identify opportunities for innovation. [ACHIEVEMENT b]

Common causes of variation in process performance are identified and analyzed to get a quantitative understanding of their impact.

Identify opportunities for innovation based on the quantitative understanding of the analyzed data.

GP 5.1.3 Analyze new technologies and process concepts to identify opportunities for innovation. [ACHIEVEMENT c]

Industry best practices, new technologies and process concepts are identified and evaluated.

Feedback on opportunities for innovation is actively sought.

Emergent risks are considered in evaluating improvement opportunities.

GP 5.1.4 Define and maintain an implementation strategy based on innovation vision and objectives. [ACHIEVEMENT d]

Commitment to innovation is demonstrated by organizational management including the process owner(s) and other relevant stakeholders.

Define and maintain an implementation strategy to achieve identified opportunities for innovation and objectives.

Based on implementation strategy process changes are planned, prioritized based on their impact on defined innovations.

Measures that validate the results of process changes are defined to determine the expected effectiveness of the process changes and the expected impact on defined business objectives.

Generic resources

Process improvement framework [ACHIEVEMENT a, c, d]

Process feedback and analysis system (measurement data, causal analysis results etc.) [ACHIEVEMENT b, c]

Piloting and trialing mechanism [ACHIEVEMENT c, d]

5.6.2. PA 5.2 Process innovation implementation process attribute

The process innovation process implementation attribute is a measure of the extent to which changes to the definition, management and performance of the process achieves the relevant process innovation objectives. As a result of full achievement of this process attribute:

  1. Impact of all proposed changes is assessed against the objectives of the defined process and standard process;

  2. Implementation of all agreed changes is managed to ensure that any disruption to the process performance is understood and acted upon;

  3. Effectiveness of process change on the basis of actual performance is evaluated against the defined product requirements and process objectives.

Generic practices

GP 5.2.1 Assess the impact of each proposed change against the objectives of the defined and standard process. [ACHIEVEMENT a] Objective priorities for process innovation are established.

Specified changes are assessed against product quality and process performance requirements and goals.

Impact of changes to other defined and standard processes is considered.

GP 5.2.2. Manage the implementation of agreed changes. [ACHIEVEMENT b]

A mechanism is established for incorporating accepted changes into the defined and standard process(es) effectively and completely.

The factors that impact the effectiveness and full deployment of the process change are identified and managed, such as:

  • Economic factors (productivity, profit, growth, efficiency, quality, competition, resources, and capacity );

  • Human factors (job satisfaction, motivation, morale, conflict/cohesion, goal consensus, participation, training, span of control);

  • Management factors (skills, commitment, leadership, knowledge, ability, organizational culture and risks);

  • Technology factors (sophistication of system, technical expertise, development methodology, need of new technologies).

Training is provided to users of the process.

Process changes are effectively communicated to all affected parties.

Records of the change implementation are maintained.

GP 5.2.3 Evaluate the effectiveness of process change. [ACHIEVEMENT c]

Performance and capability of the changed process are measured and evaluated against process objectives and historical data.

A mechanism is available for documenting and reporting analysis results to management and owners of standard and defined process.

Measures are analyzed to determine whether the process performance has improved with respect to common causes of variations.

Other feedback is recorded, such as opportunities for further innovation of the predictable process.

Generic resources

Change management system [ACHIEVEMENT a, b, c]

Process evaluation system (impact analysis, etc.) [ACHIEVEMENT a, c]

Annex A Conformity of the process assessment and reference model

A.1 Introduction

The Automotive SPICE process assessment and process reference model are meeting the requirements for conformance defined in ISO/IEC 33004. The process assessment model can be used in the performance of assessments that meet the requirements of ISO/IEC 33002.

This clause serves as the statement of conformance of the process assessment and process reference models to the requirements defined in ISO/IEC 33004. [ISO/IEC 33004, 5.5 and 6.4]

Due to copyright reasons each requirement is only referred by its number. The full text of the requirements can be drawn from ISO/IEC 33004.

A.2 Conformance to the requirements for process reference models

Clause 5.3, “Requirements for process reference models”

The following information is provided in chapter 1 and 3 of this document:

  • the declaration of the domain of this process reference model;

  • the description of the relationship between this process reference model and its intended context of use; and

  • the description of the relationship between the processes defined within this process reference model.

The descriptions of the processes within the scope of this process reference model meeting the requirements of ISO/IEC 33004 clause 5.4 is provided in chapter 4 of this document.

[ISO/IEC 33004, 5.3.1]

The relevant communities of interest and their mode of use and the consensus achieved for this process reference model is documented in the copyright notice and the scope of this document.

[ISO/IEC 33004, 5.3.2]

The process descriptions are unique. The identification is provided by unique names and by the identifier of each process of this document.

[ISO/IEC 33004, 5.3.3]

Clause 5.4, “Process descriptions”

These requirements are met by the process descriptions in chapter 4 of this document.

[ISO/IEC 33004, 5.4]

A.3 Conformance to the requirements for process assessment models

Clause 6.1, “Introduction”

The purpose of this process assessment model is to support assessment of process capability within the automotive domain using the process measurement framework defined in ISO/IEC 33020.

[ISO/IEC 33004, 6.1]

Clause 6.2, “Process assessment model scope”

The process scope of this process assessment model is defined in the process reference model included in chapter 3.1 of this document. The Automotive SPICE process reference model is satisfying the requirements of ISO/IEC 33004, clause 5 as described in Annex A.2.

The process capability scope of this process assessment model is defined in the process measurement framework specified in ISO/IEC 33020, which defines a process measurement framework for process capability satisfying the requirements of ISO/IEC 33003.

[ISO/IEC 33004, 6.2]

Clause 6.3, “Requirements for process assessment models”

The Automotive SPICE process assessment model is related to process capability.

[ISO/IEC 33004, 6.3.1]

This process assessment model incorporates the process measurement framework specified in ISO/IEC 33020, which satisfies the requirements of ISO/IEC 33003.

[ISO/IEC 33004, 6.3.2]

This process assessment model is based on the Automotive SPICE Reference Model included in this document.

This process assessment model is based on the Measurement Framework defined in ISO/IEC 33020.

[ISO/IEC 33004, 6.3.3]

The processes included in this process assessment model are identical to those specified in the Process Reference Model

[ISO/IEC 33004, 6.3.4]

For all processes in this process assessment model all levels defined in the process measurement framework from ISO/IEC 33020 are addressed.

[ISO/IEC 33004, 6.3.5]

This process assessment model defines

  • the selected process quality characteristic;

  • the selected process measurement framework;

  • the selected process reference model(s);

  • the selected processes from the process reference model(s)

in chapter 3 of this document.

[ISO/IEC 33004, 6.3.5 a-d]

In the capability dimension, this process assessment model addresses all of the process attributes and capability levels defined in the process measurement framework in ISO/IEC 33020. [ISO/IEC 33004, 6.3.5 e]

Clause 6.3.1, “Assessment indicators”

NOTE: Due to an error in numbering in the published version of ISO/IEC 33004 the following reference numbers are redundant to those stated above. To refer to the correct clauses from ISO/IEC 33004, the text of clause heading is additionally specified for the following three requirements.

The Automotive SPICE process assessment model provides a two-dimensional view of process capability for the processes in the process reference model, through the inclusion of assessment indicators as defined in chapter 3.3. The assessment indicators used are:

  • Base practices and output work products

[ISO/IEC 33004, 6.3.1 a, “Assessment indicators”]

  • Generic practices and Generic resources

[ISO/IEC 33004, 6.3.1 b, “Assessment indicators”]

Clause 6.3.2, “Mapping process assessment models to process reference models”

The mapping of the assessment indicators to the purpose and process outcomes of the processes in the process reference model is included in each description of the base practices in chapter 4.

The mapping of the assessment indicators to the process attributes in the process measurement framework including all of the process attribute achievements is included in each description of the generic practices in chapter 5.

Each mapping is indicated by a reference in square brackets.

[ISO/IEC 33004, 6.3.2, “Mapping process assessment models”]

Clause 6.3.3, “Expression of assessment results”

The process attributes and the process attribute ratings in this process assessment model are identical to those defined in the measurement framework. As a consequence, results of assessments based upon this process assessment model are expressed directly as a set of process attribute ratings for each process within the scope of the assessment. No form of translation or conversion is required.

[ISO/IEC 33004, 6.3.3, “Expression of assessment results”]

Annex B Work product characteristics

Work product characteristics listed in this Annex can be used when reviewing potential outputs of process implementation. The characteristics are provided as guidance for the attributes to look for, in a particular sample work product, to provide objective evidence supporting the assessment of a particular process.

A documented process and assessor judgment is needed to ensure that the process context (application domain, business purpose, development methodology, size of the organization, etc.) is considered when using this information.

Work products are defined using the schema in table B.1. Work products and their characteristics should be considered as a starting point for considering whether, given the context, they are contributing to the intended purpose of the process, not as a check-list of what every organization must have.

Table 18 Table B.1 — Structure of WPC tables

Work product

identifier

An identifier number for the work product which is used to reference the work product.

Work product name

Provides an example of a typical name associated with the work product characteristics. This name is provided as an identifier of the type of work product the practice or process might produce. Organizations may call these work products by different names. The name of the work product in the organization is not significant. Similarly, organizations may have several equivalent work products which contain the characteristics defined in one work product type. The formats for the work products can vary. It is up to the assessor and the organizational unit coordinator to map the actual work products produced in their organization to the examples given here.

Work product characteristics

Provides examples of the potential characteristics associated with the work product types.

The assessor may look for these in the samples provided by the organizational unit.

Work products (with the ID NN-00) are sets of characteristics that would be expected to be evident in work products of generic types as a result of achievement of an attribute. The generic work products form the basis for the classification of specific work products defined as process performance indicators.

Specific work product types are typically created by process owners and applied by process deployers in order to satisfy an outcome of a particular process purpose.

NOTE: The generic work products denoted with `*` are not used in the Automotive SPICE process assessment model but are included for completeness.

Table B.2 — Work product characteristics

Table B.2 — Work product characteristicse

WP ID

WP Name

WP Characteristics

01-00

Configuration item

• Item which is maintained under configuration control: - may include components, subsystems, libraries, test cases, compilers, data, documentation, physical media, and external interfaces • Version identification is maintained • Description of the item is available including the: - type of item - associated configuration management library, file, system - responsible owner - date when placed under configuration control - status information (i.e., development, baselined, released) - relationship to lower level configured items - identification of the change control records - identification of change history

01-03

Software item

• Integrated software consisting of: - source code - software elements - executable code - configuration files • Documentation, which: - describes and identifies source code - describes and identifies software elements - describes and identifies configuration files - describes and identifies executable code - describes software life-cycle status - describes archive and release criteria - describes compilation of software units - describes building of software item

01-50

Integrated software

• An aggregate of software items • A set of executables for a specific ECU configuration and possibly associated documentation and data

01-51

Application parameter

• Name • Description • Value domain, threshold values, characteristic curves • Owner • Means of data application (e.g. flashing interfaces) • If necessary a grouping/a categorization: - name of the category/group/file name - description • Actual value or characteristic curve applied

02-00_old

Contract

• Defines what is to be purchased or delivered • Identifies time frame for delivery or contracted service dates • Identifies any statutory requirements • Identifies monetary considerations • Identifies any warranty information • Identifies any copyright and licensing information • Identifies any customer service requirements • Identifies service level requirements • References to any performance and quality expectations/constraints/monitoring • Standards and procedures to be used • Evidence of review and approval • As appropriate to the contract the following are considered: - references to any acceptance criteria - references to any special customer needs (i.e., confidentiality requirements, security, hardware, etc.) - references to any change management and problem resolution procedures - identification of any interfaces to independent agents and subcontractors - identification of customer's role in the development and maintenance process - identification of resources to be provided by the customer

02-01_old

Commitment / agreement

• Signed off by all parties involved in the commitment/agreement • Establishes what the commitment is for • Establishes the resources required to fulfill the commitment, such as: - time - people - budget - equipment - facilities

03-00

Data

• Result of applying a measure

03-03

Benchmarking data

• Results of measurement of current performance that allow comparison against historical or target values • Relates to key goals/process/product/market need criteria and information to be benchmarked

03-04

Customer satisfaction data

• Determines levels of customer satisfaction with products and services • Mechanism to collect data on customer satisfaction: - results of field performance data - results of customer satisfaction survey - interview notes - meeting minutes from customer meetings

03-06

Process performance data

• Data comparing process performance against expected levels data • Defined input and output work products available • Meeting minutes • Change records • Task completion criteria met • Quality criteria met • Resource allocation and tracking

04-00

Design

• Describes the overall product/system structure • Identifies the required product/system elements • Identifies the relationship between the elements • Consideration is given to: - any required performance characteristics - any required interfaces - any required security characteristics

04-02

Domain architecture

• Identified domain model(s) tailored from • Identified asset specifications • Definition of boundaries and relationships with other domains (Domain Interface Specification) • Identification of domain vocabulary • Identification of the domain representation standard • Provides an overview of the functions, features capabilities and concepts in the domains

04-03

Domain model

• Must provide a clear explanation and description, on usage and properties, for reuse purposes • Identification of the management and structures used in the model

04-04_old

Software architectural design

• Describes the overall software structure • Describes the operative system including task structure • Identifies inter-task/inter-process communication • Identifies the required software elements • Identifies own developed and supplied code • Identifies the relationship and dependency between software elements • Identifies where the data (such as application parameters or variables) are stored and which measures (e.g. checksums, redundancy) are taken to prevent data corruption • Describes how variants for different model series or configurations are derived • Describes the dynamic behavior of the software (Start-up, shutdown, software update, error handling and recovery, etc.) • Describes which data is persistent and under which conditions • Consideration is given to: - any required software performance characteristics - any required software interfaces - any required security characteristics required - any database design requirements

04-05_old

Software detailed design

• Provides detailed design (could be represented as a prototype, flow chart, entity relationship diagram, pseudo code, etc.) • Provides format of input/output data • Provides specification of CPU, ROM, RAM, EEPROM and Flash needs • Describes the interrupts with their priorities • Describes the tasks with cycle time and priority • Establishes required data naming conventions • Defines the format of required data structures • Defines the data fields and purpose of each required data element • Provides the specifications of the program structure

04-06_old

System architectural design

• Provides an overview of all system design • Describes the interrelationship between system elements • Describes the relationship between the system elements and the software • Specifies the design for each required system element, consideration is given to aspects such as: - memory/capacity requirements - hardware interface requirements - user interface requirements - external system interface requirements - performance requirements - command structures - security/data protection characteristics - settings for system parameters (such as application parameters or global variables) - manual operations - reusable components • Mapping of requirements to system elements • Description of the operation modes of the system components (startup, shutdown, sleep mode, diagnosis mode, etc.) • Description of the dependencies among the system components regarding the operation modes • Description of the dynamic behavior of the system and the system components

05-00

Goals

• Identifies the objective to be achieved • Identifies who is expected to achieve the goal • Identifies any incremental supporting goals • Identifies any conditions/constraints • Identifies the timeframe for achievement • Are reasonable and achievable within the resources allocated • Are current, established for current project, organization • Are optimized to support known performance criteria and plans

06-00

User documentation

• Identifies: - external documents - internal documents - current site distribution and maintenance list maintained • Documentation kept synchronized with latest product release • Addresses technical issues

06-01

Customer manual

• Takes account of: - audience and task profiles - the environment in which the information will be used - convenience to users - the range of technical facilities, including resources and the product, available for developing and delivering on-screen documentation - information characteristics - cost of delivery and maintainability • Includes information needed for operation of the system, including but not limited to: - product and version information - instructions for handling the system - initial familiarization information - long examples - structured reference material, particularly for advanced features of the software - checklists - guides to use input devices

06-02

Handling and storage guide

• Defines the tasks to perform in handling and storing products including: - providing for master copies of code and documentation - disaster recovery - addressing appropriate critical safety and security issues • Provides a description of how to store the product including: - storage environment required - the protection media to use - packing materials required - what items need to be stored - assessments to be done on stored product • Provides retrieval instructions

06-04

Training material

• Updated and available for new releases • Coverage of system, application, operations, maintenance as appropriate to the application • Course listings and availability

07-00

Measure

• Available to those with a need to know • Understood by those expected to use them • Provides value to the organization/project • Non-disruptive to the work flow • Appropriate to the process, life cycle model, organization: - is accurate - source data is validated - results are validated to ensure accuracy • Has appropriate analysis and commentary to allow meaningful interpretation by users

07-01

Customer satisfaction survey

• Identification of customer and customer information • Date requested • Target date for responses • Identification of associated hardware/software/product configuration • Ability to record feedback

07-02

Field measure

• Measures attributes of the performance of system's operation at field locations, such as: - field defects - performance against defined service level measures - system ability to meet defined customer requirements - support time required - user complaints (may be third party users) - customer requests for help - performance trends - problem reports - enhancements requested

07-03

Personnel performance measure

• Real time measures of personnel performance or expected service level • Identifies aspects such as: - capacity - throughput - operational performance - operational service - availability

07-04

Process measure

• Measures about the process' performance: - ability to produce sufficient work products - adherence to the process - time it takes to perform process - defects related to the process • Measures the impact of process change • Measures the efficiency of the process

07-05

Project measure

• Monitors key processes and critical tasks, provides status information to the project on: - project performance against established plan - resource utilization against established plan - time schedule against established plan - process quality against quality expectations and/or criteria - product quality against quality expectations and/or criteria - highlight product performance problems, trends • Measures the results of project activities: - tasks are performed on schedule - product's development is within the resource commitments allocated • References any goals established

07-06

Quality measure

• Measures quality attributes of the work products defined: - functionality - reliability - usability - efficiency - maintainability - portability • Measures quality attributes of the "end customer" product quality and reliability *NOTE: Refer ISO/IEC 25010 for detailed information on measurement of product quality.*

07-07_old

Risk measure

• Identifies the probability of risk occurring • Identifies the impact of risk occurring • Establishes measures for each risk defined • Measures the change in the risk state

07-08

Service level measure

• Real time measures taking while a system is operational, it measures the system's performance or expected service level • Identifies aspects such as: - capacity - throughput - operational performance - operational service - service outage time - up time - job run time

08-00

Plan

As appropriate to the application and purpose: • Identifies what objectives or goals there are to be satisfied • Establishes the options and approach for satisfying the objectives, or goals • Identification of the plan owner • Includes: - the objective and scope of what is to be accomplished - assumptions made - constraints - risks - tasks to be accomplished - schedules, milestones and target dates - critical dependencies - maintenance disposition for the plan • Method/approach to accomplish plan • Identifies: - task ownership, including tasks performed by other parties (e.g. supplier, customer) • quality criteria • required work products • Includes resources to accomplish plan objectives: - time - staff (key roles and authorities e.g. sponsor) - materials/equipment - budget • Includes contingency plan for non-completed tasks • Plan is approved

08-04

Configuration management plan

• Defines or references the procedures to control changes to configuration items • Defines measurements used to determine the status of the configuration management activities • Defines configuration management audit criteria • Approved by the configuration management function • Identifies configuration library tools or mechanism • Includes management records and status reports that show the status and history of controlled items • Specifies the location and access mechanisms for the configuration management library • Storage, handling and delivery (including archival and retrieval) mechanisms specified

08-12

Project plan

• Defines: - work products to be developed - life cycle model and methodology to be used - customer requirements related to project management - project resources • Milestones and target dates: - estimates - quality criteria - processes and methods to employ - contingency actions

08-13

Quality plan

• Objectives/goal for quality • Defines the activities tasks required to ensure quality • References related work products • References any regulatory requirements, standards, customer requirements • Identifies the expected quality criteria • Specifies the monitoring and quality checkpoints for the defined life cycle and associated activities planned • Defines the methods of assuring quality • Identifies the quality criteria for work products and process tasks • Specifies the threshold/tolerance level allowed prior to requiring corrective actions • Defines quality measurements and timing of the collection • Specifies mechanism to feed collected quality record back into process impacted by poor quality • Defines the approach to guaranteeing objectivity • Approved by the quality responsible organization/function: - identifies escalations opportunities and channels - defines the cooperation with customer and supplier QA

08-14_old

Recovery plan

• Identifies what is to be recovered: - procedures/methods to perform the recovery - schedule for recovery - time required for the recovery - critical dependencies - resources required for the recovery - list of backups maintained - staff responsible for recovery and roles assigned - special materials required - required work products - required equipment - required documentation - locations and storage of backups - contact information on who to notify about the recovery - verification procedures - cost estimation for recovery

08-16

Release plan

• Identifies the functionality to be included in each release • Identifies the associated elements required (i.e., hardware, software, documentation etc.) • Mapping of the customer requests, requirements satisfied to particular releases of the product

08-17

Reuse plan

• Defines the policy about what items to be reused • Defines standards for construction of reusable objects: - defines the attributes of reusable components - quality/reliability expectations - standard naming conventions • Defines the reuse repository (library, CASE tool, file, data base, etc.) • Identifies reusable components: - directory of component - description of components - applicability of their use - method to retrieve and use them - restrictions for modifications and usage • Method for using reusable components • Establishes goal for reusable components

08-18

Review plan

• Defines: - what to be reviewed - roles and responsibilities of reviewers - criteria for review (check-lists, requirements, standards) - expected preparation time - schedule for reviews • Identification of: - procedures for conducting review - review inputs and outputs - expertise expected at each review - review records to keep - review measurements to keep - resources, tools allocated to the review

08-19_old

Risk management plan

• Project risks identified and prioritized • Mechanism to track the risk • Threshold criteria to identify when corrective action required • Proposed ways to mitigate risks: - risk mitigator - work around - corrective actions activities/tasks - monitoring criteria - mechanisms to measure risk

08-20_old

Risk mitigation plan

• Planned risk treatment activities and tasks: - describes the specifics of the risk treatment selected for a risk or combination of risks found to be unacceptable - describes any difficulties that may be found in implementing the treatment • Treatment schedule • Treatment resources and their allocation • Responsibilities and authority: - describes who is responsible for ensuring that the treatment is being implemented and their authority • Treatment control measures: - defines the measures that will be used to evaluate the effectiveness of the risk treatment • Treatment cost • Interfaces among parties involved: - describes any coordination among stakeholders or with the project’s master plan that must occur for the treatment to be properly implemented • Environment/infrastructure: - describes any environmental or infrastructure requirements or impacts (e.g., safety or security impacts that the treatment may have) • Risk treatment plan change procedures and history

08-26

Documentation plan

• Identifies documents to be produced • Defines the documentation activities during the life cycle of the software product or service • Identifies any applicable standards and templates • Defines requirements for documents • Review and authorization practices • Distribution of the documents • Maintenance and disposal of the documents

08-27

Problem management plan

• Defines problem resolution activities including identification, recording, description and classification • Problem resolution approach: evaluation and correction of the problem • Defines problem tracking • Mechanism to collect and distribute problem resolutions

08-28

Change management plan

• Defines change management activities including identification, recording, description, analysis and implementation • Defines approach to track status of change requests • Defines verification and validation activities • Change approval and implication review

08-29

Improvement plan

• Improvement objectives derived from organizational business goals • Organizational scope • Process scope, the processes to be improved • Key roles and responsibilities • Appropriate milestones, review points and reporting mechanisms • Activities to be performed to keep all those affected by the improvement program informed of progress

08-50_old

Test specification

• Test Design Specification • Test Case Specification • Test Procedure Specification • Identification of test cases for regression testing • Additionally, for system integration: - identification of required system elements (hardware elements, wiring elements, settings for parameters (such as application parameters or global variables) , data bases, etc.) - necessary sequence or ordering identified for integrating the system elements

08-51

Technology monitoring plan

*No requirements additional to Plan (Generic)*

08-52_old

Test plan

• Test Plan according to ISO29119-3 • Context: - project/Test sub-process - test item(s) - test scope - assumptions and constraints - stakeholder - testing communication • Test strategy - identifies what needs there are to be satisfied - establishes the options and approach for satisfying the needs (black-box and/or white-box-testing, boundary class test determination, regression testing strategy, etc.) - establishes the evaluation criteria against which the strategic options are evaluated - identifies any constraints/risks and how these will be addressed - test design techniques - test completion criteria - test ending criteria - test start, abort and re-start criteria - metrics to be collected - test data requirements - retesting and regression testing - suspension and resumption criteria - deviations from the Organizational Test Strategy • Test data requirements • Test environment requirements • Test sub-processes • Test deliverables • Testing activities and estimates

09-00

Policy

• Authorized • Available to all personnel impacted by the policy • Establishes practices/rules to be adhered to

09-03

Reuse policy

• Identification of reuse requirements • Establishes the rules of reuse • Documents the reuse adoption strategy including goals and objectives • Identification of the reuse program • Identification of the name of the reuse sponsor • Identification of the reuse program participants • Identification of the reuse steering function • Identification of reuse program support functions

10-00

Process description

• A detailed description of the process/procedure which includes: - tailoring of the standard process (if applicable) - purpose of the process - outcomes of the process - task and activities to be performed and ordering of tasks - critical dependencies between task activities - expected time required to execute task - input/output work products - links between input and outputs work products • Identifies process entry and exit criteria • Identifies internal and external interfaces to the process • Identifies process measures • Identifies quality expectations • Identifies functional roles and responsibilities • Approved by authorized personnel

11-00

Product

• Is a result/deliverable of the execution of a process, includes services, systems (software and hardware) and processed materials • Has elements that satisfy one or more aspects of a process purpose • May be represented on various media (tangible and intangible)

11-03

Product release information

• Coverage for key elements (as appropriate to the application): • Description of what is new or changed (including features removed) • System information and requirements • Identification of conversion programs and instructions • Release numbering implementation may include: - the major release number - the feature release number - the defect repair number - the alpha or beta release; and the iteration within the alpha or beta release • Identification of the component list (version identification included): - hardware / software / product elements, libraries, etc. - associated documentation list • new parameter information (e.g. for application parameters or global variables) and/or commands • Backup and recovery information • List of open known problems, faults, warning information, etc. • Identification of verification and diagnostic procedures • Technical support information • Copyright and license information • The release note may include an introduction, the environmental requirements, installation procedures, product invocation, new feature identification and a list of defect resolutions, known defects and workarounds

11-04

Product release package

• Includes the hardware/software/product • Includes and associated release elements such as: - system hardware/software/product elements - associated customer documentation - application parameter definitions defined - command language defined - installation instructions - release letter

11-05_old

Software unit

• Follows established coding standards (as appropriate to the language and application): - commented - structured or optimized - meaningful naming conventions - parameter information identified - error codes defined - error messages descriptive and meaningful - formatting - indented, levels • Follows data definition standards (as appropriate to the language and application): - variables defined - data types defined - classes and inheritance structures defined - objects defined • Entity relationships defined • Database layouts are defined • File structures and blocking are defined • Data structures are defined • Algorithms are defined • Functional interfaces defined

11-06

System

• All elements of the product release are included • Any required hardware • Integrated product • Customer documentation • Fully configured set of the system elements: - application parameters defined - commands defined - data loaded or converted

11-07

Temporary solution

• Problem identification • Release and system information • Temporary solution, target date for actual fix identified • Description of the solution: - limitations, restriction on usage - additional operational requirements - special procedures - applicable releases • Backup/recovery information • Verification procedures • Temporary installation instructions

12-00

Proposal

• Defines the proposed solution • Defines the proposed schedule • Identifies the coverage identification of initial proposal: - identifies the requirements that would be satisfied - identifies the requirements that could not be satisfied, and provides a justification of variants • Defines the estimated price of proposed development, product, or service

12-01_old

Request for proposal

• Reference to the requirements specifications • Identifies supplier selection criteria • Identifies desired characteristics, such as: - system architecture, configuration requirements or the requirements for service (consultants, maintenance, etc.) - quality criteria or requirements - project schedule requirements - expected delivery/service dates - cost/price expectations - regulatory standards/requirements • Identifies submission constraints: - date for resubmission of the response - requirements with regard to the format of response

12-03

Reuse proposal

• Identifies the project name • Identifies the project contact • Identifies the reuse goals and objectives • Identifies the list of reuse assets • Identifies the issues/risks of reusing the component including specific requirements (hardware, software, resource and other reuse components) • Identifies the person who will be approving the reuse proposal

12-04

Supplier proposal response

• Defines the suppliers proposed solution • Defines the suppliers proposed delivery schedule • Identifies the coverage identification of initial proposal: - identifies the requirements that would be satisfied - identifies the requirements that could not be satisfied, and provides a justification of variants • Defines the estimated price of proposed development, product, or service

13-00

Record

• Work product stating results achieved or provides evidence of activities performed in a process • An item that is part of a set of identifiable and retrievable data

13-01_old

Acceptance record

• Record of the receipt of the delivery • Identification of the date received • Identification of the delivered components • Records the verification of any customer acceptance criteria defined • Signed by receiving customer

13-04_old

Communication record

• All forms of interpersonal communication including: - letters - faxes - e-mails - voice recordings - podcast - blog - videos - forum - live chat - wikis - photo protocol - meeting support record

13-05

Contract review record

• Scope of contract and requirements • Possible contingencies or risks • Alignment of the contract with the strategic business plan of the organization • Protection of proprietary information • Requirements which differ from those in the original documentation • Capability to meet contractual requirements • Responsibility for subcontracted work • Terminology • Customer ability to meet contractual obligations.

13-06

Delivery record

• Record of items shipped/delivered electronically to customer • Identification of: - who it was sent to - address where delivered - the date delivered • Record receipt of delivered product

13-07

Problem record

• Identifies the name of submitted and associated contact details • Identifies the group/person(s) responsible for providing a fix • Includes a description of the problem • Identifies classification of the problem (criticality, urgency, relevance etc.) • Identifies the status of the reported problem • Identifies the target release(s) in which the problem will be fixed • Identifies the expected closure date • Identifies any closure criteria • Identifies re-review actions

13-08

Baseline

• Identifies a state of one or a set of work products and artifacts which are consistent and complete • Basis for next process steps • Is unique and may not be changed *NOTE: This should be established before a release to identify consistent and | complete delivery*

13-09

Meeting support record

• Agenda and minutes that are records that define: - purpose of meeting - attendees - date, place held - reference to previous minutes - what was accomplished - identifies issues raised - any open issues - next meeting, if any

13-10

Configuration management record

• Status of the work products/items and modifications • Identifies items under configuration control • Identifies activities performed e.g. backup, storage, archiving, handling and delivery of configured items • Supports consistency of the product

13-13

Product release approval record

• Content information of what is to be shipped or delivered • Identification of: - for whom it is intended - the address where to deliver - the date released • Record of supplier approval

13-14_old

Progress status record

• Record of the status of a plan(s) (actual against planned) such as: - status of actual tasks against planned tasks - status of actual results against established objectives/goals - status of actual resources allocation against planned resources - status of actual cost against budget estimates - status of actual time against planned schedule - status of actual quality against planned quality • Record of any deviations from planned activities and reason why

13-15

Proposal review record

• Scope of proposal and requirements • Possible contingencies or risks • Alignment of the proposal with the strategic business plan of the organization • Protection of proprietary information • Requirements which differ from those in the original documentation • Capability to meet contractual requirements • Responsibility for subcontracted work • Terminology • Supplier ability to meet obligations • Approved

13-16_old

Change request

• Identifies purpose of change • Identifies request status (e.g., open, allocated, implemented, closed) • Identifies requester contact information • Impacted system(s) • Impact to operations of existing system(s) defined • Impact to associated documentation defined • Criticality of the request, due date

13-17

Customer request

• Identifies request purpose, such as: - new development • enhancement • internal customer • operations • documentation • informational • Identifies request status information, such as: - date opened - current status - date assigned and responsible owner - date verified - date closed • Identifies priority/severity of the request • Identifies customer information, such as: - company/person initiating the request - contact information and details - system site configuration information - impacted system(s) - impact to operations of existing systems - criticality of the request - expected customer response/closure requirements • Identifies needed requirements/standards • Identifies information sent with request (i.e., RFPs, dumps, etc.)

13-18

Quality record

• Identifies what information to keep • Identifies what tasks/activities/process produce the information • Identifies when the data was collected • Identifies source of any associated data • Identifies the associated quality criteria • Identifies any associated measurements using the informatio • Identifies any requirements to be adhered to create the record, or satisfied by the record

13-19_old

Review record

• Provides the context information about the review: - what was reviewed - lists reviewers who attended - status of the review • Provides information about the coverage of the review: - check-lists - review criteria - requirements - compliance to standards • Records information about: - the readiness for the review - preparation time spent for the review - time spent in the review - reviewers, roles and expertise • Review findings: - non-conformances - improvement suggestions • Identifies the required corrective actions: - risk identification - prioritized list of deviations and problems discovered - the actions, tasks to be performed to fix the problem - ownership for corrective action - status and target closure dates for identified problems

13-20_old

Risk action request

• Date of initiation • Scope • Subject • Request originator • Risk management process context: - this section may be provided once, and then referenced in subsequent action requests if no changes have occurred - process scope - stakeholder perspective - risk categories - risk thresholds - project objectives - project assumptions - project constraints • Risks: - this section may cover one risk or many, as the user chooses - where all the information above applies to the whole set of risks, one action request may suffice - where the information varies, each request may cover the risk or risks that share common information - risk description(s) - risk probability - risk consequences - expected timing of risk • Risk treatment alternatives: - alternative descriptions - recommended alternative(s) - justifications • Risk action request disposition: - each request should be annotated as to whether it is accepted, rejected, or modified, and the rationale provided for whichever decision is taken

13-21

Change control record

• Used as a mechanism to control change to baselined products/products in official project release libraries • Record of the change requested and made to a baselined product (work products, software, customer documentation, etc.): - identification of system, documents impacted with change - identification of change requester - identification of party responsible for the change - identification of status of the change • Linkage to associated customer requests, internal change requests, etc. • Appropriate approvals • Duplicate requests are identified and grouped

13-22_old

Traceability record

• All requirements (customer and internal) are to be traced • Identifies a mapping of requirement to life cycle work products • Provides the linkage of requirements to work product decomposition (i.e., requirement → design → code → test → deliverables, etc.) • Provides forward and backwards mapping of requirements to associated work products throughout all phases of the life cycle *NOTE: this may be included as a function of another defined work product (example: A CASE tool for design decomposition may have a mapping ability as part of its features)*

13-24_old

Validation results

• Validation check-list • Passed items of validation • Failed items of validation • Pending items of validation • Problems identified during validation • Risk analysis • Recommendation of actions • Conclusions of validation • Signature of validation

13-25_old

Verification results

• Verification check-list • Passed items of verification • Failed items of verification • Pending items of verification • Problems identified during verification • Risk analysis • Recommendation of actions • Conclusions of verification • Signature of verification

13-50_old

Test results

• Level Test Log • Anomaly Report • Level Test Report (Summary) - test cases not passed - test cases not executed - information about the test execution (date, tester name etc.) Additionally where necessary: • Level Interim Test Status Report • Master Test Report (Summary)

14-00

Register

• A register is a compilation of data or information captured in a defined sequence to enable: - an overall view of evidence of activities that have taken place - monitoring and analyzes - provides evidence of performance of a process over time

14-01

Change history

• Historical records of all changes made to an object (document, file, software component, etc.): - description of change - version information about changed object - date of change - change requester information - hange control record information

14-02_old

Corrective action register

• Identifies the initial problem • Identifies the ownership for completion of defined action • Defines a solution (series of actions to fix problem) • Identifies the open date and target closure date • Contains a status indicator • Indicates follow up audit actions

14-05_old

Preferred suppliers register

• Subcontractor or supplier history • List of potential subcontractor/suppliers • Qualification information • Identification of their qualifications • Past history information when it exists

14-06

Schedule

• Identifies the tasks to be performed • Identifies the expected and actual start and completion date for required tasks against progress/completion of tasks • Allows for the identification of critical tasks and task dependencies • Identifies task completion status, vs. planned date • Has a mapping to scheduled resource data *NOTE: A schedule is consistent with the work breakdown structure, see 14-09*

14-08_old

Tracking system

• Ability to record customer and process owner information • Ability to record related system configuration information • Ability to record information about problem or action needed: - date opened and target closure date - severity/criticality of item - status of any problem or actions needed - information about the problem or action owner - priority of problem resolution • Ability to record proposed resolution or action plan • Ability to provide management status information • Information is available to all with a need to know • Integrated change control system(s)/records

14-09

Work breakdown structure

• Defines tasks to be performed, and their amendments • Documents ownership for tasks • Documents critical dependencies between tasks • Documents inputs and output work products • Documents the critical dependencies between defined work products *NOTE: A work breakdown structure may be integrated into/part of the schedule, see 14-06*

14-11

Work product list

• Identifies: - name of work product - work product reference ID - work product revision - when updated - work product status - when approved - reference to approval source - file reference

14-50

Stakeholder groups list

• Identifies: - relevant stakeholder groups - weight/importance of each stakeholder group - representative(s) for each stakeholder group - information needs of each stakeholder group

15-00

Report

• A work product describing a situation that: - includes results and status - identifies applicable/associated information - identifies considerations/constraints - provides evidence/verification

15-01_old

Analysis report

• What was analyzed? • Who did the analysis? • The analysis criteria used: - selection criteria or prioritization scheme used - decision criteria - quality criteria • Records the results: - what was decided/selected - reason for the selection - assumptions made - potential risks • Aspects of correctness to analyze include: - completeness - understandability - testability - verifiability - feasibility - validity - consistency - adequacy of content

15-03

Configuration status report

• Identification of the number of items under configuration management • Identification of risks associated to configuration management • Identification of the number of configuration management items lost and reason for their loss • Identification of problem and issues related to configuration management • Identification of receiving parties • Identification of baselines made

15-05

Evaluation report

• States the purpose of evaluation • Method used for evaluation • Requirements used for the evaluation • Assumptions and limitations • Identifies the context and scope information required: - date of evaluation - parties involved - context details - evaluation instrument (check-list, tool) used • Records the result: - data - identifies the required corrective and preventive actions - improvement opportunities, as appropriate

15-06

Project status report

• A report of the current status of the project • Schedule: - planned progress (established objectives/goals) or completion (dates/deadlines) of tasks against - actual progress of tasks - reasons for variance from planned progress - threats to continued progress - contingency plans to maintain progress • Resources (human resources, infrastructure, hardware/materials, budget): - planned expenditure against actual expenditure - reasons for variance between planned and actual expenditure - expected future expenditure - contingency plans to achieve budget goals • Quality goals: - actual quality measures - reasons for variance from planned quality measures - contingency plans to achieve quality goal • Project issues: - issues which may affect the ability of the project to achieve its goals. - contingency plans to overcome threats to project goals

15-07

Reuse evaluation report

• Identification of reuse opportunities • Identification of investment in Reuse • Identification of current skills and experience • Identification of reuse infrastructure • The evaluation report must represent current status in implementation of the reuse program

15-08_old

Risk analysis report

• Identifies the risks analyzed • Records the results of the analysis: - potential ways to mitigate the risk - assumptions made - constraints

15-09_old

Risk status report

• Identifies the status of an identified risk: - related project or activity - risk statement - condition - consequence - changes in priority - duration of mitigation, when started - risk mitigation activities in progress - responsibility - constraints

15-12

Problem status report

• Presents a summary of problem records: - by problem categories/classification • Status of problem solving: - development of solved vs. open problems

15-13

Assessment/audit report

• States the purpose of assessment • Method used for assessment • Requirements used for the assessmen • Assumptions and limitations • Identifies the context and scope information required: - date of assessment - organizational unit assessed - sponsor information - assessment team - attendees - scope/coverage - assessees’ information - assessment Instrument (check-list, tool) used • Records the result: - data - identifies the required corrective actions - improvement opportunities

15-16

Improvement opportunity

• Identifies what the problem is • Identifies what the cause of a problem is • Suggest what could be done to fix the problem • Identifies the value (expected benefit) in performing the improvement • Identifies the penalty for not making the improvement

15-18

Process performance report

No requirements additional to Evaluation report (Generic)

15-21_old

Supplier evaluation report

No requirements additional to Evaluation report (Generic)

16-00

Repository

• Repository for components • Storage and retrieval capabilities • Ability to browse content • Listing of contents with description of attributes • Sharing and transfer of components between affected groups • Recovery of archive versions of components • Ability to report component status • Changes to components are tracked to change/user requests

16-03

Configuration management system

• Supports the configuration management strategy • Correct configuration of products • Can recreate any release or test configuration • Ability to report configuration status • Has to cover all relevant tools

16-06

Process repository

• Contains process descriptions • Supports multiple presentations of process assets

17-00

Requirement specification

• Each requirement is identified • Each requirement is unique • Each requirement is verifiable or can be assessed (see 17-50) • Includes statutory and regulatory requirements • Includes issues/requirements from (contract) reviews

17-02

Build list

• Identification of aggregates of the software application system • Identification of required system elements (application parameter settings, macro libraries, data bases, job control languages, etc.) • Necessary sequence ordering identified for compiling the software release • Input and output source libraries identified

17-03

Stakeholder requirements

• Purpose/objectives defined • Includes issues/requirements from (contract) reviews • Identifies any: - time schedule/constraints - required feature and functional characteristics - necessary performance considerations/constraints - necessary internal/external interface considerations/constraints - required system characteristics/constraints - human engineering considerations/constraints - security considerations/constraints - environmental considerations/constraints - operational considerations/constraints - maintenance considerations/constraints - installation considerations/constraints - support considerations/constraints - design constraints - safety/reliability considerations/constraints - quality requirements/expectations

17-05

Documentation requirements

• Purpose/objectives defined • Proposed contents (coverage) defined • Intended audience defined • Identification of supported hardware/software/product release, system information • Identification of associated hardware/software/product requirements and designs satisfied by document • Identification of style, format, media standards expected definition of the intended distribution requirement • Includes storage requirements

17-08

Interface requirements specification

• Defines relationships between two products, process or process tasks • Defines criteria and format for what is common to both • Defines critical timing dependencies or sequence ordering • Description of the physical interfaces of each system component like: - bus interfaces (CAN, MOST, LIN, Flexray etc.) - transceiver (type, manufacturer, etc.) - analogue interfaces - digital interfaces (PWM, I/O) - additional interfaces (IEEE, ISO, Bluetooth, USB, etc.) • Identification of the software interfaces of software components and other software item in terms of: - inter-process communication mechanisms - bus communication mechanisms

17-11_old

Software requirements specification

• Identifies standards to be used • Identifies any software structure considerations/constraints • Identifies the required software elements • Identifies the relationship between software elements • Consideration is given to: - any required software performance characteristics - any required software interfaces - any required security characteristics required - any database design requirements - any required error handling and recovery attributes - any required resource consumption characteristics

17-12_old

System requirements specification

• System requirements include: functions and capabilities of the system; business, organizational and user requirements; safety, security, human-factors engineering (ergonomics), interface, operations, and maintenance requirements; design constraints and qualification requirements. • Identifies the required system overview • Identifies any interrelationship considerations/constraints between system elements • Identifies any relationship considerations/constraints between the system elements and the software • Identifies any design considerations/constraints for each required system element, including: - memory/capacity requirements - hardware interface requirements - user interface requirements - external system interface requirements - performance requirements - command structures - security/data protection characteristics - application parameter settings - manual operations - reusable components • Describes the operation capabilities • Describes environmental capabilities • Documentation requirements • Reliability requirements • Logistical Requirements • Describes security requirements • Diagnosis requirements

17-50

Verification criteria

• Each requirement is verifiable or can be assessed • Verification criteria define the qualitative and quantitative criteria for verification of a requirement. • Verification criteria demonstrate that a requirement can be verified within agreed constraints. (Additional Requirement to 17-00 Requirements specification)

18-00

Standard

• Identification of to whom/what they apply • Expectations for conformance are identified • Conformance to requirements can be demonstrated • Provisions for tailoring or exception to the requirements are included

18-01

Acceptance criteria

• Defines expectations for acceptance like e.g.: - interfaces - schedules - messages - documents - meetings - joint reviews

18-06

Product release criteria

• Defines expectations for product release: - release type and status - required elements of the release - product completeness including documentation - adequacy and coverage of testing - limit for open defects - change control status

18-07

Quality criteria

• Defines expectations for quality: - establishes what is an adequate work product (required elements, completeness expected, accuracy, etc.) - identifies what constitutes the completeness of the defined tasks - establishes life cycle transition criteria and the entry and exit requirements for each process and/or activity defined - establishes expected performance attributes - establishes product reliability attributes

18-50_old

Supplier qualification criteria

• Expectations for conformance, to be fulfilled by competent suppliers, are identified • Links from the expectations to national/international/domains-specific standards/laws/regulations are described • Conformance to requirements can be demonstrated by the potential suppliers or assessed by the acquiring organization • Provisions for tailoring or exception to the requirements are included

19-00

Strategy

• Identifies what needs and objectives or goals there are to be satisfied • Establishes the options and approach for satisfying the needs, objectives, or goals • Establishes the evaluation criteria against which the strategic options are evaluated • Identifies any constraints/risks and how these will be addressed

19-05

Reuse strategy

• Identify the goals for reuse • Identify the commitment for creating reusable components • Determine which product lines and type of artifacts should be supported with reuse • Identify system and hardware/software/product elements which can be reused within the organization • Identify the reuse repository and tools

19-10_old

Verification strategy

• Verification methods, techniques, and tools • Work product or processes under verification • Degrees of independence for verification • Schedule for performing the above activities • Identifies what needs there are to be satisfied • Establishes the options and approach for satisfying the need • Establishes the evaluation criteria against which the strategic options are evaluated • Identifies any constraints/risks and how these will be addressed • Verification ending criteria • Verification start, abort and re-start criteria

19-11_old

Validation strategy

• Validation methods, techniques, and tools • Work products under validation • Degrees of independence for validation • Schedule for performing the above activities • Identifies what needs there are to be satisfied • Establishes the options and approach for satisfying the need • Establishes the evaluation criteria against which the strategic options are evaluated • Identifies any constraints/risks and how these will be addressed

20-00

Template

• Defines the attributes associated with a work product to be created as a consequence of a process execution • Identifies technical elements typically associated with this product type • Defines expected form and style

21-00

Work product

• Defines the attributes associated with an artifact from a process execution: - key elements to be represented in the work product

Annex C Terminology

Annex C lists the applicable terminology references from ISO/IEC/IEEE 24765 and ISO/IEC/IEEE 29119. It also provides terms which are specifically defined within Automotive SPICE. Some of these definitions are based on ISO/IEC/IEEE 24765.

Table 19 Table C.1 — Terminology

Term

Origin

Description

Acceptance testing

ISO/IEC/IEEE 24765

Formal testing conducted to enable a user, customer, or authorized entity to determine whether to accept a system or component.

Application parameter

Automotive SPICE V3.1

An application parameter is a parameter containing data applied to the system or software functions, behavior or properties.

The notion of application parameter is expressed in two ways:

firstly, the logical specification (including name, description, unit, value domain or threshold values or characteristic curves, respectively), and,

secondly, the actual quantitative data value it receives by means of data application.

Architecture element

Automotive SPICE V3.1

Result of the decomposition of the architecture on system and software level:

  • The system is decomposed into elements of the system architecture across appropriate hierarchical levels.

  • The software is decomposed into elements of the software architecture across appropriate hierarchical levels down to the software components (the lowest level elements of the software architecture).

Baseline

ISO/IEC/IEEE 24765

A specification or product that has been formally reviewed and agreed upon, that thereafter serves as the basis for further development, and can be changed only through formal change control procedures.

Black-box testing

Automotive SPICE V3.1

Method of requirement testing where tests are developed without knowledge of the internal structure and mechanisms of the tested item.

Code review

Automotive SPICE V3.1

A check of the code by one or more qualified persons to determine its suitability for its intended use and identify discrepancies from specifications and standards.

Coding

ISO/IEC/IEEE 24765

The transforming of logic and data from design specifications (design descriptions) into programming language.

Consistency

Automotive SPICE V3.1

Consistency addresses content and semantics and ensures that work products are not in contradiction to each other. Consistency is supported by bidirectional traceability.

Defect

→ [FAULT]

Dynamic analysis

ISO/IEC/IEEE 24765

A process of evaluating a system or component based on its behavior during execution.

Element

Automotive SPICE V3.1

Elements are all structural objects on architectural and design level on the left side of the “V”. Such elements can be further decomposed into more fine-grained sub-elements of the architecture or design across appropriate hierarchical levels.

Error

ISO/IEC/IEEE 24765

The difference between a computed, observed, or measured value or condition and the true, specified, or theoretically correct value or condition.

Fault

ISO/IEC/IEEE 24765

A manifestation of an error in software.

Functional requirement

ISO/IEC/IEEE 24765

A statement that identifies what a product or process must accomplish to produce required behavior and/or results.

Functional specification

ISO/IEC/IEEE 24765

A document that specifies the functions that a system or component must perform. Often part of a requirements specification.

Functional testing

ISO/IEC/IEEE 24765

Testing conducted to evaluate the compliance of a system or component with specified functional requirements.

Hardware

ISO/IEC/IEEE 24765

Physical equipment used to process, store, or transmit computer programs or data.

Hardware item

Automotive SPICE V3.1

A physical representation of a hardware element.

Integration

Automotive SPICE V3.1

A process of combining items to larger items up to an overall system.

Integrated software item

Automotive SPICE V3.1

A set of software units or items that are integrated into a larger assembly for the purpose of integration testing.

Integration testing

Automotive SPICE V3.1

Testing in which items (software items, hardware items, or system items) are combined and tested to evaluate the interaction among them.

Integrated system item

Automotive SPICE V3.1

A set of items that are integrated into a larger assembly for the purpose of integration testing.

Quality assurance

ISO/IEC/IEEE 24765

A planned and systematic pattern of all actions necessary to provide adequate confidence that an item or product conforms to established technical requirements.

Regression testing

Automotive SPICE V3.1

Selective retesting of a system or item to verify that modifications have not caused unintended effects and that the system or item still complies with its specified requirements.

Requirement

Automotive SPICE V3.1

A property or capability that must be achieved or possessed by a system, system item, product, or service to satisfy a contract, standard, specification or other formally imposed documents.

Requirements specification

Automotive SPICE V3.1

A document that specifies the requirements for a system or item. Typically included are functional requirements, performance requirements, interface requirements, design requirements, and development standards.

Software

ISO/IEC/IEEE 24765

Computer programs, procedures, and possibly associated documentation and data pertaining to the operation of a computer system.

Software component

Automotive SPICE V3.1

In Automotive SPICE V3.1 the term “software component” is used for the lowest level elements of the software architecture for which finally the detailed design is defined. A software “component” consists of one or more software “units”. → [ARCHITECTURE ELEMENT], [UNIT]

Software element

→ [ARCHITECTURE ELEMENT]

Software item

ISO/IEC/IEEE 24765

Identifiable part of a software product.

Software unit

→ [UNIT]

Static analysis

Automotive SPICE V3.1

A process of evaluating an item based on its form, structure, content, or documentation.

System

Automotive SPICE V3.1

A collection of interacting items organized to accomplish a specific function or set of functions within a specific environment.

System item

Automotive SPICE V3.1

Identifiable part of the system.

System test

ISO/IEC/IEEE 24765

Testing conducted on a complete, integrated system to evaluate the system’s compliance with its specified requirements.

Testing

Automotive SPICE V3.1

Activity in which an item (system, hardware, or software) is executed under specific conditions; and the results are recorded, summarized and communicated.

Traceability

ISO/IEC/IEEE 24765

The degree to which a relationship can be established between two or more products of the development process, especially products having a predecessor-successor or master-subordinate relationship to one another.

Unit

Automotive SPICE V3.1

Part of a software component which is not further subdivided.

→ [SOFTWARE COMPONENT]

Unit test

Automotive SPICE V3.1

The testing of individual software units or a set of combined software units.

Validation

ISO/IEC/IEEE 29119

Validation demonstrates that the work item can be used by the users for their specific tasks.

Verification

ISO/IEC/IEEE 29119

Verification is confirmation, through the provision of objective evidence, that specified requirements have been fulfilled in a given work item.

White-box testing

Automotive SPICE V3.1

Method of testing where tests are developed based on the knowledge of the internal structure and mechanisms of the tested item.

Annex D Key concepts

The following sections describe the key concepts that have been introduced in the Automotive SPICE PRM resp. PAM 3.1. They relate to the terminology described in Annex C Terminology.

D.1 The “Plug-in” concept

The following figure shows the basic principle of the “plug-in” concept. The top-level comprises all system engineering processes organized in a system “V”. Depending on the product to be developed the corresponding engineering disciplines with their domain-specific processes (e.g. hardware engineering HWE, mechanical engineering MEE, or software engineering SWE) can be added to the assessment scope. All other processes such as management processes and supporting processes are domain-independent and are therefore designed in a way that they can be applied to both the system level and the domain levels.

../_images/figureD.1.png

Fig. 6 Figure D.1 — The “Plug-in” concept

All processes printed in bold are part of the Automotive SPICE 3.1 PRM/PAM whereas the other processes (mechanical engineering and hardware engineering) are not developed under VDA QMC mandate.

D.2 The Tip of the “V”

All engineering processes (i.e. system engineering and software engineering) have been organized according to the “V model” principle in such a way that each process on the left side is corresponding to exactly one process on the right side. Therefore, the process SWE.3 “Software Detailed Design and Unit Construction” is separated from the process SWE.4 “Software Unit Verification”.

../_images/figureD.2.png

Fig. 7 Figure D.2 — The tip of the “V”

D.3 Terms “Element”, “Component”, “Unit”, and “Item”

The following figure depicts the relationships between element, component, software unit, and item, which are used consistently in the engineering processes.

../_images/figureD.3.png

Fig. 8 Figure D.3 — Element, component, unit, and item

An architecture consists of architectural “elements” that can be further decomposed into more fine-grained architectural sub-“elements” across appropriate hierarchical levels. The software “components” are the lowest-level “elements” of the software architecture for which finally the detailed design is defined. A software “component” consists of one or more software “units”.

“Items” on the right side of the V-model correspond to “elements” on the left side (e.g. a software “item” can be an object file, a library or an executable). This can be a 1:1 or m:n relationship, e.g. an “item” may represent more than one architectural “element”.

D.4 Traceability and consistency

Traceability and consistency are addressed by two separate base practices in the Automotive SPICE 3.1 PAM. Traceability refers to the existence of references or links between work products thereby further supporting coverage, impact analysis, requirements implementation status tracking etc. In contrast, consistency addresses content and semantics.

Furthermore, bidirectional traceability has been explicitly defined between

  • test cases and test results, and

  • change requests and work products affected by these change requests.

An overview of bidirectional traceability and consistency is depicted in the following figure.

../_images/figureD.4.png

Fig. 9 Figure D.4 — Bidirectional traceability and consistency

D.5 “Agree” and “Summarize and Communicate”

The information flow on the left side of the “V” is ensured through a base practice “Communicate agreed ‘work product x’”. The term “agreed” here means that there is a joint understanding between affected parties of what is meant by the content of the work product.

The information flow on the right side of the “V” is ensured through a base practice “Summarize and communicate results”. The term “Summarize” refers to abstracted information resulting from test executions made available to all relevant parties.

Note that these communication-oriented base practices do not necessarily require a formal approval, confirmation, or release as rather targeted at by GP 2.1.7 on capability level 2. At capability level 1 the communication-oriented base practices mean that the work products (or their content) are to be disseminated to relevant parties.

../_images/figureD.5.png

Fig. 10 Figure D.5 — Agree, summarize and communicate

D.6 “Evaluate”, “Verification Criteria” and “Ensuring compliance”

This section describes relations, differences, and commonalities between verification, testing, evaluation, and compliance. The following Figure D.6 provides an overview.

Verification criteria are used as input for the development of the test cases or other verification measures that ensures compliance with the requirements. Verification criteria are only used in the context of System Requirements Analysis (SYS.2) and Software Requirements Analysis (SWE.1) processes. Verification aspects which cannot be covered by testing are covered by the verification process (SUP.2).

Criteria for unit verification ensure compliance of the source code with the software detailed design and the non-functional requirements. Possible criteria for unit verification include unit test cases, unit test data, coverage goals and coding standards and coding guidelines, e.g. MISRA. For unit testing, such criteria shall be defined in a unit test specification. This unit test specification may be implemented e.g. as a script in an automated test bench.

../_images/figureD.6.png

Fig. 11 Figure D.6 — Evaluation, verification criteria and compliance

Evaluation of alternative solutions is required for system and software architectures as well as for software detailed designs. The evaluation has to be done according to defined criteria. Such evaluation criteria may include quality characteristics like modularity, reliability, security, and usability, or results of make-or-buy or reuse analysis. The evaluation result including a rationale for the architecture/design selection has to be recorded.

Compliance with an architectural design means that the specified integration tests are capable of proving that interfaces and relevant interactions between

  • the software units,

  • the software items and

  • the system items

fulfill the specification given by the architectural design.

D.7 The relation between “Strategy” and “Plan”

Both terms “Strategy” and “Plan” are commonly used across following processes of the Automotive SPICE 3.1 PAM:

SYS.4

System Integration and Integration Test

SYS.5

System Qualification Test

SWE.4

Software Unit Verification

SWE.5

Software Integration and Integration Test

SWE.6

Software Qualification Test

SUP.1

Quality Assurance

SUP.8

Configuration Management

SUP.9

Problem Resolution Management

SUP.10

Change Request Management

The following figure shows the general relationship between strategy and plan in any of these processes.

../_images/figureD.7.png

Fig. 12 Figure D.7 — Strategy and plan

Capability Level 1:

Each of these processes requires the development of a process-specific strategy. The strategy always corresponds to a process-specific “Plan”. For each process-specific “Plan” there are processspecific work product characteristics defined (e.g. “08-52 Test Plan”, “08-04 Configuration Management Plan”).

Capability Level 2 or higher:

Each process-specific “Plan” (WP 08-nn) inherits the work product characteristics represented by the Generic Plan (WP 08-00). This means that for a process-specific “Plan” both the process-specific characteristics (WP 08-nn) and the generic characteristics (WP 08-00) apply.

Annex E Reference standards

Annex E provides a list of reference standards and guidelines that support implementation of the Automotive SPICE PAM / PRM.

Table 20 Table E.1 — Reference standards

ISO/IEC 33001:2015

Information technology – Process assessment – Concepts and terminology

ISO/IEC 33002:2015

Information technology – Process assessment

– Requirements for performing process assessment

ISO/IEC 33003:2015

Information technology – Process assessment

– Requirements for process measurement frameworks

ISO/IEC 33004:2015

Information technology – Process assessment

– Requirements for process reference, process assessment and maturity models

ISO/IEC 33020:2015

Information technology – Process assessment

– Process measurement framework for assessment of process capability

ISO/IEC 15504-5:2006

Information Technology – Process assessment

– Part 5: An exemplar Process Assessment Model

ISO/IEC 12207:2008

Systems and software engineering – Software life cycle processes

ISO/IEC/IEEE 29119-1:2013

Software and systems engineering – Software testing

– Part 1: Concepts and definitions

ISO/IEC/IEEE 29119-3:2013

Software and systems engineering – Software testing

– Part 3: Test documentation

ISO/IEC/IEEE 24765:2010

Systems and software engineering – Vocabulary

ISO/IEC 25010:2011

Systems and software engineering

– Systems and software Quality Requirements and Evaluation (SQuaRE)

– System and software quality models