Leslie A. (Schad) Johnson
Flight Systems
Boeing Commercial Airplane Group
P. O. Box 3707, M/S 02-JX
Seattle, Washington 98124-2207
Voice: 425-294-0707
Fax: 425-294-2299
E-mail: leslie.a.johnson@boeing.com
Abstract:
This is a practitioner's discussion of the evolution of the current practice and application
of RTCA/DO-178B[1] for software approval in the commercial world. The objectives include:
- Developing and providing the data for development of educational material;
- Providing the rationale behind the guidance for people new to the commercial
certification environment; and,
- Clarification of the intent and application of DO-178B.
The derivation of the software approval guidelines from the Federal Aviation
Regulations (FARs) to DO-178B is discussed to clarify its relationship to the government
regulations. An explanation of the Designated Engineering Representative (DER) system is also
provided along with a discussion of the safety process to describe the environment in which
DO-178B is used.
The evolution of the avionics industry that led eventually to DO-178B is included as part of the
background behind the rationale of DO-178B. The key aspects of each version, from the original
version to DO-178B provide insight to the rationale for the inclusion and further development of
the content. In addition, there are special considerations in using DO-178B concerning its
current guidance for systems and highlights of the problem areas for those from a military
culture. As the industry moves to use of off-the-shelf (COTS) components the incentive is
greater to reconcile the difference between military standards and commercial standards.
Trustworthiness of software is an absolute concept independent of the verification process used.
This paper explores the differences and similarities between DO-178B and MIL-STD-498 affecting
the software development process.
Introduction:
The avionics industry has had the challenge of having to adapt quickly to the fast changing
technology of real-time embedded software. Along with that, many have entered the commercial
avionics market and stumbled into a part of the government called the Federal Aviation
Administration (FAA), requiring certification by the FAA or their designee. For foreign markets,
avionics must also be certified by other regulatory agencies. Certification means that the
software aspects of a system must be assured to be safe and airworthy.
That is, they must be
developed as defined by the software certification guidelines to the level of rigor and
discipline required by their criticality level, as determined by a functional hazard assessment.
Many long-standing members of the commercial avionics field are experiencing demands for
explanation and assistance. What comes under most scrutiny are the software guidelines for
certification, DO-178B/ED-12B. (ED-12B is the European version of the same document). Some
questions concern its intent and meaning, but most question the need to really do what it says
and the justifying rationale. The evolution of the content of DO-178B is best known by those
long in the field, but the demand is greater than their numbers. The industry group responsible
for DO-178B is now facing this dilemma. Therefore, the intention of this paper is to document
the key aspects of environment and history of DO-178B to increase the perspective.
Relationship to the Federal Aviation Regulations (FARs)[2] & Joint Aviation
Requirements (JARs):
With the exception of FAR 33/JAR E, the federal aviation regulations do not reference
software directly. The FAA has issued an advisory circular AC 20-115B[3] which recognizes
DO-178B as a means of evaluating software for the purposes of complying with the regulations.
The JAA recognizes ED-12B via Temporary Guidance Leaflet (TGL) No. 4. Other airworthiness
authorities have similar means of recognizing either DO-178B or ED-12B as a means of showing
compliance to the regulations. The FARs/JARs provide some very basic objectives more at the
system level and DO-178B/ED-12B expands these considerably for software. Using the regulations
for transport category airplanes as an example, the certification of airplanes and their
associated systems is partially covered under FAR/JAR 25.1301, paragraphs a) through d) discuss
the requirements for function and installation. Paragraph a) which introduces the concept of
intended function.
Subchapter 1309 covers the requirements for equipment, systems and installations. All systems
including real-time embedded systems must comply with this portion of the regulations. Software
approval guidance derives directly from paragraphs a) and b). Paragraph a) states that items
coming under this regulation must be designed to ensure they perform their intended functions
under any foreseeable operating condition. The last applicable paragraph (b), indicates that the
aircraft systems and associated components are to be considered separately and in relationship to
other systems.
These systems and components must be designed so that the occurrence of any failure condition
which would prevent continual safe flight and landing of the aircraft is to be extremely
improbable. In addition, they must be designed so that the occurrence of any other failure
condition which would reduce the capability of the airplane or the ability of the crew to cope
with adverse operating conditions is improbable. Industry practice in complying with this
chapter then results in a safety assessment for the aircraft and each system. These definitions
are further expanded in AC 25-1309.1a and Advisory Material-Joint (AMJ) 25.1309.1a, which
provides background for the concept of unintended function.
For software, AC20-115B invokes DO-178B and AMJ 2X-1 invokes ED-12B as an acceptable means of
evaluating software for any type certification (TC), supplemental type certification (STC), or
TSO. This includes FAR parts 21, 23, 25, 27, 29, and 33. For the JAA the AMJ includes JAR parts
23, 25, 27, 29, and E.
DO-178B's relationship to the regulations begins with FAR 21, the certification procedures for
all products and parts on an aircraft and top level regulation for systems on an aircraft. This
is where TSO's are introduced. They will explicitly reference DO-178B, if they include software
approval. TSOs are what can be considered "off-the-shelf" systems, once they have been approved
by the FAA - to a point.
TSOs do not mean a part is approved for installation. For example, when a TSO requires DO-178B
compliance, the airframe manufacturer may accept the software approval without further evaluation
or substantiation to the FAA. However, an installation certification is still required, which
could include the full software certification where a TSO(s) for the system did not explicitly
reference DO-178B. Additionally, if the installation approval required a DO-178B Level A
assurance and the TSOA was for a Level C assurance, then a reevaluation is required to establish
compliance to Level A requirements.
Rule changes can change the acceptability of data from a TSO'd part. New FAA regulations undergo
a process of public review and comment prior to acceptance for use. This process is called the
Notice of Proposed Rulemaking (NPRM) [4]. New rules may require evaluations that were not
required by the TSO. In any case, the part will have to be evaluated against the new regulations
to show compliance, which may require further work. This means that the TSO'd component is still
approved under the TSO but the data no longer meets the installation requirements.
The qualifications on the applicability of a TSO for a given installation can be frustrating.
It could be as simple as a commercial-off-the-shelf (COTS) part, or as difficult as the effort
required to gain FAA approval for an original submittal. Thus, it is 'buyer beware', until the
applicability for a given installation is known.
Thus, do not assume that TSO'd items are the equivalent of commercial off-the-shelf (COTS).
While TSOs are the closest analogy to COTS in commercial avionics, there are general differences,
as follows:
- TSOs have a public set of requirements - COTS are proprietary;
- TSOs require installation approval - COTS have no equivalent;
- TSOs have approved certification data - COTS has none;
- TSO development data exists and is available to certifying agencies - COTS have none or it
must be purchased or is simply not available;
- Where the software aspects have been approved, TSOs will have an explicit software
approval, using a specific software certification guideline - COTS has none.
- Additionally, COTS items may have extra functionality which is unused in a given
installation. This would have to be addressed in a certification program.
This indicates that getting COTS on aircraft is not straightforward, nor can one take a TSO'd
unit and assume COTs items are easy to use in commercial avionics.
Commercial Certification Safety Process:
The FAA charter has been to foster and promote civil aviation, by promulgating and enforcing
the safety regulations. Their priority is for continued operational safety, improved regulations
and policy, and to certify additional airplanes and aircraft types.
The safety process begins with aircraft level design decisions as a part of an overall airplane
safety strategy. The safety process requires an aircraft level assessment of function and
hazard. The requirements for the functions and compensation for the associated hazards are
distributed throughout the aircraft systems and architecture for each lower level of design
refinement.
Ultimately, at a system level, the functional hazard assessment systematically defines each
hazard and classifies its hazard level. A probability analysis is done with respect to the
system architecture and adjustments to the design are made, as appropriate. This is also
called a fault-tree analysis. At the same time, the system is examined for availability.
This means that the need for the system to remain operational in flight is assessed
(availability for dispatch).
The software criticality is next defined, based on its function condition classification for
that part of the system. However, determining software criticality is currently not
straightforward. It may be done on a case-by-case basis via FAA ruling until the FAA
internally defines consistent practice using three documents. These are ARPs 4761 and 4754,
and RTCA/DO-TBD (SC180).
Current practice in this area is now in transition and is using SAE Aerospace Recommended
Practice (ARP) 4761[5] and 4754[6], dealing with complex systems and determination of assurance,
respectively. Both of these documents assume the release of the RTCA Special Committee (SC)
180 [7] results. Note: The Society of Automotive Engineers (SAE) and the Requirements and
Technical Concepts for Aviation (RTCA) organizations provide a forum for international technical
standards and guidelines development. SC180 will be equivalent to DO-178B, but for complex
hardware aspects. It is expected to be released in 1997.
Figure 1
The Designated Engineering Representative (DER) Role in the Safety Process:
DERs represent various technologies and play a role in the development of the safety
assessments. During the development of a system, they assess and establish appropriate
engineering processes and analyze design relative to the certification requirements.
They are required to have direct personal understanding of the development of a system and
are the onsite safety officers. DERs either directly approve or recommend approval for the
development artifacts as a representative of the FAA. Ultimately, all systems are submitted
to the FAA for approval, leading to the approval of the installations and functions of the
systems on an aircraft.
Exposure and interaction with the FAA/certifying agencies and other experienced DERs accomplish
training in the logic and judgment used and provides for developing trust, rapport, and respect
for the DER candidate's ability and expertise. For software DERs, there is an additional
approval qualification for each safety criticality level based on experience, while providing
a direct participant in the lifecycle process described by DO-178. See 8110.37B[8].
DO-178 [9]:
In its infancy stage, software was recognized as a creative human product. In the avionics
industry, it was used to extend and modify the capabilities of mechanical and analog systems in
a manner much simpler than redesigning or modifying the hardware components. It was seen as an
aid to inexpensive modification and functional extension of an originally inflexible all-hardware
design.
During system certification in the late 1970's, it became clear that the classical statistical
methods of safety assessment for flight critical software based systems were not possible.
Alternate knowledge and method(s) were necessary to establish equivalent integrity to deal with
design errors rather than component failures. It became necessary to establish a uniform,
consistent definition of criteria for substantiating evidence for the absence of critical design
errors, answering the following questions:
- How was it known that the testing was comprehensive and complete?
- How was it known that the system requirements were comprehensive and
complete?
- How was it known that the software requirements were comprehensive and
complete and interpreted the system requirement accurately?
- How do we provide proof that a design or implementation error, which may be
present, cannot produce a safety critical situation?
Verification and validation became new terms. Verification provided the proof that a system was
built to the requirements and validation established that the requirements were complete and
correct. These criteria require that the software be produced using the "best known" practice,
minimizing or removing the risk of a malfunction or failure.
This gave birth to DO-178. This was an alternative means for software design integrity from the
classical statistical method of determining system integrity. Essentially, DO-178 was developed
as the industry standard to establish that software was safe and/or did not contribute to the
system hazards. It was created to identify and document the "best known" software practices
supporting the certification of software-based equipment and systems, thus proving a basis for
software certification approval. DO-178 was written at a conceptual level. Compliance was
done by meeting its "intent." Through trial and error, rules of use were developed and an
understanding of what was "best known" in practices was slowly built.
Features of DO-178:
This document established that a system's software development rigor could vary by the
system failure condition categories (criticality). A system could be categorized as critical,
essential and non-essential. DO-178 also established the need for a certification plan that
included software aspects and any special requirements (called special conditions).
DO-178 established the relationship of the software certification processes to the Type
Certification Approval, the TSO Authorization, and the Supplemental Type Certification and
other equivalent FARs. This has not been done explicitly in the later revisions of DO-178.
It also established the interface with system validation. This interface was covered in
Revision A with less clarity, and dropped for Revision B.
DO-178A [10]:
Its purpose was to to reflect the regulatory and industry experiences gained and consider
adding additional guidance for other applications as appropriate. Further, it was to:
- Establish techniques and methods for orderly software development;
- State that the intent of the application of the techniques was to result in documented software that is traceable, testable, and maintainable and thereby meet the certification requirements. This was to assure the absence of critical software errors.
Features of DO-178A:
The content and features of DO-178A were very different from the original version. The
concept of a system's failure condition categories established classifications of software
according to the effects of malfunctions or design errors and took into consideration the
product application. These categories were defined to be critical, essential or non-essential
and the corresponding software levels were called levels 1, 2, and 3, respectively.
This allowed for a variance between the software level and criticality category inferring a
level of assurance effort with adjustment according to the system design and implementation
techniques.
Software development processes were described in a more systematic and structured manner. The
verification process (requirements implementation correctness) included distinctions in effort
required by software level.
DO-178A incorporated the objective of achieving equivalent confidence in a re-certification as
was obtained originally. Additional material was added to define requirements for follow-on
certifications of a product.
Strengths and weaknesses of DO-178A soon became apparent. Literal interpretation, particularly
from diagrams were a problem. Also, necessary certification submittal items were frequently
contended despite the fact that they were subject to negotiation with the regulatory agencies
and could vary between agencies. Misuse included non-allowance of life cycles other than the
traditional waterfall model and non-existence of documents not identified as "required" or "made
available. Contention also arose on the required certification effort.
In general, the knowledge of why the certification requirements existed and the purpose of the
requirements failed to be understood or appreciated.
DO-178B:
In industry, avionics systems were transitioning from analog to digital and were building
larger complex systems. This phenomenon brought many people into the real-time embedded software
world with its newly evolving certification regulations. The small amount of documentation,
training materials, development standards and experienced people were proving insufficient to
meet the demand for expertise.
DO-178B was initiated to address the problems. The purpose was to provide detailed guidelines
for the production of software for airborne systems that perform intended functions with a level
of confidence in safety and which comply with airworthiness requirements. The goal was the
following:
- Develop objectives for the life cycle processes;
- Provide a description of the activities and design considerations for achieving
those objectives; and,
- Provide a description of the evidence indicating the objectives have been
satisfied.
Features of DO-178B:
The DO-178B development team was motivated to document certification practice and policy as
much as possible to lessen the increasing demand on the few experienced software certification
people.
The five failure condition categorizations used by the Joint Aviation Agencies (JAA) were
adopted. This changed the categories of critical, essential, and non-essential to catastrophic,
hazardous/severe-major, major, minor, and no effect. The purpose was to reach further harmony
between the FAA and the JAA in certification practice. This caused the software levels to change
from levels one through three to A through E, respectively.
Verification was discussed differently to clarify the difference between requirements-based
testing and structural coverage analysis. This was done to stop the practice of performing
structural testing without fully verifying the software requirements, which also caused more
testing. Analyzing functional tests for structural coverage was emphasized as the more
effective, cheaper way of performing verification.
A detailed summary of differences and emphasis is listed as follows:
- There was an attempt to bridge the gap with current practice for systems' software
considerations, including timing, coverage, and analysis.
- System verification was dropped due to a division of labor between DO-178B and the
international committee developing system guidelines.
- Additional recognition was given to new phases, with some being cradle-to-grave.
- The guideline role was obviated by the addition of Annex A, sorting the content by phase
and objective.
- Clarity was added by stating the independence requirements for control rigor per software
level.
- The JAA five failure condition categories were adopted and corresponding software levels
developed.
- The structural coverage test aspect was clarified by focusing on the "step-wise refinement"
concept of requirements decomposition. This was to gain requirement coverage first for any
tested software structure, prior to any necessary augmentation.
- The concept of transition criteria between phases was added to account for use of varied
types of life cycle models and the movement into other phases while earlier phases are not yet
complete.
- New topics were added to address new technology and concepts in Section 12. This included
the following: tools; dissimilar, option selectable, field loadable, and user modifiable
software; Commercial Off the Shelf (COTS) software; and, dead versus deactivated code.
- Traceability was added as an integral feature of software development. This included
traceability between the code and any documentation, tests, and requirements. This was to
demonstrate that no additional functionality was present (unintended function); that all
requirements had been implemented; and, that the system was fully tested.
- The quality assurance activity was augmented with a conformity review prior to production
to allow for assurance of a smooth transition to production.
- The role of the Plan for
Software Aspects of Certification was expanded to include the top level planning information
and de-emphasized the Software Development Plan.
- A clear statement and recognition of the use of alternate means and product service history
was incorporated.
Challenges in using DO-178B are already occurring. They include the discovery that previously
certified systems didn't necessarily use earlier versions of DO-178 correctly and now result in
greater transition issues. Literal interpretation remains a problem.
International harmony in interpretation, application, and understanding is needed. Market
expansion requires new methods of dissemination of certification policy and practice to focus
consistency and understanding. The training practices of apprenticeship and word of mouth do
not meet the industry demand for information.
The growing number of advanced complex avionics systems require greater access and dissemination
of information and explanatory material. Perhaps a DO-178 Users Group and periodically
published "Proceedings" would be a way to extend or clarify guidance in a regular fashion.
Special Considerations:
Section 2.0
Some current practices are changing the original intent and use of DO-178B. In section
2.0, "system aspects relating to software development," an informational detailed description
was included to capture the industry methods and understanding used to assess the design
assurance required for a system's software.
Section 2.0 was included to provide an informational tie between the ARP 4761 and 4754 documents
and the SC180 document. To date, the ARP documents have been released, but the SC180 document
has not. This leaves DO-178B as the only released document put into place for certification
guidance by Advisory Circular. The result is that part of section 2.0 is being used as the
official guidance. The subsections on software level determination in relation to the failure
condition (section 2.2.3) and safety monitoring system architecture (section 2.3.3) are the
interim areas used for guidance today.
Military Transition Issues
The first comprehensive software standard equivalent to DO-178 was MIL-STD-1679. This led
to MIL-STD-2167, -2167A[11], then MIL-STD-498 [12] and now to ISO/IEC-12207 [13]. The objectives
of these standards seem similar enough to be able to transition to commercial certification
requirements without much difficulty. Yet, while there is similarity in commercial and military
airborne software developments, there are fundamental differences in DO-178B that need to be
understood by people familiar with MIL-STD processes.
Data, for a military program, is typically provided to confirm understanding and support
maintenance by the customer of the delivered product. In the commercial arena, the customer
uses the data as compliance evidence to the certification requirements, rather than for
maintenance purposes.
To stay in scope, the focus will be only on the specific issues of DO-178B. The difficulties
that have been identified are the DO-178 requirements for evidence and rigorous verification
[14], which are stated more explicitly in DO-178B, and also the consistency of requirements.
Systematic records of accomplishing each of the objectives and guidance are necessary. A
documentation trail must exist demonstrating that the development processes not only were
carried out, but also were corrected and updated as necessary during the program life cycle.
Each document, review, analysis, and test must have evidence of critique for accuracy and
completeness, with criteria that establishes consistency and expected results. This is usually
accomplished by a checklist which is archived as part of the program certification records. The
degree of this evidence varies only by the safety criticality of the system and its software.
For the military, the accumulation of evidence for a formal acceptance is typically the job of
Quality Assurance, e.g., the functional and physical configuration audits. As a result,
engineering has not been schooled or trained to meticulously keep proof of the processes,
product, and verification real-time. The engineers have focused on the development of the
product, not the delivery. So both the degree and the timing of evidence different in the
military environment. In addition, program durations can be from 10 to 15 years resulting
in the software engineers moving on by the time of system delivery. This means that most
management and engineers have never been on a project from "cradle-to-grave."
The second issue is verification. The difficulty is requirements for the level of rigor of
software requirement and structural coverage in DO-178B. This is particularly true for a flight
critical system. Evidence must be formally developed for systematic implementation,
documentation, and test or analysis that each requirement has been incorporated and verified.
In turn, the evidence must show that all levels of requirements can be traced to all of its
roots and each is fully tested. It also must demonstrate the characteristic of reversibility.
This means one must be able to trace from each test to its top level requirements. The total
results in 100% requirement coverage.
The rigor of verification and its evidence must exist to achieve 100% structural coverage
required for a flight critical system. Additionally, evidence must demonstrate each aspect
has been systematically verified. Finally, the data must be available to be delivered, audited,
and analyzed by the FAA or their designee for validation of completeness and correctness.
The last concern is consistency of guidance and standards. This is seen in the military
environment in the literal use of the MIL-STDs. Lack of consistency causes the human reaction
to translate guidelines or standards literally to avoid being subject to inconsistent or
alternative judgments of interpretation. Commercially, it is no better, where lack of
consistency is a persistent and growing problem.
The interpretability of DO-178B and its predecessors causes inconsistency of understanding and
implementation. This has been due to too few experienced people; a fast growing industry; and
no comprehensive formalized training. This is true for both the FAA and foreign agencies as
well as for companies seeking certification of systems and aircraft.
The weakness of commercial practice with DO-178B is the lack of consistent, comprehensive
training of the FAA engineers/DERs/foreign agencies affecting:
- the effectiveness of the individual(s) making findings; and,
- the consistency of the interpretations in the findings.
Training programs may be the answer for both the military and commercial environments to avoid
the problem of inconsistent interpretation and the results of literal interpretation.
Recommendations:
Harmonization is needed between the FAA and other regulatory agencies. This needs to be
done to foster consistency in interpretation and application of DO-178B. Training programs are
needed to promote uniformity in compliance findings by the FAA/other regulatory agencies/DERs.
New methods of dissemination of certification policy and practice are necessary to address the
workload of the regulatory people as well as meet the demand for knowledge by industry.
Rapid market expansion requires availability of information, explanatory material, and training.
Perhaps, a DO-178 Users Group and periodically published "Proceedings" would be a way to extend
or clarify guidance in a regular fashion.
Special Thanks:
Mike DeWalt, FAA, NRS Software | Fred Moyer, Rockwell Collins |
Bill Dolman, Lucas Aerospace | Ron Pehrson, Boeing Company, BCAG |
Randy Johnson, Boeing Company, DS&G | Ty Startzman, Boeing Company, DS&G |
Ed Kosowski, Rockwell Collins | Dan Veerhusen, Rockwell Collins |
Bibliography:
- RTCA/DO-178B, "Software Considerations in Airborne Systems and Equipment Certification,"
December 1, 1992.
- Code of Federal Regulations, Aeronautics and Space, Parts 1 to 59, Revised as of January 1,
1997.
- Advisory Circular No: 20-115B, "RTCA, Inc., Document RTCA.DO-178B," Initiated
by: AIR-100, 1/11/93.
- FAA Designated Engineering Representative Standardization and Familiarization Seminar,
Aircraft Certification Service, 1/95.
- ARP-4761, "Guidelines and Methods for Conducting the Safety Assessment Process on Civil
Airborne Systems and Equipment," S-18 Committee, SAE, March 29, 1996.
- ARP-4754, "Certification Considerations for Highly Integrated or Complex Aircraft Systems,"
Systems Integration Requirements Task Group, AS-1C, ASD, SAE, June 27th, 1996.
- RTCA/DO-TBD, "Design Assurance Guidance for Airborne Electronic Hardware," RTCA Paper
#110-97/SC180-076, June 1997.
- 8110.37B, "Designated Engineering Representatives (DER) Guidance Handbook," U.
S. Department of Transportation, Federal Aviation Administration, November 12, 1996.
- RTCA/DO-178, "Software Considerations in Airborne Systems and Equipment Certification,"
RTCA, Inc., September 13, 1982.
- RTCA/DO-178A, "Software Considerations in Airborne Systems and Equipment Certification,"
March 1985.
- DOD-STD-2167A, "Military Standard Defense System Software Development," AMSC No. N4327,
February 29, 1988.
- MIL-STD-498, "Military Standard Software Development and Documentation," AMSC No. N7069,
December 5, 1994.
- IEEE/EIA 12207-1996, "(ISO/C 12207) Standard for Information Technology--
Software Life Cycle Processes," sponsored by the Software Engineering Standards Committee of the
IEEE Computer Society, December 10, 1996.
- "Comparison of FAA DO-178A and DOD_STD_2167A Approaches to Software Certification," Michael
P. DeWalt, NRS - Software, FAA, Seattle, Washington, February 6, 1988, published in the
proceedings of the 8th Digital Avionics Conference, AIAA.
Biography:
Leslie A. (Schad) Johnson is an Associate Technical Fellow at the Boeing Commercial Airplane
Group, Flight Systems organization. She has been working in the field of software engineering
since 1978 and in software certification since 1984. She has been an active Software Designated
Engineering Representative for the FAA since 1988. She has been involved with the technical
program management and software engineering analysis at Boeing and at suppliers on most airplane
systems, particularly those which are flight critical. Leslie is an active participant in the
RTCA SC190 committee and its predecessor, SC-167, that developed and wrote DO-178B.
She obtained her Masters in Software Engineering in 1983, and her undergraduate degree, a BA in
English Education, cum laude, in 1972, both from Seattle University.