When Systems are Simulations Ð T&E, VV&A, or Both? - PDF

Description
M T R 9 8 W M I T R E T E C H N I C A L R E P O R T When Systems are Simulations Ð T&E, VV&A, or Both? June 1998 Michael Borowski, Priscilla Glasow, Ó1998 The MITRE Corporation

Please download to get full document.

View again

of 14
All materials on our website are shared by users. If you have any questions about copyright issues, please report us to resolve them. We are always happy to assist you.
Information
Category:

Medicine, Science & Technology

Publish on:

Views: 14 | Pages: 14

Extension: PDF | Download: 0

Share
Transcript
M T R 9 8 W M I T R E T E C H N I C A L R E P O R T When Systems are Simulations Ð T&E, VV&A, or Both? June 1998 Michael Borowski, Priscilla Glasow, Ó1998 The MITRE Corporation It has been approved for public release. Washington C3 Center McLean, Virginia Abstract This paper provides a brief introduction on the use of modeling and simulation (M&S) and the importance of the credibility of that M&S when it is used to support system acquisition. A previous paper, The Relationship of VV&A to T&E (Allen et al. 1997), is briefly reviewed, followed by an introduction of a new example of M&S use. This example cites the recent development of systems that are themselves simulations. As new acquisition systems, they require traditional test and evaluation (T&E). As simulations, they require verification, validation and accreditation (VV&A) to ensure credibility. The main text of this paper examines key commonalties between these two processes. It provides insights into how T&E and VV&A requirementsõ can be met with reduced cost and risk, to the optimal benefit of the user. A case study is provided and conclusions presented. KEYWORDS: Test and Evaluation, Verification and Validation, Modeling and Simulation. iii Section 1 Introduction The answer is a little of both. Risk reduction is the primary purpose for both test and evaluation (T&E) and verification, validation, and accreditation (VV&A). By evaluating system performance against stated requirements, the user can gain confidence in the system produced. Modeling and simulation (M&S)s is being used as a key tool in system acquisition to reduce the time to field a system, resources to develop and evaluate that system, and decision risk. The use of M&S can also help evaluate and improve the quality, military utility, and supportability of fielded systems. In the T&E phase of system acquisition, M&S is used to develop parameters for mission rehearsal, design tests, analyze data collected during testing, and evaluate regions of the operational envelope that are otherwise not testable. While M&S is a useful tool for predicting, training, and planning, it is not a substitute for testing. M&S is only useful if it applies to the evaluation of the system being acquired, and if it is capable of replicating reality to an acceptable level as required for the particular use. The evaluation of M&S systems against the requirements, both system-specific and in terms of its real world representation provides insight into M&S credibility. This paper is not intended to address the specifics for performing VV&A. Instead, it will quickly review earlier work done to characterize the use of M&S in system acquisition and propose a new case of current application. 1 Section 2 Previous Research The Relationship of VV&A to T&E identified four cases where M&S has traditionally been used to support system acquisition. That research illustrated that a clear overlap exists between the two processes and suggested areas where collaboration might reduce cost and risk. The dialogue created by that paper has served to promote cooperation between the testing and VV&A communities. Table 2-1 illustrates the four cases: Table 2-1. Relationship of VV&A plans to TEMPs CASE M&S OPERATIONAL RELATIONSHIP SYSTEM (VVAP to TEMP) 1 - Used for readiness, - No operational force structure, or system developed No sustainability V Acquisition - VV&A Plan - No TEMP 2 - Used for concept - Normal acquisition Precedes def.of operational V Develop- system T ment - VV&A Plan - TEMP 3 - Supports concept - Acq. supported & definition guided by M&S for V Supports - Model updated perf. modeling and Develop- during development engineering trades T ment and test - TEMP indirectly - VV&A Plan influenced by VVAP 4 - M&S embedded in - Normal acquisition and developed as - VVAP becomes V Part of component(s) of part of TEMP effort Develop- operational system - VV&A and DT&E/ T ment - VV&A Plan OT&E tests directly support each other In Case 1, M&S are built for reasons not related to system acquisition. Since there is no system being acquired and no T&E activity, there is no defined relationship between T&E and VV&A. In Case 2, M&S is developed to support the Concept Exploration and Program Definition phases of acquisition. M&S precedeõs system development, but is not updated as the system matures. The model loses congruence with the system being developed except for requirements, any VV&A conducted will have little relevance to T&E of the mature system. 2-1 In Case 3, M&S supports a system under development. The digital representation of the system precedes system development and is updated as the system matures. In this case, the real system and the model are distinct entities. The VV&A of the model and the T&E of the system occur in parallel. Following the Model-Test-Model paradigm, the T&E and VV&A processes, complement and support each other. The model is used to guide the system development and the developing systemõs test results are used to refine the model. The developing Simulation Based Acquisition concept takes its roots in using modeling and simulation as the basis for development of new systems. It clearly follows a Case 3 scenario for VV&A and T&E comparison. In Case 4, the model is a subset of the system. M&S are totally embedded within the operational system. This integration of the VV&A and T&E processes yields three key benefits: commonality and reuse of testing techniques, value of conceptual modeling and early correction of system problems. The reader is encouraged to read The Relationship of VV&A to T&E for a full discussion of each of these benefits. 2-2 Section 3 A New Example Case 5 is the addition to that earlier work. Here, the system under test is itself a simulation. The system hardware consists solely of the computer platform(s) required to run the simulation. The system software consists of only the simulation. Table 3-1 illustrates Case 5: Table 3-1. Acquisition of a Simulation CASE M&S OPERATIONAL SYSTEM RELATIONSHIP (VVAP to TEMP) 5 M&S is the System -Acquisition of M&S -VV&A and T&E congruent V T The relationship of the T&E and VV&A processes is illustrated as being roughly congruent, with T&E a subset of VV&A. A crosswalk of the two processes was conducted as part of the earlier research and a comparison was made of the information required to support each process. The VV&A Plan format contained in the DoD VV&A Recommended Practices 3-1 Guide (RPG) (Glasow, ed. 1996) was compared to the Test and Evaluation Master Plan (TEMP) format. It was found that the information requirements were virtually identical, but that the VV&A process includes certain activities, such as code verification or algorithm validation, that are not part of the T&E process. However, where problems are identified during T&E, there may be a need to examine the code or algorithms, hence T&E is an open subset of the encompassing VV&A process. Table 3-2 illustrates this crosswalk: Table 3-2. TEMP VV&A Crosswalk TEMP FORMAT V&V FORMAT PART I-SYSTEM INTRODUCTION A. Application Description and M&S Approach A. Mission Description B. System Threat Assessment B. Model Description C. Measures of Effectiveness D. System Description C. Application M&S Requirements and Acceptability Criteria E. Critical Technical Parameters Part II- INTEGRATED TEST PROGRAM SUMMARY D. Model Capability A. Integrated Test Program Schedule B. Management E. Model V&V Status Part III- DT&E Outline F. Model V&V Requirements A. DT&E Overview B. Future DT&E G. Verification Plan Part IV-OT&E Outline H. Validation Plan A. OT&E Overview B. COIs I. Data Verification, Validation and Certification (VV&C) Plan C. Future OT&E D. Live Fire T&E J. Integrated Verification and Validation Part V-T&E Resource Summary 3-2 Section 4 Key Commonalties Requirements The T&E process is founded in requirements, including Critical Technical Parameters, Critical Operational Issues, and Measures of Performance and Effectiveness. The maturity of this process provides an excellent benchmark for the evolution of the VV&A process. In the same way that the T&E process assesses operational system performance, the VV&A process assesses M&S credibility. The DOD Generic VV&A Process, described in the RPG, begins by identifying the problem to be solved and the requirements for solving that problem. The next step is to determine the problem solving approach. M&S is one tool for problem solving, but other tools may also be used to arrive at a solution. Given that at least part of the solution will be arrived at through M&S, general requirements for model capabilities are identified. Depending on these general requirements, the problem solver may be able to use an existing model either Òas isó or modified, or a new model may need to be developed. Once that decision is made, requirements for the specific model(s) chosen are established and the model is prepared for use. While the T&E process is rooted in requirementsõ definition, the VV&A process has not yet learned the lesson or importance of requirements. Many programs attempt to avoid requirementsõ definition or make unfounded assumptions. One assumption is that M&S is the correct tool to use where another tool might be easier or less costly for the given problem. Another assumption is in choosing a specific model without rationale or basis in requirements. A poor choice of model may result in invalid results where the model chosen was not built to answer particular types of questions. Sometimes these problems are due to unfamiliarity with the VV&A process, although there have also been instances where suboptimal decisions were intentionally made. Such decisions often reflected a desire to maximize resources in other areas or to placate a decision maker who had already decided what tool would be used. In a few cases, requirements have been ÒtailoredÓ out of the VV&A process. ÒTailoringÓ is a VV&A term that describes the focusing of a well-planned VV&A effort on those tasks that will provide optimal return on investment. It is the process of selecting which V&V tasks and techniques will provide the most expedient and credible results by which the model can be assessed. RequirementsÕ definition, however, is not open to negotiation or tailoring. Common sense dictates that in order to credibly assess a simulation, one must know what the simulation is supposed to do! 4-1 Management The T&E process is well established and is understood by a large community of developers, testers, and managers. By comparison, the VV&A process is relatively new. The T&E process utilizes mature methods that provide excellent examples which VV&A would do well to emulate. For example, the TEMP requires that responsibilities for each segment of the testing community be delineated. Another example is the approval process for the TEMP and other testing documents, which requires negotiation and compromise among participating organizations prior to the start of a T&E effort. By comparison, VV&A efforts reflect a wide variety of dissimilar and nonstandard approaches, many of which are incomplete or which unnecessarily delay the start of the VV&A process. Identification of roles and responsibilities -- who will do what for whom, when, where, why and for how much - is essential prior to the start of any VV&A effort. Unfortunately, many programs do not learn this lesson until after they have expended large sums of time and money, losing the optimal window of opportunity, during the development of the simulation. Often resources are wasted on educating the contractors who are supposed to understand and implement VV&A, thereby adding to the reputation of VV&A being costly. Documentation The T&E process is characterized by clearly defined documentation. Although common reporting formats for VV&A were developed for DOD, many programs avoid committing implementation details to writing. Initial attempts at writing a VV&A plan often include large tutorials that have been written at considerable expense. No new information is offered in such treatises despite claims of ÒtailoringÓ to meet ÒuniqueÓ program needs. However, the veracity of such claims have been few. Where the T&E process details the specific information requirements and criteria for assessing the system under test, most VV&A plans to date have failed to provide the executable detail needed to perform V&V. Specific V&V tasks and techniques must be identified and linked to specific portions of the problem to ensure that those tasks are indeed necessary. Unfortunately, the combined lack of stated requirements and the absence of executable V&V detail result in VV&A plans that merely provide a high level strategy, but never provide clear direction and action. The evolving Simulation Based Acquisition concept will employ synthetic environments and digital representations of evolving systems. This will require disciplined implementation of requirementsõ traceability, sound management processes and thorough documentation. With only digital models and databases, the software products will be based upon proven software development processes. The VV&A community will have to actively engage these practices to ensure that VV&A is not Òassumed awayó under the context of good software development processes, or replaced altogether as an Òunnecessary expenseó to program offices. 4-2 JWARS Case Study The JWARS program office initiated a V&V effort in October 1997 to support the production version of JWARS. Additionally, the Joint Requirements Oversight Council (JROC) directed that a test and evaluation plan be required as part of the Operational Requirements Document (ORD). T&E planing began in December 1997, using the V&V Plan as its initial point of departure. JWARS presents a new form of modeling and simulation use in DOD. Where M&S has historically been used to support development of weapons systems and other tangible assets, the JWARS simulation is itself the system under test. As such, the system requires some form of T&E, while its formulation as a simulation requires that JWARS also undergo V&V, under DOD Instruction The relationship of these two processes has been the subject of research being conducted to support other DOD programs facing similar dilemmas. That research has been applied to the evolution of an integrated JWARS T&E / V&V strategy. The JWARS V&V effort is distinct from the in-house quality assurance being conducted by the developer. It is independent in the sense that the V&V contractor reports to the oversight body for JWARS development, the Joint Analytic Modeling Improvement Program (JAMIP). The conduct of additional V&V beyond developer QA is evidence of the commitment on the part of the program office to provide a product that is useful and usable by the warfare analysis community. During the first quarter of FY98, the V&V contractor developed a V&V plan. The RPG was used as the primary resource for the development of the plan. The V&V contractor also worked closely with an oversight group of recognized DOD VV&A experts who provided input and direction for the planõs development. The V&V plan is currently under Service and CINC review. It has received favorable comments from many of the key reviewers, although significant concern for supporting resources and the V&V relationship to T&E has been stressed. JWARS is an ACAT III program, therefore formal test and evaluation per DOD acquisition directives is not required. However, the JWARS Program Office has elected to use the Test and Evaluation Master Plan (TEMP) format to guide the development of a T&E plan. T&E planning and execution is occurring in parallel with the V&V effort and is leveraging from the V&V Plan to ensure coordination between these two processes. JWARS T&E differs from traditional program T&E in two significant ways. First, to support its T&E initiative, the JWARS program has recognized the need to involve both the ServicesÕ T&E and analytical agencies. The T&E agencies are the traditional sources for test and evaluation support to the Services and would naturally be sought for their expertise during this process. However, the Operational Test Directors (OTDs) at these agencies are primarily warfighters who test hardware -- platforms, weapons systems, and equipment which is distinctly different from Òanalytical softwareó such as JWARS. Whereas the OTDs represent the military users of hardware systems, the military analysts are the targeted user community for JWARS. The operational test of a weapons system often involves hands-on use of the system in the field by military operators. Similarly, operational test of a simulation requires use by personnel trained and experienced in that arena. For JWARS, those personnel are the analysts who are trained in the use of theater-level simulation. The analysis agencies will serve as test sites and the OT&E agencies will provide oversight and report on the testing conducted. A second difference is JWARSÕ intended use of alpha and beta testing. Although conducted at field sites and involving potential users, the primary purpose of these tests is to provide feedback to the developer. JWARS intends to use the alpha and beta testing phases to support both the developerõs quality assurance program, and to provide the military user community with early opportunities to become familiar with the simulation The JWARS V&V Plan identifies V&V techniques from the RPG, however due to limited resources, problem domain validation will be restricted to face validation by subject matter experts. This technique, while necessary, is not sufficient for credible validation of a simulation of the magnitude and criticality of JWARS. Therefore, the T&E effort has been focused on extending the validation envelope through additional test techniques that meet both V&V and T&E objectives. In their capacity as T&E/ V&V oversight support to the program office, MITRE developed a crosswalk between the V&V Plan (as recommended in RPG) and the TEMP (as described in DODR R). This provided important information regarding the information overlaps which exist between the two documents, and identified where existing information in the V&V Plan could be leveraged as immediate input to the T&E plan. The value of this approach is the reduction of duplication of effort, thereby saving time and money, while ensuring that these processes mesh and complement each other. A Working Group Integrated Process Team (WGIPT) was established to develop the strategy, identify test activities and testers, ensure the correct conduct and documentation of test events, review test results, and provide recommendations to the JAMIP. The WGIPT consists of representatives from the ServicesÕ analysis organizations, the ServicesÕ T&E agencies, OSD, J-8, and the JWARS program office. Advisors to the WGIPT included the JWARS developer, MITRE, and representatives from the Joint Data System (JDS). An IT&E contractor will prepare test plans for the WGIPTÕs approval, provide periodic status briefings to the WGIPT, coordinate required memoranda of agreement, and document T&E results. An initial concept for JWARS T&E was developed and presented to the WGIPT in March This concept focused on testing of the Planning & Execution and Force Assessment applications of JWARS prior to IOC. Systems Effectiveness & Tradeoff Analysis and Concept & Doctrine Development and Assessment was identified for later testing. A set of proposed performance measures was also provided, with traceability, utility, and V&V highlighted as the three key performance parameters. An additional briefing 4-4 described the Fielding Plan developed by J-8, which provides a detailed description of the logistic implementation for JWARS fielding at identified test sites. The Fielding Plan included designation of the level of testing at which each test site would participate. J-8 also provided a macro-level
Related Search
Similar documents
View more...
We Need Your Support
Thank you for visiting our website and your interest in our free products and services. We are nonprofit website to share and download documents. To the running of this website, we need your help to support us.

Thanks to everyone for your continued support.

No, Thanks