Guide to Advanced Empirical


The Process of Simulation Modelling in Software



Download 1.5 Mb.
View original pdf
Page84/258
Date14.08.2024
Size1.5 Mb.
#64516
TypeGuide
1   ...   80   81   82   83   84   85   86   87   ...   258
2008-Guide to Advanced Empirical Software Engineering
3299771.3299772, BF01324126
2. The Process of Simulation Modelling in Software
Engineering
This chapter provides an overview of the design and implementation of simulation models. Additional information about process simulation paradigms and general introductions can also be found in (Banks et al., 2000; Cellier, 1991; Law and
Kelton, 1999). Detailed descriptions of process simulation modelling methods specialized to instances of the event-driven and continuous simulation modelling paradigms can be found in (Rus et al., 2003) and (Pfahl and Ruhe, 2002), respectively.
Any process simulation modelling process consists of at least five steps cf. Fig. 1):
1. Formulation of the Problem Statement (modelling goal. Specification of the Reference Behaviour (based on observation or hypothetical. Identification of Model Concepts (physical processes, information flows, decision rules. Implementation of Executable Model (formal, executable representation. Model Experimentation
The starting point of any simulation modelling project is the identification and explicit formulation of a problem statement. The problem statement defines the modelling goal and helps to focus the modelling activities. In particular, it determines Refinement (with focus on dynamic aspects)
Abstraction
Formalization
Experimentation
Interpretation,
Evaluation,
Analysis
Virtual World
Real World
Validation
Verification
Verification
Problem
Statement
Reference
Behavior
Simulation
Results
Model
Concepts
Executable
Model
Virtual World
Fig. 1
Iterative process of simulation modelling


120 MM ller and D. Pfahl the model purpose and scope. For software process simulation models, Kellner et al. (1999) propose the following categories for model purpose and scope. Purpose:
(a) strategic management
(b) planning, control and operational management
(c) process improvement and technology adoption
(d) understanding
(e) training and learning. Scope:
(a) a portion of the life cycle (e.g. design phase, code inspection, some or all of testing, requirements management)
(b) a development project (e.g. single product development life cycle)
(c) multiple, concurrent projects (e.g., across a department or division)
(d) long-term product evolution (e.g. multiple, successive releases of a single product)
(e) long-term organization (e.g., strategic organizational considerations spanning successive releases of multiple products over a substantial time period)
In order to make the problem statement suitable for simulation-based problem- solving, it is helpful to specify the reference behaviour. Reference behaviour captures the dynamic (i.e., time-dependent) variation of key attributes of real-world entities. The reference behaviour can be both observed problematic behaviour (e.g., of quality, effort, or cost, which are to be analyzed and improved, and/or a desired behaviour that is to be achieved. The importance of the reference behaviour for the modelling process is twofold. Firstly, it helps identify important model (output) parameters and thus further focuses the subsequent modelling steps. Secondly, it is a crucial input to model validation because it allows for comparing simulation results with observed (or desired) behaviour.
The next step is the definition of model concepts, which entail. Existing process, quality, and resource models. Implicit or explicit decision rules. Typical observed behaviour patterns. Organizational information flows. Policies
Typically, model concepts can be in the form of quantitative or qualitative models, which are abstractions of behaviours observed in reality. They capture implicit and tacit expert knowledge and are formalized as rules. Usually, in this step, domain experts play a crucial role not only because they often have knowledge that cannot be found in documents or databases alone, but also because they can help distinguish relevant real-world information from what is irrelevant for the problem under study.
After the definition of model concepts the model is implemented in the simulation tool. Consistent with the modelling technique and tool chosen, all the information, knowledge and experience represented by the model concepts has to be transformed into a computer executable language. The result is an executable model. Technical


5. Simulation Methods simulation modelling expertise is crucial in the transformation of model concepts into the formal model representation which eventually will be executed on a computer.
The last step is model calibration and experimentation with the executable model, producing simulation results. Simulation experiments are performed to understand the system’s behaviour. Experimentation goes hand in hand with model calibration. Model calibration refers to the adjustment of simulation model parameters until the model output corresponds to real word data. Model calibration can be done based on expert estimates or through parameter fitting based on historic data. The calibration step is important in order to ensure that the model accurately reflects real-world behaviour and is required to build confidence in simulation results. After a model is calibrated, simulation experiments are performed to understand observed behaviour, to evaluate planning alternatives, or to explore improvement opportunities. At this stage, iteration is likely in model execution and modification as variables and model structures are changed and the simulation model results are compared against each other. Thus, experimentation not only provides simulation results, but also validates the simulation model. Guidance on how to design simulation experiments in general can be found in (Banks et al.,
2000) and (Law and Kelton, 1999), and specifically for software processes in
(Wakeland et al., Like software development projects, simulation modelling involves verification and validation activities. In short, verification can be seen as an activity that ensures that the model fits its intended purpose, while validation can be seen as the activity that ensures that the model appropriately reflects the real-world behaviour. Verification and validation are continuing activities throughout the modelling and simulation life cycle. They help. To produce simulation models that represent system behaviour closely enough to be used as a substitute for the actual system when conducting experiments. To increase the credibility of simulation models to a level that makes them acceptable for managers and other decision makers
Verification activities check the internal correctness or appropriateness of a simulation model, i.e. they ensure that the model was constructed in the right way. In particular, verification checks whether the transformation steps defined by the simulation modelling process have been conducted correctly. For example, verification ensures that the identified model concepts have properly been implemented in the executable model. For verification activities, expert knowledge on the simulation modelling technique is a major requirement. To some extent, verification is supported by simulation modelling tools. For example, the consistency of units in model equations can be automatically checked by a tool.
Validation activities check the external correctness or appropriateness of a simulation model, i.e. they try to find out whether the right model (with regards to its purpose or application) was constructed. In particular, validation checks whether the model represents the structural and behavioural properties of the real system correctly appropriately. For example, simulation results can be used to check the robustness or sensitivity of model behaviour for extreme values of input data. Even though


122 MM ller and D. Pfahl validation can be partly supported by simulation modelling tools, expert knowledge about the real world system is needed to interpret the range of results obtained.
The simulation literature offers several proposals for verification and validation of simulation models (Balci, 2003; Banks et al., 2000; Barlas, 1989; Forrester and
Senge, 1980; Law and Kelton, 1999; Sargent, 2003). For example, Balci (2003) proposes more than 30 different verification and validation techniques, classified into informal, static, dynamic, and formal. However, full verification and validation of simulation models whilst desirable, are often practically impossible due to cost and time restrictions (Pidd, 2004). Typically, only a subset of the available techniques and methods for model verification and validation are used.

Download 1.5 Mb.

Share with your friends:
1   ...   80   81   82   83   84   85   86   87   ...   258




The database is protected by copyright ©ininet.org 2024
send message

    Main page