Contents
If you need to update the Table of Content you would need to first unlock it.
To unlock the Table of Contents: select the Table of Contents, click simultaneously: Ctrl + Shift + F11.
To update the Table of Contents: F9.
To lock it: select the Table of Contents and then click simultaneously: Ctrl + F11.
Contents 3
Intellectual Property Rights 6
Foreword 6
1 Scope 7
2 References 7
2.1 Normative references 7
2.2 Informative references 7
3 Definitions, symbols and abbreviations 10
3.1 Definitions 10
3.2 Symbols 12
3.3 Abbreviations 12
4 An integrated framework for testing automation on a cloud infrastructure 14
4.1 Roles, relationships and interactions among TaaS users 14
4.1.1 End user services 15
5 End user use cases 17
5.1 Direct test execution use case 18
5.1.1 Direct test execution use case TaaS sequence diagram 19
5.2 Manual test design use case 21
5.2.1 Manual test design use case TaaS sequence diagram 23
5.3 Automated test design use case 24
5.3.1 Automated test design use case TaaS sequence diagram 26
6 Representation of System models in TPaaS 27
6.1 MDSL conceptual model 28
6.1.1 Test Planning Concept 28
6.1.2 Test Analysis Concepts 28
6.1.3 Test Design Concepts 30
6.1.4 Test Case Concepts 31
6.1.5 Test Data Concepts 32
6.1.6 Test Derivation Concepts 33
6.1.7 Refined Test Design Concepts 35
6.1.8 Test Scheduling Concepts 36
6.2 Realisation as UML Profiles 37
6.2.1 Test Planning Concepts Implementation 38
6.2.3 Test Requirement Implementation 38
6.2.4 Test Object Implementation 38
6.2.5 Test Component Implementation 38
6.2.6 SUT Implementation 38
6.2.7 Test Configuration Implementation 38
6.2.8 Test Case Implementation 38
6.2.9 Precondition Implementation 38
6.2.10 Postcondition Implementation 38
6.2.11 Parameter Implementation 38
6.2.12 Stimulus Implementation 39
6.2.13 Response Implementation 39
6.2.14 Verdict Implementation 39
6.2.15 Test Design Model Implementation 39
6.2.16 TestData Implementation 39
6.2.17 DataPartition Implementation 39
6.2.18 TestDataValue Implementation 40
6.2.19 DataPool Implementation 40
6.2.20 Test Suite Implementation 40
6.2.21 Test Procedure Implementation 40
6.2.22 Scheduling Specification Implementation 40
6.3 Constraints on the MIDAS DSL 41
6.3.1 TestConfiguration/TestContext Constraints 41
6.3.2 TestCase Constraints 41
6.3.3 TestProcedure Constraints 45
6.4 MDSL Validator 46
6.5 TTCN-3 Generator 46
6.6 SCA4SAUT approach to system modelling 48
6.6.1 Overview of the SCA4SAUT model 49
6.6.2 Introduction to the SCA Assembly notation 50
7 Deployment of the TPaaS on the public cloud infrastructure 56
7.1 Integration of test methods on the TPaaS platform 56
7.1.1 The Database structure for the MIDAS TPaaS 57
7.1.2 The storage file system for MIDAS TPaaS 58
7.2 Implemented facilities 59
7.2.1 Development Environment (devenv_vm) 59
7.2.2 Production Environment (prodenv_multivm) 61
Annex A: End User Use Case Examples 64
A.1 Direct Execution Use Case Example: IMS Conformance testing 64
A.1.1 IMS as SUT 64
A.1.2 Test configuration 65
A.1.2.1 SUT architecture 65
A.1.3.2 Message flow scenarios 66
A.1.3.3 Test suite structure 67
A.1.4 Direct execution procedures taken within TPaaS 67
A.1.5 Lesson learned from direct execution use case 69
A.2 Manual test design example - SCM Pilot 71
A.2.1 SCM Pilot 71
A.2.2 Test configuration 74
A.2.3 Message flow scenarios 78
A.2.4 Manual execution 82
A.2.5 Experiences 84
A.3 Automated test design example - e-Health Pilot 86
A.3.1 e-Health Pilot 86
A.3.2 Test configuration 88
A.3.3 Message flow scenarios 91
A.3.4 Automated execution 95
A.3.5 Experiences 97
History 98
Intellectual Property Rights
IPRs essential or potentially essential to the present document may have been declared to ETSI. The information pertaining to these essential IPRs, if any, is publicly available for ETSI members and non-members, and can be found in ETSI SR 000 314: "Intellectual Property Rights (IPRs); Essential, or potentially Essential, IPRs notified to ETSI in respect of ETSI standards", which is available from the ETSI Secretariat. Latest updates are available on the ETSI Web server (http://ipr.etsi.org).
Pursuant to the ETSI IPR Policy, no investigation, including IPR searches, has been carried out by ETSI. No guarantee can be given as to the existence of other IPRs not referenced in ETSI SR 000 314 (or the updates on the ETSI Web server) which are, or may be, or may become, essential to the present document.
Foreword
This Technical Report (TR) has been produced by ETSI Technical Committee Methods for Testing and Specification (MTS).
1 Scope
The present document provides an overview of the approach taken within the EU-funded research project called MIDAS to design, build and deploy an integrated framework for testing automation that will be available as a Test as a Service (TaaS) on a Cloud infrastructure, and which covers key testing activities: test suite generation, test execution, scheduling, evaluation and test results arbitration. Although, MIDAS is focused on the test automation for Service Oriented Architecture (SOA), the testing methods and technologies that are investigated and prototyped within the project, particularly on model-based test design and test suite generation, model checking of choreographies for sound interaction of test scenarios, fuzzing for security testing, usage-based testing, probabilistic inference reasoning for test evaluation and scheduling, can be generalized to a great degree and can be applied not only to SOA System - Under Test (SUT), but also to SUTs in other domains, e.g. Automotive, Telecommunications, Machine-to-Machine services, etc.
References are either specific (identified by date of publication and/or edition number or version number) or non‑specific. For specific references, only the cited version applies. For non-specific references, the latest version of the referenced document (including any amendments) applies.
Referenced documents which are not found to be publicly available in the expected location might be found at http://docbox.etsi.org/Reference.
NOTE: While any hyperlinks included in this clause were valid at the time of publication, ETSI cannot guarantee their long term validity.
2.1 Normative references
The following referenced documents are necessary for the application of the present document.
Not applicable.
References are either specific (identified by date of publication and/or edition number or version number) or non‑specific. For specific references, only the cited version applies. For non-specific references, the latest version of the referenced document (including any amendments) applies.
NOTE: While any hyperlinks included in this clause were valid at the time of publication ETSI cannot guarantee their long term validity.
The following referenced documents are not necessary for the application of the present document but they assist the user with regard to a particular subject area.
[i.0] ETSI ES 202 951: "Methods for Testing and Specification (MTS); Model-Based Testing (MBT); Requirements for Modelling Notations".
[i.0] ETSI ES 203 119: "Methods for Testing and Specification (MTS); The Test Description Language (TDL); Specification of the Abstract Syntax and Associated Semantics".
[i.0] MIDAS Deliverable D2.2.WP2: "Architecture and specifications of the MIDAS framework and platform", 2013.
[i.0] MIDAS Deliverable D3.6.WP3: "Generating TTCN-3 from PIM UML SAUT Specifications", 2014.
[i.0] MIDAS Deliverable D6.1.WP6: " Analysis of required functionalities and available public Cloud services", 2014.
[i.0] MIDAS Deliverable D6.2.WP6: "Specification and design of the basic MIDAS platform as a service on the Cloud", 2014.
[i.0] MIDAS Deliverable D6.3.WP6: "The basic MIDAS platform and the integrated test evaluation, planning and scheduling macro-component", 2014.
[i.0] ISO/IEC 9126-1:2001: “Software engineering -- Product quality”, 2001.
[i.0] ISO 9001: “Quality Management Systems”, 2005.
[i.0] International Organisation for Standardisation (ISO): ISO/IEC 29119, Software Testing Standard, http://www.softwaretestingstandard.org
[i.0] UTP_1_2 (2013). UML testing profile (UTP) version 1.2. Tech. Rep. formal/2013-04-03, Object Management Group.
[i.0] International Software Testing Qualifications Board (ISTQB): ISTQB/GTB standard glossary for testing terms. http://www.software-tester.ch/PDF-Files/CT_Glossar_DE_EN_V21.pdf.
[i.0] Object Management Group (OMG): Business Motivation Model (BMM). http://www.omg.org/spec/BMM
[i.0] IEEE 610.12: IEEE Standard Glossary of Software Engineering Terminology, 1990
[i.0] NIST: The NIST Definition of Cloud Computing, Special Publication 800-145, 2011
[i.16] MIDAS Deliverable D2.1: “Requirements for automatically testable services and services architectures”, 2013.
[i.17] ETSI TS 102 790-1: "Technical Committee for IMS Network Testing (INT); IMS specific use of Session Initiation Protocol (SIP) and Session Description Protocol (SDP); Conformance Testing; Part 1: Protocol Implementation Conformance Statement (PICS)".
[i.18] ETSI TS 102 790-2: "Core Network and Interoperability Testing (INT); IMS specific use of Session Initiation Protocol (SIP) and Session Description Protocol (SDP); Conformance Testing; (3GPP Release 10); Part 2: Test Suite Structure (TSS) and Test Purposes (TP)".
[i.19] ETSI TS 102 790-3: "Core Network and Interoperability Testing (INT); IMS specific use of Session Initiation Protocol (SIP) and Session Description Protocol (SDP); Conformance Testing; (3GPP Release 10); Part 3: Abstract Test Suite (ATS) and partial Protocol Implementation eXtra Information for Testing (PIXIT) proforma specification".
[i.20] ETSI TS 123 228: "Digital cellular telecommunications system (Phase 2+); Universal Mobile Telecommunications System (UMTS); LTE; IP Multimedia Subsystem (IMS)".
[i.21] ETSI TS 124 229: "Digital cellular telecommunications system (Phase 2+); Universal Mobile Telecommunications System (UMTS); LTE; IP multimedia call control protocol based on Session Initiation Protocol (SIP) and Session Description Protocol (SDP)".
[i.22] SCA_AM_V1_0 (2007). Service component architecture assembly model specification version 1.0. Tech. rep., OSOA.
[i.23] SCA_AM_V1_1 (2011). Service component architecture assembly model specification version 1.1. Tech. Rep. OASIS Committee Specification Draft 09 / Public Review Draft 04, OASIS.
[i.24] SoaML_1_0_1 (2012). Service Oriented Architecture Modeling Language (SoaML) Specification, Version 1.0.1. formal-12-05-10. Object Management Group.
[i.25] SOAP_1_1 (2000). Simple object access protocol (SOAP) 1.1. Tech. Rep. W3C Note 08 May 2000, World Wide Web Consortium.
[i.26] UTP_1_2 (2013). UML testing profile (UTP) - version 1.2. Tech. Rep. formal/2013-04-03, Object Management Group.
[i.27] WS-Transfer (2011). Web services transfer (WS-transfer). Tech. Rep. W3C Recommendation 13 December 2011, World Wide Web Consortium.
[i.28] WSDL_1_1 (2001). Web service definition language (WSDL) 1.1. Tech. Rep. W3C Note 15 March 2001, World Wide Web Consortium.
[i.29] XML_Infoset (2004). XML information set (second edition). Tech. Rep. W3C Recommendation 4 February 2004, World Wide Web Consortium.
[i.30] XPath_2_0 (2010). XML path language (XPath) 2.0 (second edition). Tech. rep., World Wide Web Consortium. W3C Recommendation 14 December 2010, World Wide Web Consortium.
[i.31] XSD_1_Structures (2004). XML schema part 1: Structures second edition. Tech. Rep. W3C Recommendation 28 October 2004, World Wide Web Consortium.
[i.32] MIDAS Deliverable: “SAUT Construction Model Specification Service Component Architecture for Services Architecture Under Test (SCA4SAUT) – V. 1.2”, 2014
Share with your friends: |