Spec sfs® 2014 Run and Reporting Rules Standard Performance Evaluation Corporation (spec)



Download 168.85 Kb.
Page1/6
Date28.01.2017
Size168.85 Kb.
#9392
  1   2   3   4   5   6

SPEC SFS® 2014 Run Rules Version 1.0



SPEC SFS® 2014

Run and Reporting Rules


Standard Performance Evaluation Corporation (SPEC)
7001 Heritage Village Plaza
Suite 225
Gainesville, VA 20155

Phone: 1-703-579-8460


Fax: 1-703-579-8463
E-Mail: info@spec.org

Copyright (c) 2014 by Standard Performance Evaluation Corporation (SPEC)

All rights reserved

SPEC and SFS are registered trademarks of the Standard Performance Evaluation Corporation

NFS is a registered trademark of Sun Microsystems, Inc.

Table of Contents


1Overview 4

1.1Definitions 5

1.2Philosophy 5

1.3Caveats 6

2Results Disclosure and Usage 6

2.1Fair Use of SPEC SFS® 2014 Results 7

2.2Research and Academic usage of SPEC SFS® 2014 7

2.3SPEC SFS® 2014 metrics 7

2.4Full disclosure of benchmark configuration and results 8

2.5Disclosure of Results for Electronically Equivalent Systems 8

2.5.1Definition of Electronic Equivalence 8

3Benchmark Software Requirements 9

3.1Storage Solution Software 9

3.2Benchmark Source Code Changes 9

4Storage Solution Configuration and Protocol Requirements 9

4.1Shared storage protocol requirements 9

4.2Load Generator configuration requirements 10

4.3Description of Stable Storage for SPEC SFS 2014 10

Example: 10

NFS protocol definition of stable storage and its use 10

Example: 11

SMB protocol definition of stable storage and its use 11

4.3.1Definition of terms pertinent to stable storage 12

4.3.2Stable storage further defined 12

4.3.3Specifying fault-tolerance features of the SUT 13

4.3.4SPEC SFS® 2014 submission form fields related to stable storage 13

4.3.5Stable storage examples 14

5Benchmark Execution Requirements 15

5.1Valid methods for benchmark execution 15

5.2Solution File System Creation and Configuration 15

5.3Data Point Specification for Results Disclosure 15

5.4Overall response time calculation 16

5.5Benchmark Modifiable Parameters 16

5.5.1BENCHMARK 16

5.5.2LOAD 16

5.5.3INCR_LOAD 17

5.5.4NUM_RUNS 17

5.5.5CLIENT_MOUNTPOINTS 18

5.5.6EXEC_PATH 18

5.5.7USER 18

5.5.8WARMUP_TIME 18

5.5.9IPV6_ENABLE 18

5.5.10PRIME_MON_SCRIPT 19

5.5.11PRIME_MON_ARGS 19

5.5.12NETMIST_LOGS 19

5.5.13PASSWORD 19

6SFS Submission File and Reporting Form Rules 20

6.1Submission Report Field Descriptions 20

6.2Processing Elements Field Description 28

6.3Memory elements field description 29

6.4Solution under test diagram 29



1Overview

This document specifies the guidelines on how the SPEC SFS 2014 Benchmark is to be run for measuring and publicly reporting performance results. These rules have been established by the SPEC SFS subcommittee and approved by the SPEC Open Systems Steering Committee. They ensure that results generated with this suite are meaningful, comparable to other generated results, and are repeatable (with documentation covering factors pertinent to duplicating the results).


This document provides the rules to follow for all submitted, reported, published and publicly disclosed runs of the SPEC Solution File Server (SPEC SFS 2014) Benchmark according to the norms specified and approved by the SPEC SFS subcommittee. These run rules also form the basis for determining which server hardware and software features are allowed for benchmark execution and result publication.
This document should be considered the complete guide when addressing the issues of benchmark and file server configuration requirements for the correct execution of the benchmark. The only other documents that should be considered are potential clarifications or interpretations of these Run and Reporting Rules. These potential interpretations should only be accepted if they originate from and are approved by the SFS subcommittee.
These Run and Reporting Rules are meant to provide the standard by which customers can compare and contrast solution performance. It is the intent of the SFS subcommittee to set a reasonable standard for benchmark execution and disclosure of results so customers are presented with enough information about the disclosed configuration to potentially reproduce configurations and their corresponding results.
As a requirement of the license of the benchmark, these Run and Reporting Rules must be followed. If the user of the SPEC SFS 2014 benchmark suite does not adhere to the rules set forth herein, SPEC may choose to terminate the license with the user. Please refer to the SPEC SFS 2014 Benchmark license for complete details of the user’s responsibilities.
Per the SPEC license agreement, all results publicly disclosed must adhere to these Run and Reporting Rules.
The general philosophy behind the set of rules for benchmark execution is to ensure that benchmark results can be reproduced if desired:
1. All data published must be gathered from benchmark execution conducted according to the Run and Reporting Rules described in this chapter.

2. Benchmark execution must complete in its entirety and normally without benchmark failure or benchmark error messages.

3. The complete hardware, software, and network configuration used for the benchmark execution must be published. This includes any special server hardware, client hardware or software features.

4. Use of software features which invoke, generate or use software designed specifically for the benchmark is not allowed. Configuration options chosen for benchmark execution should be options that would be generally recommended for the customer.

5. The entire Solution under test (SUT), including all components and services, shall be generally available within 6 weeks of the first publication of the results. If the solution was not generally available on the date tested, the generally available solution’s performance must meet or exceed that of the solution tested for the initially reported performance. If the generally available solution does not meet the reported performance, the lower performing results from the generally available solution shall be published. However, lower results are acceptable if the margin of error for peak business metric is less than one percent (1%) and the margin of error for overall response time is less than five percent (5%)

Products are considered generally available if they can be ordered by ordinary customers and ship within a reasonable time frame. This time frame is a function of the product size and classification, and common practice. The availability of support and documentation for the products must coincide with the release of the products.

Hardware and software products must be generally available, as defined above, and still supported by their primary vendor.


In the disclosure, the submitting vendor must identify any SUT component that can no longer be ordered by ordinary customers.





Download 168.85 Kb.

Share with your friends:
  1   2   3   4   5   6




The database is protected by copyright ©ininet.org 2024
send message

    Main page