rfc3116.txt
来自「RFC 的详细文档!」· 文本 代码 · 共 1,520 行 · 第 1/5 页
TXT
1,520 行
Network Working Group J. Dunn
Request for Comments: 3116 C. Martin
Category: Informational ANC, Inc.
June 2001
Methodology for ATM Benchmarking
Status of this Memo
This memo provides information for the Internet community. It does
not specify an Internet standard of any kind. Distribution of this
memo is unlimited.
Copyright Notice
Copyright (C) The Internet Society (2001). All Rights Reserved.
Abstract
This document discusses and defines a number of tests that may be
used to describe the performance characteristics of ATM (Asynchronous
Transfer Mode) based switching devices. In addition to defining the
tests this document also describes specific formats for reporting the
results of the tests.
This memo is a product of the Benchmarking Methodology Working Group
(BMWG) of the Internet Engineering Task Force (IETF).
Table of Contents
1. Introduction . . . . . . . . . . . . . . . . . . . . . . . . . 4
2. Background . . . . . . . . . . . . . . . . . . . . . . . . . . 5
2.1. Test Device Requirements . . . . . . . . . . . . . . . . . . 5
2.2. Systems Under Test (SUTs). . . . . . . . . . . . . . . . . . 5
2.3. Test Result Evaluation . . . . . . . . . . . . . . . . . . . 5
2.4. Requirements . . . . . . . . . . . . . . . . . . . . . . . . 5
2.5. Test Configurations for SONET. . . . . . . . . . . . . . . . 6
2.6. SUT Configuration. . . . . . . . . . . . . . . . . . . . . . 7
2.7. Frame Formats. . . . . . . . . . . . . . . . . . . . . . . . 8
2.8. Frame Sizes. . . . . . . . . . . . . . . . . . . . . . . . . 8
2.9. Verifying Received IP PDU's. . . . . . . . . . . . . . . . . 9
2.10. Modifiers . . . . . . . . . . . . . . . . . . . . . . . . . 9
2.10.1. Management IP PDU's . . . . . . . . . . . . . . . . . . . 9
2.10.2. Routing Update IP PDU's . . . . . . . . . . . . . . . . . 10
2.11. Filters . . . . . . . . . . . . . . . . . . . . . . . . . . 10
2.11.1. Filter Addresses. . . . . . . . . . . . . . . . . . . . . 11
2.12. Protocol Addresses. . . . . . . . . . . . . . . . . . . . . 12
Dunn & Martin Informational [Page 1]
RFC 3116 Methodology for ATM Benchmarking June 2001
2.13. Route Set Up. . . . . . . . . . . . . . . . . . . . . . . . 12
2.14. Bidirectional Traffic . . . . . . . . . . . . . . . . . . . 12
2.15. Single Stream Path. . . . . . . . . . . . . . . . . . . . . 12
2.16. Multi-port. . . . . . . . . . . . . . . . . . . . . . . . . 13
2.17. Multiple Protocols. . . . . . . . . . . . . . . . . . . . . 14
2.18. Multiple IP PDU Sizes . . . . . . . . . . . . . . . . . . . 14
2.19. Testing Beyond a Single SUT . . . . . . . . . . . . . . . . 14
2.20. Maximum IP PDU Rate . . . . . . . . . . . . . . . . . . . . 15
2.21. Busty Traffic . . . . . . . . . . . . . . . . . . . . . . . 15
2.22. Trial Description . . . . . . . . . . . . . . . . . . . . . 16
2.23. Trial Duration. . . . . . . . . . . . . . . . . . . . . . . 16
2.24. Address Resolution. . . . . . . . . . . . . . . . . . . . . 16
2.25. Synchronized Payload Bit Pattern. . . . . . . . . . . . . . 16
2.26. Burst Traffic Descriptors . . . . . . . . . . . . . . . . . 17
3. Performance Metrics. . . . . . . . . . . . . . . . . . . . . . 17
3.1. Physical Layer-SONET . . . . . . . . . . . . . . . . . . . . 17
3.1.1. Pointer Movements. . . . . . . . . . . . . . . . . . . . . 17
3.1.1.1. Pointer Movement Propagation . . . . . . . . . . . . . . 17
3.1.1.2. Cell Loss due to Pointer Movement. . . . . . . . . . . . 19
3.1.1.3. IP Packet Loss due to Pointer Movement . . . . . . . . . 20
3.1.2. Transport Overhead (TOH) Error Count . . . . . . . . . . . 21
3.1.2.1. TOH Error Propagation. . . . . . . . . . . . . . . . . . 21
3.1.2.2. Cell Loss due to TOH Error . . . . . . . . . . . . . . . 22
3.1.2.3. IP Packet Loss due to TOH Error. . . . . . . . . . . . . 23
3.1.3. Path Overhead (POH) Error Count. . . . . . . . . . . . . . 24
3.1.3.1. POH Error Propagation. . . . . . . . . . . . . . . . . . 24
3.1.3.2. Cell Loss due to POH Error . . . . . . . . . . . . . . . 25
3.1.3.3. IP Packet Loss due to POH Error. . . . . . . . . . . . . 26
3.2. ATM Layer. . . . . . . . . . . . . . . . . . . . . . . . . . 27
3.2.1. Two-Point Cell Delay Variation (CDV) . . . . . . . . . . . 27
3.2.1.1. Test Setup . . . . . . . . . . . . . . . . . . . . . . . 27
3.2.1.2. Two-point CDV/Steady Load/One VCC. . . . . . . . . . . . 27
3.2.1.3. Two-point CDV/Steady Load/Twelve VCCs. . . . . . . . . . 28
3.2.1.4. Two-point CDV/Steady Load/Maximum VCCs . . . . . . . . . 30
3.2.1.5. Two-point CDV/Bursty VBR Load/One VCC. . . . . . . . . . 31
3.2.1.6. Two-point CDV/Bursty VBR Load/Twelve VCCs. . . . . . . . 32
3.2.1.7. Two-point CDV/Bursty VBR Load/Maximum VCCs . . . . . . . 34
3.2.1.8. Two-point CDV/Mixed Load/Three VCC's . . . . . . . . . . 35
3.2.1.9. Two-point CDV/Mixed Load/Twelve VCCs . . . . . . . . . . 36
3.2.1.10. Two-point CDV/Mixed Load/Maximum VCCs . . . . . . . . . 38
3.2.2. Cell Error Ratio (CER) . . . . . . . . . . . . . . . . . . 39
3.2.2.1. Test Setup . . . . . . . . . . . . . . . . . . . . . . . 39
3.2.2.2. CER/Steady Load/One VCC. . . . . . . . . . . . . . . . . 40
3.2.2.3. CER/Steady Load/Twelve VCCs. . . . . . . . . . . . . . . 41
3.2.2.4. CER/Steady Load/Maximum VCCs . . . . . . . . . . . . . . 42
3.2.2.5. CER/Bursty VBR Load/One VCC. . . . . . . . . . . . . . . 43
3.2.2.6. CER/Bursty VBR Load/Twelve VCCs. . . . . . . . . . . . . 44
3.2.2.7. CER/Bursty VBR Load/Maximum VCCs . . . . . . . . . . . . 46
Dunn & Martin Informational [Page 2]
RFC 3116 Methodology for ATM Benchmarking June 2001
3.2.3. Cell Loss Ratio (CLR). . . . . . . . . . . . . . . . . . . 47
3.2.3.1. CLR/Steady Load/One VCC. . . . . . . . . . . . . . . . . 47
3.2.3.2. CLR/Steady Load/Twelve VCCs. . . . . . . . . . . . . . . 48
3.2.3.3. CLR/Steady Load/Maximum VCCs . . . . . . . . . . . . . . 49
3.2.3.4. CLR/Bursty VBR Load/One VCC. . . . . . . . . . . . . . . 51
3.2.3.5. CLR/Bursty VBR Load/Twelve VCCs. . . . . . . . . . . . . 52
3.2.3.6. CLR/Bursty VBR Load/Maximum VCCs . . . . . . . . . . . . 53
3.2.4. Cell Misinsertion Rate (CMR) . . . . . . . . . . . . . . . 54
3.2.4.1. CMR/Steady Load/One VCC. . . . . . . . . . . . . . . . . 54
3.2.4.2. CMR/Steady Load/Twelve VCCs. . . . . . . . . . . . . . . 55
3.2.4.3. CMR/Steady Load/Maximum VCCs . . . . . . . . . . . . . . 57
3.2.4.4. CMR/Bursty VBR Load/One VCC. . . . . . . . . . . . . . . 58
3.2.4.5. CMR/Bursty VBR Load/Twelve VCCs. . . . . . . . . . . . . 59
3.2.4.6. CMR/Bursty VBR Load/Maximum VCCs . . . . . . . . . . . . 60
3.2.5. CRC Error Ratio (CRC-ER) . . . . . . . . . . . . . . . . . 62
3.2.5.1. CRC-ER/Steady Load/One VCC . . . . . . . . . . . . . . . 62
3.2.5.2. CRC-ER/Steady Load/Twelve VCCs . . . . . . . . . . . . . 63
3.2.5.3. CRC-ER/Steady Load/Maximum VCCs. . . . . . . . . . . . . 64
3.2.5.4. CRC-ER/Bursty VBR Load/One VCC . . . . . . . . . . . . . 65
3.2.5.5. CRC-ER/Bursty VBR Load/Twelve VCCs . . . . . . . . . . . 66
3.2.5.6. CRC-ER/Bursty VBR Load/Maximum VCCs. . . . . . . . . . . 68
3.2.5.7. CRC-ER/Bursty UBR Load/One VCC . . . . . . . . . . . . . 69
3.2.5.8. CRC-ER/Bursty UBR Load/Twelve VCCs . . . . . . . . . . . 70
3.2.5.9. CRC-ER/Bursty UBR Load/Maximum VCCs. . . . . . . . . . . 71
3.2.5.10. CRC-ER/Bursty Mixed Load/Three VCC. . . . . . . . . . . 73
3.2.5.11. CRC-ER/Bursty Mixed Load/Twelve VCCs. . . . . . . . . . 74
3.2.5.12. CRC-ER/Bursty Mixed Load/Maximum VCCs . . . . . . . . . 75
3.2.6. Cell Transfer Delay (CTD). . . . . . . . . . . . . . . . . 76
3.2.6.1. Test Setup . . . . . . . . . . . . . . . . . . . . . . . 76
3.2.6.2. CTD/Steady Load/One VCC. . . . . . . . . . . . . . . . . 77
3.2.6.3. CTD/Steady Load/Twelve VCCs. . . . . . . . . . . . . . . 78
3.2.6.4. CTD/Steady Load/Maximum VCCs . . . . . . . . . . . . . . 79
3.2.6.5. CTD/Bursty VBR Load/One VCC. . . . . . . . . . . . . . . 81
3.2.6.6. CTD/Bursty VBR Load/Twelve VCCs. . . . . . . . . . . . . 82
3.2.6.7. CTD/Bursty VBR Load/Maximum VCCs . . . . . . . . . . . . 83
3.2.6.8. CTD/Bursty UBR Load/One VCC. . . . . . . . . . . . . . . 85
3.2.6.9. CTD/Bursty UBR Load/Twelve VCCs. . . . . . . . . . . . . 86
3.2.6.10. CTD/Bursty UBR Load/Maximum VCCs. . . . . . . . . . . . 87
3.2.6.11. CTD/Mixed Load/Three VCC's. . . . . . . . . . . . . . . 88
3.2.6.12. CTD/Mixed Load/Twelve VCCs. . . . . . . . . . . . . . . 90
3.2.6.13. CTD/Mixed Load/Maximum VCCs . . . . . . . . . . . . . . 91
3.3. ATM Adaptation Layer (AAL) Type 5 (AAL5) . . . . . . . . . . 93
3.3.1. IP Packet Loss due to AAL5 Re-assembly Errors. . . . . . . 93
3.3.2. AAL5 Re-assembly Time. . . . . . . . . . . . . . . . . . . 94
3.3.3. AAL5 CRC Error Ratio . . . . . . . . . . . . . . . . . . . 95
3.3.3.1. Test Setup . . . . . . . . . . . . . . . . . . . . . . . 95
3.3.3.2. AAL5-CRC-ER/Steady Load/One VCC. . . . . . . . . . . . . 95
3.3.3.3. AAL5-CRC-ER/Steady Load/Twelve VCCs. . . . . . . . . . . 96
Dunn & Martin Informational [Page 3]
RFC 3116 Methodology for ATM Benchmarking June 2001
3.3.3.4. AAL5-CRC-ER/Steady Load/Maximum VCCs . . . . . . . . . . 97
3.3.3.5. AAL5-CRC-ER/Bursty VBR Load/One VCC. . . . . . . . . . . 99
3.3.3.6. AAL5-CRC-ER/Bursty VBR Load/Twelve VCCs. . . . . . . . .100
3.3.3.7. AAL5-CRC-ER/Bursty VBR Load/Maximum VCCs . . . . . . . .101
3.3.3.8. AAL5-CRC-ER/Mixed Load/Three VCC's . . . . . . . . . . .102
3.3.3.9. AAL5-CRC-ER/Mixed Load/Twelve VCCs . . . . . . . . . . .104
3.3.3.10. AAL5-CRC-ER/Mixed Load/Maximum VCCs . . . . . . . . . .105
3.4. ATM Service: Signaling . . . . . . . . . . . . . . . . . . .106
3.4.1. CAC Denial Time and Connection Establishment Time. . . . .106
3.4.2. Connection Teardown Time . . . . . . . . . . . . . . . . .107
3.4.3. Crankback Time . . . . . . . . . . . . . . . . . . . . . .108
3.4.4. Route Update Response Time . . . . . . . . . . . . . . . .109
3.5. ATM Service: ILMI. . . . . . . . . . . . . . . . . . . . . .110
3.5.1. MIB Alignment Time . . . . . . . . . . . . . . . . . . . .110
3.5.2. Address Registration Time. . . . . . . . . . . . . . . . .111
4. Security Considerations . . . . . . . . . . . . . . . . . . .112
5. Notices. . . . . . . . . . . . . . . . . . . . . . . . . . . .112
6. References . . . . . . . . . . . . . . . . . . . . . . . . . .113
7. Authors' Addresses . . . . . . . . . . . . . . . . . . . . . .113
APPENDIX A . . . . . . . . . . . . . . . . . . . . . . . . . . .114
APPENDIX B . . . . . . . . . . . . . . . . . . . . . . . . . . .114
APPENDIX C . . . . . . . . . . . . . . . . . . . . . . . . . . .116
Full Copyright Statement . . . . . . . . . . . . . . . . . . . .127
1. Introduction
This document defines a specific set of tests that vendors can use to
measure and report the performance characteristics of ATM network
devices. The results of these tests will provide the user comparable
data from different vendors with which to evaluate these devices.
The methods defined in this memo are based on RFC 2544 "Benchmarking
Methodology for Network Interconnect Devices".
The document "Terminology for ATM Benchmarking" (RFC 2761), defines
many of the terms that are used in this document. The terminology
document should be consulted before attempting to make use of this
document.
The BMWG produces two major classes of documents: Benchmarking
Terminology documents and Benchmarking Methodology documents. The
Terminology documents present the benchmarks and other related terms.
The Methodology documents define the procedures required to collect
the benchmarks cited in the corresponding Terminology documents.
Dunn & Martin Informational [Page 4]
RFC 3116 Methodology for ATM Benchmarking June 2001
2. Background
2.1. Test Device Requirements
This document is based on the requirement that a test device is
available. The test device can either be off the shelf or can be
easily built with current technologies. The test device must have a
transmitting and receiving port for the interface type under test.
The test device must be configured to transmit test PDUs and to
analyze received PDUs. The test device should be able to transmit
and analyze received data at the same time.
2.2. Systems Under Test (SUTs)
There are a number of tests described in this document that do not
apply to each SUT. Vendors should perform all of the tests that can
be supported by a specific product type. It will take some time to
perform all of the recommended tests under all of the recommended
conditions.
2.3. Test Result Evaluation
Performing all of the tests in this document will result in a great
deal of data. The applicability of this data to the evaluation of a
particular SUT will depend on its expected use and the configuration
of the network in which it will be used. For example, the time
required by a switch to provide ILMI services will not be a pertinent
measurement in a network that does not use the ILMI protocol, such as
an ATM WAN. Evaluating data relevant to a particular network
installation may require considerable experience, which may not be
readily available. Finally, test selection and evaluation of test
results must be done with an understanding of generally accepted
testing practices regarding repeatability, variance and the
statistical significance of a small numbers of trials.
2.4. Requirements
In this document, the words that are used to define the significance
of each particular requirement are capitalized. These words are:
* "MUST" This word, or the words "REQUIRED" and "SHALL" mean that
the item is an absolute requirement of the specification.
* "SHOULD" This word or the adjective "RECOMMENDED" means that there
may exist valid reasons in particular circumstances to ignore this
item, but the full implications should be understood and the case
carefully weighed before choosing a different course.
Dunn & Martin Informational [Page 5]
RFC 3116 Methodology for ATM Benchmarking June 2001
* "MAY" This word or the adjective "OPTIONAL" means that this item
is truly optional. One vendor may choose to include the item
because a particular marketplace requires it or because it
enhances the product, for example; another vendor may omit the
same item.
An implementation is not compliant if it fails to satisfy one or more
of the MUST requirements for the protocols it implements. An
implementation that satisfies all the MUST and all the SHOULD
requirements for its protocols is said to be "unconditionally
compliant"; one that satisfies all the MUST requirements but not all
the SHOULD requirements for its protocols is said to be
"conditionally compliant".
2.5. Test Configurations for SONET
The test device can be connected to the SUT in a variety of
configurations depending on the test point. The following
⌨️ 快捷键说明
复制代码Ctrl + C
搜索代码Ctrl + F
全屏模式F11
增大字号Ctrl + =
减小字号Ctrl + -
显示快捷键?