1 / 12

Performance Benchmarking of Networking Devices for SIP

This proposal aims to standardize performance benchmarking metrics for SIP networking devices, addressing inconsistencies in vendor-reported metrics and providing clarity for the operational community. The benchmarks can be used by service providers, vendors, and others to compare device performance and make deployment decisions for SIP and 3GPP IMS networks.

llouise
Download Presentation

Performance Benchmarking of Networking Devices for SIP

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Proposal for New Work Item SIP PERFORMANCE BENCHMARKING OF NETWORKING DEVICES draft-poretsky-sip-bench-term-02.txt draft-poretsky-sip-bench-meth-00.txt Co-authors are Vijay Gurbani of Lucent Technologies Carol Davids of IIT VoIP Lab Scott Poretsky of Reef Point Systems 67th IETF Meeting – San Diego

  2. Service Providers are now planning VoIP and Multimedia network deployments using the IETF developed Session Initiation Protocol (SIP). The mix of SIP signaling and media functions has produced inconsistencies in vendor reported performance metrics and has caused confusion in the operational community. (Terminology) SIP allows a wide range of configuration and operational conditions that can influence performance benchmark measurements. (Methodology) Motivation

  3. Service Providers can use the benchmarks to compare performance of RFC 3261 devices. Server performance can be compared to other servers, SBCs, and Servers paired with SIP-Aware Firewall/NATs. Vendors and others can use benchmarks to ensure performance claims are based on common terminology and methodology. Benchmark metrics can be applied to make device deployment decisions for IETF SIP and 3GPP IMS networks More Motivation

  4. Terminology defines Performance benchmark metrics for black-box measurements of SIP networking devices Methodology provides standardized use cases that describe how to collect those metrics. Scope

  5. DUT MUST be a RFC 3261 compliant device. MAY include a SIP Aware Firewall/NAT and other functionality BMWG does not standardize compliance testing. SUT A RFC 3261 compliant device with a separate external SIP Firewall/NAT Anticipates the need to test the performance of a SIP-aware functional element that is not itself defined in RFC 3261 Devices vs. Systems Under Test

  6. Overview of Terminology Draft • The terminology document distinguishes between Invite Transactions and Non Invite Transactions • Thus the document addresses the fact that the SIP-enabled network provides services and applications other than PSTN replacement services. • The equipment as well as the network needs to support the total load.

  7. The following benchmarks have been defined Registration Rate Session Rate Session Capacity Associated media sessions - establishment rate Associated media sessions - setup delay Associated media sessions - disconnect delay Standing associated media sessions IM rate Presence rate Benchmarks defined

  8. SIPPING - Malas-draft-05 relates to end to end network metrics BMWG Poretsky et al - terminology draft 02 and methodology draft 00, relate to network-device metrics A network device performs work whether or not a registration succeeds or fails; whether or not an attempted session is created; whether or not a disconnect is successful. Whether or not a media session is created at all. A SIP-enabled network also carries signaling traffic whether or not a media session is successfully created. For example, IM, Presence and more generally subscription services all require network resources as well as computing device resources For this reason, we think that many of the bmwg metrics complement the malas draft and can also inform that document. Complements SIPPING Work Item

  9. Two forms of test topology Basic SIP Performance Benchmarking Topology with Single DUT and Tester Optional SUT Topology with Firewall/NAT between DUT and Tester when Media is present. Test Considerations Selection of SIP Transport Protocol Associated Media Session Duration Attempted Sessions per Second Need to complete test cases. Looking for more test cases Methodology

  10. Complete methodology Incorporate comments from mailing list Propose that BMWG make this a work item Next Steps for Terminology and Methodology

  11. -----Original Message----- From: Romascanu, Dan (Dan) [mailto:dromasca@avaya.com] Sent: Sunday, June 25, 2006 6:00 AM I believe that the scope of the 'SIP Performance Metrics' draft is within the scope of what bmwg is doing for a while, quite successfully, some say. On a more 'philosophical plan', there is nothing that says that the IETF work must strictly deal with defining the bits in the Internet Protocols - see http://www.ietf.org/internet-drafts/draft-hoffman-taobis-08.txt. And in any case, measuring how a protocol or a device implementing a protocol behaves can be considered also 'DIRECTLY related to protocol development'. BACKUP - Relevance to BMWG -----Original Message----- From: nahum@watson.ibm.com [mailto:nahum@watson.ibm.com] Sent: Friday, May 26, 2006 2:51 PM SPEC wants to develop and distribute common code for benchmarking, as is done with SPECWeb a SPECJAppServer. That code can and should use the standardized peformance definitions agreed to by SIPPING and/or BMWG.

  12. BMWG develops standard to benchmark SIP networking device performance SIPPING WG develops standard to benchmark end-to-end SIP application performance SPEC to develop industry-available test code for SIP benchmarking in accordance with IETF’s BMWG and SIPPING standards. BACKUP - Industry Collaboration -----Original Message-----From: Poretsky, Scott Sent: Thursday, June 22, 2006 8:00 PMTo: 'Malas, Daryl'; acmorton@att.com; gonzalo.camarillo@ericsson.com; mary.barnes@nortel.comCc: vkg@lucent.com; Poretsky, ScottSubject: RE: (BMWG/SippingWG) SIP performance metricsYes Daryl.  I absolutely agree.  The item posted to BMWG focuses on single DUT benchmarking of SIP performance.  Your work in SIPPING is focused on end-to-end application benchmarking.  It would be great (and I would even say a requirement) that the Terminologies for these two work items remain consistent with each other. 

More Related