1 / 14

IETF 77 – Anaheim, BMWG

SIP Performance Benchmarking draft-ietf-bmwg-sip-bench-term-01 draft-ietf-bmwg-sip-bench-meth-01 March 22, 2010 Prof. Carol Davids, Illinois Inst. of Tech. Dr. Vijay Gurbani, ALU Scott Poretsky, Allot Communications. IETF 77 – Anaheim, BMWG. Status. Working Group last call in progress

whitley
Download Presentation

IETF 77 – Anaheim, BMWG

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. SIP Performance Benchmarkingdraft-ietf-bmwg-sip-bench-term-01 draft-ietf-bmwg-sip-bench-meth-01 March 22, 2010Prof. Carol Davids, Illinois Inst. of Tech. Dr. Vijay Gurbani, ALU Scott Poretsky, Allot Communications IETF 77 – Anaheim, BMWG

  2. Status • Working Group last call in progress • Needs reviewers and comments • IIT Masters Candidates working to implement the methodology on open source SIP servers – results to be available soon IETF 77 – Anaheim, BMWG

  3. Summary of the ContentsTerminology • SIP Benchmarking Terminology provides 4 sets of definitions: • Protocol Components – defines the signaling, media and control planes; sessions with and without associated media, Invite-initiated sessions and Non-Invite initiated sessions • Test Components – defines parts of the test agent • Test Setup Parameters – defines a Session Attempt Rate, Establishment Threshold Time, and other parameters that must be recorded before entering a test cycle • Benchmarks – defines seven test parameters 3 IETF 77 – Anaheim, BMWG

  4. Benchmarks • 3.4.1. Registration Rate Definition: The maximum number of registrations that can be successfully completed by the DUT/SUT in a given time period. • 3.4.2. Session Establishment RateDefinition: The maximum average rate at which the DUT/SUT can successfully establish sessions. • 3.4.3. Session Capacity Definition: The maximum number of Established Sessions that can exist simultaneously on the DUT/SUT until Session Attempt Failure occurs. • 3.4.4. Session Overload Capacity Definition: The maximum number of Established Sessions that can exist simultaneously on the DUT/SUT until it stops responding to Session Attempts. • 3.4.5. Session Establishment Performance Definition: The percentage of Session Attempts that become Established Sessions over the duration of a benchmarking test. • 3.4.6. Session Attempt Delay Definition: The average time measured at the Emulated Agent for a Session Attempt to result in an Established Session. • 3.4.7 IM Rate Definition: Maximum number of IM messages completed by the DUT/SUT. IETF 77 – Anaheim, BMWG

  5. Reporting Format Test Setup SIP Transport Protocol = ____________________ Session Attempt Rate = _____________________ IS Media Attempt Rate = ____________________ Total Sessions Attempted = __________________ Media Streams Per Session = ________________ Associated Media Protocol = _________________ Media Packet Size = ________________________ Media Offered Load = _______________________ Media Session Hold Time = __________________ Establishment Threshold Time = _______________ Loop Detecting Option = _____________________ Forking Option = ___________________________ Number of endpoints request sent to = ________ Type of forking = __________________________ Benchmarks for IS Session Capacity = __________________________ Session Overload Capacity = __________________ Session Establishment Rate = _________________ Session Establishment Performance = __________ Session Attempt Delay = _____________________ Session Disconnect Delay = __________________ Benchmarks for NS IM Rate = _______________________________ Registration Rate = _________________________ Re-registration Rate = ________________________ IETF 77 – Anaheim, BMWG

  6. Summary of Changes • Updated and clarified some definitions and test cases based on clarification from implementation in IIT VoIP Research Lab. 6 IETF 77 – Anaheim, BMWG

  7. Next Steps • Reviewers and comments please! • Create -02 incorporating lessons learned in lab testing as well as reviewers’ comments. IETF 77 – Anaheim, BMWG

  8. BACKUP IETF 77 – Anaheim, BMWG

  9. Scope – DUT/SUT Signaling • The DUT must be a RFC 3261 capable network equipment. This is referred to as the "Signaling Server". • This may be a Registrar, Redirect Server, Stateless Proxy or Stateful Proxy. A DUT MAY also include a B2BUA, SBC, or P-CSCF functionality. • The DUT MAY be a multi-port SIP-to-switched network gateway implemented as a SIP UAC or UAS • The DUT or SUT MUST NOT be end user equipment. • The DUT MAY have an internal SIP ALG, Firewall, and/or a NAT. This is referred to as the "SIP Aware Stateful Firewall.“ SUT The Tester acts as multiple "Emulated Agents" that initiate (or respond) to SIP messages as session endpoints and source (or receive) “Associated Media” for established connections. • Terminology defines SIP Control Plane performance benchmarks for black-box measurements of SIP signaling of networking devices • Stress and debug scenarios are not addressed in this work item IETF 77 – Anaheim, BMWG

  10. Scope – Signaling and Media • Control signaling is benchmarked • Media performance is not benchmarked in this work item • It is RECOMMENDED that control plane benchmarks are performed with media present, but this is optional. • The SIP INVITE requests MUST always include the SDP body • The type of DUT dictates whether the associated media streams traverse the DUT or SUT. Both scenarios are within the scope of this work item. Signaling Signaling DUT or SUT – Calling UE – Tester Called UE – Tester Associated Media Associated Media Signaling Signaling Calling UE – Tester DUT or SUT – Called UE – Tester Associated Media IETF 77 – Anaheim, BMWG

  11. Session Terms • 3.1.6. Session Attempt Definition: A SIP Session for which the Emulated Agent has sent the SIP INVITE or SUBSCRIBE NOTIFY and has not yet received a message response from the DUT/SUT. • 3.1.7. Established Session Definition: A SIP session for which the Emulated Agent acting as the UE/UA has received a 200OK message from the DUT/SUT. • 3.1.8. Invite-initiated Session (IS) Definition: A Session that is created by an exchange of messages in the Signaling Plane, the first of which is a SIP INVITE request. • 3.1.9. Non-INVITE-initiated Session (NS) Definition: A session that is created by an exchange of messages in the Signaling Plane that does not include an initial SIP INVITE message. • 3.1.10. Session Attempt Failure Definition: A session attempt that does not result in an Established Session. • 3.1.11. Standing Sessions <added and the definition needs to be changed – as below> Definition: A SIP session that is currently an established session. IETF 77 – Anaheim, BMWG

  12. Scope - Scenarios • Session Establishment performance is benchmarked • Both INVITE and non-INVITE scenarios (such as IM) are addressed • Different transport mechanisms -- such as UDP, TCP, SCTP, or TLS -- may be used; • Transport mechanism MUST be noted as a condition of the test as the performance of SIP devices may vary accordingly • Looping and forking options are also considered • Impacts processing at SIP proxies • REGISTER and INVITE requests may be challenged or remain unchallenged for authentication purpose as this may impact the performance benchmarks. • Any observable performance degradation due to authentication is considered to be of interest to the SIP community IETF 77 – Anaheim, BMWG

  13. Scope - Overload • SIP Overload is considered within the scope of this work item: • Considerations on how to handle overload are deferred to work progressing in the SIPPING working group. • The normal response to an overload stimulus -- sending a 503 response -- is considered inadequate. • Vendors are free to implement their specific overload control behavior as the expected test outcome if it is different from the IETF recommendations. However, such behavior MUST be documented and interpreted appropriately across multiple vendor implementations. • This will make it more meaningful to compare the performance of different SIP overload implementations. • This draft now has a dependency on the strategy of the overload work in SIPPING IETF 77 – Anaheim, BMWG

  14. Out of Scope Scenarios • SIP Control performance benchmarking is the focus of this work item. • Media performance is not benchmarked in this work item • Stress and Steady-State benchmarking is not considered in scope. This could be covered in an Appendix if preferred. • Re-INVITE requests are not considered in scope • Benchmarking SIP Presence is not considered in scope • IMS-specific scenarios are not considered, but test cases can be applied with 3GPP-specific SIP signaling and the P-CSCF as a DUT • Session disconnect is not considered in scope • Only session establishment is considered for the performancebenchmarks. • Disconnect is a lightweight transaction to release resources for steady-state. • Has no performance benchmark because dependent on INVITE • posted on SIPPING for feedback IETF 77 – Anaheim, BMWG

More Related