1 / 22

Experiences in Automating the Testing of SS7 Signaling Transfer Points

Experiences in Automating the Testing of SS7 Signaling Transfer Points. Tim Moors, Malathi Veeraraghavan , Zhifeng Tao, Xuan Zheng, Ramesh Badri Polytechnic University Brooklyn, NY, USA mv@poly.edu http://kunene.poly.edu/~mv. ISSTA 2002 July 22-24, 2002 Via di Ripetta, Rome - Italy.

Download Presentation

Experiences in Automating the Testing of SS7 Signaling Transfer Points

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Experiences in Automating the Testing of SS7 Signaling Transfer Points Tim Moors, Malathi Veeraraghavan, Zhifeng Tao, Xuan Zheng, Ramesh Badri Polytechnic University Brooklyn, NY, USA mv@poly.edu http://kunene.poly.edu/~mv ISSTA 2002 July 22-24, 2002 Via di Ripetta, Rome - Italy This project is co-sponsored by Verizon, NYSTAR and CATT.

  2. Outline • Background • Problem statement • Solution approach • Implementation — Automated SS7 Test Results Analyzer (ASTRA) • Summary

  3. Voice trunks SS7 links Databases (e.g. 800) 1-718-260-8888 SS7 Setup message 1-800-888-8888 STP STP STP STP STP STP STP STP SS7 Setup message Telephony Switch Telephony Switch Telephony Switch Telephony Switch Telephony Switch Telephony Switch 1-800-888-8888 1-718-260-8888 1-718-260-0000 1-718-260-0000 An 800 call is placed. Database is consulted for routing number. SS7 network also used for mobility management. A regular phone call is placed. STP (Signaling Transfer Point) is a datagram router – packet switch of the SS7 (Signaling System No. 7) network. A simple telephony network with two switches and two telephones. • Telephony & SS7 network Quad

  4. Outline • Background • Problem statement • Solution approach • Implementation —Automated SS7 Test Results Analyzer (ASTRA) • Summary

  5. Problem statement • Automate the analysis of test results generated by interoperability tests of STPs • Test result files are too large for error-free manual analysis • Tests need to be run for every upgrade (multiple vendors supply STPs to a service provider)

  6. What does an STP implement? • STP is the packet switch of an SS7 network • network layer • datagram forwarding • routing protocol • handles failures (topology) • handles congestion (loading) • data-link layer • complex support needed because of the use of link-by-link error control • referred to as signaling traffic management

  7. System under Test (SUT) Emulated systems TFR TFR TFR TFR STP Quad interoperability testing SCP_W31 SCP_E1 • Complex system – many interacting EFSMs • Distributed – many Points of Control and Observation (PCOs) – synchronization problem • “Embedded testing” or “Testing in Context” – Quad • Testing real-time properties – timed events • Non-determinism – many output sequences possible A43 A13 A44 A14 D1-0,1 A31 A1 SSP_E1 SSP_W31 STP_C STP_A A2 D2-0,1 A32 RSP C1-0,1 C2-0,1 …… COO COA RSR …… COO COA TFR COA COO TFP COO COA D3-0,1 A11 A41 SSP_E6 SSP_W36 STP_D STP_B A42 A12 D4-0,1 A46 A45 A15 SCP_E2 A16 SCP_W32

  8. 30, 000 test events in test results files • Messages • routing updates • congestion notifications • traffic retrieval • Traffic load • SS7 network engineered for high reliability • Links carry half-load under normal conditions Complexity of STP testing • Monitoring • High speed • —3 types of monitors • — 252 monitored locations • Low speed • — 3 types of monitors • — 36 monitored locations • Types of tests • Failure • — Sequential failures • — Simultaneous failures • — STP pair isolation • Congestion • Number of tests • High speed links • —22 test cases • Low speed links • —12 test cases • — 22 test parts • — 146 steps

  9. Complete problem Automate all steps of STP interoperability testing ASTRA Test Environment Set Up Test Execution Test Result Retrieval And Collection Test Results Analysis ASTRA : Automated SS7 Test Results Analyzer

  10. Outline • Background • Problem statement • Solution approach • Implementation —Automated SS7 Test Results Analyzer (ASTRA) • Summary

  11. Generic methodology • The set of actual events (test results) captured by monitor m in step s of test t is At,s,m • We create the set of Expected Behavior Xt,s,m at monitor m in step s of test t • Compare the actual captured events At,s,m with the created EB Xt,s,m to verify • messages and associated parameters • traffic load • timer values

  12. Expected behavior at multiple PCOs (monitors) Expected behavior is represented as a “program” instead of as a database because we: • Cannot predict the exact sequence of events • Use a tree structure • Cannot predict the exact times of occurrences of events • Use times of test actions for searches of events • Cannot predict the exact number of occurrences of some events • use a while loop

  13. Expected Behavior Expression Language (EBEL) • Basic language components • findmsg() function - indicates monitor • assert_load() function • foreach(),while() loops • waitfor() pseudo-event • test actions • fail_link(), restore_link() • set_load • pause

  14. Timed event Specify PCO (monitor) 5. foreach $t (@set_1) { 6. findmsg(A(STP_A)($t),$t,STP_A,$t,TFR,$s) 7. } 8. ... 9. findmsg(D(STP_A)(STP_D),STP_D,STP_A,STP_D,TFR,$s)causes 10 EB program example 1. foreach $s (@SSPS_E) { 2. $ta_fail{$s}: fail_link(A(STP_A)($s)) causes 4,... 3. ... 4. waitfor(T11,STP_A) causes 5,9,... 10. while($time<$ta_restore{$s}) { 11. waitfor(T10,STP_D) causes 12 12. findmsg(D(STP_A)(STP_D),STP_A,STP_D,STP_A,RSR,$s) 13. } 14. ... 15.foreach $s (@SSPS_E) { 16. $ta_restore{$s}: restore_link(A(STP_A)($s)) causes ...

  15. EB as a tree If (2) { if (4) { foreach $t (@set_1) {/*5*/ 6$t } } if (9) { 10 } } 2 4 5 9 6a 6b 6c 10 Abstract expected events form a tree of causation. Representing the abstract event tree on the left as a series of nested if statements.

  16. Outline • Background • Problem statement • Solution approach • Implementation —Automated SS7 Test Results Analyzer (ASTRA) • Summary

  17. Translate EBEL code into a PERL program • Add time values (consult TAR) • Convert to depth-first search Test Data Test Action Record Network Configuration Files Timing Files Data Formatting Actual Event Record Expected Behavior Translator Analyzer Parameter Observation List Event Lists: Hidden, Missing, Unexpected Matched Event Record Data Program Architecture of ASTRA • Adjust variations in format from different test data • Synchronize all events • Filter database to remove irrelevant information • Sort events in the database into chronological order

  18. Analyzer - matching operation YES Was the link monitored? NO Write to Hidden Event List Search for event in the time period Write to Missing Event List YES Event Found? Set Match Flag to 1 NO Write to Unexpected Event List Matched Event Record Find all messages with Match Flag = 0 Find Event on link, params, [t1,t2] from EB Actual Event Record

  19. Test Data Test Action Record Network Configuration Files Timing Files Data Formatting Legend Data Statistical Analysis & Report Generation Program Test Results Database Architecture of ASTRA • Calculate the numbers of expected, missing, hidden and unexpected events • Calculate timer observations • Calculate load measurements • Declare failure/pass/inconclusiveness of the tests Actual Event Record Expected Behavior Translator Analyzer Parameter Observation List Event Lists: Hidden, Missing, Unexpected Matched Event Record

  20. Outline • Background • Problem statement • Solution approach • Implementation —Automated SS7 Test Results Analyzer (ASTRA) • Summary

  21. Productivity of ASTRA • The size of the EB code 5909 • The number of events in collected test results 29196 Test Data  30,000 events ASTRA EB code  6,000 lines Produced database  3,000 events

  22. Summary • Challenging problem • Non-determinism (e.g., simultaneous link restoral) • If A restores before B, a different type of message will be sent than if B restores before A; if both options not listed with an “OR” in EB, then either “unexpected” or “missing” event • Solutions (?) • Cover all possible scenarios by automatically creating EB directly from SDL specification - state explosion problem • Allow EB program to consult the test results to make further predictions with “IF” statement. This approach trusts test results; Also found need for a “NOT” statement. • Asides: • Pay attention to monitors: “more the better” and they miss messages! • Need automation of all steps of testing – not just results analysis • Found 3 implementation bugs in STPs tested!

More Related