1 / 33

Automated Testing for Mobility Management Entity of Long Term Evolution System

Automated Testing for Mobility Management Entity of Long Term Evolution System . 9/11/2014. Xi Chen. Acknowledgement. Supervisor: Prof. Jyri Hämäläinen Instructor: M.Sc. Risto Nissinen (Nokia Siemens Networks Oy). Outline. Background Test System Overview ATCA MME Tektronix G35

azuka
Download Presentation

Automated Testing for Mobility Management Entity of Long Term Evolution System

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Automated Testing for Mobility Management Entity of Long Term Evolution System 9/11/2014 Xi Chen

  2. Acknowledgement Supervisor: Prof. Jyri Hämäläinen Instructor: M.Sc. Risto Nissinen (Nokia Siemens Networks Oy)

  3. Outline Background Test System Overview ATCA MME Tektronix G35 Robot Framework & Test Suite Agile Methodology Test suite & Test cases Results

  4. Background The work is done in Nokia Siemens Network Oy in Espoo. It is a research about test automation of Mobility Management Entity (MME) of LTE core network with the usage of Tektronix G35 tester. The purpose is to track the performance of ATCA hardware platform by implementing a test suite which could collect MME counters’ statistics.

  5. High-level Overview of Test System ATCA (MME) Framework Tektronix G35 Framework Network traffic Performance data Performance data Robot Framework

  6. ATCA hardware Advanced Telecom Computing Architecture (ATCA) Open standard specification 16 slots for computer units 322.25 mm high 280 mm deep

  7. ATCA hardware Modularity, scalability and flexibility. Standardized rack size & power supply Service providers get: smaller equipment & significant energy saving

  8. ATCA based MME Five key functional units: • Control Plane Processing Unit (CPPU) • Signaling & Mobility Management Unit (SMMU) • IP Director Unit (IPDU) • Marker & Charging Unit (MCHU) • Operation & Maintenance Unit (OMU)

  9. ATCA based MME CPPU: transaction based mobility management SMMU: storing information of visiting subscribers into visiting subscriber database (Home Subscriber Sever) IPDU: balance the loads & connectivity MCHU: offers statistics function OMU: handles operation & maintenance functions

  10. Tektronix G35 -Traffic Procedure G35 generates network traffic Define traffic profile in G35 • Start scenario: Initialization (e.g. set eNBs) • Call scenario: Periodically attach & detach subscribers • Stop scenario: Do nothing in our work

  11. Tektronix G35 remote control Test suite is developed in a remote client workstation. Test suite remotely controls G35 by invoking operations that are exposed by G35 through one interface. • E.g. Remotely configure G35, start traffic, stop traffic In practice:

  12. Robot framework Generic test automation framework • Open source software • Implemented in Python • Can be extended with Python, Java or other languages

  13. Architecture of Robot framework based test Interacts with System Under Test through Test Library

  14. Robot framework test case • Tabular syntax • Constructed with keywords • Keywords: • Build-in keywords • Imported keywords from test library • User keywords

  15. Robot framework test case One row one step (executed row by row) Test case do not need to know what is happening underneath besides keywords Each keyword is a function call which accepts arguments. • E.g. User keyword: Add Two Numbers

  16. Robot framework test suite All test cases are encapsulated in one test suite A test suite has Setup & Teardown phases: • Setup: Initialization actions (e.g. configure G35) • Teardown: Final actions • Keep the real tests be focused in between Each test case can have its own Setup & Teardown

  17. Agile methodology - Scrum Aim at flexibility, adaptbility & productivity • Assume, the requirement, schedule will probably be changing during development Development cycle is a sprint • One sprint = e.g. 2 weeks • Daily Scrum • Planning meeting • Review meeting

  18. Test suite - overview • Initialization • Organize directories structure for test results • Configure G35 working environment • Configure SSH feature on ATCA • Set traffic profile on G35 • Statistics Collection (ATCA & Tektronix) • Details are explained on the following slides Results Generation Main 3 phases

  19. Test cases G35 & MME preparation Initial statistics collection Start the traffic Periodically Collect Statistics (G35 & ATCA) of traffic period Stop the traffic Wait & Collect Statistics (G35 & ATCA) of plus period Final step

  20. Test suite – statistics collection The essential part of the test sutie is counters’ statistics collection for both G35 & ATCA Counters’ statistics collection for G35: • Python function on remote client gets a list of counters defined in G35 & access the values Counters’ statistics collection for ATCA: • Test case establish SSH connection to the computer units on ATCA • Send Man Machine Language (MML) commands to get counters value

  21. Test suite – ATCA statistics collection Computer units on ATCA: • OMU: Operation & Management Unit • CPPU: Control Plane Processing Unit • MCHU: Market & Charging Unit • SMMU: Signaling & Mobility Management Unit Test case establish a telnet connection to ACTA  start a remote session to the computer unit send MML commands

  22. Test suite – ATCA CPU loads tracking Track CPU loads of all computer units on ATCA Make sure the CPU loads are in acceptable level CPU loads are tracked simultaneously by running a python script which implements the multithreading feature.

  23. Test suite – Statistic recording & graph generating Step 1: data extraction Plain text  list of strings Step 2: String formatting Each string in the list  “Start Time Period, Counter Number, Counter Name, Counter Value”  One record Step 3: Recording records are written to *.dat file Step 4: Graph generating Use Gunplot to generate statistic graph from DAT file

  24. Time domain – user inputs for test run Three user input values for the test suite run • Traffic period & Plus period & Record duration Traffic period: Total period spends on periodical counters collection after traffic is started Plus period: The period spends on counters collection after traffic is stopped Record duration: The period for each round of counters collection

  25. Time domain - all test steps NextSlide last statistics collection (of plus period) G35 & MME preparation Final step (stop to measure) Initial statistics collection Periodicalstatistics collection (of traffic period) Start traffic Stop traffic 0s 0s ~22s ~44s ~22s ~nrOfRound * (44s+wait) ~plusPeriod+44s User defined duration period

  26. Time domain - counters collection part Duration (1round) Plus period (User defined) Traffic period Stop Stop Stop Statistics Recording & data parsing,organization & write to files Counters Collection Wait until duration is over

  27. Results directory

  28. Result graph – ATCA counters

  29. Result graph – G35 counters

  30. Result graph – CPU loads of CPPU

  31. Results of time measurements Prepare for the G35 counters(~2s) Set timer for counters measurement from MME memory (~20s) G35 & MME preparation ** ~22s Initial statistics collection 0s Tek: ~15s(SE) <2s(classic) Start the traffic CPPU: ~1s *TOPTEN (~22s) ~44s SMMU: ~0s Periodically Collect Statistics (G35 & ATCA) of traffic period * MME : ~0s 1 round TOP: ~22s org. & files: ~6s Stop the traffic 0s *TOPTEN (~22s) Wait & Collect Statistics (G35 & ATCA) of plus period * ~plusPeriod+44s *Stop to measure counters from MME memory (~20s) ~20s Final step *

  32. Conclusion Automated testing for MME is found to be very important. It provides an efficient way to generate a clear picture of the performance of MME. It is helpful when improving the quality of MME

  33. Thank you! & Questions?

More Related