1 / 24

Online Music Store MSE Project Presentation III Presented by: Reshma Sawant Major Professor: Dr. Daniel Andresen

Online Music Store MSE Project Presentation III Presented by: Reshma Sawant Major Professor: Dr. Daniel Andresen 03/11/08 Phase III Presentation Outline Project Overview Brief Review of Phases Action Items from Phase II Implementation/Demo Assessment Evaluation Project Evaluation

Audrey
Download Presentation

Online Music Store MSE Project Presentation III Presented by: Reshma Sawant Major Professor: Dr. Daniel Andresen

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Online Music StoreMSE Project Presentation IIIPresented by: Reshma Sawant Major Professor: Dr. Daniel Andresen 03/11/08

  2. Phase III Presentation Outline • Project Overview • Brief Review of Phases • Action Items from Phase II • Implementation/Demo • Assessment Evaluation • Project Evaluation • Lessons learned

  3. Project Overview The objective of this project is to design and develop an Online Music Store. • Target: Public Users • Product: Media for Music • User Types: User, Administrator • Functionalities for Users: Browsing, searching, buying products, getting song recommendations, managing personal account • Functionalities for Administrator: Manage Catalog Details, Manage Orders, Manage Shopping Cart

  4. Review of Phases • Phase I: • Requirement Specifications • Phase II: • Designed Web Pages • Created Test Plan • Phase III (Current): • Coding • Testing and Analysis

  5. Action Items from Phase II • Correct multiplicities in Class Diagram • Multiplicity between ShoppingCart Class and CartItem Class should be 1..*

  6. ClassDiagram

  7. Action Items from Phase II 2) Revise SLOC count and Project Duration • Included in Project Evaluation

  8. Implementation & Demo • Technologies Used: • IDE – Microsoft Visual Studio 2005 • Technology - ASP.NET 2.0 • Language – C# • Database – SQL Server 2005

  9. Assessment Evaluation • Manual Testing - To ensure the correctness of various parts of code

  10. Assessment Evaluation • E.g. Register Web Page for User • E.g. Edit Shopping Cart

  11. Assessment Evaluation • Performance Testing • Goal: • Determine load in terms of concurrent users and requests • Determine Response Time – time between the request being initiated for a Web Page to time taken for it to be completely displayed on a user’s browser • Tool Used – JMeter(http://jakarta.apache.org) • Inputs to JMeter: • Number of Users • Ramp-up period – time (sec) to load the full number of users chosen • Loop Count - how many times to repeat the test • E.g. Users = 10, Loop-Count = 20, Ramp-up period = 5 sec => 10 Users will be loaded in 5 sec with total requests = 200 (10*20)

  12. Assessment EvaluationPerformance Testing Factors • Load Type • Peak Load – maximum number of users and requests loaded in short duration (e.g. 5 sec). • Sustained Load – maximum users and requests loaded for longer period (e.g. 5 mins). • Connection • Wireless Connection at 54.0 Mbps • LANConnection at 100.0 Mbps • Web pages Tested • HTML Page (Login Web Page) • Database Intensive Page (Home Page) • Business Logic Page (Shopping Cart Page)

  13. Assessment EvaluationPerformance Testing Environmental Set-up • Machine Configuration • Operating System – Windows XP Professional • Memory – 1GB RAM • 100GB HardDisk • Intel Pentium M Processor 1.7 GHz

  14. Assessment Evaluation Home Page [http://localhost:2416/CDShop/Default.aspx] • Peak Load at Wireless (54 Mbps) vs. LANConnection (100 Mbps) • Note • Loop-Count constant at 20,000 • Ramp-up period of 5 sec • Users – 200, 600, 800, 1000 • Observations • Response Time increases linearly with number of users for both Wireless and LAN • Max no.of users handled by the system before it becomes saturated = 1000 • Response Time is less for LAN due to better bandwidth.

  15. Assessment Evaluation Home Page [http://localhost:2416/CDShop/Default.aspx] • Constant Users vs. Constant Loop-Count for Wireless Connection Users Constant at 200 Loop-Count Constant at 20,000 Loop-Count increased up to 20000 Users – 200, 600, 800, 1000

  16. Assessment Evaluation Home Page [http://localhost:2416/CDShop/Default.aspx] • Observations • Response Time increases rapidly with number of users but not very much when the users are kept constant and only loop-count is increased. • Reason: • If the number of users is kept constant and only the loop-count is increased, the number of requests/sec handled by the server remains constant for every increase in the loop count. • If the users are increased and loop count is kept constant, the requests/sec handled by the server increases with increasing users, but the number of executions remain constant and hence the longer response time.

  17. Assessment Evaluation • Note • Loop-Count constant at 20,000 • Ramp-up period of 5 sec • Users – 200, 600, 800, 1000 • Observations • Response Time increases more for Home Page as compared to Login and Shopping Cart Page • Lowest Response Time for Login Page as no database requests are submitted by the user • Moderate Response Time for Shopping Cart page because there are more computations • Response Time for Shopping Cart Page is approx. 28% more on an average than for Login Page • Response Time for Home Page is approx. 246% more on an average than for Login Page • Comparison of Response Times of all 3 WebPages at Wireless Connection of 54.0Mbps

  18. Assessment Evaluation Home Page [http://localhost:2416/CDShop/Default.aspx] • External Factors affecting Response Time • Varying Network Bandwidth • Limited System Hardware Resources (CPU, RAM, Disks) and Configuration • JMeter Tests and Server running on the same machine

  19. Assessment Evaluation Summary • For Peak Load • Users – 200, 600, 800, 1000 • Loop-Count constant at 20,000 • Ramp-up period = 5 sec • Response Time increases rapidly with number of users but not very much when the users are kept constant and only loop-count is increased. • Response Time is highest for Home page, Intermediate for Shopping Cart Page and Lowest for Login Page

  20. Assessment Evaluation Login Page [http://localhost:2416/CDShop/Login.aspx] • For Sustained Load at Wireless Connection

  21. Project Evaluation • Project Duration (actual) • Phase I = 86 hours • Phase II = 140.5 hours • Phase III = 304.5 hours • Total = 531 hours • Project Duration (in Months) • Estimated at the end of Phase II = 6.5 Months • Actual = 7.5 Months

  22. Project Evaluation • Category BreakDown • Research = 38.5 hours • Design = 37 hours • Coding = 305.5 hours • Testing = 32 hours • Documentation = 118 hours • Total = 531 hours

  23. Project Evaluation • SLOC Count (Actual) – LocMetrics Tool (http://www.locmetrics.com) • C# Code (Including C# auto-generated code) = 2757 • SQL Code = 540 • XML Code = 86 • CSS Code = 412 • Total = 3795 • SLOC Count (Estimated) • At the end of Phase II – 3200 (Based on prototype design in phase I)

  24. Project Experience • Lessons Learned: • New technology • Use of various tools for designing and testing – Visual Studio 2005, JMeter, LocMetrics • Working with UML and Class Diagrams • Entire life cycle of the project– requirement gathering, Design, Coding, Testing and Documentation • Testing applications at different levels

More Related