Online Music Store
Download
1 / 24

Online Music Store MSE Project Presentation III Presented by: Reshma Sawant Major Professor: Dr. Daniel Andresen - PowerPoint PPT Presentation


  • 528 Views
  • Updated On :

Online Music Store MSE Project Presentation III Presented by: Reshma Sawant Major Professor: Dr. Daniel Andresen 03/11/08 Phase III Presentation Outline Project Overview Brief Review of Phases Action Items from Phase II Implementation/Demo Assessment Evaluation Project Evaluation

Related searches for Online Music Store MSE Project Presentation III Presented by: Reshma Sawant Major Professor: Dr. Daniel Andresen

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Online Music Store MSE Project Presentation III Presented by: Reshma Sawant Major Professor: Dr. Daniel Andresen ' - Audrey


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
Slide1 l.jpg

Online Music StoreMSE Project Presentation IIIPresented by: Reshma Sawant Major Professor: Dr. Daniel Andresen 03/11/08


Phase iii presentation outline l.jpg
Phase III Presentation Outline

  • Project Overview

  • Brief Review of Phases

  • Action Items from Phase II

  • Implementation/Demo

  • Assessment Evaluation

  • Project Evaluation

  • Lessons learned


Project overview l.jpg
Project Overview

The objective of this project is to design and develop an Online Music Store.

  • Target: Public Users

  • Product: Media for Music

  • User Types: User, Administrator

  • Functionalities for Users: Browsing, searching, buying products, getting song recommendations, managing personal account

  • Functionalities for Administrator: Manage Catalog Details, Manage Orders, Manage Shopping Cart


Review of phases l.jpg
Review of Phases

  • Phase I:

    • Requirement Specifications

  • Phase II:

    • Designed Web Pages

    • Created Test Plan

  • Phase III (Current):

    • Coding

    • Testing and Analysis


Action items from phase ii l.jpg
Action Items from Phase II

  • Correct multiplicities in Class Diagram

    • Multiplicity between ShoppingCart Class and CartItem Class should be 1..*


Class diagram l.jpg
ClassDiagram


Action items from phase ii7 l.jpg
Action Items from Phase II

2) Revise SLOC count and Project Duration

  • Included in Project Evaluation


Implementation demo l.jpg
Implementation & Demo

  • Technologies Used:

    • IDE – Microsoft Visual Studio 2005

    • Technology - ASP.NET 2.0

    • Language – C#

    • Database – SQL Server 2005


Assessment evaluation l.jpg
Assessment Evaluation

  • Manual Testing - To ensure the correctness of various parts of code


Assessment evaluation10 l.jpg
Assessment Evaluation

  • E.g. Register Web Page for User

  • E.g. Edit Shopping Cart


Assessment evaluation11 l.jpg
Assessment Evaluation

  • Performance Testing

  • Goal:

    • Determine load in terms of concurrent users and requests

    • Determine Response Time – time between the request being initiated for a Web Page to time taken for it to be completely displayed on a user’s browser

    • Tool Used – JMeter(http://jakarta.apache.org)

    • Inputs to JMeter:

      • Number of Users

      • Ramp-up period – time (sec) to load the full number of users chosen

      • Loop Count - how many times to repeat the test

        • E.g. Users = 10, Loop-Count = 20, Ramp-up period = 5 sec

          => 10 Users will be loaded in 5 sec with total requests = 200 (10*20)


Assessment evaluation performance testing factors l.jpg
Assessment EvaluationPerformance Testing Factors

  • Load Type

    • Peak Load – maximum number of users and requests loaded in short duration (e.g. 5 sec).

    • Sustained Load – maximum users and requests loaded for longer period (e.g. 5 mins).

  • Connection

    • Wireless Connection at 54.0 Mbps

    • LANConnection at 100.0 Mbps

  • Web pages Tested

    • HTML Page (Login Web Page)

    • Database Intensive Page (Home Page)

    • Business Logic Page (Shopping Cart Page)


  • Assessment evaluation performance testing environmental set up l.jpg
    Assessment EvaluationPerformance Testing Environmental Set-up

    • Machine Configuration

    • Operating System – Windows XP Professional

    • Memory – 1GB RAM

    • 100GB HardDisk

    • Intel Pentium M Processor 1.7 GHz


    Slide14 l.jpg

    Assessment Evaluation

    Home Page [http://localhost:2416/CDShop/Default.aspx]

    • Peak Load at Wireless (54 Mbps) vs. LANConnection (100 Mbps)

    • Note

      • Loop-Count constant at 20,000

      • Ramp-up period of 5 sec

      • Users – 200, 600, 800, 1000

      • Observations

      • Response Time increases linearly with number of users for both Wireless and LAN

      • Max no.of users handled by the system before it becomes saturated = 1000

      • Response Time is less for LAN due to better bandwidth.


    Slide15 l.jpg

    Assessment Evaluation

    Home Page [http://localhost:2416/CDShop/Default.aspx]

    • Constant Users vs. Constant Loop-Count for Wireless Connection

      Users Constant at 200 Loop-Count Constant at 20,000

      Loop-Count increased up to 20000 Users – 200, 600, 800, 1000


    Slide16 l.jpg

    Assessment Evaluation

    Home Page [http://localhost:2416/CDShop/Default.aspx]

    • Observations

      • Response Time increases rapidly with number of users but not very much when the users are kept constant and only loop-count is increased.

      • Reason:

        • If the number of users is kept constant and only the loop-count is increased, the number of requests/sec handled by the server remains constant for every increase in the loop count.

        • If the users are increased and loop count is kept constant, the requests/sec handled by the server increases with increasing users, but the number of executions remain constant and hence the longer response time.


    Assessment evaluation17 l.jpg
    Assessment Evaluation

    • Note

      • Loop-Count constant at 20,000

      • Ramp-up period of 5 sec

      • Users – 200, 600, 800, 1000

      • Observations

      • Response Time increases more for Home Page as compared to Login and Shopping Cart Page

      • Lowest Response Time for Login Page as no database requests are submitted by the user

      • Moderate Response Time for Shopping Cart page because there are more computations

      • Response Time for Shopping Cart Page is approx. 28% more on an average than for Login Page

      • Response Time for Home Page is approx. 246% more on an average than for Login Page

    • Comparison of Response Times of all 3 WebPages at Wireless Connection of 54.0Mbps


    Slide18 l.jpg

    Assessment Evaluation

    Home Page [http://localhost:2416/CDShop/Default.aspx]

    • External Factors affecting Response Time

      • Varying Network Bandwidth

      • Limited System Hardware Resources (CPU, RAM, Disks) and Configuration

      • JMeter Tests and Server running on the same machine


    Slide19 l.jpg

    Assessment Evaluation

    Summary

    • For Peak Load

      • Users – 200, 600, 800, 1000

      • Loop-Count constant at 20,000

      • Ramp-up period = 5 sec

      • Response Time increases rapidly with number of users but not very much when the users are kept constant and only loop-count is increased.

      • Response Time is highest for Home page, Intermediate for Shopping Cart Page and Lowest for Login Page


    Slide20 l.jpg

    Assessment Evaluation

    Login Page [http://localhost:2416/CDShop/Login.aspx]

    • For Sustained Load at Wireless Connection


    Project evaluation l.jpg
    Project Evaluation

    • Project Duration (actual)

      • Phase I = 86 hours

      • Phase II = 140.5 hours

      • Phase III = 304.5 hours

    • Total = 531 hours

    • Project Duration (in Months)

    • Estimated at the end of Phase II = 6.5 Months

    • Actual = 7.5 Months


    Project evaluation22 l.jpg
    Project Evaluation

    • Category BreakDown

      • Research = 38.5 hours

      • Design = 37 hours

      • Coding = 305.5 hours

      • Testing = 32 hours

      • Documentation = 118 hours

    • Total = 531 hours


    Project evaluation23 l.jpg
    Project Evaluation

    • SLOC Count (Actual) – LocMetrics Tool (http://www.locmetrics.com)

      • C# Code (Including C# auto-generated code) = 2757

      • SQL Code = 540

      • XML Code = 86

      • CSS Code = 412

      • Total = 3795

      • SLOC Count (Estimated)

      • At the end of Phase II – 3200 (Based on prototype design in phase I)


    Slide24 l.jpg

    Project Experience

    • Lessons Learned:

      • New technology

      • Use of various tools for designing and testing – Visual Studio 2005, JMeter, LocMetrics

      • Working with UML and Class Diagrams

      • Entire life cycle of the project– requirement gathering, Design, Coding, Testing and Documentation

      • Testing applications at different levels