1 / 37

NAME and the Sheffield Software Observatory

NAME and the Sheffield Software Observatory. NAME. NAME, the Network of Agile Methodologies Experience, is a European Union fifth framework network with three major aims:. Aim 1. 1. Bring together researchers and practitioners of Agile Methodologies for dissemination of experiences.

Download Presentation

NAME and the Sheffield Software Observatory

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. NAMEand the Sheffield Software Observatory

  2. NAME • NAME, the Network of Agile Methodologies Experience, is a EuropeanUnion fifth framework network with three major aims:

  3. Aim 1 • 1. Bring together researchers and practitioners of AgileMethodologies for dissemination of experiences.

  4. Aim 2 • To define a research roadmap on Agile Methodologies (AMs) forthe European Union Sixth Framework Programme. • To this end we will explore avenues for research in Extreme Programming (XP), SCRUM, and otheragile methodologies,

  5. Aim 2 (contd.) • also in relationships with other, wellestablished,methodologies and techniques, such as Object Oriented Development, Component Based Development, Software Product Lines, Open Source Development etc.

  6. Aim 3 • To create an experimental framework to collect, classify, andanalyze already existing and new experience in the area of XP and AMs.

  7. Main partners • NAME is a partnership among the Free University of Bolzano-Bozen, CWI,Datasiel, the Polytechnic of Valencia, the Munich University of Technology, the University of Cagliari, and the University of Sheffield.

  8. Goals for today • To gather information about experiences of using XP or AM .

  9. Sheffield Software Observatory • Purpose: to examine software development processes under laboratory conditions. • Comparative studies under realistic circumstances. • Quantitative and qualitative approaches. • Technologies and human factors in software development.

  10. Research approach • Quantitative analysis of project data: • Requirements documents, timesheets, plans, designs, tests, code, quality reviews, client feedback, etc. • Qualitative analysis of collected data: • Positivist approach – questionnaires, interviews, focus groups, evaluation reports.

  11. Software Hut • Comparative experiments. • Typically 20 teams competing to build solutions for real business clients. • Runs every year 2nd year students. • Typically half the teams use methodology X and the other methodology Y. • Statistical analysis on all data collected.

  12. 2001 pilot study • 3 clients each with 5 teams of 4-5 developers. • Half the teams used XP the rest used trad/UML led approach. • Data collected and analysed. • Some evidence that XP teams produced slightly better solutions/documentation.

  13. 2001 continued • Problem with data validation. • Used the experience to improve the experiment – repeated in 2001. • Needed better training in XP. • Needed better data collection • Needed better analysis

  14. 2002 Software Hut. • 20 teams – 4 clients: • Small Firms Enterprise Development Institute – wanted an intranet with company processes and information on it. • Learn Direct – a data analysis tool for studying marketing trends etc.

  15. 2002 contd. • Dentistry Research Institute. • Web based questionnaire system for field trials. • National Cancer Screening Service (NHS) – document handling system – archived scientific information etc.

  16. Experiment • Half of the teams used XP, the rest trad. • Randomised experiment. • Data collected includes all system documentation throughout the project. • 15 hours per week per person in each team. • Systems evaluated by clients. • Processes evaluated by academics.

  17. Software Hut Project 2002 Blocks A B C D XP 5, 7, 8 2, 6 1, 9 3, 4, 10 Treatments Trad 18, 20 12, 14, 17 11, 13, 19 15, 16 Allocation of teams in blocks and treatments. A) SFEDI. B) S. of Dent. C) UFI. D) Cancer S.

  18. Software Hut Project 2002

  19. Software Hut Project 2002 Marks given by the clients. Split by treatment and sorted before plotting.

  20. Software Hut Project 2002 Marks given by the clients. Teams 1 to 10 are XP. Teams 11 to 20 are Traditional.

  21. Software hut Project 2002 Marks given by the client. Split by block and sorted before plotting.

  22. Software hut Project 2002 Marks given by the client. Split by block and sorted before plotting. A) SFEDI. B) S. of Dent. C) UFI D) Cancer S.

  23. Software Hut Project 2002 Marks given the lecturers. Split by treatment and sorted before plotting.

  24. Software Hut Project 2002 Marks given the lecturers. Teams 1 to 10 are XP. Teams 11 to 20 are Traditional.

  25. Software Hut Project 2002 Marks given by the lecturers. Split by block and sorted before plotting.

  26. Software Hut Project 2002 Marks given by the lecturers. Split by block and sorted before plotting. A) SFEDI. B) S. of Dent. C) UFI. D) Cancer S.

  27. Software Hut Project 2002 Marks given by clients + lecturers. Split by treatment and sorted before plotting.

  28. Software Hut Project 2002 Marks given by clients + lecturers. Teams 1 to 10 are XP. Teams 11 to 20 are Traditional.

  29. Software Hut Project 2002 Marks given by clients + lecturers. Split by block and sorted before plotting.

  30. Software Hut Project 2002 Marks given by clients + lecturers. Split by block and sorted before plotting. A) SFEDI. B) S. of Dent. C) UFI. D) Cancer S.

  31. Software Hut Project 2002 Number of tests cases. Teams 1 to 10 are XP. Teams 11 to 20 are Traditional.

  32. Software Hut Project 2002 Number of test cases. Split by block and sorted. A) SFEDI. B) S. of Dent. C) UFI. D) Cancer S.

  33. Software Hut Project 2002 Number of Requirements. Teams 1 to 10 are XP. Teams 11 to 20 are Traditional.

  34. Software Hut Project 2002 Number of Requirements. Split by block and sorted before plotting. A) SFEDI. B) S. of Dent C) UFI. D) Cancer S.

  35. Conclusions so far • Testing was emphasised in both groups • Probably ensured trad results were good • Incremental delivery good • Pair programming is hard for some • Pair programming is good in testing and debugging • Test first is hard but worth it

  36. Conclusions contd. • XP groups spend more time on project • More talking – communication? • Some practices easier to adopt than others • Some practices may not be so important • More research is needed.

More Related