1 / 67

Software Development Methodologies: Models and Techniques

This article explores different methodologies and models used in software development, including the waterfall model, multi-stage development model, and agile development methodologies. It discusses the advantages and specific techniques of each approach.

Download Presentation

Software Development Methodologies: Models and Techniques

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Project management in SE • Methodologies and models of software processes • Peeter Normak • 26.11.2015

  2. Plan • Discussion: Home assignment. • Choice of a software development methodology. • Examples of methodologies/models. • Capability maturity models (CMM-SW and CMMI). • NASA software process improvement methodology. • Software process assessment methodology SPICE. • Reviewing project proposals. • About the examination. • Reflections

  3. Home assignment • Formulate three words of wisdom based on the article “Why Software Fails” (http://spectrum.ieee.org/computing/software/why-software-fails/). • Find at least five myths about software development. • What do you have learned at most from NASA SEL experience (http://www.cs.umd.edu/~basili/publications/proceedings/P94.pdf)?

  4. SW development methodology – what it is? • A software development methodology is a framework for structuring, planning, and controlling the process of software development. • A software development methodology is a splitting of software development processes into distinct phases containing activities with the intent of better planning and management. • A software development methodology is a way of organizing and managing the development of a piece of software. • A software development methodology is a collection of procedures, techniques, tools and documentation for supporting the software developers in developing and implementing a new software.

  5. The classical waterfall model

  6. The waterfall model and a two-phase model • Requirements Design Coding Testing Implementation • The advantages of a two-phase model: • gives an opportunity to interrupt a project after relatively few costs are made, if it turns out unreasonable to be completed. • allows to plan a project more adequately (because activities and costs are planned in two phases). • motivates the project managers to turn more attention to project planning.

  7. The waterfall model – economic characteristics • Correction of an error that was made during the design phase at the implementation phase of the project is hundred times more expensive than it would be corrected immediately. • Programming (coding) consumes only 15% of software development costs. • By reading the code only 60% of errors can be detected. • Pareto principle is applicable in software development as well: 80% of problems are caused by 20% of elements (80% of activities to 20% of requirements; 80% costs to 20% of components; 80% errors in 20% of components; 80% of results achieved by 20 % of developers etc).

  8. Multi-stage (iterative) development model • The general structure of the model: • The first phase: requirements and architecture/general design. • Stage 1: detailed design, coding, testing, integration and release • Stage 2: detailed design, coding, testing, integration and release • … • Stage n: detailed design, coding, testing, integration and release • Final release of software.

  9. Discussion • List some advantages of the multi-stage development model.

  10. Advantages of the multi-stage model • Early introduction of software. • Risks are decreased. • Problems appear in the early phase. • Time for composing reports decreases (working software is the best report). • More possibilities/choices for increasing functionality. • Planning is more adequate (feedback after every stage!). • Better flexibility and efficiency of software process (changes are discussed after each stage). • Correction of errors is more effective (localisation is much easier). • Work distribution is more even throughout the project life-cycle.

  11. Agile SW development methodologies

  12. Agile development methodologies – the basics • A number simultaneously developed methodologies appeared mid 1990-ies. • Most well-known: extreme programming (XP) and Scrum. • Most well-known person: Kent Beck, initiator of Manifesto for Agile Software Development (1996). • Key words: Communication, Simplicity, Feedback, Respect, Courage. • XP roles: developer, customer, coach, tracker; tester, consultant, big boss. • Scrum roles: The Product Owner, Scrum master, team member.

  13. Agile development methodologies – specifics • Specifics: • discipline, • high and broad qualification, • small development teams, • change of requirements is possible throughout the whole project, • effective usage of human work. • See, for example, the Wikipedia article “Agile software development – • https://en.wikipedia.org/wiki/Agile_software_development

  14. Agile development methodologies – some technics • Bases on user stories, realization of a story 2 weeks in average. • Changing roles and tasks in development. • Every day starts with a stand-up meeting. • Minimality principle (add new functionality only if really needed). • Intense and continuous collaboration with the customer. • Agreed rules, specifications, standards. • Pair programming. • No overtime work! • See http://www.extremeprogramming.org/rules.html

  15. Agile development methodologies – some risks • Less documentation does not mean complete missing of documents. • All team members are not good enough in completing all tasks. • Product owner can not represent the interests enough of all stakeholder. • The fact that there are no project managers can cause problems in work distributions (the teams are self-organizing). • Predefined duration of sprints can cause quality problems (in cases when, for example, the testing turns to be more complex).

  16. Combination of principles (on example of Scrum and PRINCE2) • “Implementing lean is also a lean process, an iterative process of change, learn and adapt”. • Scrum is output-based (processes are flexible), PRINCE2* is process-based. • Basic processes: • Starting up a project (the only pre-project process) • Initiating a project • Directing a project • Managing stage boundaries • Controlling a stage • Managing product delivery Scrum • Closing a project. * http://www.projectsmart.co.uk/docs/prince2-introduction-ps.pdf

  17. Scrum + PRINCE2 – synergy • PRINCE2 strengths: • Project Board can represent the interests/needs of stakeholders better than a single person (Product Owner). • Project Manager has authority to coordinate the project activities. • Risk management is a pre-defined PRINCE2 process, but not regulated in Scrum. • Scrum strengths: • Description of the intended outcome will evolve during the project execution. • Starting new tasks/sprints without waiting permissions from the Project Board. • Lesson Learned document will be updated at the End of each sprint.

  18. Recommendations: Choice of a software development methodology • Competent usage of a method is more important than the method itself. Therefore, implementation of a new method should be justified and could be performed after a thorough analysis only. • Experience of implementing a new methodology obtained in other institutions may prevent dramatic failures. • A methodology should be adapted to the organizational culture as well as to the skills and habits of the project team; a majority of the team should accept the (adapted) methodology. Example: Cramo. • The development and other tools depend sometimes from software process methodology. It is suggested to use three level division in usage of the tools: compulsory, recommended, acceptable.

  19. Discussion • Whether to use the term “software development method” instead of “software development methodology”? • See: Geambasu, C.V., I.Jianu, I.Jianu, A.Gavrila (2011). Influence factors for the choice of a software development methodology. Accounting and Management Information Systems 10 (4), 479-494. • ftp://ftp.repec.org/opt/ReDIF/RePEc/ami/articles/10_4_3.pdf

  20. Maturity models of SW development

  21. Capability Maturity Model for Software CMM-SW • For assessing the quality of an institution in software development. • 5 levels: initial, repeatable, defined, managed, optimizing (distribution is very uneven!). • Each level has key process areas, requirements and self-evaluation questionnaire. • 2003 analysis in Estonia (69 respondents): no company has reached the level 2! • Recommendation (Gunnar Piho): consider sub-levels of level 2 : • CMM2-requirements (requirements are well managed)‏ • CMM2- plans (CMM2-requirements+activities are planned and resourced)‏ • CMM2-results (CMM2-plans+monitoring of activities/results).

  22. CMM-SW – 2nd level questions about SW project planning • Are estimates (e.g., size, cost, and schedule) documented for use in planning and tracking the software project? • Do the software plans document the activities to be performed and the commitments made for the software project? • Do all affected groups and individuals agree to their commitments related to the software project? • Does the project follow a written organizational policy for planning a software project? • Are adequate resources provided for planning the software project? • Are measurements used to determine the status of the activities for planning the software project (e.g., completion of milestones)? • Does the project manager review the activities for planning the software project on both a periodic and event-driven basis?

  23. Discussion • Possible shortcomings of the • Capability Maturity Model for SW (and capability models in general)

  24. Capability Maturity Model Integration – CMMI • The purpose: provide guidance for improving organization’s processes and ability to manage the development (CMM-DEV), acquisition (CMM-ACQ), and service provision (CMM-SVC). • CMMI bases on process models. Process areas are: • Project management • Process management • Supporting processes (analysis and quality assurance) • Capability levels 1…5. • The components of a CMMI model are grouped into three categories that reflect how they are to be interpreted: • required (Specific goals and generic goals of an institution), • expected (Specific practices and generic practices) • informative (provide details that help model users get started in thinking about how to approach goals and practices).

  25. Example: CMMI – processes for development • Project Planning (2) • Requirements Management (2) • Quantitative Project Management (4) • Risk Management (3) • Integrated Project Management (3) • Project Monitoring and Control (2) • Organizational Process Definition (3) • Organizational Process Focus (3) • Organizational Performance Management (5) • Organizational Process Performance (4) • Organizational Training (3) • Causal Analysis and Resolution (5) • Configuration Management (2) • Decision Analysis and Resolution (3) • Measurement and Analysis (2) • Process and Product Quality Assurance (2) • Supplier Agreement Management (2) • Additionally for level 3: Product Integration, Requirements Development, Technical Solution, Validation, Verification.

  26. CMMI Capability and maturity levels • Level 0 (incomplete): a process that is either not performed or partially performed. One or more of the specific goals of the process area are not satisfied. • Level 1 (performed/initial): a process that satisfies the specific goals of the process area. • Level 2 (managed): a process that is planned and executed in accordance with policy, employs skilled people having adequate resources to produce controlled outputs, involves relevant stakeholders; is monitored, controlled, reviewed, and evaluated. • Level 3 (defined): a process that is tailored from the organization's set of standard processes according to the organization’s tailoring guidelines, and contributes work products, measures, and other process-improvement information to the organizational process assets. • Level 4 (quantitatively managed, for measuring maturity only): a process that is controlled using statistical and other quantitative techniques. • Level 5 (optimizing, for measuring maturity only): process that is changed and adapted to meet relevant current and projected business objectives; focuses on continually improving the process performance through both incremental and innovative technological improvements.

  27. NASA Software Process Improvement (SPI)‏ methodology • NASA SPI methodology is iterative and relative (based on the current level). • Understanding: • Specification of objectives and possible processes, models, relations and indicators for achieving these objectives (the goal is to understand what processes can lead to the objectives). • Assessing: • Assessing the impact of applying different methods and tools (the goal is to find methods and tools that are most suitable to apply). • Packaging: • Implementation of the most suitable methods and tools in the organization’s everyday practice.

  28. NASA SPI versus CMM-SW • The most significant differences between NASA SPI and CMM-SW: • About the conception: NASA – bottom-up, CMM – top-down; • Scope: NASA – based on concrete needs, CMM – general process quality; • Assessment: NASA – relative with individual indicators, CMM – absolute with universal indicators; • Scale: NASA – continuous, CMM – 5 levels. • Several NASA SPI basic principles are realized in CMMI.

  29. SW assessment processes

  30. Software process assessment methodology SPICE • Software process assessment methodology (Software Process Improvement and Capability dEtermination): a framework for the assessment of software processes. • It: • Facilitates self-assessment • Takes account on the context of the process being assessed • Produces a process rating profile rather than a pass/fail result • Addresses the adequacy of practices relative to the process purpose • Is appropriate across all application domains and sizes of organization. • Is approved as ISO/IEC 15504 “Information technology – Software process assessment” standard.

  31. SPICE – the initial structure • Concepts and introductory Guide. • A model for process management. Defines, at a high level, the fundamental activities that are essential to software engineering. • Rating processes. Defines a framework for conducting an assessment, and sets out the basis for rating, scoring and profiling process capabilities. • Guide to conducting assessment. • Construction, selection and use of assessment instruments and tools. • Qualification and training of assessors. • Guide for use the results of an assessment in process improvement. • Guide for use in determining supplier process capability. • Vocabulary.

  32. Standard ISO/IEC 15504 “Information technology – Software process assessment” • 1: Concepts and vocabulary (ISO/IEC 15504-1:2004) • 2: Performing an assessment (ISO/IEC 15504-2:2003) • 3: Guidance on performing an assessment (ISO/IEC 15504-3:2004) • 4: Guidance on use for process improvement and process capability determination (ISO/IEC 15504-4:2004) • 5: An exemplar Process Assessment Model (ISO/IEC 15504-5:2012) • 6: An exemplar system life cycle process assessment model (ISO/IEC 15504-6:2008) • 7: Assessment of organisational maturity (ISO/IEC 15504-7:2008) • 8: An exemplary process assessment model for IT service management (ISO/IEC 15504-8:2012) • 9: Target process profiles (ISO/IEC 15504-9:2011)

  33. Composing recommendations

  34. Reference (recommendation) • The aim: to get an opinion from a competent person representing a target group, about necessity and usefulness of the project outcome. • Choosing the person for writing a recommendation: • the person should be an expert in the field • the person should not have a conflict of interests‏ • the financing institution should accept the person. • Normally the content will be disclosed to the project team. E: KN Cambridge • Sometimes the recommendations are form based dealing with some fixed aspects only. E: assess the competence of the applicant. • Problem: recommendations tend to be quite formal and are sometimes almost useless. E: Tiger Leap.

  35. Recommendations (composition of recommendations) • The text of a recommendation should be concrete and objective, contain relevant facts. • Recommendation should contain information about the aspects that are important for the financing institution. • Describe possible/expected benefits that the outcome of the project can bring.

  36. Reviewing project proposals

  37. Reviewing the project proposals • The aim: to get an assessment about suitability of the project (from the point of view of the financing institution). • Some aspects (the concrete list depends on the financing institution): • To what extent the project corresponds to the priorities and aims of the financing institution, • Are there enough resources for executing the project; first of all, the human resources: are the people competent enough. • How realistic is the timetable, • How adequate the budget is; how effectively the resources will be used. • NB! The reviewer should check correctness of all citations and calculations. E: discussion in Pärnu. • What threats and negative consequences may the project have.

  38. Recommendations (reviewing – view of a project team) • Be not too critical with respect to what has been done so far by other institutions/individuals (in justification of the need of the project). E: SF article – Anh. • Take into account that reviewers are normally experienced experts (do not bluff). • Take into account that reviewer may not be an expert in all aspects of the project – the text should be clear and unambiguous. • Use preliminary (maybe in-house) reviewing.

  39. Recommendations (reviewing – view of a reviewer) • Be benevolent and constructive in reviews (rather than suspicious and grouchy). • Be careful and thorough because the quality of reviews is one of the tools for building up of experts’ reputation. • Consider the possibility of reviewer’s identity disclosure.

  40. Example: evaluation criteria I • Evaluation criteria for training projects targeted to school teachers. • Whether and how is the project related to national school curriculum? • Does the project contribute to the development of education in Estonia and when the results can be expected? • Are the outcomes of the project applicable in majority of Estonian schools (that is, are there necessary technical and human resources available)? • Is the budget realistic? • Is the timetable realistic? • Is the project team qualified enough? • What negative consequences may the project have to Estonian schools and educational system? • Evaluation of training materials. • Practical conducting of the training. • The accuracy of the drafting of the text of the application. • Grade the project proposal in the ten-point scale.

  41. Example: evaluation criteria II • Grade 10: excellent (extremely actual, best possible teachers, project application is thoroughly elaborated, budget is adequate). Application to be approved in full amount, without any changes. • Grade8: very good (actual, high level teachers, project application is thoroughly elaborated, budget is adequate). Application to be approved, possibly with some minor changes. • Grade6: satisfactory (actual, competent teachers, project application is satisfactory elaborated, budget is adequate). Application to be approved, taking into account the changes proposed by experts. • Grade5: conditionally satisfactory (actual, competent teachers, project application is satisfactory elaborated). Resubmit the application, taking into account the changes proposed by experts. • Grade4: poor (not actual or teachers not competent enough or project application is unsatisfactory elaborated). Reject the application. • Grade2: extremely poor (not actual, teachers are not competent enough, project application is unsatisfactory elaborated). Reject the application, resubmission of a revised application inadvisable. • Grade1: out of scope (this type of training projects will not be supported). • Weighted average grade is possible; Financing can be made subject to satisfaction of certain conditions. 41

  42. The examination

  43. Submission of documents – the dates • 10.12 – presentation of examination documents (group work): • Project Plan • Final Report • Analysis Document • 15.12 (the deadline) – Submission of the examination documents (group work). • 31.12 (the deadline) – Submission of a review and 3 assessments (exactly three major strengths and three major weaknesses for each assessed work) (individual work). • All documents should be sent to peeter.normak@tlu.ee

  44. The documents – general requirements • Examination work (Project Plan, the Report, the Analysis Document) should be original, composed specially for this course. • The text of examination work should be well structured, relevant, focused, without superfluity. Describe the role of each student in in each document. • Review and assessments should be in a single file. The name of the file should begin with the name of the author, followed by “review_assessments”. • Example: JaanTamm-review_assessments.doc.

  45. Examination work – the Project PlanGroup work

  46. Composition of a project plan • Project plan should cover possibly all important aspects. • Budget is compulsory even if no real money is needed for conducting the project. The budget should base on estimation of necessary resources (workload, depreciation, expert advice etc). • In case the project plan is form based – preferably not – and an important aspect of the project does not fit into the form, this information should nevertheless be included either into an existing section or into an extra created section.

  47. Project Plan – the structure • Base on slide “Structure of a project plan” in the “PM-project-planning-2015.ppt” presentation. • Do not restrict with the aspects indicated explicitly in the brackets of a unit (for example – need, previous experience). • Add additional sections depending on the type of the project. Examples of aspects that can be covered: slide “Project plan – additional aspects” in “PM-project-planning-2015.ppt”. • If a section (for example, the needs analysis) of a project plan is disproportionately large then it (or part of it) can be formed as an appendix.

  48. Project Final ReportGroup work

  49. Final report – the structure • Base on slide “Final report – the structure (Example)” in the “PM-project-completion-2015.ppt” presentation. • The figures on this slide are indicative. • The report can have different structure, depending on the project. • Composing the text, take into account the general quality indicators (described on slide “Final Report – quality indicators” of the “PM-project-completion-2015.ppt” presentation). • If a section (for example, the needs analysis) of a project plan is disproportionately large then it (or part of it) can be formed as an appendix.

  50. Final report – possible additional sections • Impact assessment. • Dissemination • Exploitation • Cooperation with other institutions • … • Example: Final Report of the HITSA project “Development of Interaction design studies and research in Tallinn University” (file Final_Report-IDLab-2009-2012.pdf)).

More Related