1 / 22

A PMA example from a Norwegian company

A PMA example from a Norwegian company. Tor Stålhane IDI / NTNU. The PMA. What follows is a short summary of a one-day session with all the project participants from a large IT-company. The project’s time line The positive issues Affinity diagram Ishikawa diagram The negative issues

Download Presentation

A PMA example from a Norwegian company

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A PMA example from a Norwegian company Tor Stålhane IDI / NTNU

  2. The PMA What follows is a short summary of a one-day session with all the project participants from a large IT-company. • The project’s time line • The positive issues • Affinity diagram • Ishikawa diagram • The negative issues • Affinity diagram • Ishikawa diagram

  3. Feb Apr Jun Aug Oct Feb Apr Jun Oct Des Des Year 1 Year 2 External project manager out External architect out External functions responsible out Formal cooperationwith customer New scope New improved companystandard architecture Project establishedExternal project manger Low intensity ”fumbling phase” Four iterations New, internal Functions responsible Test manager Deliver technical pilot New project manager New external functions responsible New architect Defined scope Demo version Clearification of several roles Started cooperationwith customer

  4. The will to change Working process Use case Good methods that can be reused in further projects New and improved methods Has improved us – competence and process Evaluation of iterations –useful activity Good process for QA improvement Willing til to try newprocesses / methods New standard technology Use cases are good Use case driven=>good structure Bruk av nye metoder ogteknologi er motiverende(JSF / J-unit) Nytt, bedre utviklingsmiljøCC, J-unit, JSF Ny teknologi Godt teknisk grunnlag(moderne) Mye læring – ny teknologi JSF +++ New tools Cruise control => tool support in development J-unit test as new method inthe development processTool for performance testing Iterative process Good testing process Iterative Iterative process gave good progress and control Incremental deliveries Iterative process positive Competence mix and organization Stable testing environment Testing and correction Use of QC for doing follow-up of reported defects Good use of QC => goodoverview for status Good mix of resources => banking experience=> technology competenceNot only programmers(Java + Transigo) Interesting to learn about banking from The X personnel Resourceful consultants with highcompetence (Java ++) Good mix of competence Team organized according to product Base for change control Delivery focus 1 2 3 3 3 2 2 4 We have delivered Delivered to productionwithout large delays QC change project

  5. Customer focus Good cooperation with customer Bank’s contribution – test New concept => positive reception in the market place STG STG for testing Workshop /STG for test cases Structured reviews of UC Reviews of test cases together with the developers Structures reviews increased thequality of the documents Status meetings Status meeting in the morning Morning meetings gives common ground Morning meetings during testing Regular status meetings Daily status meetings duringsystem test period Cooperation Competent testers Good cooperation with QA Good project moral Good cooperation and a fine collaborative atmosphere Competent personnel Exciting and challenging Good cooperation and communication Good cooperation in the development team Cooperation with our Swedish subsidiary Structured / knowledgeable PM Good follow-up of the project Good cooperation in the project, also during error correction / test phase 6 4 5 5

  6. Structure Use the team Be focused Identify problems(hindrances) Do not startlong discussions ”15 minutes” Discipline Status meetings Once a day oronce a week Consider the whole project Consider the whole project Find info (who knows what) Problem solving Improve spec Start early Cross disciplinary

  7. Methods (dev) High level design New development methods require time More focus on unit testing Too little focus on CC – status meetings? Low focus on PMD errors(exploded for common components) SAD for the system was lacking No good SAD was made Lacked documentation of architecture decisions Bad architecture / design Need a functional Lead and Technical Lead. Important roles Low level design Needed more detailed design Too little focus on design Reuse of common components wasstarted too late Some functionality should have beenbetter specified and designed Use case realization was not doneaccording to intentions Start of test Overtime Unit test Java checklist not finishedwhen turned over to system test not used. Development not finished when testing started 2 7 5 4 6 3 Overtime to deliver on time Delivery on time dependedon large amount of overtime Estimation Some iterations were too large Too short test period in each iteration Large time pressure on developers => little unit testing Much work => time pressure possibly underestimated Pressure at the end of theproject Unit testing for X was severelyunderestimatedMuch reallocation of requirementsBad requirements estimation Resources Avoid part-time resourcesin a project Difficult to usepart-time resources Too many part-timerecourses Too few recourses fortesting Late staffing of test teamneeds dedicated adm.Resources full time Immature use of tools • Little ”paid” for good ideas • ClearCase • STG • automated unit testes • Lacked prototype for STG for UC

  8. Thin functional spec Stakeholders Error correction Difficult to involve IT operations Too little info / resources from Kernel The whole value chain must berepresented Discover lacking functionality in the main system earlier Lacking documentation in generaland especially for central parts Lacked communication with thosewho implemented functionality in the main system. Too much work planned initially Handover f-team => t-team Lacking details in spec Functional description toosmall / not good Project discipline Error correction started late Looong correction time duringthe first part of systems test For little team spirit betweenfunctions and development Development had too little focus on UC and system spec Updating the spec during development => unclearresponsibility Changers in use cases were notwell handled Functional changes duringdevelopment – test resourceswere not informed Changes were not handled inan optimal way “Request for project changes”didn’t work Lacked decisions on questionspertaining to functionality evenwhen the system test had started Functional spec and technical spec where out of phase Development vs. test Lacking access toapplication server with the same OS as the production environment Test new components in application env. earlier Different technical platformfor development and test Project scope Problems when reusingcommon components The project mandate was unclear when the project started No connection between complexity/ volume and mandate from management Steering group didn’t understand the complexity 2 3 2 2 4

  9. Estimation Project scope Overtime Resources Thin functionalspec Development vs. test Error correction Project discipline Start of test Methods (development) High level design Immature useof tools Stakeholders Low level design

  10. Needs to gothrough this together Only UC Lacks a process Lacking examples Lacking documentation Lacking guidelines Lacking a system’sarchitecture Complicated to make a low-level design Thin functional spec Lacks competence Bad templates Is often not required(balancing item) Low-level designproblems Must be possible todo just a part of it Focus oncode Lack of resources Where can we find info? Time of delivery Little focuson this Oral info The need is identified too latein the process Lacking documentation The need mustbe identifiedearly in the project We are not used to produce a low-leveldesign Lack of businessunderstanding Management / control

  11. A second look We have been working with this company for a long time. It would thus be interesting to compare this PMA with information from • An earlier, general PMA in the same company • An analysis of the errors in the company’s defect reporting system • A company-wide gap analysis

  12. Requirements spec Testing Too many temp. solutions Culture Competence Who is the customer Consequences of a small, subordinate clause Flexibility Changes over time Work in the same way as always before Level of details Application knowledge Requirements spec problems Work together with the customer Competition Understand the customer Commitment from those who know the requirements Difficult to estimate Working methods Time pressure

  13. Defect reports Development and wrong or unclear specificationaccount for almost 87% of all corrections costs. We will thus have a closer look at these two categories. We focus on errors inserted by the project in question, and have removed errors like user misunderstandings and errors in third party software.

  14. Development defects and Pareto – 1

  15. Development defects and Pareto – 2 Five error categories are needed to account for 80% of allcorrection costs. • Logical error • Misc. Error • Wrong / unclear requirements • Wrong interface • Undefined error. Approximately 95% of the defects can be categorized as • Wrong / missing functionality • Wrong / missing information displayed. This holds for most of the errors, irrespective of the categoriesassigned by the developers.

  16. Defect descriptions – examples • Logical error: Fails when opening a new account. Link is missing in the menu on the left-hand side • Misc. error: Write-command fails. The looking-glass icon is missing. • Wrong / unclear requirements: Account number is not reported. The radio button ADD is missing. • Wrong interface: The code for X is not changed. Spelling-error. • Undefined error: Cannot access screen Z. Missing values in a menu.

  17. Defect causes We observed during the first, general PMA that the developers were proud of their application knowledge and preferred a fuzzy requirements specification since this allows them to use their creativity. They get problemswith requirements and testing when they add a large number of consultants • with excellent development and coding • without application knowledge

  18. The role of application knowledge – 1 Application knowledge Customerrequirements Finished system Coding Testing Development knowledge

  19. The role of application knowledge – 2 When analyzing the error reports we saw that when application knowledge is not available • test quality will suffer • we will get incomplete functionality and information display Time pressure due to bad estimation, made the situationeven worse for the testers. • Primary problems: lack of application knowledge, incomplete requirement, bad estimation • Secondary problems: time pressure, low-quality testing, incomplete functionality in the finished system

  20. PMA, gap analysis and error analysis • Problems identified by error analysis are also are identified by PMA or RCA. • The PMA focuses on low-level design and identified functional specification and business understanding as secondary problem areas in the RCA diagram. • Estimation is only identified in the affinity diagram • Testing only occurs as an item in the two affinity diagram groups Estimation and Resources.

  21. PMA problems Problems with the PMA are that people: • Tend to focus on the start and end of a period – e.g. a project – and forget most of what went on in between. • Are good at identifying and collecting important data but are not so good at analyzing these data. • Put emphasis on single, concrete events over statistical data. Personal experience takes precedence over numerical data.

  22. Improved PMA process • Perform a PMA. This will give us an affinity diagram and one or more RCAs. • Collect error data, e.g. from the system test. • where in the process the errors were introduced • why the errors were introduced • the correction costs • Identify problem causes from the affinity diagram and the RCAs. Prioritize the problems using • the participants’ opinion • priorities based on information from the error reports. In some sense this can also be seen as triangulation [13] but in ouropinion it is something more – it helps us to focus the results of apeople process by using concrete and at least partly objectiveinformation.

More Related