1 / 11

Reviews and Inspections

Reviews and Inspections. Types of Evaluations. Formal Design Reviews conducted by senior personnel or outside experts uncover potential problems Inspections and Walkthroughs done by peers detect errors, adherence to standards, etc. Verification (not really a FTR) Unit Test

dacey
Download Presentation

Reviews and Inspections

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Reviews and Inspections

  2. Types of Evaluations • Formal Design Reviews • conducted by senior personnel or outside experts • uncover potential problems • Inspections and Walkthroughs • done by peers • detect errors, adherence to standards, etc. • Verification (not really a FTR) • Unit Test • Integration Test • Usability Test

  3. Formal Reviews • Reviewers should be senior personnel and/or outside experts • Outcome: • approve • approve pending changes • reject • Review Leader should not be Project Leader • Usually done at the end of the phase. • very appropriate for SRS and Design • sometimes appropriate for code text section 8.2

  4. Sample Design Review Checklist • Well-structured • Simple • Efficient • Adequate • Flexible • Practical • Implementable

  5. General: • Does the architecture convey a clear vision of the system that can be used for further development? • Is the architecture structured to support likely changes? • Does the architecture describe the system at a high level of detail? (No interface or implementation details.) • Does the architecture cleanly decompose the system? • Is the architecture independent of the infrastructure used to develop the system? • Has maintainability been considered? • No duplicate functionality in the architecture? • Complete: • Are software requirements reflected in the software architecture? • Is effective modularity achieved? Are modules functionally independent? • Does each module/class have an understandable name? • Is each association well named? • Is each association’s and aggregation’s cardinality correct? • Correct: • Does each association reflect a relationship that exists over the lives of the related modules/classes? • Does the architecture have loose coupling and good cohesion? www.cs.trincoll.edu/~hellis2/CPSC240/Project/DesignReviewChecklist.doc

  6. Sample Design Walkthrough • Does the algorithm accomplishes desired function? • Is the algorithm logically correct? • Is the interface consistent with architectural design? • Is the logical complexity reasonable? • Have error handling and "anti-bugging" been specified? • Are local data structures properly defined? • Are structured programming constructs used throughout? • Is design detail amenable to implementation language? • Which are used: operating system or language dependent features? • Is compound or inverse logic used? • Has maintainability been considered? stolen from Pressman

  7. Peer Reviews • guided by: • checklists, • standards, • past problems • attendees: • review leader • the author • scribe • folks with domain knowledge • possibly an SQA team member (for standards) Why schedule a meeting with so many people? Why not just have two people review the item without a meeting? text section 8.3

  8. Inspection Process • pre-meeting • read the document ahead of time • meeting • author presents overview • review team asks questions and express opinions • after meeting • scribe prepares summary • team approves summary • follow up

  9. Inspection Guidelines • Review the Product, not the person! • Find errors, don't try to solve them! • Keep Records • Take written notes. • Review your earlier reviews. • Allocate resources and schedule time for FTRs. • 3 to 5 people • Conduct training for reviewers • Keep it short • limit debate and rebuttal • Set an agenda and keep it. • no more than two hours preparation • small portions only • narrow focus increases likelihood of finding an error • meeting duration less than two hours

  10. Examples • Case : Software Review • What went wrong? • NASA's FTR Guidelines and Sample Checklists

  11. Next… • Testing • Unit Testing • Integration Testing

More Related