1 / 61

Using MS Project for Execution & Control

Using MS Project for Execution & Control. First, make certain your project plan is complete and final Second, save it as a baseline Begin entering actual information Actual costs Percentage complete. Tracking: MS Project will track—. Task start dates Task finish dates Task duration

Download Presentation

Using MS Project for Execution & Control

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Using MS Project for Execution & Control • First, make certain your project plan is complete and final • Second, save it as a baseline • Begin entering actual information • Actual costs • Percentage complete

  2. Tracking: MS Project will track— • Task start dates • Task finish dates • Task duration • Task cost work • Percentage of task that is complete

  3. Getting Earned Value Data Visible • You can go to view and replace the entry table with the Earned Value table • Or, you can enter the earned value columns into your existing table through the Insert Column facility. • The columns are BCWP, BCWS, ACWP, CV, SC, SPI, CPI, etc. You can also request the Tracking Gantt Chart off the LHS side of MS Project

  4. First set Baseline and then Save it • Go to Project • Click on Set Baseline

  5. Entering actual start & Finish dates for a task • On the view bar, click Gantt chart • In the task name field select the task to update • On the Tools menu, point to tracking and click Update Tasks • Under Actual, type the dates in the Start and Finish boxes

  6. To enter Actual Costs • You can change the table to the cost table • Or you can insert the actual cost column into the entry table • Click on File on the task bar • Then click on options • Then click on Schedule • Then uncheck “Actual costs are always calculated by Project” • Click OK

  7. Indicating progress on a task as a percentage • In the task name field of the Gantt Chart • Double click—this brings up the task information sheet • Select the general tab • In the percentage complete box type a whole number between 0 and 100

  8. Entering actual costs for a resource assignment • On the Tools menu, click options, then click the calculation tab • Clear the Actual costs are always calculated by MS Project check box • Click OK • On the view bar, click Task usage • On the view menu, point to the Table, and click Tracking • Drag the divider bar to the right to view the Activity Cost field • In the activity cost field, type the actual cost for the assignment for which you want to update costs

  9. Process Maturity and Project Closeout James R. Burns

  10. Maturity Models • Software Quality Function Deployment • Capability Maturity Model • Project Maturity Model • See pages 344-347?? of Schwalbe, 7th Edition

  11. Quality Function Deployment • Translates the “voice of the customer” into technical design requirements • Customer is King • Displays requirements in matrix diagrams • First matrix called “house of quality” • Series of connected houses

  12. 5 Importance Trade-off matrix 3 Design characteristics 4 2 1 Customer requirements Relationship matrix Competitive assessment 6 Target values Quality House

  13. Competitive Assessment Customer Requirements 1 2 3 4 5 Presses quickly 9 B A X Removes wrinkles 8 AB X Doesn’t stick to fabric 6 X BA Provides enough steam 8 AB X Doesn’t spot fabric 6 X AB Doesn’t scorch fabric 9 A XB Heats quickly 6 X B A Automatic shut-off 3 ABX Quick cool-down 3 X A B Doesn’t break when dropped 5 AB X Doesn’t burn when touched 5 AB X Not too heavy 8 X A B Irons well Easy and safe to use

  14. Energy needed to press Weight of iron Size of soleplate Thickness of soleplate Material used in soleplate Number of holes Size of holes Flow of water from holes Time required to reach 450º F Time to go from 450º to 100º Protective cover for soleplate Automatic shutoff Customer Requirements Presses quickly - - + + + - Removes wrinkles + + + + + Doesn’t stick to fabric - + + + + Provides enough steam + + + + Doesn’t spot fabric + - - - Doesn’t scorch fabric + + + - + Heats quickly - - + - Automatic shut-off + Quick cool-down - - + + Doesn’t break when dropped + + + + Doesn’t burn when touched + + + + Not too heavy + - - - + - Ironswell Easy and safe to use

  15. - - Energy needed to press Weight of iron Size of soleplate Thickness of soleplate Material used in soleplate Number of holes Size of holes Flow of water from holes Time required to reach 450º Time to go from 450º to 100º Protective cover for soleplate Automatic shutoff + + +

  16. Energy needed to press Weight of iron Size of soleplate Thickness of soleplate Material used in soleplate Number of holes Size of holes Flow of water from holes Time required to reach 450º Time to go from 450º to 100º Protective cover for soleplate Automatic shutoff Units of measure ft-lb lb in. cm ty ea mm oz/s sec sec Y/N Y/N Iron A 3 1.4 8x4 2 SS 27 15 0.5 45 500 N Y Iron B 4 1.2 8x4 1 MG 27 15 0.3 35 350 N Y Our Iron (X) 2 1.7 9x5 4 T 35 15 0.7 50 600 N Y Estimated impact 3 4 4 4 5 4 3 2 5 5 3 0 Estimated cost 3 3 3 3 4 3 3 3 4 4 5 2 Targets 1.2 8x5 3 SS 30 30 500 Design changes * * * * * * * Objective measures

  17. Capability Maturity Model • Developed in preliminary form by Watts Humphries (published in a book he wrote that appeared in 1989) • Refined by the SEI (Software Engineering Institute) , a spin-off of Carnegie Mellon University in Pittsburgh • Known as the CMM • Discussed in Schwalbe, page 344-347 (approx)

  18. Immature Software Organizations • Processes are ad hoc, and occasionally chaotic. • Processes are improvised by practitioners ON THE FLY. • Testing, reviews and walkthroughs usually curtailed under stress. • Quality is unpredictable.

  19. Immature Software Organizations, Cont’d • Costs and schedules are usually exceeded. • Reactionary management is usually firefighting. • Success rides on individual talent and heroic effort. • Technology benefits are lost in the noise.

  20. Mature Software Organizations • Processes are defined and documented. • Roles and responsibilities are clear. • Product and process are measured. • Processes and projects finish on time and within budget • Management has time to plan, monitor, and communicate.

  21. Mature Software Organizations, Cont’d • Quality, costs, and schedules are predictable • Management committed to continuous improvement. • Technology is used effectively within defined processes.

  22. Software Process Definition • Project Planning • Project Management • Software Engineering Procedures • Software standards • Software Quality Evaluation • Software Configuration management

  23. The Five Levels of Software Process Maturity • INITIAL • REPEATABLE • DEFINED • MANAGED • OPTIMIZING

  24. Five Levels

  25. Initial • Software processes are ad hoc, even chaotic • The software processes are not defined • Success depends on individual effort • The environment is not stable

  26. Initial, Continued • The benefits of software engineering practices are undermined • Planning is nonexistent or ineffective • Process capability is unpredictable because the software process is constantly changed or modified as the work progresses

  27. Repeatable • Basic project management policies and procedures are established • Cost, schedule and functionality (scope) are tracked by module and task • A process discipline is put in place to repeat earlier successes • Managing new projects is based on experience with similar projects

  28. Repeatable, Continued • Basic software management controls are installed • Estimations of cost and time to complete are based on history for similar projects • Problems are identified and documented • Software requirements are baselined (made tough to change)

  29. Repeatable, Continued • Project standards are defined • Project teams work with their customers and subcontractors to establish stable, managed working environments • Process is under the control of a project management system that is driven by performance on previous projects • A project performance database is defined and populated

  30. Defined • Software processes are documented • Software processes are standardized and integrated organization-wide • All projects use documented and approved versions of the organization’s processes of developing and maintaining software • A Software Engineering ProcessGroup (SEPG) is created to facilitate process definition and improvement efforts

  31. Defined, Continued • Organization-wide training programs are implemented • Organization-wide standard software processes can be refined to encompass the unique characteristics of the project • A peer review process is used to enhance product quality • Process capability is stable and based on a common understanding of processes, roles, and responsibilities in a defined process

  32. Managed • Quantitative quality goals are defined • Product quality and productivity are measured and collected • Both processes and products are quantitatively understood • Both processes and products are controlled using detailed measures • A productivity and quality database is defined

  33. Managed, Continued • Projects achieve control by narrowing the variation in performance to within acceptable boundaries • Process variation is controlled by use of a strategic business plan that details which product lines to pursue • Risks associated with moving up the learning curve of a new application domain are known and carefully managed • Process capability is measured and operating within measurable limits

  34. Optimizing • Continuous process improvement is enabled by quantitative feedback • Continuous process improvement is assessed from testing innovative ideas and technologies • Weak process elements are identified and strengthened • Defect prevention is explicit

  35. Optimizing, Cont’d • Statistical evidence is available on process effectiveness • Innovations that exploit the best software engineering practices are identified • Improvement occurs from • INCREMENTAL ADVANCEMENTS IN EXISTING PROCESSES • INNOVATIONS USING NEW TECHNOLOGIES AND METHODS

  36. How are firms doing?? • Many U.S. firms have reached the highest level, OPTIMIZING • Indian firms may be doing better

  37. Organizational Project Management Maturity Model (OPM3) 1. Ad-Hoc: The project management process is described as disorganized, and occasionally even chaotic. The organization has not defined systems and processes, and project success depends on individual effort. There are chronic cost and schedule problems. 2. Abbreviated: There are some project management processes and systems in place to track cost, schedule, and scope. Project success is largely unpredictable and cost and schedule problems are common. 3. Organized: There are standardized, documented project management processes and systems that are integrated into the rest of the organization. Project success is more predictable, and cost and schedule performance is improved. 4. Managed: Management collects and uses detailed measures of the effectiveness of project management. Project success is more uniform, and cost and schedule performance conforms to plan. 5. Adaptive: Feedback from the project management process and from piloting innovative ideas and technologies enables continuous improvement. Project success is the norm, and cost and schedule performance is continuously improving.

  38. Enter CMMI: Capability Maturity Model Integration • In 2007, the SEI asserted that it would no longer support the old SW-CMM. • On Dec 31, 2007 all SW-CMM appraisal results were expired • The purpose of the CMMI was to focus process maturity more towards project performance • Organizations must now upgrade to the CMMI • The CMMI is vastly improved over the CMM • Emphasis is on business needs, integration and institutionalization

  39. CMMI Staged Representation - 5 Maturity Levels Process performance continually improved through incremental and innovative technological improvements. Level 5 Optimizing Level 4 Quantitatively Managed Processes are controlled using statistical and other quantitative techniques. Process Maturity Level 3 Processes are well characterized and understood. Processes, standards, procedures, tools, etc. are defined at the organizational (Organization X )level. Proactive. Defined Level 2 Managed Processes are planned, documented, performed, monitored, and controlled at the project level. Often reactive. Level 1 Initial Processes are unpredictable, poorly controlled, reactive.

  40. CMMI Origins • The CMMI was derived from the • SW-CMM—capability maturity model for software • EIA/IS – electronic Industries Alliance Interim Standard • IPD-CMM—Capability Maturity Model for Integrated Product Development • CMMI architecture is open and designed to accommodate additional disciplines, like • CMMI-DEV – processes for development • CMMI-ACQ—processes required for supplier sourcing • CMMI-SVC—processes required for services

  41. CMMI: cap mat model integration • Level 0: Incomplete • No goal. • Level 1: Performed • The process supports and enables achievement of the specific goals of the process area by transforming identifiable input work products to produce identifiable output work products. • Level 2: Managed • The process is institutionalized as a managed process. • Level 3: Defined • The process is institutionalized as a defined process. • Level 4: Quantitatively Managed • The process is institutionalized as a quantitatively managed process. • Level 5: Optimizing • The process is institutionalized as an optimizing process.

  42. Use of this tool has shown… • The Engineering and Construction Industries have a higher level of maturity than do the information systems and software development disciplines

  43. Completing and Terminating a Project James Burns

  44. Completing • Integration Testing • Regression methods • Final Testing • Acceptance Testing • Installation/Conversion • Training

  45. Purpose of Acceptance Testing • to get paid every dime that you are owed!! • When is the best time to write the Acceptance Test Plan • Why??? • Who dictates what those tests will consist of? • Do you think there should be at least one test for each and every defined requirement?

  46. Final, Thorough Test • Do beta testing?? • Run some old integration tests • Devise true-to-life tests • Try to overload the system • Try to break it by entering wrong inputs, out of range values, etc. • Test user documentation as well.

  47. Installation • going live

  48. Training • Usually, not enough budget is set aside for training • At the mid-market level and lower, training budgets are slim • On-line, context-sensitive help is one answer

  49. Conversion • Crash • Parallel • Pilot

  50. Customer Survey • Degree to which objectives were achieved? • Degree to which users accepted and endorsed the product • Overall satisfaction level • Best if done by an outside survey agency or firm

More Related