1 / 28

Selection Methods

Selection Methods. Factors to consider System component costs Methods. FACTORS. Cost: often hard to estimate If technology enabled without modifications, easier to estimate development costs Still hard to know benefit impact Intangibles Market share, better customer service

eratcliffe
Download Presentation

Selection Methods

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Selection Methods Factors to consider System component costs Methods

  2. FACTORS • Cost: often hard to estimate • If technology enabled without modifications, easier to estimate development costs • Still hard to know benefit impact • Intangibles • Market share, better customer service • Better corporate image • Better access to data

  3. Hidden Outcomes • Organizational power shifts • Imposition of work methods • Get more done with fewer people • Big impact on how people work • Subject to technology changes

  4. Hinton & Kaye (1996) • Two capital budgeting approaches: • Capital investment • Apply rigorous cost/benefit analysis • Revenue enhancement • Disregard cost/benefit details IT tends to be treated as a capital investment Maybe it shouldn’t be • If critical to firm strategy • If needed to keep up with competitors

  5. IT Budgeting Techniques • FINANCIAL • Discounted cash flow • Long term – reflects time value of money • Cost-benefit analysis • Do not need to include time value of money • Payback • Quick and dirty • Usually sufficient

  6. IT Risk Factors • Project manager ability (often hire consultant) • Experience (probably won’t have in ERP) • Experience with programming • Availability of critical equipment, software • Probably not a problem in ERP • Project team completeness • Make it complete • Personnel turnover • Especially after adoption of ERP • Project team size (make sure enough) • Relative control of project manager over project team

  7. IS/IT Project Evaluation Technique UseBacon [1992]: Financial

  8. IS/IT Project Evaluation Technique UseBacon [1992]: Managerial

  9. IS/IT Project Evaluation Technique UseBacon [1992]: Developmental

  10. Cost Benefit Example • Adopt a small ERP ($3 million) • 1st year BPR • Internal team doubles in cost in year 2 • Consulting costs double in year 2 • Year 2 – more hardware • Year 3 – finish implementation • Operate ERP begins year 4 • Cost of capital 20% per year

  11. Cost/Benefit Example

  12. NPV – (two benefit growth rates)

  13. Payback – 30% Benefit Growth

  14. Payback – 10% Benefit Growth

  15. Value Analysis • Keen (1981) • DSS benefits usually very nebulous • Unfair to apply cost-benefit analysis • benefit estimates unreliable • Costs - identify as in cost-benefit • Benefits - leave in subjective terms • Managerial decision: are you willing to pay this much for that set of benefits?

  16. SMARTSimple Multi-attribute Rating Technique develop hierarchy, score alternatives, weight value =  weight x score

  17. terminology • objectives - what you want to accomplish • attributes - features of a thing • criteria - measures of things of value • tradeoffs - one alternative better on one attribute, the other better on another attribute • Vendor fast, has better perceived quality, higher price • ASP fast, but less control • In-House slow, risky, but best fit

  18. SMART technique 1. identify person whose utilities are to be maximized 2. identify the issue or issues 3. identify the alternatives to be evaluated 4. identify the relevant dimensions of value for evaluating alternatives (attribute scales) 5. rank the dimensions in order of importance 6. rate dimensions in importance, preserving ratios 7. sum the importance weights, & divide by total(wi) 8. measure how well each alternative does on each dimension(sij) 9. U =  wi sij

  19. points • in Step 4, limit criteria • there are only so many things a human can keep track of at one time • 8 plenty • if weight extremely low, drop

  20. methodology • Step 4: Vendor, ASP, In-house • Step 5: rank order criteria • cost > quality > control • Step 6: rate dimensions • least important = 10 control = 10 quality = 35 cost = 50

  21. methodology • Step 7: sum, divide by total cost = 50/95 = .526 quality = 35/95 = .368 control = 10/95 = .105 • SWING WEIGHTING (check) give most important 100, others proportional cost = 100, quality = 60, control = 25 cost = 100/185 = .541, quality = .324, control = .135 • maybe average: cost .53; quality .35; control .12

  22. methodology • purpose of swing weighting • the input is admittedly an approximation • giving values based on a different perspective • additional check • should yield greater accuracy

  23. scores • Step 8: score each alternative on each criterion • need as objective a scale as you can get • doesn’t have to be linear COST: maximum feasible = 0 minimum available = 1.0 Vendor about 0.8 Outsource 1.0 In-House 0

  24. scores • QUALITY: Vendor excellent 1.0 ASP less 0.4 In-House good 0.8 • CONTROL Vendor average 0.3 ASP low 0.1 In-House maximum 1.0

  25. calculation of value U =  wi sij COST QUAL Control weights.53 .35 .12 scores: TOTALS Vendor 0.8 1.0 0.3 0.810 ASP 1.0 0.4 0.1 0.682 In-House 0.0 0.8 1.0 0.400 recommends the Vendor system

  26. SMART • provides a very workable means to implement the principles of MAUT • in fact, it can be MORE accurate than MAUT (more realistic scores, tradeoffs) • identify criteria • develop scores over criteria • identify alternatives available, measure scores • simple calculation

More Related