1 / 18

Decision Making Manual: A Toolkit for Making Moral Decisions

Decision Making Manual: A Toolkit for Making Moral Decisions. William J. Frey (UPRM) José A. Cruz-Cruz (UPRM) Chuck Huff (St. Olaf). There is an analogy between design problems and ethical problems. Problem-solving in computing can be modeled on software design.

Download Presentation

Decision Making Manual: A Toolkit for Making Moral Decisions

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Decision Making Manual: A Toolkit for Making Moral Decisions William J. Frey (UPRM) José A. Cruz-Cruz (UPRM) Chuck Huff (St. Olaf)

  2. There is an analogy between design problems and ethical problems

  3. Problem-solving in computing can be modeled on software design • The software development cycle can be presented in terms of four stages: • Problem Specification • Solution Generation • Solution Testing • Solution Implementation • Generate or create options that embody or realize ethical value or worth • We don’t find them, we make them

  4. Specify Values (Socio-Technical System Table) • A STS is a system of environments that enable and constrain the practice of business • STS exhibit different components: • Hardware, Software, Physical Surroundings, Stakeholders (people, groups, & roles), Procedures, Laws (Criminal Law, Civil Law, Statutes & Regulations), Information Systems (collecting, storing, transferring) • Other Components: Financial Markets, Rate Structure (Power Systems), Environment, Technological Context, Supply Chain • A STS is a system whose components are interrelated and interact with each other. • STSs embody values • Moral: Justice, Respect, Responsibility, Trust, Integrity • Non-Moral: Financial, Efficiency, Sustainability • STSs exhibit trajectories i.e., coordinated paths of change

  5. Classify the problem: • Disagreement on Facts • Did the supervisor sexually harass the employee? (What happened—there are two different versions) • Disagreement on Concepts • Has the supervisor created a hostile environment? (Meaning of hostile environment?) • Conflicts • Conflict between moral values (Toysmart either honors property claims of creditors or privacy rights of customers) • Conflicts between moral and non-moral values (In order to get the chips to clients on time, LaRue has told the quality control team to skip environmental tests and falsify results) • A key value becomes vulnerable • Online activity has magnified the potential harms of cyberslander against companies like Biomatrix • Immediate, Midterm, or Remote Harms • Is it the case that Therac-25 patients are receiving radiation overdoses?

  6. Solution Generation • Don’t fall into the dilemma trap • Assumption that all ethical problems in business offer only two solution forms: do the right thing financially or do the right thing ethically • Brainstorm • Do exercises to unlock creative thought • Start with an individual list • Share your list with others while suspending criticism • Once you have a preliminary list (set a quota) refine it • Eliminate solutions that are impractical • Combine solutions (one is part of another; one is plan A, the other plan B) • Test solutions globally and quickly to trim them down to a manageable list

  7. Refined Solution List

  8. Generic Solutions • Gather more information • NoloContendere • Be diplomatic. Negotiate with the different parties. Look for a “win-win” solution • Oppose. Stand up to authority. Organize opposition. Document and publicize the wrong • Exit (Get a transfer. Look for another job. Live to fight another time) • Integrate as plans A, B, C, etc. (Try one, then the other if the first doesn’t work.)

  9. Test Solutions • Develop a solution evaluation matrix • Test the ethical implications of each solution • See if the solution violates the code • Carry out a global feasibility assessment of the solution. • What are the situational constraints? • Will these constraints block implementation?

  10. Solution Evaluation Matrix

  11. Reversibility • Does the action still look good when viewed from the standpoint of key stakeholders? • Agent projects into standpoint of those targeted by the action and views it through their eyes • Avoid extremes of too little and too much identification with stakeholder • go beyond your egocentric standpoint but don’t become lost in the perspective of the other • Empathic and Advisory Projections

  12. Harm / Benefits • What are the likely harms and benefits that will follow from the action under consideration? • What is their magnitude and range? • How are they distributed? • Which alternative produces the most benefits coupled with the least harms? • Avoid too much (trying to factor in all consequences) and too little (leaving out significant consequences)

  13. Publicity Test • What are the values embedded in the action you are considering? • Is it responsible or irresponsible? Just or unfair? Respectful or disrespectful? • Would you want to be publically associated with this action given the values it embodies? • People would view you as responsible, just, or respectful; irresponsible, unjust (biased?), disrespectful

  14. A Feasibility Test—Will it Work? • Restate your global feasibility analysis • Are there resource constraints? • Are these fixed or negotiable? • Are there technical or manufacturing constraints? • Are these fixed or negotiable? • Are there interest constraints? • Are these fixed or negotiable?

  15. Feasibility Matrix

  16. What if there are major constraints? • Try out what Westin calls the “intermediate impossible” (Practical Companion, 38) • Take your ethically, financially, technically ideal solution • Test its feasibility. If it is lacking… • Modify it as little as possible until it becomes feasible. Then implement the “intermediate impossible.”

  17. Final Considerations • Has your problem shifted? • Check over your refined solution list and your final solution. Sometimes the process moves from one problem to another. If so, re-specify your problem given what you have learned. • Have you opened all possible doors to solving your problem? • Multiple framings. Resisting dilemma trap

  18. Some Readings • Anthony Weston. (2002). A Practical Companion to Ethics: Second Edition. Oxford, UK: Oxford University Press. • Weston has several excellent suggestions for brainstorming solutions to ethical problems. He also discusses how to avoid the dilemma trap. • Good Computing. (Book under development through Jones and Bartlett) (Huff, Frey, Cruz) • The manuscript describes the four-stage software development cycle that is used as a model here for problem-solving. • Carolyn Whitbeck. (1998). Ethics in engineering practice and research. Cambridge, UK: Cambridge University Press. • Whitbeck provides an illuminating discussion of the analogy between ethics and design problems.

More Related