Philosophy Discussion Prompts on utilitarianism (Mill) and deontology (Kant)
Respond to one of these prompts and be clear about which one you are referring to:
PROMPT #1: MORAL THEORIES. Which of the three moral theories discussed in lecture — utilitarianism (Mill), deontology (Kant), and virtue ethics (Aristotle) — do you support? Explain your answer. Could these theories be combined?
PROMPT #2: TECHNOLOGICAL INSTRUMENTALISM. Kevin Kelly (founder of Wired magazine) claims that no technology is ever completely out of bounds. Having a new technology is always an unalloyed good. It is always better to have a new technology at our disposal and simply seek to manage it properly. Do you agree with Kevin Kelly? Explain your answer.
PROMPT #3: RISK ASSESSMENT. “Cost-benefit analysis is designed as a method of quantification, so it surely is better able to deal with more quantifiable aspects of the issues it confronts. But this limitation is in itself ethically neutral unless it can be shown that the quantifiable considerations systematically push decisions in a particular direction. Its detractors must show that the errors of cost-benefit analysis are systematically unjust or ineffective – for example, that it frequently helps the rich at the expense of the poor, or despoils the environment to the benefit of industry, or vice versa.” (Leonard & Zeckhauser 1983: 8). Do you agree with this position? Explain your answer.
PROMPT #4: ROBOT PERSONS. Should we allow robots to become moral agents and thus persons? Explain your answer.
PROMPT #5: MORAL RESPONSIBILITY. When there are crashes involving driverless cars, who should be held responsible? Explain your answer.
PROMPT #6: TROLLEY PROBLEM. What would you do in the Bystander at the Switch scenario? (A) Throw the switch in order to maximize well-being (five people surviving is greater than one.) (B) Not throw the switch because that would be a form of killing, and killing is inherently wrong. Explain and defend your answer.
Answer preview:
word limit:274