Social Studies skills: Difference between revisions

(→‎Dunning–Kruger effect: adding observable v. provable)
(→‎Logic: adding ethical dilemmas)
Line 717: Line 717:
the defensible but undesired position to which one retreats when hard pressed.</pre>
the defensible but undesired position to which one retreats when hard pressed.</pre>
</div>
</div>
== Ethics ==
=== Aristotle ===
* by Aristotle's view, the study of ethics is essential to understanding the world around us and for finding virtue and happiness
** ''ethikē'' = ethics
** ''aretē'' = virtue or excellence
**  ''phronesis'' = practical or ethical wisdom
** ''eudaimonia'' = "good state" or happiness
* steps to become a virtuous person:
** 1. practicing righteous actions guided by a teacher leads to righteous habits
** 2. righteous habits leads to good character by which righteous actions are willful
** 3. good character leads to ''eudaimonia''
=== ethical dilemmas ==
==== the "Trolley problem"
* a dilemma created by the need to sacrifice one innocent person to save (usually given as) five others
* scenario:
** a runaway (out of control) trolley is heading towards a track with five workers on it (or sometimes presented as five people tied up and who are unable to move)
** there is a secondary track that was not in the original pathway of the trolley and that has one person on it
** an engineer who sees the situation can divert the trolley to the secondary track, thus killing the one person on it but saving the five on the original track
*** the problem is that that one person was otherwise not in danger and not wrongfully on the track
*** is that sacrifice ethical?
* the "utilitarian" view holds that it would be ethical and morally responsible to divert the trolley as it would save more lives
** by "utilitarian" we mean a choice or action that benefits the most people, even at the expense of some others
*** i.e. "maximize utility"
* objections to the utilitarian response include:
** the engineer had no intention to harm the five but by diverting the trolley would have made a willful decision to kill the one; therefore the act would be morally objectionable
*** = deliberately harming anyone for any reason is morally wrong
*** = violating the "doctrine of double effect," which states that deliberately causing harm, even for a good cause, is wrong
* the Trolley problem shows up in other situations:
** artificial intelligence, such as driverless vehicles
** Isaac Asimov explored moral and ethical dilemmas regarding artificial intelligence in his collection of essays, "I Robot."
*** Asimov envisioned the '''Three Laws of Robotics'''
click EXPAND to read the Three Laws of Robotics
<div class="mw-collapsible mw-collapsed">
<pre>
First Law
A robot may not injure a human being or, through inaction, allow a human being to come to harm.
Second Law
A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
Third Law
A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
</pre></div>


==Standards/ Standardization==
==Standards/ Standardization==