Joshua Greene (psychologist)

Joshua David Greene (born 1974)[1] is an American experimental psychologist, neuroscientist, and philosopher. He is a professor of psychology at Harvard University. Most of his research and writing has been concerned with moral judgment and decision-making. His recent research focuses on fundamental issues in cognitive science.[2][3]

Joshua Greene
Joshua Greene in 2018
Born1974 (age 49–50)
Alma materHarvard University (BA)
Princeton University (PhD)
Known forDual process theory
Scientific career
Fieldsexperimental psychology, moral psychology, neuroscience, social psychology, philosophy
InstitutionsHarvard University
Thesis The Terrible, Horrible, No Good, Very Bad Truth About Morality and What to Do About It  (2002)
Doctoral advisorDavid Lewis
Gilbert Harman
Websitewww.joshua-greene.net

Education and career

edit

Greene attended high school in Fort Lauderdale, Broward County, Florida.[4] He briefly attended the Wharton School of the University of Pennsylvania before transferring to Harvard University.[5] He earned a bachelor's degree in philosophy from Harvard in 1997,[6] followed by a Ph.D. in philosophy at Princeton University under the supervision of David Lewis and Gilbert Harman. Peter Singer also served on his dissertation committee. His 2002 dissertation, The Terrible, Horrible, No Good, Very Bad Truth About Morality and What to Do About It, argues against moral-realist language and in defense of non-realist utilitarianism as a better framework for resolving disagreements.[7] Greene served as a postdoctoral fellow at Princeton in the Neuroscience of Cognitive Control Laboratory before returning to Harvard in 2006 as an assistant professor. In 2011, he became the John and Ruth Hazel Associate Professor of the Social Sciences. Since 2014, he has been a professor of psychology.

Dual-process theory

edit

Greene and colleagues have advanced a dual process theory of moral judgment, suggesting that moral judgments are determined by both automatic, emotional responses and controlled, conscious reasoning. In particular, Greene argues that the "central tension" in ethics between deontology (rights- or duty-based moral theories) and consequentialism (outcome-based theories) reflects the competing influences of these two types of processes:

Characteristically deontological judgments are preferentially supposed by automatic emotional responses, while characteristically consequentialist judgments are preferentially supported by conscious reasoning and allied processes of cognitive control.[8]

In one of the first experiments to suggest a moral dual-process model,[4] Greene and colleagues showed that people making judgments about "personal" moral dilemmas (like whether to push one person in front of an oncoming trolley in order to save five others) engaged several brain regions associated with emotion that were not activated by judgments that were more "impersonal" (like whether to pull a switch to redirect a trolley from a track on which it would kill five people onto a track on which it would kill one other person instead).[9] They also found that for the dilemmas involving "personal" moral questions, those who did make the intuitively unappealing choice had longer reaction times than those who made the more emotionally pleasant decision.

A follow-up study compared "easy" personal moral questions to which subjects had fast reaction times against "hard" dilemmas (like the footbridge problem) to which they had slow reaction times.[10] When responding to the hard problems, subjects displayed increased activity in the anterior dorsolateral prefrontal cortex (DLPFC) and inferior parietal lobes—areas associated with cognitive processing—as well as the anterior cingulate cortex—which has been implicated in error detection between two confusing inputs, as in the Stroop task). This comparison demonstrated that harder problems activated different brain regions, but it did not prove differential activity for the same moral problem depending on the answer given. This was done in the second part of the study, in which the authors showed that for a given question, those subjects who made the utilitarian choices did have higher activity in the anterior DLPFC and the right inferior parietal lobe than subjects making non-utilitarian choices.

These two studies were correlational, but others have since suggested a causal impact of emotional vs. cognitive processing on deontological vs. utilitarian judgments.[11][12][13] A 2008 study[14] by Greene showed that cognitive load caused subjects to take longer to respond when they made a utilitarian moral judgment but had no effect on response time when they made a non-utilitarian judgment, suggesting that the utilitarian thought processes required extra cognitive effort.

Greene's 2008 article "The Secret Joke of Kant's Soul"[15] argues that Kantian/deontological ethics tends to be driven by emotional respondes and is best understood as rationalization rather than rationalism—an attempt to justify intuitive moral judgments post-hoc, although the author states that his argument is speculative and will not be conclusive. Several philosophers have written critical responses.[16][17][18][19][20][21]

Moral Tribes

edit

Drawing on dual-process theory, as well as evolutionary psychology and other neuroscience work, Greene's book Moral Tribes (2013) explores how our ethical intuitions play out in the modern world.[22]

Greene posits that humans have an instinctive, automatic tendency to cooperate with others in their social group on tragedy of the commons scenarios ("me versus us"). For example, in a cooperative investment game, people are more likely to do what's best for the group when they're under time pressure or when they're primed to "go with their gut", and inversely, cooperation can be inhibited by rational calculation.[23] However, on questions of inter-group harmony ("us versus them"), automatic intuitions run into a problem, which Greene calls the "tragedy of commonsense morality". The same ingroup loyalty that achieves cooperation within a community leads to hostility between communities. In response, Greene proposes a "metamorality" based on a "common currency" that all humans can agree upon and suggests that utilitarianism—or as he calls it, "deep pragmatism"—is up to the task.[24]

Reception

edit

Moral Tribes received multiple positive reviews.[25][26][27][28]

Thomas Nagel critiques the book by suggesting that Greene is too quick to conclude utilitarianism specifically from the general goal of constructing an impartial morality; for example, he says, Immanuel Kant and John Rawls offer other impartial approaches to ethical questions.[24]

Robert Wright calls[29] Greene's proposal for global harmony ambitious and adds, "I like ambition!" But he also claims that people have a tendency to see facts in a way that serves their ingroup, even if there's no disagreement about the underlying moral principles that govern the disputes. "If indeed we're wired for tribalism", Wright explains, "then maybe much of the problem has less to do with differing moral visions than with the simple fact that my tribe is my tribe and your tribe is your tribe. Both Greene and Paul Bloom cite studies in which people were randomly divided into two groups and immediately favored members of their own group in allocating resources—even when they knew the assignment was random." Instead, Wright proposes that "nourishing the seeds of enlightenment indigenous to the world's tribes is a better bet than trying to convert all the tribes to utilitarianism—both more likely to succeed, and more effective if it does."

Greene's metamorality of deep pragmatism has recently been criticized by Steven Kraaijeveld and Hanno Sauer for being based on conflicting arguments about moral truth.[30]

In Moral Tribes, Greene argues that reasoned thought is important in moral decision-making, while also acknowledging the significant role that emotions play in the process. He supports this claim with compelling evidence, including results from neurobiological studies. Greene's willingness to recognize the importance of emotional-based moral reasoning is a significant development in bridging the gap between the continental and analytic schools of philosophy, as the latter tends to prioritize objective reasoning over subjective, emotional approaches.[31]

Awards and distinctions

edit

Greene received the 2012 Stanton Prize from the Society for Philosophy and Psychology.[32]

In 2013, Greene was awarded the Roslyn Abramson Award, given annually to Harvard faculty "in recognition of his or her excellence and sensitivity in teaching undergraduates".[6]

Bibliography

edit
  • Greene, Joshua D; Sommerville, R Brian; Nystrom, Leigh E; Darley, John M; Cohen, Jonathan D (2001). "An fMRI investigation of emotional engagement in moral judgment". Science. 293 (5537): 2105–2108. Bibcode:2001Sci...293.2105G. doi:10.1126/science.1062872. PMID 11557895. S2CID 1437941.
  • Greene, Joshua; Jonathan Haidt (2002). "How (and where) does moral judgment work?". Trends in Cognitive Sciences. 6 (12): 517–523. doi:10.1016/S1364-6613(02)02011-9. PMID 12475712. S2CID 6777806.
  • Greene, Joshua D; Nystrom, Leigh E; Engell, Andrew D; Darley, John M; Cohen, Jonathan D (2004). "The neural bases of cognitive conflict and control in moral judgment". Neuron. 44 (2): 389–400. doi:10.1016/j.neuron.2004.09.027. hdl:10983/15961. PMID 15473975. S2CID 9061712.
  • Greene, Joshua D (2008). "The Secret Joke of Kant's Soul". In Sinnott-Armstrong, Walter (ed.). Moral Psychology: The Neuroscience of Morality: Emotion, Brain Disorders, and Development. MIT Press. pp. 35–80. ISBN 978-0-262-19564-5.

See also

edit

References

edit
  1. ^ "AUT - Úplné zobrazení záznamu". Czech National Authority Database. Retrieved May 8, 2023.
  2. ^ Cooper, Dani (August 25, 2015). "Brain turns words into complex thoughts like a computer". Australian Broadcasting Corporation.
  3. ^ Frankland, Steven M.; Greene, Joshua D. (September 15, 2015). "An architecture for encoding sentence meaning in left mid-superior temporal cortex". Proceedings of the National Academy of Sciences. 112 (37): 11732–11737. Bibcode:2015PNAS..11211732F. doi:10.1073/pnas.1421236112. PMC 4577152. PMID 26305927.
  4. ^ a b Ohlson, Kristin. "The Vexing Mental Tug-of-War Called Morality". Discover. No. July–August 2011. Retrieved September 6, 2015.
  5. ^ Greene, Joshua D. (2013). Moral Tribes: Emotion, Reason, and the Gap Between Us and Them. New York: Penguin Press. ISBN 9781101638675.
  6. ^ a b Manning, Colin (May 29, 2013). "Two named Abramson winners". Harvard Gazette. Retrieved September 6, 2015.
  7. ^ Greene, Joshua David (2002). The terrible, horrible, no good, very bad truth about morality and what to do about it (Thesis). CiteSeerX 10.1.1.174.5109. OCLC 54743074. S2CID 170676316.
  8. ^ Greene, Joshua D. (July 2014). "Beyond Point-and-Shoot Morality: Why Cognitive (Neuro)Science Matters for Ethics". Ethics. 124 (4): 695–726. doi:10.1086/675875. S2CID 9063016.
  9. ^ Greene, Joshua D.; Sommerville, R. Brian; Nystrom, Leigh E.; Darley, John M.; Cohen, Jonathan D. (September 14, 2001). "An fMRI Investigation of Emotional Engagement in Moral Judgment". Science. 293 (5537): 2105–2108. Bibcode:2001Sci...293.2105G. doi:10.1126/science.1062872. PMID 11557895. S2CID 1437941.
  10. ^ Greene, Joshua D.; Nystrom, Leigh E.; Engell, Andrew D.; Darley, John M.; Cohen, Jonathan D. (October 2004). "The Neural Bases of Cognitive Conflict and Control in Moral Judgment". Neuron. 44 (2): 389–400. doi:10.1016/j.neuron.2004.09.027. hdl:10983/15961. PMID 15473975. S2CID 9061712.
  11. ^ Mendez, Mario F; Anderson, Eric; Shapira, Jill S (December 2005). "An Investigation of Moral Judgement in Frontotemporal Dementia". Cognitive and Behavioral Neurology. 18 (4): 193–197. doi:10.1097/01.wnn.0000191292.17964.bb. PMID 16340391. S2CID 19276703.
  12. ^ Koenigs, Michael; Young, Liane; Adolphs, Ralph; Tranel, Daniel; Cushman, Fiery; Hauser, Marc; Damasio, Antonio (April 2007). "Damage to the prefrontal cortex increases utilitarian moral judgements". Nature. 446 (7138): 908–911. Bibcode:2007Natur.446..908K. doi:10.1038/nature05631. PMC 2244801. PMID 17377536.
  13. ^ Valdesolo, Piercarlo; DeSteno, David (June 2006). "Manipulations of Emotional Context Shape Moral Judgment". Psychological Science. 17 (6): 476–477. doi:10.1111/j.1467-9280.2006.01731.x. PMID 16771796. S2CID 13511311.
  14. ^ Greene, Joshua D.; Morelli, Sylvia A.; Lowenberg, Kelly; Nystrom, Leigh E.; Cohen, Jonathan D. (June 2008). "Cognitive load selectively interferes with utilitarian moral judgment". Cognition. 107 (3): 1144–1154. doi:10.1016/j.cognition.2007.11.004. PMC 2429958. PMID 18158145.
  15. ^ https://psycnet.apa.org/record/2007-14534-005 [bare URL]
  16. ^ Lott, Micah (October 2016). "Moral Implications from Cognitive (Neuro)Science? No Clear Route". Ethics. 127 (1): 241–256. doi:10.1086/687337. S2CID 151940241.
  17. ^ Königs, Peter (April 3, 2018). "Two types of debunking arguments". Philosophical Psychology. 31 (3): 383–402. doi:10.1080/09515089.2018.1426100. S2CID 148678250.
  18. ^ Meyers, C. D. (May 19, 2015). "Brains, trolleys, and intuitions: Defending deontology from the Greene/Singer argument". Philosophical Psychology. 28 (4): 466–486. doi:10.1080/09515089.2013.849381. S2CID 146547149.
  19. ^ Kahane, Guy (2012). "On the Wrong Track: Process and Content in Moral Psychology". Mind & Language. 27 (5): 519–545. doi:10.1111/mila.12001. PMC 3546390. PMID 23335831.
  20. ^ Fiala, Brian. "The Secret Emptiness of Greene's Argument".
  21. ^ Kleingeld, Pauline (2014). "Debunking Confabulation: Emotions and the Significance of Empirical Psychology for Kantian Ethics". Kant on Emotion and Value. pp. 146–165. doi:10.1057/9781137276650_8. ISBN 978-1-349-44676-6.
  22. ^ Greene, Joshua (2013). Moral Tribes: Emotion, Reason, and the Gap Between Us and Them. Penguin Press. ISBN 978-1594202605.[non-primary source needed]
  23. ^ Greene, Joshua D. "Deep Pragmatism". Edge. Retrieved November 24, 2013.
  24. ^ a b Nagel, Thomas. "You Can't Learn About Morality from Brain Scans: The problem with moral psychology". New Republic. Retrieved November 24, 2013.
  25. ^ Waytz, Adam (November 2, 2013). "'Moral Tribes' by Joshua Greene". Boston Globe. Retrieved November 24, 2013.
  26. ^ "Moral Tribes: Emotion, Reason, and the Gap Between Us and Them". Kirkus Reviews. August 19, 2013. Retrieved November 24, 2013.
  27. ^ "The Brain's Way Of Dealing With 'Us' and 'Them'". Wall Street Journal. November 23, 2013.
  28. ^ Baggini, Julian (January 3, 2014). "The social animal". Financial Times.
  29. ^ Wright, Robert (October 23, 2013). "Why Can't We All Just Get Along? The Uncertain Biological Basis of Morality". The Atlantic. Retrieved November 24, 2013.
  30. ^ Kraaijeveld, Steven R.; Sauer, Hanno (July 2019). "Metamorality without Moral Truth". Neuroethics. 12 (2): 119–131. doi:10.1007/s12152-018-9378-3. S2CID 149750930.
  31. ^ Bekesi, Aron B. (2016). "The Scientific Discovery of Emotions - A Turning Point in Philosophy". Existential Analysis. 27 (1): 144–154 – via Academia.edu.
  32. ^ "Prizes". Society for Philosophy and Psychology. Retrieved September 6, 2015.
edit
  NODES
coding 1
Community 1
Experiments 1
HOME 1
iOS 1
languages 1
Note 1
OOP 6
os 55
text 1
Theorie 2
web 2