Moral patienthood

(Redirected from Moral status)

Moral patienthood[1] (also called moral patience,[2] moral patiency,[3] and moral status[4][5]) is the state of being eligible for moral consideration by a moral agent.[4] In other words, the morality of an action can depend on how it affects or relates to moral patients.

Notions of moral patienthood in non-human animals[6][7] and artificial entities[8][9] have been academically explored.

Definition

edit

Most authors define moral patients as "beings that are appropriate objects of direct moral concern".[4] This category may include moral agents, and usually does include them. For instance, Charles Taliaferro says: "A moral agent is someone who can bring about events in ways that are praiseworthy or subject to blame. A moral patient is someone who can be morally mistreated. All moral agents are moral patients, but not all moral patients (human babies, some nonhuman animals) are moral agents."[10]

Narrow usage

edit

Some authors use the term in a more narrow sense, according to which moral patients are "beings who are appropriate objects of direct moral concern but are not (also) moral agents".[4] Tom Regan's The Case for Animal Rights used the term in this narrow sense.[11] This usage was shared by other authors who cited Regan, such as Nicholas Bunnin and Jiyuan Yu's Blackwell Dictionary of Western Philosophy,[11] Dinesh Wadiwel's The War Against Animals,[12] and the Encyclopedia of Population.[13] These authors did not think that moral agents are not eligible for moral consideration, they simply had a different view on how a "moral patient" is defined.

Relationship with moral agency

edit

The paper by Luciano Floridi and J.W. Sanders, On the Morality of Artificial Agents, defines moral agents as "all entities that can in principle qualify as sources of moral action", and defines moral patients, in accordance with the common usage, as "all entities that can in principle qualify as receivers of moral action".[14] However, they note that besides inclusion of agents within patients, other relationships of moral patienthood with moral agency are possible. Marian Quigley's Encyclopedia of Information Ethics and Security summarizes the possibilities that they gave:

How can we characterize the relationship between ethical agents and patients? According to Floridi and Sanders (2004), there are five logical relationships between the class of ethical agents and the class of patients: (1) agents and patients are disjoint, (2) patients can be a proper subset of agents, (3) agents and patients can intersect, (4) agents and patients can be equal, or (5) agents can be a proper subset of patients. Medical ethics, bioethics, and environmental ethics “typify” agents and patients when the patient is specified as any form of life. Animals, for example, can be moral patients but not moral agents. Also, there are ethics that typify moral agenthood to include legal entities (especially human-based entities) such as companies, agencies, and artificial agents, in addition to humans.[15]

Mireille Hildebrandt notes that Floridi and Sanders, in their paper, spoke of "damage" instead of "harm", and that in doing so, they "avoid the usual assumption that an entity must be sentient to count as a patient."[16]

History

edit

In 2021, Open Philanthropy recommended a grant of $315,500 to "support research related to moral patienthood and moral weight."[17]

See also

edit

References

edit
  1. ^ Haji, Ishtiyaque; Bernstein, Mark H. (November 2001). "On Moral Considerability: An Essay on Who Morally Matters". Philosophy and Phenomenological Research. 63 (3): 730. doi:10.2307/3071172. JSTOR 3071172.
  2. ^ Zhou, Xinyue; Guo, Siyuan; Huang, Rong; Ye, Weiling (2020), Wu, Shuang; Pantoja, Felipe; Krey, Nina (eds.), "Think versus Feel: Two Dimensions of Brand Anthropomorphism: An Abstract", Marketing Opportunities and Challenges in a Changing Global Marketplace, Cham: Springer International Publishing, pp. 351–352, doi:10.1007/978-3-030-39165-2_138, ISBN 978-3-030-39164-5, retrieved 2024-04-16
  3. ^ Danaher, John (March 2019). "The rise of the robots and the crisis of moral patiency". AI & Society. 34 (1): 129–136. doi:10.1007/s00146-017-0773-9. ISSN 0951-5666.
  4. ^ a b c d Audi, Robert, ed. (2015). The Cambridge Dictionary of Philosophy (3 ed.). Cambridge: Cambridge University Press. doi:10.1017/cbo9781139057509. ISBN 978-1-139-05750-9.
  5. ^ Jaworska, Agnieszka; Tannenbaum, Julie (2023), "The Grounds of Moral Status", in Zalta, Edward N.; Nodelman, Uri (eds.), The Stanford Encyclopedia of Philosophy (Spring 2023 ed.), Metaphysics Research Lab, Stanford University, retrieved 2024-04-16
  6. ^ Lan T, Sinhababu N, Carrasco LR (2022) Recognition of intrinsic values of sentient beings explains the sense of moral duty towards global nature conservation. PLoS ONE 17(10): e0276614. https://doi.org/10.1371/journal.pone.0276614
  7. ^ Müller, N.D. (2022). Kantian Moral Concern, Love, and Respect. In: Kantianism for Animals. The Palgrave Macmillan Animal Ethics Series. Palgrave Macmillan, Cham. https://doi.org/10.1007/978-3-031-01930-2_2
  8. ^ Balle, S.N. Empathic responses and moral status for social robots: an argument in favor of robot patienthood based on K. E. Løgstrup. AI & Soc 37, 535–548 (2022). https://doi.org/10.1007/s00146-021-01211-2
  9. ^ Harris, J., Anthis, J.R. The Moral Consideration of Artificial Entities: A Literature Review. Sci Eng Ethics 27, 53 (2021). https://doi.org/10.1007/s11948-021-00331-8
  10. ^ Taliaferro, Charles; Marty, Elsa J., eds. (2018). A dictionary of philosophy of religion (2nd ed.). New York: Bloomsbury Academic, An imprint of Bloomsbury Publishing Inc. ISBN 978-1-5013-2523-6.
  11. ^ a b Bunnin, Nicholas; Yu, Jiyuan (2004). The Blackwell dictionary of Western philosophy. Malden, MA: Blackwell Pub. ISBN 978-1-4051-0679-5.
  12. ^ Wadiwel, Dinesh Joseph (2015). The war against animals. Critical animal studies. Leiden; Boston: Brill. ISBN 978-90-04-30041-5.
  13. ^ "Animal Rights | Encyclopedia.com". www.encyclopedia.com. Retrieved 2024-04-16.
  14. ^ Floridi, Luciano; Sanders, J.W. (August 2004). "On the Morality of Artificial Agents". Minds and Machines. 14 (3): 349–379. doi:10.1023/B:MIND.0000035461.63578.9d. hdl:2299/1822. ISSN 0924-6495.
  15. ^ Quigley, Marian, ed. (2008). Encyclopedia of information ethics and security. Hershey: Information Science Reference. p. 516. ISBN 978-1-59140-987-8. OCLC 85444168.
  16. ^ Duff, Antony; Green, Stuart P., eds. (2011). Philosophical foundations of criminal law. Oxford; New York: Oxford University Press. p. 523. ISBN 978-0-19-955915-2.
  17. ^ Open Philanthropy (March 2021). "Rethink Priorities — Moral Patienthood and Moral Weight Research". Retrieved December 1, 2023.
  NODES
HOME 1
Intern 1
languages 2
mac 3
Note 3
os 13