Skip to main content Accessibility help
×
Hostname: page-component-848d4c4894-wg55d Total loading time: 0 Render date: 2024-06-07T10:30:01.923Z Has data issue: false hasContentIssue false

4 - An ‘Artificial’ Concept as the Opposite of Human Dignity

Published online by Cambridge University Press:  28 March 2024

Tina Sikka
Affiliation:
Newcastle University
Get access

Summary

Introduction

This chapter provides a critical review of recent developments of science and technology, and scrutinizes the challenges of artificiality that we invent. Human existence and dignity continue to be challenged by advanced technologies such as AI and other innovations in the life sciences. As such, I argue that frameworks emerging out of science and technology studies (STS), ethics, philosophy and ELSI (ethical, legal and social implications) must be used to inform our perspectives.

I begin by focusing on the concept of the ‘artificial’. When one adds the prefix ‘artificial’ to any word, it introduces a level of discomfort, and it feels as if it conflicts with the values associated with life and human existence. As discussed further on, it carries with it some elements of the ‘uncanny valley’ or ‘numinose’. We should thus also feel a sense of latent warning rather than admiration for technological triumph, particularly when confronted with the nature of ‘artificiality’.

Second, based on the artificial as a concept, I argue that human existence and dignity are at risk, due to the development of specific forms of science and technology. Despite this, human beings and our societies still continue to pursue the mass production and purchase of these artefacts. What kind of society will we produce in the future? Will the end point of technological innovation mean the loss of human dignity and the value of life itself? Will humanity have the will to control the advanced technologies and artefacts we created?

This chapter aims to engage in a critical review of technological trends. It offers an overview of the values that really need to be protected and draws on a metaphysical perspective from which to maximize sustainability and existential survival into the future.

Problems with the ‘artificial’ concept

Humanity can be described, at least in part, as a population of mammals who produce their own technology and tools (for example, Homo faber (Scheler, 1961)). ‘Artificiality’, which is different from natural origin, can be generally defined as a something human-caused and human-made, and refers to those artefacts produced since the beginning of recorded civilization (Needham,1956; Jones and Taub, 2018).

Type
Chapter
Information
Genetic Science and New Digital Technologies
Science and Technology Studies and Health Praxis
, pp. 81 - 102
Publisher: Bristol University Press
Print publication year: 2023

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

Save book to Kindle

To save this book to your Kindle, first ensure coreplatform@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

Available formats
×