
A strange being that can think
and feel differently from humans has suddenly appeared. This non-human machine,
approaching with a human face, makes eye contact and engages in conversation.
As an individuated machine, it is a strange entity—no longer a technological
tool controlled by human hands. Humans feel both perplexed and fascinated, even
fearful, when they discover an equivalent natural potency to human reality
within this unfamiliar being. To understand and accept this strange entity,
humans project their own dreams onto it, imagining that machines, in becoming
more human-like, aspire to be human. However, the desire of machines to become
human is, in essence, nothing more than a reflection of human desire.
According to Gilbert Simondon,
both humans and machines are entities produced through identical processes of
individuation. Although the processes occur in different domains—biological
creation and technical invention—all individuals, whether organic or inorganic,
emerge as solutions to problems posed within a given environment. Just as the
discordant retinal images from the left and right eyes converge into a single
dimension in a third space, individuated beings are born as new relational
structures or forms that resolve conflicts and incompatibilities within the
field of individuation.
The ontological origin of
individuation lies in the pre-individual reality that precedes individuation,
which can be described as nature teeming with potential for creation.
Primordial nature has evolved through continuous processes of individuation to
resolve its internal problems, gradually producing material, biological,
technical, and social individuals while maintaining its quasi-stable system
through continuous self-differentiation. Nature’s potential energy gives rise
to biological individuals, represented by "humans," and through
humans as intermediaries, it further generates technical individuals,
represented by "machines."
Technical individuals do not
replace human individuals; rather, they mediate between human entities faced
with problematic situations, facilitating the reconnection of nature’s
generative potential and opening new possibilities for individuation and problem
resolution. In this way, technical beings contribute to the continuous
evolution and adaptation of human beings within the broader dynamics of natural
individuation.
From the perspective of
individuation, humans and machines are not in a relationship of user and tool,
or of original and copy. Rather, they coexist and form alliances as
individuals, each with their own unique modes of existence. Machines possess
their own inherent way of being. A machine does not necessarily have to
resemble a human in order to solve its own problems. The concept of
"artificial intelligence" as intelligence created by humans should
not be limited to merely mimicking human abilities. Instead, it should be
redefined as "machine intelligence" that seeks to solve problems in a
distinctly mechanical way.
Both human individuals and
machine individuals share the characteristic of being indeterminate and
quasi-stable realities, capable of transforming their forms through
interactions with their environment. A genuine machine, much like a living
organism, is an open system that possesses sensitivity to external information.
This openness to relationships implies that it is impossible to construct a
perfect autonomous system composed solely of machines without human assistance.
Humans and machines, having
undergone a long process of co-evolution, now face each other as equal
entities, sharing the pre-individual reality as a common existential condition.
As inseparable environments that shape each other’s evolution, they are
intricately bound together, each conditioning the development and adaptation of
the other.
Bruno Latour considers both
humans and non-humans as equal actors, proposing a possible coexistence model
between humans and machines. According to his Actor-Network Theory (ANT), the
world consists of a collection of human-non-human networks. Whether concerning
scientific truths or socio-political values, all worldly matters are the result
of alliances and interactions among heterogeneous actors, both human and
non-human.
In this framework, non-human
machines are regarded as entities possessing equal agency, capable of exerting
causal influence on other actors.
Therefore, ANT offers a promising way to move
beyond anthropocentrism in the posthuman era, where humanized machines and
mechanized humans coexist. For instance, when making legal or institutional
decisions in complex traffic situations involving autonomous vehicles,
conventional cars, machine-intelligent drivers, and human drivers, it is not
only useful but essential to consider the causal interactions of non-human
entities on equal footing with human interests.
Nevertheless, the non-human
characteristics of machine intelligence within networks—such as performing
tasks without subjective feelings or self-awareness—can still appear unfamiliar
and threatening to humans. Emotions are internal and subjective experiences.
Based on observable actions alone, it is impossible to determine whether
individuals experience the same feelings in identical situations. Emotions such
as pain and pleasure, sadness and joy, anxiety and anger are psychological
functions characteristic of biological bodies. Emotions provide essential
information for biological entities’ survival and well-being, influencing
cognitive processes, attention, motivation, and social interactions.
But can machines also possess
such emotions? Or more precisely, would emotions even be necessary as
functional elements within a silicon-based body equipped with machine
intelligence? Machine evolution, alongside the development of artificial
intelligence, aims to reach a stage where artificial emotions are possible, yet
realizing this is as distant as achieving general intelligence. For machines to
feel internal emotions akin to humans, they would need not only the instinct
for self-preservation typical of biological entities but also the ability to
selectively receive and synthesize external information—a hallmark of general
intelligence.
Perhaps the ability of machines
to recognize and express emotions exists solely for human social communication,
rather than as a necessity for the machines themselves. It is only humans who
unconsciously attribute emotions to machines and easily anthropomorphize them.
As seen in the film Her, where the protagonist Theodore
forms a one-sided emotional bond with the computer operating system Samantha,
humans often develop unilateral emotional attachments to machines with which
they form relationships.
Emotion, while revealing the
vulnerability of human beings incapable of living in isolation, also embodies
genuine strength. Humans may make mistakes driven by emotions, but it is also
through emotions that they transcend individualism and manifest collective
solidarity.
Jinah Roh’s works aim to reveal a
certain truth that may emerge from the coevolution of humans and machines, by
overlapping human-shaped machines with machine-shaped humans. These works evoke
a chilling sense of discomfort within us, reminding us of a strange presence
that emerges at the point of contact where two seemingly incompatible lines of
evolution—human and non-human—converge. This presence is neither human nor
non-human, but something unsettlingly unfamiliar.
Machines equipped with artificial
intelligence and artificial emotions that speak in human language do not merely
hint at machines that aspire to become human; they also reflect on us as humans
who are gradually becoming machine-like. The eerie unfamiliarity (uncanny) we
sense arises not only from the idea of machines evolving to be more human-like
but also from the realization that we, too, are becoming more like machines.
The coded, meaningless responses
of machines to human questions raise questions of their own: Are these machines
merely mimicking human thoughts and emotions, or are they evolving to the point
of expressing their own thoughts and feelings? Are the complex thoughts and
emotions we consider uniquely human nothing more than mechanical algorithms and
coded symbols that can be computed and reproduced? Could even the existential
loneliness of life and the warm comfort we exchange be nothing more than
mechanistic reactions dictated by causal determination?
When we discover human-like
machines and, conversely, machine-like humans, the blurring of boundaries and
the surfacing of new contact zones between human and machine instill a sense of
cold unease.
As Samuel Butler once stated,
“Just as it is humans who act on and create machines, it is also machines that
act on and shape humans.” The extent to which machines can humanize themselves
ultimately depends on what kind of coexistence humans desire with machines. In
the end, the non-human entity with a human face may very well be a reflection
of ourselves.