Digital Identity and the Uncanny Valley

Earlier this semester, I took a class called “Performative Avatars” where we learned about creating our avatars with 3D scanning technique and discussed various topics relating to our virtual identities such as body politics, gender labels and violence.

While working on a 3D scan of my own body, I was posting images of my progress on my instagram account. I received many comments from close friends such as “Omg, is that you?”, “It looks so real.”, “It’s so weird that it looks so much like you.” which made me re-examine the validity of my digital representation. Even though the scan captured my physical likeliness, there was something off, perhaps the reduction of polygons, that generated a sense of uncanny and made people question its identity.

As a die hard fan of Twin Peaks, it was very exciting for me to read an article that touches on the deliberately implement of discomfort in the show that I had accepted it as a choice of 90s rom-coms style. I definitely found the over-the-top acting as well as distant characters of the town of Twin Peaks weird without being able to point of exactly which aspect of it made me uncomfortable. Rosenbaum’s article provides me an answer that the sense of uncanny derived from the fact that the characters, clearly resembles human figure yet unable to fully recognize the situation and environment they were in.  His analysis also reminds me of the classic line “The owls are not what they seem.” I used to interpret that statement as seeing owls as messengers between the spiritual and physical world beyond rather than just a kind of animal but now I also read it as Lynch’s attempt to parallel the symbol of owls to the odd citizens of Twin Peaks.

Branching out to the tech industry, I can see a lot of similarities between Lynch’s unconventional use of the Uncanny Valley and voice assistants such as Alexa, Siri and Google Home. It is an ongoing discussion of whether we should humanize these robots by giving them personalities. From researching on my own, listening to podcasts to class discussions, people seems to find the synthetic voice and unidentifiable persona of a voice assistant comforting because they associate the trademarks as un-judgmental hence in a way easier to commend it to perform tasks. Curious about whether human emotions can be delivered through simply a synthetic voice, I created a virtual persona executed as an interactive experience that reverse the emotional labor role between the user and Google Assistant.

As shown in the video, Sad Bunny, the persona of Google Assistant was constantly asking for emotional support from the user. When I demoed it in front of a group of people, it was an interesting observation towards the end of the script when Sad Bunny asked for advices after describing the experience between her and her love interest, many people empathize her and told me to tell her that she should end the relationship because her love interest is ridiculous. Another feedback I received from the piece was that people find it eerie that a robotic voice device can act so much like a human with exaggerated emotions. Whether it’s through a human body, a humanoid or an non-tangible medium, I am intrigued by various ways to approach the Uncanny Valley and would like to continue exploring the sentiment through my art practice.

Leave a Reply

Your email address will not be published. Required fields are marked *