top of page
pepi ng

nyc-based designer interested in the intersection between art and technology
execution is an interactive installation that aims to prompt users to critically examine the ways autonomous weapons we create are now being used against ourselves, and how we can prevent this slide to digital dehumanization.
IMG_8851 New.png
Process: research and brainstorming
I had the privilege to have a second opportunity to showcase a second installation work for another show at Grace Exhibition Space! This time, the theme of the show was "Auto-Organics: Robots for a Living Earth". In simple terms, it was about biomimicry and the reflection of the natural world in digital creations.

While I was doing my research into biomimicric digital forms, I stumbled upon Boston Dynamics' Spot as well as
this video which reports on Singapore's Arduino-controlled (still alive) beetles.  As I am from Singapore, I was utterly horrified by the amount of research and resources that they are investing in this dystopian-form of technology. Though the researchers mentioned that the technology would be used for a "good cause", I struggled to see how inserting electrical components into a live beetle, and controlling it via a microprocessor while it's still alive, was for a good cause. Was this even necessary? 

Thus, began my deep dive into biomimicric robots and its potential to be weaponized/ be of harm. Here's some of the examples of biomimicric weapons I've found: 
Boston Dynamics' Spot used in Singapore
Airstrike Drones currently employed by military all around the world
Ghost Robotics quadrupedal robot currently tested by the US military
Japan's cyborg cockroaches
My final sketch/ concept:
Process: Making the robot!
I made use of this YouTube tutorial that provided the 3D models and assembling instructions for the robot. I 3D printed and assembled the parts for the robot. 

Getting the movements of the different limbs of the robot was the most difficult part- I didn't have much experience in body anatomy and struggled to see which limb should move first/ in which direction. Controlling 12 server motors at once also didn't help!

It was a lot of trial and error with the code. I also struggled a lot with my Arduino heating up too much, and realized just a few days before the exhibition that I should make use of the Adafruit 16-Channel 12-bit PWM/Servo Driver to make things more efficient and organized. 

Another struggle that I faced was the weight of the robot- I realized that the limbs/ server motors were not strong enough to withstand the weight a lot of the time. I also had to constantly change batteries in order to keep the robot running (I did not want to link the robot to a power source as I didn't want another wire burdening the movement of the robot).
Prototyping the robotic spider!
Prototyping the robotic spider
First iteration
My work environment!
Process: Other parts of the installation
3D modelled and printed the buildings! I originally wanted the user to interact with several buildings, but given the time crunch I was only able to create one interactive building.

I attached photoresistors on the surface of the building, and a laser light on the robotic spider. The user would have to control the spider, and use the laser to "shoot" down the buildings. The building would fall when the laser light points at the photoresistors.

Webcam cameras and p5 Serial Communication
I made use of the p5.Serial library to link my p5.js sketch with Arduino. The p5 sketch would be projected on the wall. On the p5.js sketch, I made use of two webcam cameras- one attached to the wall facing the audience and another attached to the robotic spider.

While the user is controlling the movement of the robot, what is projected on the wall is the Webcam attached onto the robotic spider- as if the user is an actual shooter, looking through a rifle gun preparing to shoot their target. 

However, after the building is shot, the projection would turn into the one that is facing the audience instead. This suggests that we are our own target, and we are destroying no one but ourselves through the use of automated weapons.

Surrounding 3D prints
I 3D modelled and printed structures that have two halves: One half is the weapon, and the other half is the biology that the weapon was trying to mimic. For instance:
(1) The robotic claw mimicked the eagle's claw
(2) Boston Dynamic's Spot mimicked a dog
(3) The cyborg cockroach mimicked a cockroach
(4) The airstrike drones mimicked a plane
(5) The submarine mimicked fish
Screenshot 2023-06-07 at 11.12.18 PM.png
Screenshot 2023-06-07 at
IMG_8857 New.png
Some other (disorganized) process pictures
IMG_8851 New.png
Special thanks to:
bottom of page