top of page
pepi ng

nyc-based designer interested in the intersection between art and technology






generative art
I'llByte: VR Haptic Jaw
I'llByte is a bluetooth, futuristic VR Haptic Jaw created to simulate chewing and create more immersive VR experiences.
Together with my team consisting of Julia, Eloise, Beatriz and Grace, we managed to achieve:

MIT Reality Hack 2023

🥇 FIRST PRIZE of Hardware Track
🥇 FIRST PRIZE Technology Horizons for Human Interfaces
🥇 Grand Finalist
The current state of VR technology focuses on creating an immersive experience through sight and hearing. However, as of today, there has not been a VR device that stimulates the jaw, even though eating and food are such integral parts of the human experience. With such an obvious gap in the VR industry, we decided to take on the challenge.
What it does
We created a futuristic jaw haptic device that consists of two parts:

(1) An adjustable jaw brace
Made of aluminium wire and rubber bands, and can be adjusted to fit the facial profile of different users. The jaw brace is then attached to the harness via springs.

(2) Harness
The harness consists of a vest, 2 servo motors, springs and a pouch at the back. 

The pouch
 houses the bluetooth microcontrollers and physical computing elements used to control the springs.

The tension on the springs can be adjusted via motors. This means that when a user “eats” a particular food item in the XR experience, depending on the texture of the food item, the tension in the springs are adjusted automatically. This simulates the extent of difficulty or ease in chewing different items in an XR experience.
Side view of the jaw haptic device
Pouch at the back of the harness containing microcontroller and other physical computing elements
How we built it
I’llByte was built in Unity 2021.3.16f. When the VR controllers interact with a food item, the program sends a bluetooth message via MIT’s SDK The Singularity. Then the SDK connects with the microcontroller on the back of the harness, which spins the servo motors attached to the front of the harness. As a result, the springs attached to the jaw harness are pulled, creating tension dependent on the VR experience. 3 different tension modes are set up for the prototype VR game, creating 3 different “textures” of foods.
Struggling to link Unity to the microcontroller
Challenges we ran into
One of our biggest challenges was being a team of four designers and one game developer that came later down the road. This came with its pros, as we were able to refine the fabrication of our prototype, but this also meant that the development was very slow.

Coding detailed interactions in unity with four designers was extremely challenging because it requires a lot of C# coding and only one of us knows C#. Most of our time was spent on testing The Singularity SDK as well.

We had intended to use the Ultraleap so that we could use hand tracking instead of controllers, but the Quest 2 cannot connect with Ultraleap easily. The Quest 2 cannot connect with our Macbooks very well either.
What's next for I’llByte: VR Haptic Jaw
  • Test and enjoy the food you create from VR cooking experiences
  • Increase accessibility of seasonal food experiences.
  • Exposure therapy for people who have certain food fears
  • Mobility therapy.
Future iterations can include:
  • 3D print the components of the jaw brace with resin for greater accuracy and precision.
  • Create four different sizes of the device for different facial and body profiles
  • Make use of foam and cloth to increase comfort for users
  • Make use of better adjustable pins to fine tune the fit.
Special thanks to:
  • My amazing teammates: Julia, Eloise, Grace and Beatriz for going through with this intense project together
  • T for providing the best guidance and help
  • Lucas and Aubrey for providing the most valuable mentorship and patience
  • MIT for this amazing opportunity 
  • Everyone else we met during the hackathon- yall were brilliant!
bottom of page