4.1 Test 1 POC

Every project starts with a spark, and for this one, it wasn't based on a blueprint but rather a simple sketch. Flipping a common cliché on its head I wondered what if we were to "Think inside the box."

While we are constantly pushed to "think outside the box" to prove our creativity, that sketch made me pause. Who ever stops to think about the box itself? This led to the thought What if the box had a voice? The goal was to give life to an inanimate, overlooked object, to give it the ability to sense a person’s presence and speak about its own worries of being ignored. This wasn't about home automation, it was about the existential crisis of an object.

I spent hours thinking about how to achieve this and coming up with reasons why it wouldn’t work. Which was getting me now where so I had tell myself to “stop thinking and start making”. The moto that set the pace for the next 8 months “MAKE MAKE MAKE”. I grabbed some cardboard a cutting knife and got to building a low-fidelity box as a proof of concept to see if I could actually bridge the gap between a physical form and a digital personality.

2- Box_V0.jpeg

Technical Setup

I chose TouchDesigner (TD) as the initial brain for the project. The primary reason for using TD was its native ability to process and play back high-quality audio files with almost zero latency.

In this phase, the priority was rapid iteration over portability. TouchDesigner acted as a bridge, allowing me to treat the object’s voice as a data-driven output, but the physical interaction relied on a split-hardware architecture, with multiple Arduinos in the loop.

I used an Arduino Uno to interface with a distance sensor. This board was dedicated to the constant polling of proximity data, which was then streamed into TouchDesigner via a Serial DAT.

4- Distance Sensor_UNO.jpeg

5- Speaker_Nano.jpeg

To handle the audio output within the box, I integrated an Arduino Nano 33 IoT. This was used to manage the speaker's connection and ensure the sound originated from within the box itself, rather than from my laptop's external speakers.

3- Box_v0_sysARCH.png

I used a DATTO CHOP to listen for the sensor data coming from the Uno, which was mapped to the play/pause button within the AudioFileIN CHOP. As soon as a person entered the threshold, the audio file would play and sound would be heard through the Nano 33 IoT-connected speaker. This allowed me to swap narratives or adjust the volume and pitch on the fly without having to re-upload code to the microcontrollers.

6- TD_Box_V0.png

The Interaction Pattern

I wrote a narrative where the box spoke in the first person. It didn't provide information, it just spoke about the fear of being "just a box" and the anxiety of being overlooked.