This project was first inspired by a beautiful work: Zero & One - Taeyoon Choi, which explains how our daily activities consist of binary representations and operations, such as finding a path to get through a maze or voting in the context of democratic politics. In Choi’s view, everything can be represented with zero and one if we see this world from a computational perspective.
A book I am reading recently: Metaphors We Live By, by George Lakoff and Mark Johnson, then reminded me that actually there are also a lot of binary expressions in our languages. Lakoff and Johnson gave us many examples in this book, for instance, orientational metaphors:
_Happy is up; sad is down. I'm feeling up. I fell into depression.
_Conscious is up; unconscious is down. Wake up. I fell asleep.
_Foreseeable future events are up and ahead. What's coming up this week?
So I came up with this idea: make a system that connects our cognitive emotions to 1 & 0 binary logic.
Sketch:
_There are 25 input entries in total which are expressions of human emotions summarized by Charles Darwin.
_The input of this system is up & down which will be read as 1 & 0.
_Inputs are connected by binary nodes, and each node has a logic gate attached.
Tangible Prototype:
_Each square has a "keyword" that can be described in metaphorical words. For example, MOOD is up/down.
_Each square also serves as a switch that can be turned on/off by flipping it up/down. Up = 1, Down= 0.
_Values at the squares are the inputs of this system. All the squares are connected by nodes. Each node has a logic gate (one or two inputs - one output).
By continuously connecting the nodes, the number of inputs will reduce to 2 so that there will only be one single output in the end.
Since the way this system "makes sense of" all these metaphoric meanings is determined by the arrangement of the logic gates, its final output could be regarded as a result or an understanding that this system gives back to us.
By combining the ideas I had both in the sketch and the prototype, I made this demonstration of a possible system input interface. Here, viewers’ hand gestures were captured by a webcam and analyzed in real-time using TensorFlow.js, React.js, and Fingerpose.