What does it look like when our devices stop merely listening to us, and start becoming part of our conversations? How can the technology that lives closest to our bodies actively enhance our relationships with others?
We’ve been thinking about this recently, and to investigate it further we made Blush, a wearable that highlights the moments when your online interests and offline behaviors overlap.
Blush listens to everything that is said around it, and lights up when the conversation touches on topics that are in my Curriculum, a feed of topics that the Lab’s members have recently researched online. (You can read a more detailed breakdown of Curriculum here). Blush is the first of a couple wearable experiments that we’re working on here in the Lab.
Social Wearables / Augmentation
We’re particularly interested in developing wearables for more than just the wearer: devices that engage with the world around them and add to our social interactions. Blush functions as an alternative kind of punctuation in a conversation, a subtle way to include your online life in your offline interactions.
When we converse, we’re constantly sending signals beyond just the words we say to each other: our posture, eye contact, gestures, and other factors combine to add a huge amount of context to what we’re discussing. (Part of what makes a phone call so different from an in-person conversation is the lack of all that extra context that lets us know more about our counterpart’s attitude than just what he or she chooses to say.) We think of Blush as a “social wearable” because it writes to this same layer of the conversation, the layer full of second-order, contextual clues that augment what we’re saying.
Of course, extra information isn’t always good–the signal that Blush provides, whether or not my colleagues have encountered a topic before, can be interpreted many ways: this is boring, I already know this stuff; or ooh, now we’re getting to something I’m interested in! There are also obviously privacy concerns: some colleagues have playfully plumbed Blush to find out if our Curriculum contains anything embarrassing, like “Wrecking Ball” or “beginner PHP example.”
We designed Blush to inhabit the middle ground between the (often subconscious) signals we’re always sending with our bodies and the ideas we explicitly choose to talk about with others. Blush’s role as “punctuation” is important: we did not want it to dominate conversations, or derail them by being distracting. By limiting Blush to simply lighting up, and only doing that when a notable event has occurred, we hope Blush can live comfortably in the real world, augmenting our interactions with a little bit of extra information while not bogging them down.
Blush pairs with an Android app that does the continuous speech recognition. When it hears a match with a Curriculum topic, it activates the pendant over BLE; the pendant itself is a very tiny (and, at present, messy) circuit living on the back of the excellent RFD22301 radio. I’ll post an update on this blog when there’s more to share.