Interact 2016 – Insights in Self-centred Design

This week I spoke at Interact 2016 – slides and script below – a rather intimidating lineup alongside digital and physical architects whom all I greatly admire. I’m very grateful to Nomensa for the opportunity – not only is every talk an immense learning opportunity in public speaking, but I value even more the work prior – for the rigorous synthesis and curation of empirical insights is a process all designers should engage within as a practice of communication and reflection. Enjoy!

Before I get started I want to take a moment to reflect back and reiterate the theme of this year’s conference:
How will people interact with technology in the future and how do we design for it today?

Embedded within this question is a forecast of technology, but also, a resulting response in the condition of design – or the methodologies that must exist to enable these technological futures. But I want to take this a step further and suggest an additional layer to be regarded, that as designers we should also consider the conditions in which we design – and by this I mean the environments, dialogues and pursuits that enable designers to reflection upon the implications of technology, our designs and our participative role as both designer and user.

future of technology and design

Today I want to talk about all three of these by way of specific examples in my professional and personal life to stress the importance of formulating your own forecast of our future interactions with technology, defining the corresponding opportunities and responsibilities embedded within your role as a designer, and most importantly, integrating time and space for reflection on the implications of these interactions.

I call this talk Insights in Self-centred Design because Self-centred Design is my own hypercritical personal practice of mediating technology, design and reflection.

A forecast of technology

A key catalyst that is, and will continue, to transform how we will interact with technology in the future, is the Internet of Things. Though I will assume we are all familiar with this term, I also suspect we have diverse definitions and experiences with it, so I am going to focus specifically on my experiences working in enterprise IoT.

As mentioned earlier, I work for Zebra Technologies – and for those of you who don’t know Zebra, we traditional specialise in data capture products – from barcode printers and scanners to rugged mobile computers – but also maturing and emerging technology, all of which are about giving physical things a digital voice.

zebra technologies innovation and design team

The technological transformation of enterprise IoT and resulting challenges (and opportunities) are best explained by breaking down who are our users, what are their workflows, and where – or what – are their corresponding contexts or environments.

enterprise UX - interaction design challenge

Our users are what we refer to as situationally disabled, meaning they might be under extra cognitive stress or physically limited regarding how they receive and communicate vital information while simultaneously performing another task. A nurse for example, while focusing on accuracy within patient care – the correct medication or documentation of vital stats – might also be calming a stressed patient or emotional family members.

enterprise UX healthcare

In addition, Zebra, like many other tech companies, is undergoing a transformation as the definition of a product shifts from isolated hardware to the integrated solutions enabled by the data the hardware collects – this also shifts our understanding of a ‘user’ – which might not only be a person, but also a place or thing as we ask ‘who, what, or where is our user.’ Robots within Amazon fulfillment centers are a perfect example of a non-human user of which it is equally important to understand their roles, goals and limitations.

enterprise ux robotics

Another example is at Leiden University Medical Centre in the Netherlands, in which Zebra embedded the facility with beaconing technology to track and measure door to balloon time of patients at critical points within the treatment process. This solution not only required patients and medical staff to be considered users, but also the technological components – cloud and wristbands – and hospital as a dynamic, operating entity.

enterprise UX healthcare

Therefore within enterprise Internet of Things at Zebra, our users are situationally disabled people, places, and things.

And what is their intent, or workflow? Within enterprise UX we have two distinct categories of workflows – frictionless and exception-based. A frictionless workflow implies that a specified user is focused on a particular task, or sequence of tasks, and any associated technology is implicitly assisting in the background.

enterprise UX frictionless workflow

Whereas an exception-based workflow is when an error or problem proactively transfers the attention of the specified user – whether person, place, or object – from the original task flow to exception-handling, and often engaging with technology more explicitly.

enterprise UX exception-based workflow

Therefore, we have situationally disabled people, places, and things transitioning between frictionless and exception-based workflows.

What are the environments in which these workflows take places? Enterprise environments are often described as harsh – often due to extreme environmental conditions – cold, noisy, exposed – all of which can cause situational disabilities as previously mentioned.

enterprise UX environments

But more importantly, they are more often than not very complex, with many layers of information that might be tangible, augmented and/or virtual.

enterprise UX environments

I love the example of an Amazon fulfillment center because it is the epitome of hidden complexity. Each and everyone one of warehoused items not only contains layers of meta data, but also same and similar items are purposefully placed at certain proximities away from each other to avoid picking the wrong item. Imagine the task of efficiently and accurately navigating this organized chaos to fulfill an order – it is more difficult than it appears at first glance!

enterprise UX environments

Which means we have situationally disabled people, places, and things transitioning between frictionless and exception-based workflows in complex, data-rich environments – Not only is this a mouth-full, but also a head-full.

enterprise UX - interaction design challenge

But if we turn it into a non-enterprise specific design challenge and ask “How might we… situationally enable people, places and things transitioning between… implicit and explicit interactions… in complex, data-rich environments” we have a more approachable and universal starting point relative to the internet of things.

enterprise UX - interaction design challenge

But as an interaction designer what is my role in this challenge?
What are the responsibilities of a designer?
What is the condition, or state, of design as a discipline?
What does it mean to design these implicit and explicit interactions?

The condition of design

Despite the ambiguity of who, what or where is a user at Zebra – as an Interaction Designer, I design how people interact with, through, or without technology. A deliberately infinite and inclusive definition that ultimately focuses on people and prepositions, as the technology is optional, and it is relationships that concern me.

definition of interaction design by karey helms

Because of relationships – space becomes place – or to paraphrase Paul Dourish – space, referring to the physical and mechanical elements of the environment, and place, referring to the ways in which space becomes vested with social meaning.

interaction design - space becomes place

Because of relationships – appearance becomes presence – inviting action, participation and dialogue, as in Availabot by the former Berg.

interaction design - appearance becomes prescence

But also by definition, interactions are reciprocal, so you could also say I design how technology interacts with, through or without people – making people optional. Which is perhaps getting a bit abstract, especially considering technology is increasingly intangible, embedded, dynamic, synthetic and implicit.

definition of interaction design by karey helms

The Implicit Interaction Framework by Wendy Ju of Stanford’s Center for Design Research is a great reference to map implicit interactions, those that often behaving in the background, with explicit interactions, or those that demand our attention for direct engagement and manipulation.

Implicit Interaction Framework by Wendy Ju

In her framework she has four quadrants defined by two axes – the vertical axis being attentional demand and the horizontal axis being initiative. Meaning in the foreground are interactions that demand the user attention, and in the background are interactions that elude the user’s attention. Interactions initiated by the user are reactive, while interactions initiated by the system are proactive.

When you populate the quadrants it is very easy to see the most explicit interactions – direct manipulation and command interfaces – in the top left, and the most implicit interactions – ambient agents in the bottom right.

Implicit Interaction Framework by Wendy Ju

But where this framework becomes really interesting is when there are transitions between quadrants. In her writing and talks, Ju gives gives the non-technological example of an interaction between a pedestrian and a doorman. A doorman is proactive waiting in the attentional background near the entrance of a building. As he sees a pedestrian approaching, he proactively puts his hand on the door handle in a grand gesture, moving into the attentional foreground of the pedestrian. The pedestrian reacts by making eye contact, a signal of affirmation, and walks towards the door.

Implicit Interaction Framework by Wendy Ju

There are two aspects of this sequence of interactions that I want to highlight. The first being the emphasis on how mundane this is, and thus such a prolific example of the majority of our daily interactions. Secondly, you could very easily replace the doorman or the pedestrian with technology, and very quickly you have a choreography of people and technology interacting with, through or without each other. And ultimately, this choreography – or transitioning between implicit and explicit interactions would not be possible with first designing relationships.

But in order to design new relationships or transform old relationships, we must first understand our existing relationships.

The conditions in which we design

The conditions in which I understand my existing relationships is a personal practice I call Self-centred Design – in which I prototype playful interventions into the mundane moments of my daily life to reflect upon my existing relationships with technology as both designer and participant.

I’m going to briefly present examples of two Self-centred design projects and the resulting insights for each to hopefully inspire you to reflect upon the conditions in which you design.

The first project I want to share is called Burritothere is no significance in the name – Burrito is a bot, intended to be a manifestation of a core relationship in my life, my marriage.

Burrito bot - Self-centered Interaction Design by Karey Helms

Every time my husband and I message each other, the emojis we send are stored in a database and based on a mutually defined algorithm, Burrito tells us weekly who is the better spouse.

Burrito bot - Self-centered Interaction Design by Karey Helms

Sounds bonkers? I agree, but bare with me. The actual idea for Burrito came about when it was suggested a book that there are always three entities in a relationship – the two individuals, and the relationship itself – wich is the epitome of something implicit that I wanted to make explicit. So I set up a bit of a janky configuration using IFTTT, WordPress and some JavaScript to set Burrito in motion.

As of recently, we have switched from messenger to Telegram so that I can program Burrito a ‘real’ bot, but I immediately encountered a whole new array of problems mainly due to a decrease in emojis and an increase in stickers and GIFs. How do I, or should I say, how does Burrito, assign meaning and weight to this new medium? And how does Burrito know that a sticker is directed at one of us as recipient versus a third party person or topic? Why isn’t Telegram smart enough to flip, or orient, stickers relative to each participant? How does Burrito know and understand the overarching context?

Burrito bot - Self-centered Interaction Design by Karey Helms

This project kick started a critique of other conversational UIs – in particular the frustration of conversations on Messenger with my sister. My sister has a one year old and works from home, so is often sending me voice memos, whereas I’m in an open office or a design event, secretly sending texts yet struggling to listen to her messages. Again, why isn’t Messenger smart enough to be aware of our individual contexts so that content is both appropriately sent and received. Why can’t I receive transcripts and she listen to audio clips?

Burrito bot - Self-centered Interaction Design by Karey Helms

Not only Burrito, but also my proceeding critique of other conversational UIs led me to ask what are the biases?

Insights in self-centered design by karey helms

Obviously as the creators of the algorithm, Ben and I have our own, individually and together, but there are also biases embedded within the constraints of the technology. For example, Burrito can’t decipher implicit meaning of GIFs because that information is not technically available for Ben and I to rank… or I’m just not technically capable of extracting the appropriate metadata.

Therefore, how, when and do we expose these biases, and to what fidelity of legibility? Should Ben and I know each other’s rankings? Should we be aware explicitly aware of all parameters, and the corresponding lack of, that lead to Burrito’s spousal judgement? And if so, how could and would it inform our resulting interactions?

Which leads me to ask what, or who, has agency?

Insights in self-centered design by karey helms

Who can intervene and when? Am I ultimately in control as the programmer? Or should Ben have the agency to say when it is too much? Is opting out even an option and when should that be defined? Is the technology actually assisting us or is it in control? I would argue the latter as we are not having a dialogue with Burrito, but through, and thus can not defend or amend Burrito’s judgement.

But also, what is the periphery?

Insights in self-centered design by karey helms

What is the immediate context – the concurring background of implicit interactions and information, though outside of our attention, has the ability to influence where our attention is focused? Both the conversations with Ben and my sister are examples in which both Burrito and the messaging platforms’ knowledge of context external to the conversation could result in the conversation’s content to be more appropriately and meaningfully presented.

The second project is Phygital Party Mode, again an exploration of relationships – specifically an Internet of Things exploration of my relationship with my website, and others interacting with and through my website to me.

Phygital Party Mode - Self-centered Interaction Design by Karey Helms

Essentially, when you go to the bottom of my website you can click a link to enter ‘party mode’, and my portfolio transforms into a musical instrument. It is currently disabled as I am refactoring the code from HTML5 to Google Polymer, but you can still access the original version, including cat and patriot modes…

The project inadvertently instigated an unhealthy relationship with my website statistics, for which I was constantly curious of visitors interactions and reactions – because for me, their appearance meant presence, and I wanted to engage in a dialogue.

So via Arduino, I connected Party Mode to a Philips Hue light and some piezo speakers in hopes of phygitally embodying a visitor’s interaction with party mode in my flat. Much again to my husband’s disappointment, I got it working, but the project actually became interesting when I sent the prototype link to friends, which ignited a new set of challenged and many questions.

Phygital Party Mode - Self-centered Interaction Design by Karey Helms

The first being – how does it scale?

Insights in self-centered design by karey helms

All the sudden I was moving from one user, myself, to multiple. How do I manage the interactions of multiple users with the technology, and through the technology, with each other? Could they create a band? What emergent patterns would and could emerge? Can I interact back? Do they need to know there are others participating, or even myself observing?

Also, how does fail?

Insights in self-centered design by karey helms

As I become accustomed to the interactions in my flat as an indication of my portfolio’s health – when there are no interactions what is that indicative of? A failure of technology? Or a correlation between interest in user engagement? If the former, is there a backup, or is a backup needed?

Asking how it fails also helps define what is success – what is meaningful or important for each entity in the relationship, and therefore if and how to preserve missed interactions, or the necessary qualities to extract.

Which leads to, how, when and where, is it relevant?

Insights in self-centered design by karey helms

What is the appropriate time and place for these interactions to take place? If I’m not home, does party mode still play? Or is there a historical equivalent by which I can playback the day’s melody? Does time impact the fidelity of playback? Or do missed interactions remain invisible? What if my husband is working from home, is there a kill switch?

So when I look at all these Self-centred Design insights together – it is very clear to me that they, like most design insights, aren’t married to the specific projects or users – but are prevalent themes indicative of much larger trends in both technology and design.

Insights in self-centered design by karey helms

And while I might have come to many of these insights through a multitude of avenues – self-centred design not only enabled a faster path, but also provided a unique condition in which as a designer I can take responsibility for what could be the resulting choreography, and as a participant I can reflect upon what should be the resulting choreography.

So when we ask – “how might we situationally enable people, places and things transitioning between implicit and explicit interactions in complex, data-rich environments” – we can make responsive and responsible design decisions.

enterprise UX - interaction design challenge

Thank you.