Disability & Physical Interface Design ← JJM Homepage

Disability & Physical Interface Design

Arful Design Chapter 5

I am responding to Ge Wang's Artful Design, Chapter 5, specifically the relationship between interface design and the human body. Added together, these three ideas make for a tricky balance within music technology creation:

  1. Bodies matter (Wang 210)
  2. Have your machine learning -- and the human in the loop (218)
  3. Interfaces should extend us (275)

drawing of an index finger with an eye and two ears near its base

GUI's mental model
(Wang 212)

In his essay Myth of Myself, the philosopher Alan Watts points out the division in how we think of our "self" vs. our body, as evidenced by how we talk about it: "We have what I have called the conception of ourselves as a skin-encapsulated ego... If we just refer to common speech, we are not accustomed to say, “I am a body.” We rather say, “I have a body.” ... We feel that our heart beats itself, and that has nothing very much to do with “I.” In other words, we do not regard “I, myself” as identical with our whole physical organism." It is because of this separation that people are able to feel fully engaged by screen-based activities despite the GUI's mental model of the user being extremely reductive.

Conceptualizing the body as the mind's interface with the outside world, I began to think about ways in which people with physical disabilities often have a completely different relationship to physical methods of self-expression. A conversation with a friend in college has really stuck with me. On a long drive, we got around to talking about their voice (they have cerebral palsy, significantly affecting how they speak). Having seen them repeat themself when people couldn't understand them and occasionally being assumed intellectually disabled because of their speech, I expected them to say yes when asked if they would change their voice if they could. They said no - because who would they sound like if their voice was completely different? Not themself. How you express yourself is deeply tied to your sense of self, regardless of ability. If disability is central to someone's identity, they should have access to means of self-expression that also center disability in their design.

Algorithm bias at this point is well-documented, though still lacking meaningful solutions. In a way, it is the dark side of keeping the human in the machine learning loop. While an interface may be designed to extend human creativity, it can easily extend our underlying assumptions and limitations as well. Much of new instrument design has focused on the same source of nuance as traditional acoustic instruments: the fingers and hands. The mental model of these instruments is often restricted to a typical able-bodied person. New musical interfaces are the perfect avenue for exploring modes of self-expression that use the whole body, working for people with and without disabilities. Projects like Robert Wechsler's MotionComposer, which uses larger gestural movements to generate sound, show how algorithmic composition and interface design can work specifically with disability in mind rather than working in spite of it.

Demonstration of the MotionComposer