We are on the cusp of a significant trend in product design and design methodology, where the longstanding divide between physical and cognitive modes of interaction will disappear. What do I mean by this?
Consider, in general terms the distinction between primarily physical human-artifact interactions, and those that are primarily cognitive in nature.
Physical artifacts are those whose primary design intent is the mechanical transportation or transformation of matter. Think automobiles, appliances, tools, basically the main focus of industrial design in the last century. Such products are ideally designed around human anthropometric and ergonomic principles to maximize efficiency, effectiveness and safety. Often these artifacts require significant experience to develop proficiency or “muscle memory”, and those who achieve a high level of skill may be considered craftsmen, or even athletes.
Compare this to cognitive artifacts that essentially store and transmit information – books, radios, telephones, wristwatches, personal computers and so on. While these artifacts all intrinsically require physical interaction, it is typically trivial in its nature - it requires relatively limited manual dexterity to turn a page, click a mouse, press a button. The majority of design focus is on the user interface and its visual display of information – the “heavy lifting” with these artifacts takes place in the head of the user, so design for the mind, not the body, takes precedence.
So we have largely been designing two, co-existing, but separate types of products: those that utilize the complex mechanics of the body to transform matter, and those that accommodate capabilities of the mind for processing information. Products that overlap these two worlds, that is, which take advantage of complex physical interactions to drive information processing, are few and far between. The abacus the telegraph, and perhaps texting, comes to mind. Musical instruments play an interesting role in this context where technical sophisticated body control is applied to the creation of sound, but even this is for artistic or entertainment purposes, rather than concerning the analysis of information in the scientific or business sense.
But now, we are starting to see the emergence of products that hint at using the mechanics of the body to interact with complex information. For example Jeff Han’s multi-touch, interactive data wall, and virtual reality simulations that allow scientists to “feel” the forces between molecules. But even these interfaces are only scratching the surface of what is possible from a physical interaction point-of-view. Getting to this point has taken a long time because it is challenging to track and quantify the multiple degrees of freedom of movement of the body. But it is also the result of our divided design processes, where physical interactions and cognitive interactions have traditionally been designed by different people with different expertise, at different times.
Moreover, such divided design processes are themselves the result of 20th century psychology that treats humans as information processing machines. As a consequence we think of human activity as comprised of discrete, sequential steps of thinking and then acting – I see something, then I reach for it. In other words, a built-in division between cognitive interaction and physical interaction. As a result it’s easy to see why we divide the world the way we do.
But there are alternative perspectives on human behavior, in particular the ecological psychologyy of J.J. Gibson. Gibson coined the term “affordances” which is (mis) used and abused by interaction designers today. But affordances, the relationships between people (or other organisms) and artifacts, are just a part of a larger “perception-action” framework. In this view, perceiving or sensing information is a physical behavior itself, not just a means to drive a subsequent physical action. Likewise, physical behavior drives perception – the two are connected, not divided, resulting in a perception-action loop.
The implications of this for product design are subtle, but important. Taking into account the physical interactions someone takes to acquire information is a useful for understanding and determining what and how to display information. For example, rather than designing products where all information is accessed from a single point, information may be distributed across locations, where the location itself is informative above and beyond the content. We see this emerging with augmented reality applications, where the user’s particular activity, such as walking to a particular location in a city, provides location-specific information. The format and content may be driven by variables such as the person’s posture and gait (e.g. in a hurry or browsing), direction of approach, and of course their physical characteristics such as eye height and reach.
To get to the point where we can design systems to take advantage of complex physical interactions will require an taking a new look at how the fields of kinesthetics, anthropometrics and optical flow , relate to interface design. These fields will be as important to designers as information architecture and form giving are today.
As a starting point, I've been exploring the role of ergonomics in contemporary interface design, to better relate the divided fields of physical and cognitive product design.