As digital product designers, we’re thrown into complex, tense situations trying to make sense of it, including the audience, context of use, and core functionality. Via user-oriented methods we’re taught to not rely upon initial instincts but instead well-grounded “data”, to ward off the rapid fire attacks of suspicious engineers and skeptical executives anxious about their dollars’ applied toward something to guarantee a tangible ROI. Indeed, we must venture into this contest wearing a flak jacket of “data” to protect ourselves from random volleys of anxious emotions. But what does that mean, to have data?
The commonplace notion is “data” encompasses all those usability-lab tested numerical stats or click traffic or rigorous scientific formulae. Actually, there are other kinds of metrics—qualitative and quantitative—such as market share, audience growth, customer satisfaction, and NPS scores. Plus, with ethnography, affective research, and story-based methods, it’s clear that the boundaries of what constitutes data are broadening.
Indeed, just as valuable, is the data of one’s experience: the empirical, observational, and anecdotal types arising from watching and listening to people in their actual context, which adds richness in terms of the nuances of goals and subtleties of problems, beyond what web analytics can provide. Debra Dunn, of Stanford’s d.school (Hasso Platner Institute of Design), says that adhering to Web analytics “makes it very difficult to take bold leaps; it is more from engaging with users, watching what they do, understanding their pain points, that you get big leaps in design”.
Another type of data that shapes design decisions is the designer’s own evolved sense of judgment, perception, and informed intuition, after several years of working with clients/projects across diverse contexts. (Before you scoff, isn’t this true for veteran surgeons, lawyers, accountants, executives? Why not for designers?) For such seasoned, mature designers, this is a vital kind of data from actual field experience in leveraging past mistakes, lessons learned, patterns identified, and drawing upon that reservoir accordingly. The world’s best surgeons are no different in their practice and use of “self-reflective” experiential data to yield superb results. Instinct (in this sense) is simply refined, natural judgment.
Digging deeper, we see that underlying this bias toward “hard” quantifiable lab-based data is an assumption of proving isolated pieces of design solutions as truth, absolute and final.
This contrasts sharply with approaching design as a holistic demonstration of an idea for iteration and evolution in cyclical fashion, towards rapid learning. There needs to be greater appreciation of the fact that data is not truth, but is merely one point in the deliberation over what is appropriate for a context, shaped by healthy skepticism. A productive approach requires a liberal interpretation of data, acknowledging multiple flavors as valid and legitimate, for different phases of a project, given the various constraints and demands.
Ah, there’s the rub—interpretation. All data is subject to human interpretation, and humans, as we all know, are imperfect! As Jared Spool famously said at Interaction’09 conference awhile back, “Any piece of data can be whipped to confess to anything”. In the end, data is used either to support or repel one’s argument. Indeed, design is an intensely deliberative human activity, grounded in debate–even manipulation—toward some reconciling of viewpoints into an outcome. That’s the real battlefield of ideas contested in action among business, engineering, and user experience. Data helps enable and shape a conversation towards shared optimal resolution, not conclusively finalize it. It’s the peaceful coexistence with professional judgment and experience that makes such decision-making more effective and perhaps even right.