We need more visionary innovation work where we shape new meaning for something that shifts perspectives and environments.
Innovation is not an abstract or new concept, but the tangible result of the human urge for knowledge and progress. Our drive to question the status quo and to be constantly on the look-out for new ideas has barely changed over the millennia. Above all, innovation simply expresses the human desire for a comfortable life in prosperity, peace, and good health. The longings and ideas of how we can make our lives easier and more substantial accompany us in our everyday thinking and acting. As individuals, we seek meaning and purpose. A reason to get out of bed in the morning, a belief tomorrow is a better day.
The origins of innovation lie in the big and small things of everyday life. These are things that we want to improve or adapt to our changing needs and wishes. The finiteness of resources forces us to develop new solutions. Our human imagination propels us to strive for radically new technologies. We derive questions on how to do this or that differently and find the necessary knowledge. In doing so, we inevitably encounter problems and are confronted with the limits of what is currently possible. And that is the genuine starting point for the emergence of innovation: the desire for change and progress by overcoming obstacles and shifting boundaries. We’ve never been as advanced as we are today, but when I look around, I cannot help but see a world that is more unbalanced, unmatched, uninspired, or unhappier. Surely with our design and innovation capabilities, we can avoid that feeling of spinning out? Designed innovations have been the cultural shapers of our human world from the start. Through thinking and making, we have designed objects, tools, systems, cities, and commodities. But today we increasingly want to innovate systems that alter, transform, recontextualize, and self-interpret their own environments. It’s no longer only about images, shapes, form, and words; innovation now influences our social and cultural interactions across societies.
This new condition exposes the boundaries of what works through old thinking and methods geared to preserve the status quo. New choices need to be made. Humans originally evolved in a world of few choices. Prehistoric, preindustrial, and pre-digital eras required far fewer decisions than today’s all-access, always-on world of too much information. That age of information has innovation struggle with its central paradox: we must do something each time that appears to be happening for the first time.
And so, by demand and by reaction, we flood our innovation process with ideas, prioritizing the user-generated or crowdsourced ones. Somehow quantity will provide quality. We stumble on ideas by looking outside, by what we call, ‘emerging ourselves in the problem space.’ Those we talk with, under the moniker of learning, are to provide us with these ideas with which we think solve their problems. People like experts in the domain, users of the device, providers of the service, management of the brand; stakeholders. For all its banner-waving of being human-centered, ironically, we water down the human-ness by categorizing people under anonymous cards, such as ‘stakeholder,’ ‘persona,’ ‘user.’ Every person has a very individual set of knowledge, characteristics, and wishes. Everyone has skills and valuable experience that can be helpful in a wide range of situations in life—personally and professionally. Thus, every human being has the potential to become a catalyst for innovation and to participate in its implementation. Meanwhile, we seek to empower only the select few to carry on that torch.
The radius of our worldview, our depth of field if you wish, is not any wider or richer than those we work with or for. Yet, undeservingly, most designers or innovators see themselves as the vital missing part of the puzzle. The center point of anything and everything that wants to be of value and meaning. Without realizing it, those of us in such advisory roles often bring our own issues to our work helping others. We make assumptions and judgments based on our own experiences that often have little to do with the leader we’re trying to be, or team we are trying to support. I’m worried that the discussion about innovation is losing its vitality and that a handful of beliefs are becoming dangerous dogmas. Two that worry me the most are: Innovation should be user-centered—i.e., users are the first and foremost source of insights. Innovation processes should, therefore, start from observation of mainstream or extreme users. It’s lovely in theory, but it is more meaningful to focus on what makes us human, and less on what makes us users. Innovation should be something more than a focused pursuit of creating simpler products–and then wanting this to spark emotional connections with users. Those looking for a prescribed way to implement Design-Thinking are destined to be disappointed. Innovation is a messy, opaque process that depends as much on group dynamics as intellect or insight. Our second mistake is that we see the innovation process as more important than the outcome. There is no one “true” path to innovation. Yet all too often, organizations and the designers they partner with act as if there is. They lock themselves into one type of strategy and say, “This is how we innovate.” It works for a while, but eventually, it catches up with them as they spiral down the narrow path of iteration. Locked into a set of solutions that no longer fits the problems they need to solve. It’s a narrative of becoming square-pegged in a round-hole world. Innovation always loses relevance this way.
Just as it will lose impact when all the responsibility is placed on the design effort. Yes, good design is often made by great designers, but these are talents that are few and far between. The 0.1% of those practicing the profession are the ones who are believed to have the magic silver bullets. User-centered innovation is very effective for incremental innovation but fails when it comes to breakthroughs. Designing something innovative to truly fix a human condition, support or protect them, or understanding them to such a degree that it becomes part of their identity; you’d be forgiven to think there is a supernatural requirement about it all. Let’s forget about innovation as a magic process and focus on how designers and managers should best work together to deliver great quality outputs. Just as the marketing department wouldn’t rely on a single marketing tactic, or a CFO on a single source of financing for the entire life of an organization, there is a need to build up a portfolio of innovation strategies designed for specific tasks.
It’s self-evident, but frustrating, that this is not truly understood: one size does not fit all in innovation. Different innovation problems require different approaches. There is no one method that is always good. The more disruptive an innovation is, the more emergent its process. If all companies converge on the same approach, innovation becomes less of a differentiator. The most innovative companies are those that question the existing innovation paradigms that explore new avenues. Today most innovation efforts are really primary efforts in tech-first iterative cycles. Without realizing it, most innovators, designers included, are having functionality tunnel vision. Pushing frantically for the next version of things they have seen before. Innovators and designers help their technologies find a problem rather than problems finding a solution. A more holistic approach to design-made innovation is inevitably becoming the option to explore. Innovating the known is an important half, but opening it up to more is the everything. Design should always be in the service of a better life, but, unfortunately, it does not always achieve that objective. We can all think of examples of design projects, even the best-intentioned ones, which threaten to make our lives worse rather than better. I have yet to meet a designer who wants his or her work to be dysfunctional, dispiriting, demeaning, or disempowering, but sometimes the reality is that it is. Some design projects prove to be damaging because of the way in which they are applied. The computer virus was an innovation, originally designed as a self-replicating form of software that could be installed remotely without the user’s knowledge, but it was not intended to be malignant. Quite the contrary. Sadly though, it proved to be open to abuse and to create destructive viruses.
This exposes a blind spot because no one seemed to have considered that self-replicating software can be anything other than useful. And if the alternative downside ever was considered, it certainly was not acted upon. There is another blind spot in plain view today, one that has us questioning what innovation is actually for. In Part 2, I expand on what it is we should call innovation, based on the blind spots surrounding the digital app economies that are currently billed to be improving our lives. What I can signal here already is that there is a new path to meaningful innovation to consider, one that better fits our changing world. We’re already supplied with a surplus of ideas, yet feel desperately lacking in meaning. It’s time to innovate the way we innovate. The old ways do not apply because these are geared toward solving known problems only. This is a point that previous champions of user-centricity now also make. For example, Don Norman, a pioneer in human-centered design, recently raised issues about the limits of user-centered innovation, arguing that it cannot create breakthroughs. The developers of the Nintendo Wii didn’t get close to users, they got close to ‘interpreters’: media people, artists, designers, sociologists, retailers, suppliers, etc. Innovating for the future is first about finding the right interpreters of that future.
Innovation might be the buzzword word du jour, but it is hardly a new concept. In fact, it is as old as the human race. A Google search on what it means yields 487,000,000 results in point two seconds. Lately, the word itself has become so prevalent that it’s difficult to recognize meaningful innovation from simply an updated idea. Through design and innovation we create what we see, what we use, and what we experience. In this time of crisis and revolt against social and economic imperatives, designers and innovators can choose what they dedicate their profession to. Iterating and ideating on products or services that only aim to increase consumption at the expense of personal data will have us look for what we can predict only.
If all companies converge on the same approach, innovation becomes less of a differentiator. The most innovative companies are those that question the existing innovation paradigms that explore new avenues.
We need visionary innovation work where we shape new meaning for something that shifts perspectives and environments. Innovation that restores the humanity back into the societies we design for. Things like economic, environmental, or social crises shouldn’t really be happening, or be left unmet. Certainly not with the capabilities in tech that we have today. This is the step change required. One that requires a different approach altogether. That requires vision. Designers stuck on a path of incremental change have become less visionary. Spending all their effort in getting close to consumers to find the right answer to a tech problem and then attempting to become businessmen as to try to solve the problem commercially. Those designers have lost vision.
To the best of my recollection, it wasn’t until 2010 that anyone ever called me “innovative.” But now, I hear it all the time. Often I notice a particular temperature around this statement. That is, I either sense the temperature in the room go up, or down, whenever I am declared to be “innovative” or an “innovator.”
So what happened? For a designer, the ability to innovate and quickly solve problems within their project work is one of the strongest values they can bring to the proverbial tables. Companies are looking to hire creative and innovative specialists in the hopes of not becoming stale or falling behind more flexible and agile startups. To stay ahead of the competition, the senior managers within a brand are paired with innovation specialists or advisors, experts who have previously used creative thinking to launch successful projects. However, in labeling enterprising individuals as “innovators,” these managers may doom them from succeeding before they even start. As the person who is looked upon to dazzle with your new creative solutions, all eyes are on every move you make. Every word you speak is heard as a directive, a solution, even a guarantee. But as real innovators know, the only way to create real change is to take risks, and with those risks often comes the potential for failure. Being under the spotlight can magnify failures and make innovators more risk-averse—influencing their ability to create the type of disruption needed for real change. When a person’s role is declared as ‘the innovator,’ he or she is going to be taking risks under the spotlight. That is daunting, even for the best and more seasoned among us.
Consciously or not, it will hold back the quality of the creative thinking that is being asked for. Although ‘innovation’ has become an increasingly popular buzzword, the overwhelming majority of people maintain a strong, innate bias against new ideas—paradoxically, even those ideas they claim to want. For a work to be truly creative, for an idea to explore its potential and reach, it has to depart from the norm and push beyond the boundaries of the known and proven; that very departure makes many people uneasy. Objective evidence shoring up the validity of a creative proposal does not motivate or condition people to accept it. If you are a practicing designer or consultant, you’ve witnessed your clients dismiss creative ideas in favor of ideas that are purely practical—tried and true.
For example, in a Cornell University Social and Behavioural Sciences studies, participants expressed a negative reaction to a running shoe equipped with nanotechnology that adjusted fabric thickness to cool the runner’s foot while reducing the conditions that create blisters because of friction. To uncover bias against creativity, the researchers used a subtle technique to measure unconscious bias—the kind to which people may not want to admit, such as racism. Results revealed that while people explicitly claimed to desire creative ideas, they actually associated creative ideas with negative words such as “vomit,” “poison,” and “agony.” These associations play up the bias that causes subjects to reject novel, high-quality ideas for new products. These findings imply a deep irony. Uncertainty drives the search for and generation of new and creative ideas, but uncertainty also makes us less able to recognize creativity, even when we need it most. Revealing the existence and nature of a bias against creativity can help explain why people might reject creative ideas and stifle scientific advancements, even in the face of strong intentions to the contrary. The field of creativity may need to shift its current focus from identifying how to generate more creative ideas to identify how to help innovative institutions recognize and accept creativity.
Anti-creativity bias is so subtle that people are unaware of it, which can interfere with their ability to recognize a creative idea. The next time your great idea at work elicits silence or eye rolls, you might just have experienced that anti-creativity bias. Research indicates people don’t even know what a creative idea looks like and that creativity, hailed as a positive change agent, actually makes people squirm. How is it that people say they want creativity but in reality often reject it? This initiates tension between failure or incompetence; when people reject a creative idea, did the idea fail? Or are those evaluating it incompetent? Did the originator(s) of the idea fail in their positioning and storytelling, or did this happen out of incompetence?
Speak to any designer or consultant, especially if they are still making their way into the discipline of innovation, and they will share that the inner battle that rages on in their minds and guts is the constant questioning, the relentless balancing whether their thoughts are good enough, competent enough, to be expressed. Or, if these will be labeled or result in failure. Because of being under that spotlight, carrying the weight of having the right answers and ideas, creative people are often merciless self-critics to begin with. Our identity is tied closely with our ideas and how these are received either builds us up or tear us down if we let that happen.
Let’s face it, many people have the attitude, “If it ain’t broke, don’t fix it.” I’ve once heard a manager use that as a reason to reject insight-based creativity. To which I quipped that my role in the project is that “If it ain’t broke, perhaps I need to hit it harder.” Jokes aside, these people will rarely ever understand the need for change or why their bosses even needed to hire someone to that end. Innovative ideas can displace colleagues, potentially jeopardize their jobs or title, or at the other spectrum, create more work for them. Not everyone is adaptable in the workplace and this fear can cause distrust, or negative perceptions, of those identified as the “innovators.” We then end up having permission to do everything but the ability to do nothing.
Our practice has made Design-Thinking our go-to move—wanted to make sure the right problem is defined before we unleash our expertise toward solving it. Designers, like myself, act a little like an anthropologist to understand human needs and problems before jumping to solutions. If you’re anything like me, you delay the converging as long as the project can tolerate it, and you may even have my habit of not even using the term ‘solution.’ Instead, I describe these as ‘possibilities.’ Whatever is labeled a solution today, is tomorrow’s design problem. To condition the right expectations and mindset, using the term ‘possibility’ over ‘solution’ has proven, for me, to soften the tensions around failure/incompetence. Both for myself and for the teams involved. But it has not yet helped those outside of the practice of creativity and advice to let go of the need to have validation for new ideas before approving them. Most of those in business, if they need to embrace how to do something new, need “the illusion of rationality.” For the innovation initiative to move forward, this round of irrelevance is a waste of time and resources—yet one that keeps a lot of people occupied, as I learned firsthand.
Ultimately, while basic design and creative methods can be learned much like muscles, and developed and strengthened through practice, this shift in mindset requires a different kind of leadership. It necessitates a new breed of leader who has developed and can use both sides of the brain—linear analysis for planning and executing when the decision-making information is known, and a discovery mindset for using small bets to create the data. I will wear my bias on my sleeve here because for me, this is a role that a designer-atheart is perfectly suited for. Our role is to embody the culture that is conducive to any playbook of innovation.
Such a culture is not only good for a company’s bottom line, it is also something that both leaders and employees value in their organizations. During projects at companies across the world and regardless of industry or market, I’ve often informally surveyed dozens of managers about whether they want to work in a way that will see innovative behaviors as the norm. Creative, open-minded thinking, the exploration of ideas. I cannot think of a single instance when someone has said, “No, I don’t.”