In this era of the Fourth Industrial Revolution, new technologies are evolving faster than ever. But some experts are concerned that the pace of progress is so rapid that there isn’t enough time for us to absorb and reflect on (and consequently avoid) some of the mistakes that have caused technological development to stumble in the past.
Hilary Sutcliffe, a responsible innovation expert and the director of SocietyInside, addresses this concern in a recent World Economic Forum article in which she looks back to the early days of nanotechnology in. In examining some of the issues that accompanied the introduction of nanotechnology, Sutcliffe identifies five important lessons that one can (and should) apply to all future forms of technological development, from artificial intelligence to gene editing.
Distinguish the brand from the science.
Sutcliffe’s article refers to the “tyranny of the ‘ology’”—the danger of becoming overly fixated on the brand of a particular new technology rather than on the science behind it. Brands like nanotechnology or synthetic biology, in particular, have become popular buzzwords that organizations use to attract funding and academic investment, as well as to demonstrate their commitment to innovation. However, development can be compromised when the glamor surrounding a new technology, rather than how well it works or what risks are associated with it, drives discussions.
For example, the early excitement around nanotechnologies focused on the definition of nanomaterials as having features smaller than 100 nanometers. But while this size specification was an important element of “brand nano,” it proved to be a poor predictor of how these new materials actually behaved or the hazards they presented.
Hype has consequences.
In the new technology sector, competition for funding, media attention, and public interest is fierce. As a result, what Sutcliffe calls an “economy of promises” has developed, in which scientists and businesses hugely exaggerate the potential benefit of their particular “ology” to boost their chances of accessing vital financial support and other resources.
But these overstated claims have repercussions that we should not overlook. One of these is the inevitable tarnishing of a technology’s reputation when it proves unable in the short term to live up to its hype. For example, the 2004 goal of the US National Cancer Institute—to use nanotechnology to eliminate death and suffering from cancer by 2015—can’t help but make us feel disappointed now, even though the technology itself may eventually lead to that desired outcome.
Another repercussion concerns the delicate world of new technology regulation and legislation. Regulators have no option but to start their process based on what scientists and businesses claim their technology will deliver, but too much hype here can distract from a thorough and accurate exploration of a technology’s very real risks and hazards.
Closely associated with the issue of over-promising is the actual language involved in discussing and promoting new technologies. Naturally, we must devise new terms and metaphors when describing technologies and possibilities we haven’t seen before, as well as what problems they might solve, but we need to be careful to consider the impact of our chosen words. For example, many new technologies rely on military-inspired metaphors to evoke a feeling of control, dominance over nature, or extreme scientific accuracy. Not only do such terms lead to unsettling comparisons, they are also not usually reflected in reality, which compounds the problem of over-promising.
Don’t start by obsessing about the backlash.
Yes, new technologies can sometimes prove controversial, but when scientists launch their ideas from a place of defensiveness and confrontation, they often spark the very problems they are trying to avoid. Society does not necessarily have a widespread fear of technology itself. Instead, it has a widespread desire for engaged and collaborative discussion about what the technology is being developed for and what problems it will help solve. While it’s certainly important to think about how to address a potential backlash, imagining that such a backlash is already occurring when it isn’t can obstruct both developmental productivity and useful, forward-thinking societal dialogue.
Weigh the risks and benefits thoughtfully.
One of the biggest challenges associated with the hype, as well as the sheer volume of information around new technologies, is that it can be difficult for us to accurately weigh the real evidence for either potential benefit or possible harm. So much conflicting information and opinions surface when new technologies are introduced that parties from both sides of the debate often fall back on pre-conceived ideas, cherry-picking data to prove the point they’ve already decided on. However, as a society, it’s important that we discuss and weigh the question of acceptable benefits and risks in a thoughtful and clear-headed way, especially because reports have repeatedly shown that early warning signs of disaster are often clear (as in the case of asbestos, for example), but we are held back from acting by systemic biases and behavioral reasons.