At the recent QCon San Francisco conference, Katherine Jarmul, privacy activist and principal data scientist at Thoughtworks, gave a talk on unravelling techno-solutionism, in which she explored the inherent bias in AI training datasets, the tendency to assume that there will be a technical solution to almost any problem and that those technical solutions will be beneficial for mankind. She discussed ways to identify techno-solutionism and posed questions for technologists to consider when building products.
She started by discussing how the training datasets used in AI systems have biases inserted based on the tags allocated to them by the people who perform the tagging. Large numbers of taggers are among the lowest paid in the tech industry. To illustrate the point she showed a photograph of a man and a woman having a conversation, which was tagged as a worker is being scolded by her boss in a stern lecture and a hot, blond girl getting criticized by her boss. The image did not indicate that either of these descriptions was true, and yet the tagging goes into the database to train AI systems.
She defined techno-solutionism as a naïve belief that any problem can be solved by applying a magic technology box and that the application of technology will change society for the better. Techno-solutionism treats technological advancement as inherently good. She used the example of the first written formula for gunpowder discovered in the 9th Century in China while researching the elixir of life. Is the technology good, neutral, or bad?
The reality is that almost any technical advance has benefits and harms. Often the benefits and harms are unevenly distributed – one group may reap most or all of the benefits while others receive all or most of the harm.
She pointed out that the computer industry is one where techno-solutionism is rife, and she traced the thinking to the early mythology of Silicon Valley, and even further back to the California Mentality of the early settlers who had an attitude of we can overcome challenges, better ourselves and change the terrain. In Silicon Valley, this has emerged as the belief that one good idea can change the world, and make you rich.
She quoted Joseph Weizenbaum, who built what is regarded as the first AI system, as saying that computer technology has, from the very beginning, been:
a fundamentally conservative force, that entrenched existing hierarchies and power dynamics, which otherwise might have had to be changed.
This conservatism has meant that societal change has been impeded and the benefits of technological advancement have accrued disproportionately to a small proportion of humanity.
She gave pointers on how to spot techno-solutionism in action. If you find yourself making any of these statements think carefully about the wide impact of what you are working on:
- I’m optimizing a metric that someone made up
- Everyone agrees on how awesome everything will be
- If only we had _______ it would solve everything
- Mythology speak: revolutionise, change, progress
- People who bring up potential issues are excluded
- I haven’t tested a non-technical solution to the problem
She then gave five specific lessons that technologists need to take into account when building products:
1) Contextualize the technology
Ask what came before this technology, what would happen if it was never discovered, and what we would do without this technology.
2) Research the impact, not just the technology
Look at the potential impact of the technology in the short, medium, and long term. Look wide to identify who and what may be impacted and explore the knock-on impacts
3) Make space for, and learn from, those who know
Identify the people, communities, and groups who are impacted and listen to them. Make sure you communicate their voices and if you are in a position of privilege use that privilege to let other voices be heard.
4) Recognize system change and speak it plainly
Use language wisely and with forethought. She used an example of “revolutionizing” e-commerce to describe a small change to a way of interacting online. Exaggeration and hyperbole are often used to obfuscate the impact of change on disadvantaged communities.
5) Fight about justice, not just about architectures
She spoke about researchers fired from Google for exposing the bias in their algorithms. Lend your voice to those who have been silenced.
She then spoke about her decision to focus on data privacy as an area where she has a passion for change and can make a difference.
She ended with a series of questions for the audience to ponder:
- What could you be doing if you weren’t building what you are now?
- What could you change if you focused on the change, not the technology?
- What if we took collective responsibility for the future of the world instead of the future of technology?