Trust in Technology
Dr Lynch explores the troubling phenomenon of fake news and why technology should be seen as part of a solution to provide checks and balances on humans’ intent on exploiting it.
It strikes me, as we move deeper into the era of fake news and the leader of the free world discredits long-established news organizations with a stab of his finger in the air, that trust is becoming more important than ever.
As is often the case, technology is to be found to be a source of good and ill, as discovered by Teller and Oppenheimer at the dawn of the atomic age. Take blockchain as an example, which the Economist dubbed "a machine for creating trust". Blockchain is a reliable public ledger that anybody can access and inspect but no one single user controls. Suddenly power is no longer in the hands of one institution but spread amongst its members. No need for a centralized stock market, land registry, currency and so on. Updates can only be made following a strict set of rules and with the agreement of participants and it is immediately obvious if a change has been made to the information as it would not match other versions on the ledger. Blockchain harnesses both the advances in cryptography and in processing power to answer our need for a robust and trustworthy system.
From its early incarnation hosting bitcoin transactions, blockchain is also now used in a broad range of transactions that we used to trust governments and banks to oversee.
Technology can be at the root of the spread of distrust, and the most recent example of this the preponderance of fake news. In and of itself, fake news is not a new phenomenon. The business model of many tabloid newspapers adheres to a model of journalism that bears very little resemblance to recounting actual events. On the other hand, I probably do not consult these publications to form an opinion on a matter of grave importance, but instead for distraction and entertainment.
Nevertheless, what we have begun to see recently is a troubling phenomenon. The deliberate misinformation of the public in an attempt to sway a vote or curry favorable opinion is flooding new media outlets like Facebook and Twitter alongside legitimate, double-sourced pieces of information. Associated with this is the creation of technologically enabled social media echo chambers creating polarized views. The speed at which misinformation goes viral and reaches millions of readers is staggering and worrisome. During the last US election, it appears that the attempts to discredit the presidential candidates were not carried out only by campaign teams, but also by foreign governments with a vested interest in a particular outcome. The ability to influence an election so directly, if indeed that did happen (and we'll never know), is made much easier through technology.
Allegations have been made that Russia hacked into the servers of the Democratic Party and leaked emails that damaged Hilary Clinton's presidential bid. President Trump has alleged that tampering with the electronic voting machines in some places made Hilary the undeserving winner of the popular vote. For weeks, attempts to undermine the election results were made by both sides. In response, election officials in the Netherlands have made a decision to return to paper ballots over electronic voting to ensure that the election results can be correct. Trust in technology is waning.
These are the early canaries in the gold mine, and it is only the beginning. As more of our corporate and personal life migrates online, the more attractive it is for criminals. In cybersecurity, these days, it is widely accepted that the bigger threat to a company is not the theft of bank accounts or personal details, it is a so-called trust attack. It is possible to infiltrate the network of a well-established (and cyber-defended) bank with a piece of code that lies dormant and undetected on the network for a long time. When the code is activated, it could for example, alter the balance on an account by a very small amount and do this over several months, corrupting backups. When alerted, bank officials will have no means of establishing what the true balance in the account is, completely undermining the integrity of that, and all other, information they are holding. Counter parties can no longer trust them and that is the end. The consequences of trust attacks may be far greater than the type of attack that cost Dido Harding her sleep, or has put the Verizon acquisition of Yahoo in jeopardy.
What if a trust attack is perpetrated on one of our trusted sources of information, even the BBC? The BBC usually comes top in worldwide polls of "who do you trust to know what is really happening?" - except notably in Russia. The BBC agonizes over impartiality, giving equal air time to politicians of every stripe to avoid accusations of bias. When I was on their Board, after hearing all of the impartiality analysis, I was always pleased to see the Beeb being accused of bias in both directions on any contentious issue. If both sides are convinced you are biased against them you can't be going too wrong! A debate for another day is whether this impartiality should apply to science where an oncologist's opinion should carry more weight than that of a nutter proclaiming that the cure to cancer is to be found is in eating pomegranate seeds with ice cream. A trust attack, however, might just put that nutter's opinion in the mouth of the experienced oncologist and the combination of news outlet and trusted source would satisfy the public that this was not a piece of fake news, and what about the day of a blue moon when the nutter turns out to be right?
Truth and trust appear to be the same sides of a coin. Technology has not created the fake news problem, nor is it responsible for the erosion of trust in our institutions. I remain optimistic about technology being part of a solution to provide checks and balances on the humans' intent on exploiting its strengths to weaken society.
As for truth, it's a lot more mercurial and fragile than it may seem. The last sentence of this article is false.