The world needs a better way to regulate Big Tech’s unchecked power
As more of our actions and interactions are mediated through technology, those who write the rules of tech are increasingly writing the rules of society
It wasn’t all that long ago, certainly within most of our lifetimes, that digital technology seemed to be the answer to all our problems. Pick up virtually any book about the promise of tech published in the 1990s, and even into the early 2000s, and it was presented as almost inarguable that the democratizing effects of the digital revolution would bring a slew of benefits to civilization as we know it.
Today, that premise seems on far shakier ground. While there are plenty of reasons to still get excited about tech, there’s no shortage of reasons to worry. In his brilliant new book The Digital Republic: On Freedom and Democracy in the 21st Century, barrister and author Jamie Susskind questions how freedom and democracy can survive in a world full of all-powerful digital technologies.
Digital Trends: What’s the central argument that you’re making in The Digital Republic?
Jamie Susskind: The central argument is that we have a problem with the tech industry. That problem is not with individual bad apples at the top or particular corporations. It’s the problem of unaccountable power, through a lack of proper governance.
My book tries to diagnose where that power comes from, why it’s a problem, and how we can make it more accountable in a way that preserves freedom and democracy.
The Roman Forum, widely regarded as the birthplace of the republic form of government. GettyDT: Explain what you mean by the ‘republicanism’ referred to in the book title.
JS: It’s drawing on the ancient republican philosophy that stretches back to the Romans. This isn’t the republicanism of the modern Republican Party, nor of those who want to get rid of the monarchy in, for instance, the United Kingdom. Republicanism is a philosophy which holds that the purpose of law and politics is to reduce unaccountable power in society. For example, a republican would argue against the idea of kings, not just against a particular bad king. They wouldn’t hope for better bosses; they’d argue for employment rights. They wouldn’t complain about unpleasant slave owners; they’d fight for the abolition of slavery.
Applied to the digital context, digital republicanism says that it’s inherently problematic for an enormous amount of power to be concentrated in the hands of those who own and control digital technologies. That’s the case even if we happen to agree with how they exercise that power from time to time.
DT: Tech companies frequently face critiques, at times from both sides of the political aisle, about becoming political in some sense. But is there any way that they could have avoided this? It seems inevitable. Even the broad idea of a computer interface is, in a sense, ideological because it structures how we perceive the world. Add in the mission statement and scale of search engines and it appears that this problem was always going to arise.
JS: I think so. The central argument of my book is that digital technologies exert power – whether or not it’s conscious or desired on the part of their creators. All technologies contain rules that we have to follow when we interact with them. The rules of Twitter state that you cannot post a tweet if it’s more than a certain length. The rules of a self-driving car may state that they won’t drive over a particular speed limit, even in the case of an emergency.
As more and more of our actions and interactions and transactions are mediated through technology, those who write the rules are increasingly writing the rules of society. You may consider yourself to be an entrepreneur or an engineer or a tech executive or whatever, but you’re still performing a political function in society that should, in my view, be held accountable accordingly.
Digital Trends GraphicDT: What’s the answer to that? Engineers and executives most likely aren’t elected politicians. Should they try their best to aim for a stance of impartiality or neutrality?
JS: There’s no such thing as a neutral posture that can be adopted. That’s because neutrality itself is a choice between alternatives. For instance, if you’re neutral about the content that is posted on your social media platform that might mean being neutral about hate speech, or rape threats, or child pornography. Another example involves Google’s autofill suggestions. Google used to have a problem with its autofill responses coming back with unpleasant suggestions – so if you typed in, ‘Why do Jews,’ it would come back with ‘have big noses’ or ‘own the media.’ Google’s defense for that was that it was neutral because it reflected the queries that had been made by people in the past.
To me, that’s a good example of when neutrality is the same as injustice. Instead of changing or helping reduce the amount of discrimination in the world, Google amplified and enlarged it. As the Holocaust survivor Elie Wiesel used to say, neutrality favors the oppressor. There is no neutral posture that digital technology owners and controllers can take. I think we just have to accept that there are always going to be decisions that involve priorities and trade-offs and principles and, sometimes, prejudices.
The real question is how do we manage and govern those? We should govern them in the same way that we govern other unelected people in society who hold positions of social responsibility, be they doctors, lawyers, bankers, teachers, broadcasters. These are all industries in which people have unique positions of social responsibility, and the law imposes certain duties on them as a result.
DT: The question of neutrality has recently been raised with a lot of the discourse surrounding Twitter and the seemingly now-aborted Elon Musk takeover. Some have suggested that platforms such as Twitter have a bias and that some of the problems of social media could be solved if they acted less.
JS: One of the long-standing themes of republican political thought is that if you adopt a position of neutrality or of abstention in the social and political fray, what you’re actually doing is creating space for the strong to dominate the weak. A social media platform in which there are no rules doesn’t give everyone equal rights to participate. It means that certain voices are going to be drowned out, certain people are going to be hounded off the platform. In the real world, the state sometimes intervenes in the lives of the people within a polity in order to rebalance power imbalances. Tech should be no different.
Hanif Jackson/Digital Trends, Getty ImagesDT: There seems to be a real wave of tech skepticism at present, certainly when you compare it to, for instance, the cyber utopianism of the 1990s when there was the sense of a Californian ideology that could solve all our problems. Can you pinpoint when things changed?
JS: I think it’s quite clear that it happened in 2016. That year, the Remain side lost the Brexit referendum, and the Hillary Clinton campaign lost the electoral college in the United States. In both of those campaigns, claims were made by the losing side – and on behalf of the losing side – that the winning side had illicitly weaponized digital technologies.
Whether it was through micro-targeting or the harvesting of people’s data, some of those claims have withstood scrutiny in the subsequent years, while others have not. But regardless of their merit, I do regard that as a turning point. That year, the question of the power of digital technology shot right to the top of the political agenda. It has also exploded as an academic concern.
DT: What steps can we, as individuals, take to address some of the problems you outline in the book?
JS: Very few, I’m afraid. And it’s important to be honest about that. We need to get out of the mindset that if only we were a bit more tech-savvy, we may be able to better protect ourselves and our children. I believe that’s nonsense. I think that the challenges posed by digital technology can only in the main be fixed at the collective level. That means through the mechanism of law. It shouldn’t be left to individuals.
DT: So what would this kind of collective action or regulatory action look like?
JS: It differs from industry to industry; technology to technology. But in the book, I lay out a number of possibilities. Firstly, I think that powerful individuals in the tech sector should have their conduct regulated in a way analogous to the way that doctors and lawyers and pharmacists have theirs regulated.
Secondly, I think we need a broader conception of antitrust than the one we currently have, which is currently focused narrowly on economic concerns. I think when we are assessing whether a particular merger or acquisition is good for society, we shouldn’t just be taking price into account; we should be taking things like media diversity and the concentration of political and social power into account.
Thirdly, I would like to see ways that individuals and regulators can contest important exercises of digital power, whether that’s ways of contesting algorithms that are distributing mortgages or jobs or housing or loans. It’s a reasonably comprehensive legal regime that I outline in the book. Underpinning all of it is a new mechanism for involving the people in decisions about digital technology. It’s not just a matter of shifting power from tech companies to parliament, but also from parliament back to the people.
This interview has been edited for length and clarity.