header banner
OPINION

Hijacking democracy

Algorithms will soon be a big part of human society. It is already being used to determine what people see on online platforms, detect traffic violations on the busy streets of cosmopolitan cities, suggest posts on social media, and fire unproductive employees. The general thinking is that the evaluations made by algorithms are objective and unbiased because they analyze huge amounts of data to make decisions.  This, however, is an incorrect assumption. Since people write software codes, human-biases often transfer in algorithms potentially threatening liberal democracy.
By Subarna Shakya and Bimal Pratap Shah

People’s faith in democracy is waning. As a result, many people are in favor of replacing representative democracy with highly sophisticated AI technologies to create ‘algorithmocracy’


Algorithms will soon be a big part of human society. It is already being used to determine what people see on online platforms, detect traffic violations on the busy streets of cosmopolitan cities, suggest posts on social media, and fire unproductive employees. The general thinking is that the evaluations made by algorithms are objective and unbiased because they analyze huge amounts of data to make decisions.  This, however, is an incorrect assumption. Since people write software codes, human-biases often transfer in algorithms potentially threatening liberal democracy. 


People’s faith in democracy is waning. This is because politicians, especially in the developing countries have been enriching themselves for many decades. As a result, many people are already in favor of replacing representative democracy with highly sophisticated AI technologies to create ‘algorithmocracy.’ Eli Pariser invented the term ‘algorithmocracy’ to describe the use of the power of computers to crunch massive amounts of data to make decisions in parliament and make representative democracy irrelevant.  The idea, however, is far-fetched because AI is still in its nascent stage, but if people remain ignorant, governments and businesses could use the technology against them without their knowledge.


Rise of AI

Currently, there are only a handful of AI companies with deep pockets that have a monopoly over the technology’s development. Tech giants like Google, Facebook, Amazon, Tencent, Baidu, Alibaba and Apple have invested huge sums of money in AI-related research and development. They also own the lion’s share of AI patents and this trend is likely to continue in the future as well. 


Related story

Gunmen kill 14 passengers after hijacking bus in southwestern P...


Algorithms could help citizens vote for the right politicians. People do not have the patience to conduct research on political candidates before the elections and usually end up voting for wrong politicians. Vote Watch Europe, a think tank based in Brussels, has developed a tool designed to match voters with the most suitable candidate in European Parliament elections.  The system did this by simply requiring the voters to answer 25 carefully crafted questions.


Despite the promises, politicians, governments, civil society and businesses could potentially use sophisticated algorithms to create filter bubbles to manipulate citizens to succumb to their hidden agendas. Eli Pariser described the filter bubble as “a state of intellectual isolation that can allegedly result from personalized searches when a website algorithm selectively guesses what information a user would like to see based on information about the user, such as location, past click-behavior and search history.” 


The filter bubbles create echo chambers that repeatedly suggest content that people already prefer for creating an illusion that everyone is thinking the same thought, as the technology prevents people from getting exposed to different perspectives. This type of personalization can be harmful to society. Filter bubbles could be used to empower Big Brother states in totalitarian states like in George Orwell’s 1984. In democratic countries, the tech platforms could facilitate the spread of fake news and create ideological “filter bubbles” to increase political and social polarization reinventing a society resembling Aldous Huxley’s Brave New World.


Ideally, in a democratic society, citizens should be able to see things from a different point of view. An opportunity to weigh between different perspectives before deciding to vote during elections is what separates democracy from authoritarianism. Yet, it is impractical to blame AI technology because humans develop algorithms that reflect their biased preferences. The social bubbles created by social media platforms are already shaping reality as people can only see and hear what they like. Therefore, to save humanity from the tyranny of algorithms, it is important for democratic governments to work toward maintaining algorithmic transparency.


Taming the threat  

Governments can enforce algorithmic transparency by introducing progressive legislation that requires transparency, accountability, and participation in AI companies’ part. In the US, lawmakers have proposed a bill that would require major tech companies to detect and remove any discriminatory biases embedded in their software. The bill, entitled the Algorithmic Accountability Act of 2019, would grant new power to the US Federal Trade Commission (FTC). The Act would also force companies to be transparent so that people can examine if race, gender or other biases underpin the technology. 


The European Commission has also urged the national governments of the member nations to establish a pan-European network to monitor electoral campaigning online for unethical practices. The Commission recently published an “EU-wide code of practice on disinformation” to encourage online platforms to increase transparency on political advertising and prohibit politicians from “micro-targeting” of voters.


These steps taken by the US government and the European Commission in safeguarding democracy in the age of AI are praiseworthy because corporations now have the sophistication to create echo chambers that can remotely shape and influence public opinion in a very effective way. For example, Cambridge Analytica altered the outcome of both the 2016 US presidential election and the UK’s Brexit referendum.


Cambridge Analytica, a political consulting firm based in the UK that specializes in “psychographic” profiling, uses data collected on the Internet to create voter’s personality profiles and target them with specifically tailored content. The firm used intensive survey research, data modeling and performance-optimizing algorithms to target 10,000 different ads to different audiences using more than 50 million Facebook users’ data leading up to the 2016 US election, a company whistleblower told British lawmakers. The unsuspecting voters were exposed to the ads more than a billion times. 


At the same time, governments cannot be too regressive and design misplaced regulations that could potentially stifle innovation because algorithms can be coded to benefit humanity. For instance, an AI algorithm detected the coronavirus outbreak a week before the Centers for Disease Control and Prevention alerted people of a flulike outbreak in China and nine days before the World Health Organization issued a notice on the disease. 


It is now evident that the neologism “algorithmocracy” that once evoked the cybernetic dream of automatic governance could very well turn into a dystopian doom if governments do not effectively regulate the industry that has learned to closely track the online footprint and daily lives of citizens to manipulate them. If democracy is to work, governments must force social media giants to redesign themselves.


Shakya was awarded ‘100 Most Dedicated Professors’ by the World Education Congress in 2019.Shah is a Policy Wonk

Related Stories
POLITICS

No one should dream of going against democracy: PM...

OPINION

Corruption weakens democracy

OPINION

Democratic Recession in Nepal

SOCIETY

Documentary on historic plane hijack screened

POLITICS

How strong is internal democracy within political...