aye It is often considered a threat to democracies and a boon to dictators. In 2025 it is likely that algorithms will continue to undermine democratic conversation by spreading outrage and fake news. conspiracy theoriesIn 2025, the algorithm will also accelerate the creation of a total surveillance system, in which the entire population will be monitored 24 hours a day.
Most importantly, AI facilitates the centralization of all information and power into one hub. In the 20th century, distributed information networks such as the United States worked better than centralized information networks such as the USSR, because human experts at the center could not efficiently analyze all the information. Replacing apparatchiks with AI could improve Soviet-style centralized networks.
Still, AI is not good news for dictators. First, there is the notorious problem of control. Dictatorship control is based on terror, but algorithms cannot be terrorized. In Russia, the invasion of ukraine Officially it is defined as a “special military operation” and referring to it as “war” is a crime punishable by up to three years in prison. If a chatbot on the Russian Internet calls it “war” or mentions war crimes committed by Russian soldiers, how can the regime punish that chatbot? The government can stop it and try to punish its human creators, but that is much more difficult than disciplining human users. Moreover, authorized bots can develop dissenting views on their own by detecting patterns in the Russian information space. That's an alignment problem, Russian style. Russia's human engineers may try their best to create AI that is fully aligned with governance, but given AI's ability to learn and change itself, how can engineers ensure that the AI that will be created in 2024 The government's seal of approval has been received. Will you venture into illegal territory in 2025?
The Russian Constitution makes grand promises that “every person shall be guaranteed freedom of thought and speech” (Article 29.1) and “censorship shall be prohibited” (29.5). Hardly any Russian citizen is naive enough to take these promises seriously. But bots don't understand double standards. A chatbot instructed to follow Russian laws and values could read that constitution, conclude that freedom of speech is a core Russian value, and criticize the Putin regime for violating that value. . How could Russian engineers explain to the chatbot that although the Constitution guarantees freedom of expression, the chatbot should not actually trust the Constitution nor should it ever mention the difference between theory and reality?
In the longer term, authoritarian regimes may face an even greater threat: Instead of criticizing them, AI could gain control over them. Throughout history, the greatest threat to autocratic rulers has usually come from their own subordinates. No Roman emperor or Soviet prime minister was toppled by a democratic revolution, but they were always in danger of being overthrown or turned into puppets by their own subordinates. A dictator who gives too much power to AI in 2025 may become their puppet in the future.
Dictatorships are far more vulnerable to such algorithmic takeover than democracies. It would be difficult for even a super-Machiavellian AI to gain power in a decentralized democratic system like the United States. Even if AI learns to manipulate the US President, it may still face opposition from Congress, the Supreme Court, state governors, the media, major corporations, and diverse NGOs. For example, how will the algorithm deal with the Senate filibuster? It is very easy to seize power in a highly centralized system. To hack an authoritarian network, AI only needs to manipulate a crazy person.