Our June WebDebate tackled the topic of Algorithmic diplomacy. Three issues in this field were discussed in the debate: algorithmic diplomacy in the context of geopolitical analysis and public diplomacy, impact of algorithms on human rights and the question of filter bubbles and online echo chambers that seem to be generated by algorithms. Joining us to discuss these issues were two speakers: Mr Shaun Riordan, a Senior Visiting Fellow of the Netherlands Institute for International Relations (Clingendael), and Mr Lee Hibbard, administrator in the Bioethics Unit of the Council of Europe. The debate was moderated by Dr Katharina Höne, research associate in diplomacy and global governance at DiploFoundation.
Höne started by giving a simple definition of an algorithm: algorithms are a step-by-step description of how to do something. They can be formulated in computer languages, but can also be formulated in natural languages. They are aimed at solving problems, producing the same results when given the same inputs. Höne also gave a classification of five types of algorithms and examples of questions in the diplomatic area these algorithms could solve. Two-class or multi-class algortims answer questions such as “what is the sentiment of this tweet about this policy issue?” Algorithms that detect anomalies can answer questions like “is the trading pattern different form state’s past trade behavior?” Some algorithms do regression analysis and can ask questions such as “How many followers will the MFA’s account gain next week?” Unsupervised learning algorithms can solve questions like “which types of users generally agree with the messages of the MFA?” Machine learning algorithms make decisions like “should I vote for, against or abstain from this proposal, based on my country’s context and negotiation history?”
Algorithms have been used for a while, yet attention put on them has been renewed because of Big Data and Artificial Intelligence (AI) - algorithms are used to make sense of big quantities of data, which in turn enable developing and improving AI. In terms of diplomacy, the applications of algorithms are quite new. Algorithms have the potential to change the tools diplomats have at their disposal, influence the topics on the diplomatic agenda and can impact the very environment in which diplomacy takes place.
Riordan started by acknowledging there are great potential applications for Big Data analysis both in content policy and foreign policy analysis. As far as content policy goes, governments could use Big Data analysis to scrape social media for information about their citizens in order to tailor future interaction with them. However, this is where privacy issues appear, as well as the concern that other governments might use this information for social control. In the area of foreign policy analysis, it is possible to base foreign policy decisions on the outputs of algorithms. Riordan identifies two kinds of algorithms that can be used for such a purpose - designed and machine learning algorithms. Designed algorithms are designed by programmers and reflect the epistemological prejudices of the programmer, making them not objective. On the other hand, machine learning algorithms aren’t objective either, as the way data is being fed into the algorithm shapes the way said algorithm functions. As a non-objective robot inputs that data into the algorithm, there is still a bias, but the diplomats and policy makers who are the users of the analysis understand less about how that bias is created and aren’t able to question the decision the algorithm recommends. Another concern Riordan mentions is that algorithms doing Big Data analysis are online, which makes them susceptible to cyber attacks. A MFA’s algorithm can be hacked and the data and the analytical framework can be changed in order to change the government’s decision and conclusion about the next course of action.
Algorithms make information warfare easier and public diplomacy harder because they feed or reinforce echo chambers and filter bubbles, Riordan underlined. In information warfare, the agents want to fragment social discourse in liberal democracies, polarize societies and undermine the narratives of the government and its ability to deliver on policy decisions. This is achieved by fragmenting decisions, after which the algorithms will make sure the fragments reach the echo chamber that agrees with them. However, public diplomacy aims to influence the entire society, which is why it can’t rely on social media to do so, Riordan emphasized. In his opinion, it is necessary for diplomats to engage with internet companies that use algorithms as part of their business model to gather more information about algorithms, in order to enable the designing of policies that can broaden the debate.
Hibbard pointed out that automatic processing of data has been a subject of discourse for the last three decades. Algorithms are a part of everyday life - profiling assessments, predictions - but the speed of technological change is still in its infancy. There is a blurring of roles of responsibility, which is challenging for all actors, state and non-state, public and private alike. Human rights are the exclusive purview of states, who sign conventions protecting human rights and are legally obliged to ensure that these rights are protected, that they are clearly understood and transmitted and that one can predict their behavior accordingly. In Hibbard’s opinion, human rights that are particularly important are political and civil rights, freedom of expression, freedom of assembly, right to privacy and personal data protection, the right not to be discriminated against. The challenge here appears in the form of actors who handle information in new ways, Hibbard underlined. The way their algorithms work depends on the data that is put into them and the way that data is crunched by the algorithm itself. Algorithms have an important role in finding information and sorting it, but the issue lies in how automated that process is. When content is blocked, either voluntarily by the platform or via a court order, there are challenges regarding protecting freedom of expression, ensuring diversity and tolerance in a pluralistic information society information online. Therein lies the concern that echo chambers are created, reducing that access of information, polarizing opinions and perhaps lessening diversity. While profiling and Big Data are acceptable practices, if the data becomes personal, the question of ensuring the protection of that personal data appears.
Algorithms may be the most neutral way of collecting and analysing the data, but whether they are fair and balanced remains debatable in Hibbards opinion. The more autonomy the algorithms have, the less human agency is in play, which means less discretion. This could lead to reinforced bias and issues of discrimination that come from the bias which in turn comes from the data which is input into algorithms. With machine learning and AI, less human agency is in play, which makes humans feel like they are losing control over their data. We haven’t really mastered algorithms and what they can be used for; their misuse and abuse exists, but it is unclear whose responsibility is it to protect from it, Hibbard emphasized.
Riordan expressed uncertainty as to how algorithms could help in the case of two hostile sides exchanging insults. What is new is the way President Trump uses Twitter, but it is anarchic in nature and undermines the conducting of American foreign policy, in Riordan’s opinion. He also opined that this is bad case study that can’t produce valuable lessons for diplomacy.
In Riordan’s opinion, diplomats first have to understand algorithms, how social media works and why algorithms are limiting what social media can do, and why algorithms make information warfare so much easier. Secondly, diplomats must have more technical knowledge, such as knowledge about search engine optimisation (SEO) which is essential in trying to get a message across. Riordan’s third point was that if diplomats are going to use algorithms to undertake foreign policy analysis, they need to understand that algorithms are not objective, that there are limitations to how valuable they are, and diplomats must be willing to question them. Fourth, diplomats must be more imaginative about the use of online platforms for scenario building, simulation exercises, targeting their audience, in order to successfully combat information warfare.
Hibbard opined there must be reflection about the role of the states in the online space, which is a space with no boundaries. Diplomats must understand they are dealing with a cross border phenomenon which has its own belief system. The established order of things for diplomats and the technology sector of the Internet differs, the internet governance principles regarding processes on the Internet are different to what may have been learnt by the diplomats. While multistakeholder approach to internet governance has been discussed frequently, Hibbard opined that we haven’t yet found the effectively successful working model for internet governance in which roles and responsibilities of different actors are respected. Diplomats need to understand new realities. Diplomats must develop the ability to communicate and collaborate with non-state actors and understand their belief system, as they can affect the role of diplomats in helping to ensure the protection of human rights online.
Riordan remarked that while the culture of the technology sector is difficult to understand for diplomats and governments, the technology sector finds geopolitics hard to understand and clashes against it. The internet does have boundaries, such as the China Great Firewall, and governments are trying to impose themselves on the Internet, pursuing the idea of internet sovereignty. Both sides need to learn in this, Riordan underlined.
Riordan identifies as a problem the use of off-the-shelf products/software/programmes/apps by diplomats, which come with limitations of the built-in algorithms and are designed for other, commercial purposes. Diplomats and programmers should collaborate to tailor-design tools that diplomats will use. Riordan also stated that diplomats should influence the design of future tools and understand the implications of where the future of technologies is going.
Riordan noted that diplomacy can promote peace but it can also promote warfare, depending on the national interests of the state. Riordan stated his belief that the same will apply in cyberspace, where diplomats will also pursue the national interests of their state. Some of these interests may be peaceful, but some countries are already taking advantage of algorithms to undermine Western society and its coherence. In Riordan’s opinion, diplomats should engage with that.
Diplomats and internet companies should collaborate to ensure the protection of human rights, a range of which should be agreed upon, Hibbard underlined. Governments must work with companies to ensure their algorithms respect the balance of human rights - both in rights and in limitations where necessary- and that bias is removed from the algorithms. Algorithms, however autonomous, should be accompanied by humans, who will validate and certify their results.
Riordan warned that should Big Data analysis replace human analysts altogether, humans will lose control of policy making processes because they won’t understand the black boxes that are generating the foreign policy analysis. At the very least, the human analysis and human diplomacy is going to have to accompany the use of algorithms in order to understand intentions and motivations correctly through face-to-face diplomacy and avoid the kind of misunderstandings that lead to conflict.
Hone reflected that the questions for the futures will be promoting the collaboration between diplomats on one hand and programmers and technologists on the other hand on various levels in MFAs; promoting public-private partnerships in the sense of how tech in designed and used; including Internet companies into this dialogue and getting transparency and cooperation into the policy realm.
Riordan concluded that a way to get diplomats and technology specialists together must be found, so diplomats could bring their skill sets, techniques and mindsets of diplomats which can be helpful in solving algorithm problems, in negotiating the problems in cyberspace. Technology companies must get engaged with governments, and they can be influenced by governments that have the market power or can leverage the market attractiveness as can impose legislation on companies, but this could possibly create a complicated situation in which companies would have to choose between regulations. The second way of influencing companies according to Riordan is - they do not want information warfare carried out on their platforms as it makes them unattractive to marketers. It is easy to question what the biases and prejudices of an algorithmic black box are, and humans know that human judgement is biased and take that into account in the process.
Hibbard concluded that more reflection is to be had regarding the autonomous use of machines which collect information and make decisions, and humans should be accompanying these processes. The roles and responsibilities of actors need to be discussed. Actors need to grasp the technical aspects, at least to understand the main challenges which appear. The question of liability should also be discussed - if something is autonomous and works without any human manipulation, it is important to decide who has responsibility for its outputs.
Andrijana Gavrilovic works as an junior associate at Diplofoundation and focuses her work on the diplomacy and cybersecurity issues. She is a postgraduate student of International Security at the Faculty of Political Science in Belgrade, Serbia