Influence for hire. The Asia-Pacific’s online shadow economy
The Asia-Pacific’s online shadow economy
What’s the problem?
It’s not just nation-states that interfere in elections and manipulate political discourse. A range of commercial services increasingly engage in such activities, operating in a shadow online influence-for-hire economy that spans from content farms through to high-end PR agencies. There’s growing evidence of states using commercial influence-for-hire networks. The Oxford Internet Institute found 48 instances of states working with influence-for-hire firms in 2019–20, an increase from 21 in 2017–18 and nine in 2016–17.1 There’s a distinction between legitimate, disclosed political campaigning and government advertising campaigns, on the one hand, and efforts by state actors to covertly manipulate the public opinion of domestic populations or citizens of other countries using inauthentic social media activity, on the other. The use of covert, inauthentic, outsourced online influence is also problematic as it degrades the quality of the public sphere in which citizens must make informed political choices and decisions.
The Asia–Pacific region contains many states in different stages of democratisation.2 Many have transitioned to democratic forms of governance from authoritarian regimes. Some have weak political institutions, limitations on independent media and fragile civil societies. The rapid rate of digital penetration in the region layered over that political context leaves populations vulnerable to online manipulation. In fragile democratic contexts, the prevalence of influence-for-hire operations and their leverage by agents of the state is particularly problematic, given the power imbalance between citizens and the state.
A surplus of cheap digital labour makes the Asia–Pacific a focus for operators in this economy, and this report examines the regional influence-for-hire marketplace using case studies of online manipulation in the Philippines, Indonesia, Taiwan and Australia. Governments and other entities in the region contract such services to target and influence their own populations in ways that aren’t transparent and that may inhibit freedom of political expression by drowning out dissenting voices. Several governments have introduced anti-fake-news legislation that has the potential to inhibit civic discourse by limiting popular political dissent or constraining the independence of the media from the state.3 These trends risk damaging the quality of civic engagement in the region’s emerging democracies.
What’s the solution?
This is a policy problem spanning government, industry and civil society, and solutions must incorporate all of those domains. Furthermore, influence-for-hire services are working in transnational online spaces that cut across legislative jurisdictions. Currently, much of the responsibility for taking action against the covert manipulation of online audiences falls to the social media companies.
It’s the companies that carry the responsibility for enforcement actions, and those actions are primarily framed around the terms of service and content moderation policies that underpin platform use. The platforms themselves are conscious of the growing marketplace for platform-manipulation services. Facebook, for example, notes this trend in its strategic threat report, The state of influence operations 2017–2020.4
Solutions must involve responsibility and transparency in how governments engage with their citizens.
The use of online advertising in political campaigning is distinct from the covert manipulation of a domestic population by a state. However, governments, civil society and industry have shared interests in an open information environment and can find alignment on the democratic values that support free—and unmanipulated—political expression. Support for democratic forms of governance remains strong in the Asia–Pacific region,5 albeit with degrees of concern about the destabilising potential of digitally mediated forms of political mobilisation and a trend towards democratic backsliding over the last decade that is constraining the space for civil society.6
The technology industry, civil society and governments should make that alignment of values the bedrock of a productive working relationship. Structures bringing these stakeholders together should reframe those relationships—which are at times adversarial—in order to find common ground. There will be no one-size-fits-all solution, given the region’s cultural diversity. Yet the Asia–Pacific contains many rapidly emerging economies that can contribute to the digital economy in creative ways. The spirit of digital entrepreneurship that drives content farm operations should be reshaped through stakeholder partnerships and engagement into more productive forms of digital labour that can contribute to a creative, diverse and distinct digital economy.
Introduction
It is already well known that the Kremlin’s covert interference in the 2016 US presidential election was outsourced to the now infamous Internet Research Agency.7
ASPI’s investigations of at-scale manipulation of the information environment by other significant state actors have also identified the use of marketing and spam networks to obfuscate state actor involvement. For example, ASPI has previously identified the use of Indonesian spam marketing networks in information operations attributed to the Chinese Government and targeting the Hong Kong protest movement in 2019.8 In 2020, ASPI also discovered the Chinese Government’s repurposing of Russian and Bangladeshi social media accounts to denigrate the movement.9 Those accounts were likely to have been hacked, stolen or on-sold in the influence-for-hire shadow economy. In May 2021, Facebook suspended networks of influence-for-hire activity run from Ukraine targeting domestic audiences and linked to individuals previously sanctioned by the US Department of the Treasury for attempted interference in the 2020 US presidential election.10
Audience engagement with, and heightened sentiment about civic events create new business models for those motivated to influence. Australia’s 2019 federal election was targeted by financially motivated actors from Albania, Kosovo and the Republic of Northern Macedonia.11 Those operators built large Facebook groups, used inflammatory nationalistic and Islamophobic content to drive engagement, and seeded the groups with links through to off-platform content-farm websites. Each click-through from the Facebook group to the content-farm ecosystem generated advertising revenue for those running the operation. A similar business model run from Israel used similar tactics to build audiences on Facebook, again manipulating and monetising nationalistic and Islamophobic sentiment to build audiences that could be steered to an ad-revenue-generating content-farm ecosystem of news-style websites.12 Mehreen Faruqi, Australia’s first female Muslim senator, was a target of racist vitriol among the 546,000 followers of 10 Facebook pages within the network. These financially motivated actors demonstrate that even well-established democracies are vulnerable to manipulation through exploitation of the fissures in their social cohesion.
This report examines the influence-for-hire marketplace across the Asia–Pacific through case studies of online manipulation in the Philippines, Indonesia, Taiwan and Australia over five chapters and concludes with policy recommendations (pages 36-37). The authors explore the business models that support and sustain the marketplace for influence and the services that influence operators offer.
Those services are increasingly integrated into political campaigning, yet the report highlights that those same approaches are being used by states in the region to influence their domestic populations in ways that aren’t transparent and that constrict and constrain political expression. In some instances, states in the region are using commercial services as proxies to covertly influence targeted international audiences.
Download full report
The above sections are the report introduction only – readers are encouraged to download the full report which includes many case-studies and references.
Editor and project manager: Dr Jacob Wallis is Head of Program, Information Operations and Disinformation at ASPI’s International Cyber Policy Centre.
About the authors:
- Ariel Bogle is an Analyst at ASPI’s International Cyber Policy Centre.
- Albert Zhang is a Researcher at ASPI’s International Cyber Policy Centre.
- Hillary Mansour is a Research Intern at ASPI’s International Cyber Policy Centre.
- Tim Niven is a Research Scientist at Taiwan-based DoubleThink Lab.
- Elena Yi-Ching Ho was a Research Intern at ASPI’s International Cyber Policy Centre.
- Jason Liu is a Taiwan-based investigative journalist.
- Dr Jonathan Corpus Ong is Associate Professor, University of Massachusetts-Amherst and Shorenstein Center Fellow, Technology and Social Change Project, Harvard Kennedy School.
- Dr Ross Tapsell is Senior Lecturer at the College of Asia & the Pacific at Australian National University.
Acknowledgements
Thank you to Danielle Cave and Fergus Hanson for all of their work on this project. Thank you also to peer reviewers inside of ASPI, including Michael Shoebridge, and external, anonymous peer reviewers for their useful feedback on drafts of the report. Facebook Inc. provided ASPI with a grant of AU$100,000 which was used towards this report. The views reflected in the report are those of the authors only. Additional research costs were covered from ASPI ICPC’s mixed revenue base. The work of ASPI ICPC would not be possible without the support of our partners and sponsors across governments, industry and civil society.
What is ASPI?
The Australian Strategic Policy Institute was formed in 2001 as an independent, non‑partisan think tank. Its core aim is to provide the Australian Government with fresh ideas on Australia’s defence, security and strategic policy choices. ASPI is responsible for informing the public on a range of strategic issues, generating new thinking for government and harnessing strategic thinking internationally. ASPI’s sources of funding are identified in our annual report, online at www.aspi.org.au and in the acknowledgements section of individual publications. ASPI remains independent in the content of the research and in all editorial judgements.
ASPI International Cyber Policy Centre
ASPI’s International Cyber Policy Centre (ICPC) is a leading voice in global debates on cyber, emerging and critical technologies, issues related to information and foreign interference and focuses on the impact these issues have on broader strategic policy. The centre has a growing mixture of expertise and skills with teams of researchers who concentrate on policy, technical analysis, information operations and disinformation, critical and emerging technologies, cyber capacity building, satellite analysis, surveillance and China-related issues.
The ICPC informs public debate in the Indo-Pacific region and supports public policy development by producing original, empirical, data-driven research. The ICPC enriches regional debates by collaborating with research institutes from around the world and by bringing leading global experts to Australia, including through fellowships. To develop capability in Australia and across the Indo-Pacific region, the ICPC has a capacity building team that conducts workshops, training programs and large-scale exercises for the public and private sectors.
We would like to thank all of those who support and contribute to the ICPC with their time, intellect and passion for the topics we work on. If you would like to support the work of the centre please contact: icpc@aspi.org.au
Important disclaimer
This publication is designed to provide accurate and authoritative information in relation to the subject matter covered. It is provided with the understanding that the publisher is not engaged in rendering any form of professional or other advice or services. No person should rely on the contents of this publication without first obtaining advice from a qualified professional.
© The Australian Strategic Policy Institute Limited 2021
This publication is subject to copyright. Except as permitted under the Copyright Act 1968, no part of it may in any form or by any means (electronic, mechanical, microcopying, photocopying, recording or otherwise) be reproduced, stored in a retrieval system or transmitted without prior written permission. Enquiries should be addressed to the publishers.
Notwithstanding the above, educational institutions (including schools, independent colleges, universities and TAFEs) are granted permission to make copies of copyrighted works strictly for educational purposes without explicit permission from ASPI and free of charge.
First published August 2021. ISSN 2209-9689 (online), ISSN 2209-9670 (print).
Cover image: Illustration by Wes Mountain. ASPI ICPC and Wes Mountain allow this image to be republished under the Creative Commons License Attribution-Share Alike. Users of the image should use the following sentence for image attribution: ‘Illustration by Wes Mountain, commissioned by the Australian Strategic Policy Institute’s International Cyber Policy Centre.’
Funding statement: This report was in part funded by Facebook Inc.
- Samantha Bradshaw, Hannah Bailey, Philip N Howard, Industrialized disinformation: 2020 global inventory of organized social media manipulation, Computational Propaganda Research Project, 2020, online. ↩︎
- Lindsey W Ford, Ryan Hass, Democracy in Asia, Brookings Institution, 22 January 2021, online. ↩︎
- Andrea Carson, Liam Fallon, Fighting fake news: a study of online misinformation regulation in the Asia Pacific, La Trobe University, January 2021, online. ↩︎
- Threat report: the state of influence operations 2017–2020, Facebook, May 2021, online. ↩︎
- L.F. Ford, R. Hass, Democracy in Asia, Brookings, 2021, online. ↩︎
- V-Dem Institute, Democracy report 2021: Autocratization turns viral, 2021, online. ↩︎
- US Department of Justice, Internet Research Agency indictment, US Government, 2018, online. ↩︎
- T Uren, E Thomas, J Wallis, Tweeting through the Great Firewall: preliminary analysis of PRC-linked information operations on the Hong Kong protests, ASPI, Canberra, 3 September 2019, online. ↩︎
- Wallis, T Uren, E Thomas, A Zhang, S Hoffman, L Li, A Pascoe, D Cave, Retweeting through the Great Firewall: a persistent and undeterred threat actor, ASPI, Canberra, 12 June 2020, online. ↩︎
- Facebook, April 2021 coordinated inauthentic behaviour report, 2021, online. ↩︎
- M Workman, S Hutcheon, ‘Facebook trolls and scammers from Kosovo are manipulating Australian users’, ABC News, 15 March 2019, online. ↩︎
- C Knaus, M McGowan, M Evershed, O Homes, ‘Inside the hate factory: how Facebook fuels far-right profit’, The Guardian, 6 December 2019, online. ↩︎