The CCP’s increasingly sophisticated cyber-enabled influence operations
What’s the problem?
The Chinese Communist Party’s (CCP’s) embrace of large-scale online influence operations and spreading of disinformation on Western social-media platforms has escalated since the first major attribution from Silicon Valley companies in 2019. While Chinese public diplomacy may have shifted to a softer tone in 2023 after many years of wolf-warrior online rhetoric, the Chinese Government continues to conduct global covert cyber-enabled influence operations. Those operations are now more frequent, increasingly sophisticated and increasingly effective in supporting the CCP’s strategic goals. They focus on disrupting the domestic, foreign, security and defence policies of foreign countries, and most of all they target democracies.
Currently—in targeted democracies—most political leaders, policymakers, businesses, civil society groups and publics have little understanding of how the CCP currently engages in clandestine activities online in their countries, even though this activity is escalating and evolving quickly. The stakes are high for democracies, given the indispensability of the internet and their reliance on open online spaces, free from interference. Despite years of monitoring covert CCP cyber-enabled influence operations by social-media platforms, governments, and research institutes such as ASPI, definitive public attribution of the actors driving these activities is rare. Covert online operations, by design, are difficult to detect and attribute to state actors.
Social-media platforms and governments struggle to devote adequate resources to identifying, preventing and deterring increasing levels of malicious activity, and sometimes they don’t want to name and shame the Chinese Government for political, economic and/or commercial reasons.
But when possible, public attribution can play a larger role in deterring malicious actors. Understanding which Chinese Government entities are conducting such operations, and their underlying doctrine, is essential to constructing adequate counter-interference and deterrence strategies. The value of public attribution also goes beyond deterrence. For example, public attribution helps civil society and businesses, which are often the intended targets of online influence operations, to understand the threat landscape and build resilience against malicious activities. It’s also important that general publics are given basic information so that they’re informed about the contemporary security challenges a country is facing, and public attribution helps to provide that information.
ASPI research in this report—which included specialised data collection spanning Twitter, Facebook, Reddit, Sina Weibo and ByteDance products—reveals a previously unreported CCP cyber-enabled influence operation linked to the Spamouflage network, which is using inauthentic accounts to spread claims that the US is irresponsibly conducting cyber-espionage operations against China and other countries. As a part of this research, we geolocated some of the operators of that network to Yancheng in Jiangsu Province, and we show it’s possible that at least some of the operators behind Spamouflage are part of the Yancheng Public Security Bureau.
The CCP’s clandestine efforts to influence international public opinion rely on a very different toolkit today compared to its previous tactics of just a few years ago. CCP cyber-enabled influence operations remain part of a broader strategy to shape global public opinion and enhance China’s ‘international discourse power’. Those efforts have evolved to nudge public opinion towards positions more favourable to the CCP and to interfere in the political decision-making processes of other countries. A greater focus on covert social-media accounts allows the CCP to pursue its interests while providing a plausibly deniable cover.
Emerging technologies and China’s indigenous cybersecurity industry are also creating new capabilities for the CCP to continue operating clandestinely on Western social platforms.
Left unaddressed, the CCP’s increasing investment in cyber-enabled influence operations threatens to successfully influence the economic decision-making of political elites, destabilise social cohesion during times of crisis, sow distrust of leaders or democratic institutions and processes, fracture alliances and partnerships, and deter journalists, researchers and activists from sharing accurate information about China.
What’s the solution?
This report provides the first public empirical review of the CCP’s clandestine online networks on social-media platforms.
We outline seven key policy recommendations for governments and social-media platforms (further details are on page 39):
Social-media platforms should take advantage of the digital infrastructure, which they control, to more effectively deter cyber-enabled influence operations. To disrupt future influence operations, social-media platforms could remove access to those analytics for suspicious accounts breaching platform policies, making it difficult for identified malicious actors to measure the effectiveness of influence operations.
Social-media platforms should pursue more innovative information-sharing to combat cyber-enabled influence operations. For example, social-media platforms could share more information about the digital infrastructure involved in influence operations, without revealing personally identifiable information.
Governments should change their language in speeches and policy documents to describe social-media platforms as critical infrastructure. This would acknowledge the existing importance of those platforms in democracies and would communicate signals to malicious actors that, like cyber operations on the power grid, efforts to interfere in the information ecosystem will be met with proportionate responses.
Governments should review foreign interference legislation and consider mandating that social-media platforms disclose state-backed influence operations and other transparency reporting to increase the public’s threat awareness.
Public diplomacy should be a pillar of any counter-malign-influence strategy. Government leaders and diplomats should name and shame attributable malign cyber-enabled influence operations, and those entities involved in their operation (state and non-state) to deter those activities.
Partners and allies should strengthen intelligence diplomacy on this emerging security challenge and seek to share more intelligence with one another on such influence operations. Strong open-source intelligence skills and collection capabilities are a crucial part of investigating and attributing these operations, the low classification of which, should making intelligence sharing easier.
Governments should support further research on influence operations and other hybrid threats. To build broader situational awareness of hybrid threats across the region, including malign influence operations, democracies should establish an Indo-Pacific hybrid threats centre.
Key findings
The CCP has developed a sophisticated, persistent capability to sustain coordinated networks of personas on social-media platforms to spread disinformation, wage public-opinion warfare and support its own diplomatic messaging, economic coercion and other levers of state power.
That capability is evolving and has expanded to push a wider range of narratives to a growing international audience with the Indo-Pacific a key target.
The CCP has used these cyber-enabled influence operations to seek to interfere in US politics, Australian politics and national security decisions, undermine the Quad and Japanese defence policies and impose costs on Australian and North American rare-earth mining companies.
CCP cyber-enabled influence operations are probably conducted, in parallel if not collectively, by multiple Chinese party-state agencies. Those agencies appear at times to collaborate with private Chinese companies. The most notable actors that are likely to be conducting such operations include the People’s Liberation Army’s Strategic Support Force (PLASSF), which conducts cyber operations as part of the PLA’s political warfare; the Ministry of State Security (MSS), which conducts covert operations for state security; the Central Propaganda Department, which oversees China’s domestic and foreign propaganda efforts; the Ministry of Public Security (MPS), which enforces China’s internet laws; and the Cyberspace Administration of China (CAC), which regulates China’s internet ecosystem. Chinese state media outlets and Ministry of Foreign Affairs (MFA) officials are also running clandestine operations that seek to amplify their own overt propaganda and influence activities.
Starting in 2021, a previously unreported CCP cyber-enabled influence operation has been disseminating narratives that the CIA and National Security Agency are ‘irresponsibly conducting cyber-espionage operations against China and other countries’. ASPI isn’t in a position to verify US intelligence agency activities. However, the means used to disseminate the counter-US narrative— this campaign appears to be partly driven by the pro-CCP coordinated inauthentic network known as Spamouflage—strongly suggests an influence operation. ASPI’s research suggests that at least some operators behind the campaign are affiliated with the MPS, or are ‘internet commentators’ hired by the CAC, which may have named this campaign ‘Operation Honey Badger’. The evidence indicates that the Chinese Government probably intended to influence Southeast Asian markets and other countries involved in the Belt and Road Initiative to support the expansion of Chinese cybersecurity companies in those regions.
Chinese cybersecurity company Qi An Xin (奇安信) appears at times it may be supporting the influence operation. The company has the capacity to seed disinformation about advanced persistent threats to its clients in Southeast Asia and other countries. It’s deeply connected with Chinese intelligence, military and security services and plays an important role in China’s cybersecurity and state security strategies.
https://aspi.s3.ap-southeast-2.amazonaws.com/wp-content/uploads/2025/03/12144441/Policy-brief_-Gaming-public-opinion_-the-CCPs-increasingly-sophisticated-cyber-banner.png552789markohttp://aspi.s3.ap-southeast-2.amazonaws.com/wp-content/uploads/2024/10/16232551/ASPI-CMYK_SVG.svgmarko2023-04-26 14:41:032025-03-12 14:46:45Gaming Public Opinion
How the CCP is influencing the Pacific islands information environment
What’s the problem?
The Chinese Communist Party (CCP) is conducting coordinated information operations in Pacific island countries (PICs). Those operations are designed to influence political elites, public discourse and political sentiment regarding existing partnerships with Western democracies. Our research shows how the CCP frequently seeks to capitalise on regional events, announcements and engagements to push its own narratives, many of which are aimed at undermining some of the region’s key partnerships.
This report examines three significant events and developments:
the establishment of AUKUS in 2021
the CCP’s recent efforts to sign a region-wide security agreement
the 2022 Pacific Islands Forum held in Fiji.
This research, including these three case studies, shows how the CCP uses tailored, reactive messaging in response to regional events and analyses the effectiveness of that messaging in shifting public discourse online.
This report also highlights a series of information channels used by the CCP to push narratives in support of the party’s regional objectives in the Pacific. Those information channels include Chinese state media, CCP publications and statements in local media, and publications by local journalists connected to CCP-linked groups.1
There’s growing recognition of the information operations and misinformation and disinformation being spread globally under the CCP’s directives. Although the CCP’s information operations have had little demonstrated effectiveness in shifting online public sentiment in the case studies examined in this report, they’ve previously proven to be effective in influencing public discourse and political elites in the Pacific.2 Analysing the long-term impact of these operations, so that informed policy decisions can be made by governments and by social media platforms, requires greater measurement and understanding of current operations and local sentiment.
What’s the solution?
The CCP’s presence in the information environment is expanding across the Pacific through online and social media platforms, local and China-based training opportunities, and greater television and short-wave radio programming.3 However, the impact of this growing footprint in the information environment remains largely unexplored and unaddressed by policymakers in the Pacific and in the partner countries that are frequently targeted by the CCP’s information operations.
Pacific partners, including Australia, the US, New Zealand, Japan, the UK and the European Union, need to enhance partnerships with Pacific island media outlets and online news forum managers in order to build a stronger, more resilient media industry that will be less vulnerable to disinformation and pressures exerted by the CCP. This includes further assistance in hiring, training and retaining high-quality professional journalists and media executives and providing financial support without conditions to uphold media freedom in the Pacific. Training should be offered to support online discussion forum managers sharing news content to counter the spread of disinformation and misinformation in public online groups. The data analysis in this report highlights a need for policymakers and platforms to invest more resources in countering CCP information operations in Melanesia, which is shown to have greater susceptibility to those operations.
As part of their targeted training package, Pacific island media and security institutions, such as the Pacific Fusion Centre, should receive further training on identifying disinformation and coordinated information operations to help build media resiliency. For that training to be effective, governments should fund additional research into the actors and activities affecting the Pacific islands information environment, including climate-change and election disinformation and misinformation, and foreign influence activities.
Information sharing among PICs’ media institutions would build greater regional understanding of CCP influence in the information environment and other online harms and malign activity. ASPI has also previously proposed that an Indo-Pacific hybrid threats centre would help regional governments, businesses and civil society better understand and counter those threats.4
Pacific partners, particularly Australia and the US, need to be more effective and transparent in communicating how aid delivered to the region is benefiting PICs and building people-to-people links. Locally based diplomats need to work more closely with Pacific media to contextualise information from press releases and statements and give PIC audiences a better understanding of the benefits delivered by Western governments’ assistance. This includes greater transparency on the provision of aid in the region. Doing so will debunk some of the CCP’s narratives regarding Western support and legitimacy in the region.
A number of local journalists and media contributors have connections to CCP-linked entities, such as Pacific friendship associations. The connections between friendship associations and CCP influence are described in Anne-Marie Brady, ‘Australia and its partners must bring the Pacific into the fold on Chinese interference’, The Strategist, 21 April 2022. ↩︎
Blake Johnson, Miah Hammond-Errey, Daria Impiombato, Albert Zhang, Joshua Dunne, Suppressing the truth and spreading lies: how the CCP is influencing Solomon Islands’ information environment, ASPI, Canberra. ↩︎
Richard Herr, Chinese influence in the Pacific islands: the yin and yang of soft power, ASPI, Canberra, 30 April 2019, online; Denghua Zhang, Amanda Watson, ‘China’s media strategy in the Pacific’, In Brief 2020/29, Department of Pacific Affairs, Australian National University, 26 March 2021, online; Dorothy Wickham, ‘The lesson from my trip to China? Solomon Islands not ready to deal with the giant’, The Guardian, 23 December 2019. ↩︎
Lesley Seebeck, Emily Williams, Jacob Wallis, Countering the Hydra: a proposal for an Indo-Pacific hybrid threat centre, ASPI, Canberra, 7 June 2022. ↩︎
https://aspi.s3.ap-southeast-2.amazonaws.com/wp-content/uploads/2025/03/12150516/Policy-Brief_-Seeking-to-undermine-democracy-and-partnerships-banner.png551790markohttp://aspi.s3.ap-southeast-2.amazonaws.com/wp-content/uploads/2024/10/16232551/ASPI-CMYK_SVG.svgmarko2023-03-07 15:04:542025-03-13 09:45:13Seeking to undermine democracy and partnerships
This report explores how the Chinese party-state’s globally focused propaganda and disinformation capabilities are evolving and increasing in sophistication. Concerningly, this emerging approach by the Chinese party-state to influence international discourse on China, including obfuscating its record of human rights violations, is largely flying under the radar of US social media platforms and western policymakers.
In the broader context of attempts by the Chinese Communist Party (CCP) to censor speech, promote disinformation and seed the internet with its preferred narratives, we focus on a small but increasingly popular set of YouTube accounts that feature mainly female China-based ethnic-minority influencers from the troubled frontier regions of Xinjiang, Tibet and Inner Mongolia, hereafter referred to as ‘frontier influencers’ or ‘frontier accounts’.
Despite being blocked in China, YouTube is seen by the CCP as a key battlefield in its ideological contestation with the outside world, and YouTube’s use in foreign-facing propaganda efforts has intensified in recent years. Originally deployed on domestic video-sharing platforms to meet an internal propaganda need, frontier-influencer content has since been redirected towards global audiences on YouTube as part of the CCP’s evolving efforts to counter criticisms of China’s human rights problems and burnish the country’s image.
Alongside party-state media and foreign vloggers, these carefully vetted domestic vloggers are increasingly seen as another key part of Beijing’s external propaganda arsenal. Their use of a more personal style of communication and softer presentation is expected to be more convincing than traditional party-state media content, which is often inclined towards the more rigid and didactic. For the CCP, frontier influencers represent, in the words of one Chinese propaganda expert, ‘guerrillas or militia’ fighting on the flanks in ‘the international arena of public opinion’, while party-state media or the ‘regular army’ ‘charge, kill and advance on the frontlines’.
The frontier accounts we examine in this report were predominantly created in 2020–21 and feature content that closely hews to CCP narratives, but their less polished presentation has a more authentic feel that conveys a false sense of legitimacy and transparency about China’s frontier regions that party-state media struggle to achieve. For viewers, the video content appears to be the creation of the individual influencers, but is in fact what’s referred to in China as ‘professional user generated content’, or content that’s produced with the help of special influencer-management agencies known as multi-channel networks (MCNs).
For the mostly young and female Uyghur, Tibetan and other ethnic-minority influencers we examine in this report, having such an active presence on a Western social media platform is highly unusual, and ordinarily would be fraught with danger. But, as we reveal, frontier influencers are carefully vetted and considered politically reliable. The content they create is tightly circumscribed via self-censorship and oversight from their MCNs and domestic video platforms before being published on YouTube. In one key case study, we show how frontier influencers’ content was directly commissioned by the Chinese party-state.
Because YouTube is blocked in China, individual influencers based in the country aren’t able to receive advertising revenue through the platform’s Partner Program, which isn’t available there. But, through their arrangements with YouTube, MCNs have been able to monetise content for frontier influencers, as well as for hundreds of other China-based influencers on the platform. Given that many of the MCNs have publicly committed to promote CCP propaganda, this arrangement results in a troubling situation in which MCNs are able to monetise their activities, including the promotion of disinformation, via their access to YouTube’s platform.
The use of professionally supported frontier influencers also appears to be aimed at ensuring that state-backed content ranks well in search results because search-engine algorithms tend to prioritise fresh content and channels that post regularly. From the CCP’s perspective, the continuous flooding of content by party-state media, foreign influencers and professionally supported frontier influencers onto YouTube is aimed at outperforming other more critical but stale content.
This new phenomenon reflects a continued willingness, identified in previous ASPI ICPC reports,11 by the Chinese party-state to experiment in its approach to shaping online political discourse, particularly on those topics that have the potential to disrupt its strategic objectives. By targeting online audiences on YouTube through intermediary accounts managed by MCNs, the CCP can hide its affiliation with those influencers and create the appearance of ‘independent’ and ‘authoritative’ voices supporting its narratives, including disinformation that it’s seeking to propagate globally.
This report (on page 42) makes a series of policy recommendations, including that social media platforms shouldn’t allow MCNs who are conducting propaganda and disinformation work on behalf of the Chinese party-state to monetise their activities or be recognised by the platforms as, for example, official partners or award winners. This report also recommends that social media platforms broaden their practice of labelling the accounts of state media, agencies and officials to include state-linked influencers from the People’s Republic of China.
Fergus Ryan, Ariel Bogle, Nathan Ruser, Albert Zhang, Daria Impiombato, Borrowing mouths to speak on Xinjiang, ASPI, Canberra, 7 December 2021. Fergus Ryan, Ariel Bogle, Albert Zhang, Jacob Wallis, #StopXinjiang Rumors: the CCP’s decentralised disinformation campaign, ASPI, Canberra, 2 December 2021,https://www.aspi.org.au/report/stop-xinjiang-rumors. ↩︎
https://aspi.s3.ap-southeast-2.amazonaws.com/wp-content/uploads/2025/03/13104722/Policy-Brief_-Frontier-influencers_-the-new-face-of-Chinas-propaganda-banner.png453741markohttp://aspi.s3.ap-southeast-2.amazonaws.com/wp-content/uploads/2024/10/16232551/ASPI-CMYK_SVG.svgmarko2022-10-20 10:45:002025-03-13 10:51:18Frontier influencers: the new face of China’s propaganda
The rapid escalation in the long-running conflict between Azerbaijan and Armenia which took place in late September 2020 has been shadowed by a battle across social media for control of the international narrative about the conflict. On Twitter, large numbers of accounts supporting both sides have been wading in on politicised hashtags linked to the conflict. Our findings indicate large-scale coordinated activity. While much of this behaviour is likely to be authentic, our analysis has also found a significant amount of suspicious and potentially inauthentic behaviour.
The goal of this research piece is to observe and document some of the early dynamics of the information battle playing out in parallel to the conflict on the ground and create a basis for further, more comprehensive research. This report is in no way intended to undermine the legitimacy of authentic social media conversations and debate taking place on all sides of the conflict.
https://aspi.s3.ap-southeast-2.amazonaws.com/wp-content/uploads/2024/12/15222206/quickTake-shadowWar-banner.jpg4501350nathanhttp://aspi.s3.ap-southeast-2.amazonaws.com/wp-content/uploads/2024/10/16232551/ASPI-CMYK_SVG.svgnathan2020-10-08 06:00:002024-12-15 22:24:39Snapshot of a shadow war
How foreign affairs and defence agencies use Facebook
What’s the problem?
For defence and diplomacy, digital media, and specifically social media, have become an unavoidable aspect of their operations, communications and strategic international engagement, but the use of those media isn’t always understood or appreciated by governments.
While the Department of Foreign Affairs and Trade (DFAT) and the Department of Defence (DoD) both use social media, including accounts managed by diplomatic posts overseas and by units of the ADF, both departments can improve how they reach and engage online. It’s important to note, however, that their use cases and audiences are different. DFAT’s audience is primarily international and varies by geographical location. Defence has a more local audience and focus.
More importantly than the content, online engagement is dependent on the strength of the ties between the senders or sharers and the recipients of the content. For both departments, improving those online ties is vital as they seek to influence.
What’s the solution?
The Australian Government should use social media far more strategically to engage international audiences—particularly in the diplomatic and defence portfolios. Both DFAT and Defence should review outdated digital strategies, cross-promote more content and demonstrate transparency and accountability by articulating and publishing social media policies.
Both departments should create more opportunities for training and the sharing of skills and experiences of public diplomacy staff. They should refrain from relying solely on engagement metrics as success measures (that is, as a measure of an individual’s, usually senior staff’s or heads of missions’, level of ability or achievement).
Instead, by changing the emphasis from the producers of social media content to the audiences that interact with it, the engagement data can be usefully regarded as a proxy for attention and interest. This can tell us what kinds of audiences (mostly by location) are engaged, and what types of content they do and don’t engage with. This information indicates the (limited) utility of social media; this should guide online engagement policy.
This report also highlights and recognises the value of social media for the defence community — especially as a means of providing information and support for currently serving personnel and their families—by supporting the use of Facebook for those purposes by all defence units.
DFAT should remove the direction for all Australian heads of mission overseas to be active on social media. While this presence is indeed useful and boosts the number of global government accounts, if our ambassadors aren’t interested in resourcing those accounts, the result can be sterile social media accounts that don’t engage and that struggle to connect with publics online. Instead, both departments should encourage those who are interested in and skilled at digital diplomacy to use openness, warmth and personality to engage.
Introduction: the global rise of Facebook
This report examines DFAT’s and the DoD’s use of one social media platform—Facebook—and evaluates current practices to identify how, where and for what purposes Facebook has impact.
The focus on Facebook reflects the platform’s global reach and its popularity as an everyday, essential medium for accessing and sharing information. Besides notable exceptions (such as China), in most places (such as some Southeast Asian countries), Facebook is so popular that it’s often roughly synonymous with ‘the internet’. This is a symptom of the platform’s ubiquity and utility as well as a consequence of Facebook’s heavily promoted services, including the Free Basics internet access service, which provides limited online access via a Facebook application.1
In order to generate lessons learnt, this report makes comparisons between Australian Government pages and their counterparts in the US, the UK, New Zealand and Canada. The analysis of Facebook use for diplomatic purposes is based on 2016–17 data extracted from Facebook pages of the diplomatic missions of eight ‘publisher’ nations (the five that are the subject of this report, as well as India, Israel and Japan) in 23 ‘host nations’.2 More recent data couldn’t be used because access is no longer available, but a review of the pages suggests that the analysis stemming from the data extracted during that period remains relevant.
The underlying design of Facebook deeply influences and limits its use by publishers and users. The Facebook newsfeed—the most commonly used feature for getting regularly updated information — prioritises posts from accounts that are either closely associated through a history of user activity, including liking, sharing, commenting and messaging, or are boosted through paid promotion.
One of the main consequences is that the more a Facebook user interacts with content that they prefer, the more likely they are to receive that type of material in their newsfeeds, which they’re in turn more likely to interact with and so on. Successful content has emotional appeal, or is useful, and comes from a Facebook page that’s been frequented by the user or been shared with a close member of a user’s Facebook network of friends. As this cycle continues, Facebook ‘gets to know its users better and better’.3
In other words, it isn’t enough to make engaging (meaning fun, compelling or relevant) content. Online engagement is dependent on the strength of the ties between the senders or sharers and the recipients of the content, at least as much and very probably more than the nature of the content. Understanding this is vital for governments as they seek to influence online.
But, as a social media network, Facebook brings with it complications for public diplomacy and defence social media strategies. For example, Facebook’s utility is limited by its underlying algorithm architecture and the habits and preferences of individual Facebook users, which are influenced by in-country patterns of social media usage and internet access. These issues need to be factored into departmental communications policies and social media strategies.
Online content, classified
Facebook posts can be classified into four types, according to their apparent function or purpose: outward-facing publicity (including propaganda), inward-facing publicity, engagement, and diplomacy of the public.4 The categories often overlap: content may be both inward- and outward-facing, for example. An analysis of these four types of content can be very useful for creating a strategy for effective DFAT and DoD Facebook use.
1. Outward publicity
Outward-facing publicity is the most common. It’s characterised by its evident target being the broader public of the country in which it’s posted, or a section of that public, such as overseas students, potential immigrants or, less commonly, large expatriate populations. It therefore uses the language of the local population and locally popular themes and topics. Content varies but usually involves the provision of information, publicity for events, branding exercises or the posting of trivia (such as pictures of koalas). Posts can also be warm and personal and include one of the internet’s maligned features—cuteness.
The most popular Facebook post recorded during this research displays many of those features. It’s a video of two American embassy ‘diplokids’ playing the Indian national anthem on the occasion of India’s Independence Day.5 It’s been viewed 2.53 million times and shared more than 125,000 times (as of January 2020).
Many popular posts are practical and transactional, such as information about employment, scholarships, funding opportunities and visa applications. The US Embassy in Mexico, for example, published a series of videos outlining the procedures for various visa classes. The Australian Consulate in Hong Kong published a sequence of posts targeting Australian citizens in the lead-up to the 2016 Australian federal election with information about how to vote, and—taking advantage of Facebook’s potential to target specific audiences—paid to promote them.
Posts announcing employment opportunities at the embassy or consulate for locally engaged staff are consistently among the most popular, especially in small and developing countries. These posts can serve as more than mere job ads. One such post, on the American Facebook page in Iraq, prompted an enquiry via the comment feed from a potential applicant who feared he might be too old to apply. The American page administrator replied, assuring this applicant that his application would be welcome and reiterating American policies against age-based discrimination in a way that promoted US values and demonstrated respect for an older Iraqi man, which in return inspired several positive comments in the thread.
Other popular outward-facing promotional posts include commemorations on significant memorial days and on the occasion of tragedies such as natural disasters. Noting these days of significance on Facebook should out of respect be considered obligatory, as they largely appear to be. Posts announcing support in the aftermath of disasters are often very well received (as indicated by numbers of shares and supportive comments) and suggest that Facebook can have a useful role in promoting aid and relief efforts. For example, the Australian Embassy in Fiji posted about assistance efforts after Tropical Cyclone Winston in 2016; those posts had engagement figures in the thousands (the mean engagement figure for 2016 was 29).6
Facebook posts promoting military activity elicited significant support in other contexts. US Facebook posts in support of Iraqi soldiers serving as part of the American-led coalition against Daesh, for example, were widely shared and commented on, almost entirely positively.
How important are ambassadors and consuls-general as proponents of outward-facing publicity? The research suggests that they’re significant assets where they’re personable and relatable and embrace the community and nation where they’re posted. Speaking the local language, either proficiently or with evident effort, is a major asset. While most posts are typically published in the local language (often as well as in English), publishing videos of heads of mission speaking the language seems to have additional audience appeal. One of the few Australian Facebook pages that increased its levels of engagement from 2016 to 2017 was that of the Embassy in Paris. Australia’s Ambassador to France, Brendan Berne, a fluent French speaker, features in a number of posted videos, including media appearances and official speeches.
In one popular video post, Ambassador Berne introduced changes in Australian law to legalise same-sex marriage and then popped the question to his unsuspecting partner, Thomas.7 This was acknowledged as unorthodox but was a calculated risk that paid off, increasing the profile of the Ambassador and thereby providing him with further platforms, including popular mainstream broadcast media, on which to promote the bilateral relationship.
Former US Consul-General in Hong Kong, Clifford Hart, exemplified how the personal can empower public diplomacy, to the extent that he was known as Clifford Baby (or ‘Clifford BB’).8 His very popular farewell video post featured Hart reflecting in Cantonese on his favourite places and dishes in Hong Kong. The video also uses catchphrases from Stephen Chow (an iconic actor in Hong Kong), which, while meaningless for those unfamiliar with his work, carried immense appeal for Hong Kongers.
2. Inward-facing publicity
Inward-facing publicity is related to outward-facing publicity but has an internal focus by appealing to smaller audiences—perhaps the local diplomatic or government community or to (even more internal) colleagues in Barton, Foggy Bottom or Whitehall.
This content frequently features a staged, formulaic photo of ‘distinguished guests’ at an official event.
Anecdotally, it’s been made clear to me on a number of occasions that this type of content is regarded as important, to the extent that hours can be spent on its production—the text carefully parsed and often escalated up the chain for approvals.
Although these events have limited appeal, they have a specific value that isn’t evident in their typically low engagement metrics.9 They’re important for those people featured in the photo and at the event as a record and an acknowledgement of their participation, and for indicating their status by highlighting their access, but the limited broader appeal of the posts suggests that the resources devoted to them should be minimised.
Other types of posts are evidently not (or poorly) targeted at a broader local public. These posts are characterised by the negligible use of local language or cultural connections and an overt emphasis on topics and themes that are of minimal interest to local target populations and more aligned to internal or specialised interests.
Common examples include key messages from governments about matters that are perhaps of global significance and represent core national values or positions on international matters (such as an opinion on certain environmental or human rights issues) but do not, according to the engagement data, resonate locally. These types of posts do no harm and are probably useful as records of, and advocacy for, important international issues. However, if they’re resource intensive, they present a poor return on investment.
One example of content that’s, probably inadvertently, inward-facing is a series of podcasts produced by the Australian Embassy in South Korea using the time of very senior diplomatic officials and promoted on the Embassy’s Facebook page. The podcasts featured interviews in English with significant Australians, including senior government figures. The low engagement metrics on Facebook (and the modest listening figures via Soundcloud) are unsurprising: in a saturated media market it’s difficult to imagine the appeal of podcasts in English featuring guests who (although esteemed and accomplished) are of marginal interest to a Korean audience.
The podcasts weren’t an evidently effective way of engaging with a Korean audience and, after 28 episodes over 18 months, were concluded at the end of 2017. While here it’s characterised as unsuccessful, creativity and bravery in public diplomacy should be supported. The idea of using podcasts is one that has value and could be adopted elsewhere, perhaps targeting specific audiences such as potential international students or investors and promoted via a more professionally oriented platform, such as LinkedIn. The South Korean experiment has the obvious lesson that such efforts can be made more likely to have impact if they’re planned to connect to and target local audiences as well as conveying Australian views and expertise.
Analysis for this report reveals that both outward- and inward-facing publicity posts by DFAT and Defence vary greatly in the engagement rates they enjoy. It’s difficult to see a pattern, and most successful posts are probably a result of good luck, good management and additional localised idiosyncrasies. But the general sense is that audiences largely pay attention to content that’s useful and relevant for them, not necessarily what’s most important to the authors of the content.
3. Engagement
Engagement posts are far less common than publicity posts. This is a bit surprising, as social media has been lauded as a site for interaction, discussion and debate and for making connections.
Some recent scholarship has concluded that diplomats aren’t taking advantage of this potential due to ingrained, institutionalised resistance, based on norms for information control and risk aversion.10 As a probable factor, this report outlines another entrenched problem: Facebook, due to its algorithmic factors that prefer close ties or paid promotion, isn’t often a very good platform for two-way engagement.
There are, however, some excellent examples of how Facebook has been used by Australian diplomats to facilitate a limited yet effective type of engagement through photo competitions. One, in Timor-Leste, invited photographs that characterised and shared affection for that country, thereby demonstrating ‘relational empathy’.11 Another, in the Australian Office in Taipei, invited Taiwanese in Australia to submit photographs of their travels and experiences, resulting in Taiwanese participating in a kind of networked conversation with other Taiwanese about their positive experiences in Australia, via an Australian diplomatic Facebook page. These types of photo-based campaigns could be replicated elsewhere.
Both of these competitions take advantage of a key function of social media—the ability to share images and tag friends—to increase the reach of their content. This turns Facebook users into micro-influencers, quite powerful at a smaller scale, distributing and personally endorsing content in their networks. An obvious advantage is that the content is provided and driven by users, not government officials. The fact that the content providers are from the local community also makes the content itself likely to have local references and appeal.
4. The audience, themselves
The last type of content present on these Facebook pages isn’t authored by the account holders (the diplomats) but by the Facebook users themselves. Usually, this appears in the comments, which can easily veer off onto (some malicious but some benign, even useful) tangents. The US Embassy in Mexico, for example, posts information about visa applications that can prompt reams of comments that ask for advice about people’s precise circumstances. Many of the requests are responded to by other Facebook users, who are able to offer specific advice.
Examples like this underscore the key lesson about Facebook for public diplomacy: social media users are often active audiences and participants who make choices about what content they respond to and how they respond to it based upon how relevant, useful and appealing they find it. This fundamental conclusion is a core lesson for DFAT and similar agencies.
Engagement—by the numbers
Ranking nations according to metrics fuels the spurious idea that those nations might be in competition with each other for attention in the digital space. Instead, it’s evident that diplomacy per se is in competition with the practically limitless amount of material published from all manner of sources, much of it antithetical to the aim of international amity, and all diplomats could benefit by learning from each other’s experiences. Instead of treating them as a measure of success, engagement metrics can be useful means of approximating audience size and attention.
On average, the data (in Figures 1–4) indicates that the Facebook audience for the 23 US official diplomatic accounts reviewed is far larger than others, but is also relatively passive. In comparison, Australia’s audience is comparatively more active and engaged. But we should note that all the figures below are global averages, varying considerably by location (again suggesting that a global ranking is unhelpful). The variations between the locations (see Table 1) contain important insights about what types of useful content, and which audiences are more active and engaged, are consequently more valuable.
All the following data is based on the Facebook pages of official diplomatic posts (embassies, consulates and similar offices).12 They’re typically managed by diplomatic staff who are often not public diplomacy specialists and are usually on a 3–4 year posting, usually with considerable input by locally engaged staff.
Figure 1 is based on the numbers of page likes (people who have ‘liked’ a Facebook page) in the host country where an embassy or consulate is located. Figures 2–4 are based on the levels of engagement (reactions, comments, shares) with the content that those embassies and consulates posted on their Facebook pages.
Figure 1: Facebook page likes, January–February 2018 (total, users located in host country)
Note: This data is no longer downloadable from Facebook’s application programming interface due to restrictions introduced by Facebook in 2019. This is one of the ways Facebook has limited public access to data. For example, until early 2018, it was possible to extract data about the location (based on their Facebook profile) of Facebook page followers, making it feasible to analyse the percentage of followers who were located in the host country (that’s the figure used here) or who were located elsewhere, either based in the home country (probably mostly expats) or in a third country. This includes followers who are suspected to be bogus, either paid to follow through click farms or fake accounts attempting to appear real. See D Spry, ‘Facebook diplomacy, click farms and finding “friends” in strange places’, The Strategist, 7 September 2017, online.
Figure 1 is the total for all of the embassies and consulates counted (a list of them is included in Table 1). Figure 2 is the average figure per embassy or consulate.
Figure 2: Average engagement per Facebook page, January–February 2018
The large number of the US Facebook page likes/followers highlighted above results in a relatively high level of engagements per post but not more engagements per user. In the latter category, Australia leads; the US runs last.
Figure 3: Average engagement per Facebook post, January–February 2018
Figure 4: Average engagement per Facebook user, January–February 2018
Table 1 shows Facebook reach (the percentage of a country’s total Facebook users who are following an embassy or consulate Facebook page) for 23 countries. As per Figure 1 (and see endnote 11), these figures include only those Facebook users who are located (according to their profile) in the country where the embassy or consulate is based (for example, followers of the Australian Embassy in Dili who are based in Timor-Leste). The figures in Table 1 are the average figures for the five nations and can vary considerably. For example, for Timor-Leste the average for all five embassies is 10.495% but for Australia it’s considerably higher (approximately 35% when last checked; this is one of the few embassy Facebook pages that demonstrates significant growth).
Table 1 also demonstrates the correlations between Facebook reach and per capita GDP, population size and median age (see the appendix for the methodology). Also, countries that are closer or more strategically intertwined are more likely to follow embassy and consulate Facebook pages (for Australia, Timor-Leste; for the US, Mexico and Iraq). An important finding of this research for Australian officials is that Facebook appears to be more useful for public diplomacy in developing countries that are small, young and geographically close to Australia.
Table 1: Facebook reach across 23 countries via a selection of indicators
The metrics vary by orders of magnitude: in Timor-Leste (on average) a Facebook page will be followed by about 10% of the population who have Facebook accounts; in Myanmar, it’s about 2%; in Taiwan and New Zealand, it’s about 1 in 1,000; in the UK and Canada, it’s about 1 in 10,000. In other words, on average, a Facebook page in Timor-Leste is close to a thousand times more likely to have a local follower than one in the UK or Canada.
For Australian diplomatic posts, the contrast is even starker: in Timor-Leste, around 26% of the local Facebook population follow the Facebook page of the Australian Embassy in Dili; the equivalent in the UK is 0.01%; in Canada, 0.005%. Australia’s Facebook page in Timor-Leste is around 5,000 times more likely to have a local follower than in Canada.
The temptation is to see this as a measure of the performance of Australia’s staff in Dili, Ottawa and London. That temptation should be resisted—there are, as Table 1 suggests, demographic factors (age, size, wealth) to consider when seeking reasons for the large variations in Facebook reach.
These demographic correlations suggest that Facebook diplomacy’s ‘success’ (or, I would suggest, ‘relevance’) isn’t necessarily the result of the public diplomacy staff’s skills and endeavours but more likely a product of external factors: the popularity of Facebook as a means of accessing information among younger populations; a lack of competing sources of information in smaller countries (with smaller media industries); and the funnelling of users onto the Facebook platform in those countries (including Timor-Leste and Cambodia) where Facebook’s Free Basics service provides free but limited internet access.
This implies that, while a Facebook page may be an effective, even a primary, public diplomacy tool in some places, it won’t always be in others: therefore, resources and strategy can be adjusted accordingly. For example, it suggests that the Australian embassies in Dili, Port Moresby and other high-ranking Facebook locations should be supported and encouraged to use Facebook (as they appear to be successfully doing). The high commissions in London, Ottawa and similar locations should maintain a presence but not prioritise Facebook as a means of public diplomacy, as it isn’t an efficient communication channel.
Limitations of using Facebook for diplomacy
However, if these numbers look small enough to question the point of having a Facebook page in some locations at all, it gets worse: average posts prompt engagement from between 1 in 100 and 1 in 1,000 followers. This means that in the UK, for example, the reaction rate is about 1 in 1 million active Facebook users. While reaction rates don’t equate to reach (reach figures aren’t obtainable), they’re indicative of attention and interest, and also contribute to the organic (non-paid) spread of the content.
This is likely to get worse. Changes to the Facebook algorithm since 2014 have made it more difficult to reach large audiences unless content is promoted through paid boosts. This is reflected in the engagement metrics falling or flattening year-on-year in most locations, with a few exceptions.
Therefore, the argument for an active Facebook page shouldn’t rest on the average engagement metrics alone. Facebook posts, as long as they’re prepared using minimal resources, are low risk, low investment and usually low reward. But some posts are quite valuable, even in locations where there’s usually little engagement, potentially serving as an economical means to exert influence with small, but repeated, effects. An examination of the types of posts and the levels of engagement they receive offers some insights.
Defence’s use of social media
A review of available defence organisations’ policies and associated commentary outlines three general areas of social media use:
personal use by personnel, whether or not on deployment or active duty, and their families
professional use by personnel in matters relating to their employment, such as networking and communication for the purposes of professional development and knowledge sharing
official use by personnel acting as representatives of the defence force and in pursuit of the defence force’s aims.
The first type—personal use—prompts concern among military forces for its potential to endanger military personnel and operations, or to damage the reputation of defence organisations. Those risks aren’t confined to official Facebook pages and are as likely to occur elsewhere; infringements are already covered under existing policies (such as preventing harassment and promoting operational and personal security). Posting on social media may bring infractions to light, meaning that they can be addressed, but also increases the risk of exposing the offending content to a wider audience before it can be deleted and the infraction contained.
The UK and US defence forces are especially active in promoting responsible social media use, including by publishing guidelines for personnel.
These concerns are counterbalanced by the capacity for social media to act as a means for military families and friends to stay in touch with loved ones while they’re on deployment. Also, as some American studies suggest, social media are especially beneficial for military spouses who form support networks based on their shared experiences and concerns.13
The second type of use—professional but unofficial use—is evidenced in limited ways on Facebook.
One example is the Facebook page for The Cove,14 a website set up for the purposes of promoting research for military professionals.
The third type, official use, is the focus of this report. The defence forces of the Five Eyes nations all operate numerous Facebook pages. In the case of the US, each branch of the armed services has at least hundreds (US Air Force), if not thousands (US Army), of Facebook pages.15 The pages representing each of the main branches have millions of followers, while pages at the level of operational units (regiments, battalions and the like) vary in size accordingly.
Unsurprisingly, the Facebook pages of the branches of the US military have followers (page likes) an order of magnitude larger than in other nations (Figure 5).
Figure 5: US main military Facebook page likes, March 2018
The militaries of the others have comparable numbers of page followers, but the British Army has a significantly larger cohort than the others (Figure 6).
Figure 6: Main military Facebook page likes, non-US, March 2018
Quantitative analysis of the defence forces’ Facebook pages indicates that they receive considerably more attention and engagement than their diplomatic counterparts. The average Australian diplomatic Facebook page is followed by about 0.02% of the Facebook population in the host country (the notable exceptions are Timor-Leste, 26%, and Papua New Guinea, 7%). The larger defence force pages are followed by a larger portion of the Australian Facebook population: Defence Jobs Australia (3.3%) and the Australian Army (2.4%).
The raw numbers are similarly stark. Defence Jobs Australia has close to half a million followers, the Australian Army more than 360,000, the RAAF more than 280,000 and the RAN more than 120,000. Those numbers increase daily.
The combined figure of the page likes of the ADF Facebook pages analysed for this report is 1.45 million, or close to 10% of the Australian Facebook population (although of course many Facebook users can follow multiple pages and some may come from overseas).
In comparison, major news programs have about 1.5–2 million Facebook followers, and the ABC News Facebook page has close to 4 million. News and magazine pages are the leading Facebook pages for engagement, averaging about 100,000 engagements per page per week; Defence pages averaged 45,000 in total. The Australian Army page alone received 12,500 engagements on average per week—comparable to the music industry average and above education, department stores and politics.16
Other nations’ pages are similarly popular. These figures suggest that Facebook is valuable for defence forces as a means of communicating to their publics. They also suggest that those publics are paying attention to these pages.
Why? Partly, the answer lies in the content posted on the pages and the ways that publics engage with it. Defence department Facebook pages differ from their diplomatic counterparts in important ways—chief among them is the nature of their audiences, which appear more domestic and more closely engaged. Partly, this arises out of the large numbers of current and former personnel and their friends and families. Also, in many democracies, publics have greater levels of emotional connection— trust,17 nostalgia, admiration—with militaries than with other parts of government (including foreign affairs agencies).
Official use of these Facebook pages includes a number of related functions. The main ones are:
publicity, firstly in the sense of promoting the defence force’s values, achievements and legacies, as well as information for potential recruits, and secondly in the sense of maintaining the openness and transparency that (within the parameters of operational and personal security) are expected from defence forces of democratic nations
information sharing with the defence force’s broader community of interest, including family and friends of serving personnel and veterans as well as other stakeholders (such as people residing near bases or training areas), and including sharing details about exercises and deployments
commemorations, including notifications and memorials for service personnel who have died on deployment or exercises, celebrations and thanks for retiring senior service personnel, and days of significance, either national (such as Anzac Day) or specific to the defence force.
This report’s analysis suggests that Facebook performs each of those functions usefully and in ways other forms of media would find difficult. User engagement varies considerably across the Facebook pages analysed. Some general observations include the following:
Levels of engagement are generally higher than for public diplomacy pages. In particular, defence content is shared more and attracts more comments.
Content on smaller Facebook pages (such as regiment, brigade or group pages) has a higher level of engagement per capita, suggesting a smaller but more engaged user community.
Comments appear to be positive and supportive: they express admiration for defence personnel, thanks for service (especially for those who died on duty), patriotism and nostalgia.
Military hardware in use has considerable appeal—cinematographic and otherwise.
Defence forces are highly regarded for their service (the ‘trust factor’) as well as their embodiment of national identity.
Members of defence forces, and their families and loved ones, use defence Facebook pages to express and share emotions, including, commonly, pride and admiration.
Some important posts—including notices about mental health—attract less engagement because those topics are sensitive and Facebook is public. This is an example of how Facebook users are conscious of their online personas and tend to portray themselves cautiously. It isn’t an argument against the value of those posts, which are useful opportunities for defence forces to raise awareness of important issues and available support services.
In action and in memoriam: ADF pages
The ADF Facebook pages attracting the highest engagement fall into two main categories: accounts of activities undertaken by ADF personnel (including community undertakings, training, exercises, deployments and military action) and commemorations of days of significance, the loss of military lives, or both.
The most important commemorative day on the Australian calendar, Anzac Day, is also the dominant topic on Defence Facebook pages, appearing in the top five most engaged posts of all the larger pages.
An exception is the Chief of the Defence Force’s Facebook page, where the most popular posts are those commemorating the return to Australia of fallen Vietnam War veterans and the 20th anniversary of the loss of 18 Army personnel during a Black Hawk helicopter collision in 1996.
On the smaller, unit-level Facebook pages, in addition to Anzac Day, popular posts commemorate important battles in the history of the unit, such as Long Tan in the Vietnam War and Kapyong in the Korean War. Other popular Facebook posts noted Australia Day, Mothers’ Day, Fathers’ Day and Christmas, sometimes connecting them to personnel currently serving overseas.
The popularity of commemorative posts suggests that Facebook facilitates support for ADF personnel and traditions in a public, shareable forum. Anzac Day’s popularity among the larger Facebook pages implies that those pages enjoy widespread popularity, whereas attention to unit-specific commemorations in the smaller pages indicates their importance to those with closer ties to those units, including veterans and their families.
Some posts feature videos of ADF personnel using impressive military equipment. These have evident appeal for military aficionados and, according to the Defence Jobs Australia Facebook page metrics, for potential recruits.
Another popular type of post outlines current actions taken by the ADF. Examples of this type include HMAS Darwin’s seizure, under UN sanctions, of illicit weapons heading to Somalia; assistance provided by HMAS Canberra to Fiji following Cyclone Winston; and Operation OKRA: Strike Vision, involving F/A-18A Hornets destroying facilities operated by Daesh in central Iraq.
Other examples of popular Facebook pages featuring the ADF in action include graduations (the Australian Defence Force Academy), promotions and—especially at the unit level—posts showing personnel assisting local communities and charities.
Five-Eyes defence forces
Commemorations and actions are top posts in other defence forces’ Facebook pages. The US defence forces’ pages, in particular, are notable for their popular displays of military hardware as well as being sites of public, patriotic support for troops.
The most popular post on the US Army Facebook page, on the anniversary on the 6 June 1944 D-Day landings in Normandy, exemplifies this combination of patriotism and military memorialisation. The comments on this post further indicate the commemoration’s personal significance for veterans’ families.
These US Facebook pages demonstrate the significance of the military services and suggest how deeply they’re embedded in American culture, in family histories, national identity and popular culture. Popular UK posts similarly suggest the link between military service, family legacies, history and nationalism—in this case sometimes represented by the British royal family.
Although similar themes are evident in all defence force Facebook pages, some examples of popular content from UK, Canadian and New Zealand pages offer small but significant contrasts with Australian pages.
For example, a New Zealand Defence Force video of a ceremony at the Menin Gate memorial in Ypres, Belgium, featuring personnel performing the haka was shared more than 30,000 times,18 and the most popular New Zealand Navy Facebook post was a link to a news report on the first sailor to get a moko (a full-face traditional Maori tattoo; Figure 7).19 The popularity of these posts reflects support for Maori culture as an intrinsic and valued part of New Zealand and its defence forces.
Figure 7: New Zealand Defence Force personnel perform a haka at Menin Gate, Belgium
25 April 2017, online.
Popular Canadian Facebook posts also showcase diversity and personality. The Canadian Army’s most popular post pays tribute to an indigenous veteran, Sergeant Francis Pegahmagabow of Wasauksing First Nation, a highly decorated World War I scout and sniper.20 Other popular content includes videos of deployed personnel in a snowball fight in Poland,21 a light-sabre fight marking Star Wars Day (#MayTheFourthBeWithYou),22 a warning against venturing onto military property while chasing Pokémon23 (see cover image) and personnel wearing red stilettos to support domestic violence survivors (Figure 8).24
Figure 8: Members of 3rd Canadian Division taking part in the #WalkaMileInHerShoes fundraiser in downtown Edmonton
Source: 3rd Canadian Division, ‘Members of 3rd Canadian Division are taking part in the #WalkaMileInHerShoes fundraiser in downtown Edmonton’, Facebook, 21 September 2017, online.
Defence recruitment
The relative popularity of defence recruitment sites indicates the value of Facebook for promoting military careers. This use of Facebook differs from the pages of the main defence force branches or at unit level, as it’s more akin to advertising and promotion and less like a community site: more bulletin board than discussion boards. It’s likely that many of these posts have been promoted through paid boosts and advertising, which is a common and reasonable use of marketing budgets (Figure 9).
Figure 9: Defence force recruitment page likes, March 2018
Generally, the recruitment pages’ content appears to have similar appeal to the main pages. For example, the most popular posts on the Defence Force Australia page are a 360-degree view of a boat drop from the amphibious ship HMAS Canberra (the second most popular post on Australian defence Facebook pages) and Anzac Day 2016.
The recruitment Facebook pages are also notable for the high number of posts by Facebook users. Between 20% and 30% of the posts on the Defence Force Australia, RAF and UK Royal Navy recruitment Facebook pages are by users. Many of these user posts are genuine requests about positions and recruitment procedures.
Defence social media policy and strategy
The ADF’s social media guidelines, policies and strategy documents are not public. The last publicly available external review of Defence’s use of social media was released in 2011.
This aversion to publicness and openness contrasts with the position of DFAT, which has published its public diplomacy25 and digital media strategies26, as well as the defence force of Canada, which has published its social media strategy,27 the defence force of the UK, which has published social media guidelines,28 and the various US forces, which have each published numerous policy and guideline documents.29
The Canadian social media guidelines go so far as to promote transparency and accountability as ‘principles of participation’, aimed at meeting community standards of trust and confidence.
It’s unclear why the ADF doesn’t operate on similar principles.
Conclusion and recommendations
Facebook pages provide opportunities for defence forces to communicate to publics and, at least as importantly, for publics to express their gratitude, admiration and affection to defence forces.
In contrast, diplomatic Facebook pages are targeted at, and receive attention from, foreign publics. Compared to defence, diplomatic Facebook pages receive far less attention, but the levels of attention vary. Specifically, in countries that are smaller, younger, poorer and closer (such as Timor-Leste and Papua New Guinea), Facebook is, based on the data, an important means to inform—and engage with—general publics. Communications strategy should therefore prioritise Facebook in those countries by training personnel, allocating funds to content production and paying heed to the levels and nature of engagement by publics. Elsewhere, such as in Canada and the UK, Facebook is far less important and should be deprioritised in, but not eliminated from, public diplomacy strategies.
The strengths and limitations of Facebook’s usefulness are determined by its algorithm, which prioritises audiences’ pre-existing connections and optimises content that appeals to their needs and desires. It’s essential therefore that Defence and DFAT prioritise those audiences when determining if, when and how to make use of Facebook.
This report argues for a measured, more strategic use of social media. Specific solutions are as follows.
For diplomacy
Review the digital media strategy to account for the location-based variability of Facebook’s usefulness and prioritise resources accordingly.
Encourage diplomatic missions to develop, implement and review localised social media plans using the experience and expertise of locally engaged staff (providing training where required), and redefine the role of Australia-based staff to strategic oversight and governance.
Remove the direction for all heads of mission to be active on social media; encourage those who are active on Facebook to use openness, warmth and personality to create relational empathy.
Create opportunities for training and sharing the skills and experiences of public diplomacy staff.
For defence
Demonstrate and promote transparency and accountability by publishing social media policies.
Recognise the value of social media for the Defence community, especially as a means of providing information and support for currently serving personnel and their families, by supporting the use of Facebook for those purposes by all defence units.
Continue Defence’s impressive work using Facebook as a platform for the community to express support for personnel and veterans, and maintain the dignified, sombre tone of the memorial content.
For diplomacy and defence
Consider cross-promoting content. Defence pages reach the large national audience that diplomacy increasingly needs. Diplomatic Facebook pages—in some locations—provide opportunities for the ADF to promote its actions and values to international audiences, acting as a useful vector for strategic communication.
Refrain from using engagement metrics as success measures for diplomats; use them as proxies for public attention in order to gauge how the value of Facebook varies according to audience type and location.
Prioritise audiences’ use of social media when developing strategies, creating content and allocating resources.
Appendix: Methodology
This research focused exclusively on Facebook. While other social network platforms, especially Twitter, are also relevant, they lie outside the scope of this report.
The research used digital media research methods, which made it possible to gather and analyse large amounts of data indicating Facebook users’ engagement with online content, including which posts received more than average attention, through the examination of Facebook engagement metrics (likes, comments and shares).
This enabled analysis of Facebook users’ interests based on either the content (what types of posts receive the most attention) or the users (who was engaging with content). In turn, this suggested how social media are used and therefore how they can be useful.
The analysis of Facebook use for diplomatic purposes is based on 2016–17 data extracted from Facebook pages of the diplomatic missions of eight ‘publisher’ nations (the five that are the subject of this report, as well as India, Israel and Japan) in 23 ‘host’ nations.30 Restrictions imposed by Facebook in 2019 (and before 2018 data was extracted) mean this form of research isn’t currently replicable. The database used in this research is therefore unique; it’s available from the author.
Unlike the defence Facebook pages, the data for the diplomatic pages includes the location of those Facebook users who have followed the Facebook pages of the diplomatic mission. Again, this feature is no longer possible due to restrictions introduced by Facebook in early 2018, before the defence Facebook pages analysis was undertaken.
This report is based on data that accesses the Facebook application programming interface and obtains Facebook post and comment content (text, and links to images and video), as well as engagement data (reactions, including likes, comments, and shares). Analysis followed a two-stage, mixed-methods approach. First, quantitative data analysis identified trends and outliers. Second, identified outliers (such as high-performing pages and posts) were treated as key case studies and their content was considered more closely using methods based on qualitative media studies.
The analysis of the Facebook pages was contextualised and informed by an examination of publicly available policy and strategy documents as well as background discussion with several currently serving or former defence and diplomatic personnel from Australia and elsewhere. An important note: the engagement metrics are not, and shouldn’t be, considered as indicators of the ‘success’ of a particular Facebook page. Instead, they were used here as indicators of attention, and therefore as a means of assessing what content a specific page’s audience was more interested in and how it made use of that content.
Acknowledgements
The author would like to thank the members of the Australian and international defence and diplomatic communities for their informal advice and support, as well as for their dedication and professionalism. Any errors and all findings, conclusions and opinions contained herein are my responsibility.
What is ASPI?
The Australian Strategic Policy Institute was formed in 2001 as an independent, non‑partisan think tank. Its core aim is to provide the Australian Government with fresh ideas on Australia’s defence, security and strategic policy choices. ASPI is responsible for informing the public on a range of strategic issues, generating new thinking for government and harnessing strategic thinking internationally.
ASPI International Cyber Policy Centre
ASPI’s International Cyber Policy Centre (ICPC) is a leading voice in global debates on cyber and emerging technologies and their impact on broader strategic policy. The ICPC informs public debate and supports sound public policy by producing original empirical research, bringing together researchers with diverse expertise, often working together in teams. To develop capability in Australia and our region, the ICPC has a capacity building team that conducts workshops, training programs and large-scale exercises both in Australia and overseas for both the public and private sectors. The ICPC enriches the national debate on cyber and strategic policy by running an international visits program that brings leading experts to Australia.
Important disclaimer
This publication is designed to provide accurate and authoritative information in relation to the subject matter covered. It is provided with the understanding that the publisher is not engaged in rendering any form of professional or other advice or services. No person should rely on the contents of this publication without first obtaining advice from a qualified professional.
This publication is subject to copyright. Except as permitted under the Copyright Act 1968, no part of it may in any form or by any means (electronic, mechanical, microcopying, photocopying, recording or otherwise) be reproduced, stored in a retrieval system or transmitted without prior written permission. Enquiries should be addressed to the publishers. Notwithstanding the above, educational institutions (including schools, independent colleges, universities and TAFEs) are granted permission to make copies of copyrighted works strictly for educational purposes without explicit permission from ASPI and free of charge.
First published May 2020.
ISSN 2209-9689 (online) ISSN 2209-9670 (print)
L Mirani, ‘Millions of Facebook users have no idea they’re using the internet’, Quartz, 9 February 2015, online. See also Facebook, ‘Where we’ve launched’. ↩︎
D Spry, ‘Facebook diplomacy: a data-driven, user-focussed approach to Facebook use by diplomatic missions’, Media International Australia, 168(1):62–80. ↩︎
‘The inquiry: How powerful is Facebook’s algorithm?’, BBC World Service, 24 April 2017, online. ↩︎
https://aspi.s3.ap-southeast-2.amazonaws.com/wp-content/uploads/2025/03/24160738/pb31-winning-hearts-and-likes_banner.jpg6901226nathanhttp://aspi.s3.ap-southeast-2.amazonaws.com/wp-content/uploads/2024/10/16232551/ASPI-CMYK_SVG.svgnathan2020-06-02 06:00:002025-03-24 16:09:31Winning hearts and likes
Forty years ago, in a seminal masterpiece titled Amusing Ourselves to Death, American author Neil Postman warned that we had entered a brave new world in which people were enslaved by television and other technology-driven entertainment. The threat of subjugation comes not from the oppressive arm of authoritarian regimes and concentration camps but from our own willing submission and surrender.
“Big brother does not watch us, by his choice. We watch him, by ours,” Postman wrote in 1985.
“There is no need for wardens or gates or Ministries of Truth. When a population becomes distracted by trivia, when cultural life is redefined as a perpetual round of entertainments, when serious public conversation becomes a form of baby-talk, when, in short, people become an audience and their public business a vaudeville act, then a nation finds itself at risk; culture-death is a clear possibility.”
Postman’s insight would have been spot-on had he written this today about TikTok. Postman was mostly thinking about mass media with a commercial imperative. People would be enslaved to superficial consumerism. But add a technologically advanced authoritarian power with platforms that – unlike terrestrial TV – are essentially borderless and can reach around the globe, and you have George Orwell’s Big Brother put together with Aldous Huxley’s cultural and spiritual entropy.
Addictive digital entertainment can be corrosive even without a malign puppeteer. But with an entity such as the Chinese Communist Party fiddling the algorithms, it could be catastrophic.
Just in 2025, we have seen much of the Western world so spellbound by TikTok that the thought of living without it brought on the anguish normally reserved for the impact of conflict. “TikTok refugees” became a description, as though they had been displaced like Jews fleeing Europe or Yazidis escaping Islamic State.
Postman noted that we were innately prepared to “resist a prison when the gates begin to close around us … But what if there are no cries of anguish to be heard? Who is prepared to take arms against a sea of amusements?”
The cries of anguish were depressingly muted as TikTok built up a following in Western countries that now means four in 10 Americans aged under 30 get their “news” from TikTok, according to a recent survey by the Pew Research Centre.
When a ban was flagged, the cries came from those who couldn’t bear to give up the platform and from free speech absolutists who believed any rules amounted to government overreach. If our most popular radio stations had been based in Germany in the late 1930s, the Soviet Union during the Cold War or Syria during the ISIS caliphate, our leaders would have protected the public, regardless of popularity and notwithstanding that it would constitute government intervention in the so-called free market of ideas.
In fact, the market isn’t free because powerful actors can manipulate the information landscape.
Billionaire Elon Musk gives free-speech advocates a bad name by posting not just different opinions but promoting false content on issues such as Ukraine on his platform X. But more sinister is a platform such as TikTok, which is headquartered in authoritarian China and ultimately at the control of the CCP, with algorithms that have been demonstrated to manipulate audiences by privileging posts that serve Beijing’s strategic interests and downgrading content that does not.
Despite such threats, we have no clear framework to protect ourselves from powerful information platforms, including the newest generative artificial intelligence models such as DeepSeek, which will be increasingly available – and, thanks to their affordability, attractive – despite operating under Chinese government control. As a US court declared in upholding the congressional ban on TikTok, giving a foreign power a vector to shape and influence people’s thinking was a constraint on free speech, not an enabler of it.
Freedoms of speech and expression are core democratic principles but they need active protection. This means the involvement of governments.
US Vice-President JD Vance told the Munich Security Conference that Donald Trump represented a “new sheriff in town” who would defend free speech and “will fight to defend your right to offer it in the public square, agree or disagree”. It was an admirable derivative of the quote attributed to Evelyn Beatrice Hall describing Voltaire’s principle of “I may not agree with what you say, but I will defend to the death your right to say it”. But just as we have regulators for financial and other markets, we need regulation of our information markets.
By all means, speech should be as free as possible. Awful mustn’t equal unlawful, to borrow ASIO boss Mike Burgess’s phrase. Speech that hurts the feelings of others or advocates unpopular views cannot be the threshold for censorship. Such lazy and faint-hearted policymaking creates only a more brittle society. But that doesn’t mean we should make ourselves fish in a barrel for malign foreign powers.
Anarchy is not freedom. Governments need to brave the minefield that is modern information technology. If a platform poses risks that cannot be avoided, as with TikTok, it should be banned.
Other platforms that sit within democratic nations’ jurisdictions should be subjected to risk mitigations such as content moderation to deter and punish criminal activity. X, Facebook, Instagram and YouTube can be used as avenues for information operations, as shown by Russia buying advertisements on Facebook or CCP-backed trolls posting on X and YouTube, or be used as vectors for organised crime. Even the most ardent free-speech advocates would agree that drug trafficking, child abuse or joining a terrorist group are illegal offline and therefore should be illegal online.
No marketplace remains free and fair when governments overregulate or abdicate responsibility.
The once-free markets of trade and investment have been eroded by China to such an extent that just this week Trump issued a foreign investment policy to protect American “critical technology, critical infrastructure, personal data, and other sensitive areas” from “foreign adversaries such as the PRC”, including by making “foreign investment subject to appropriate security provisions”.
A key principle of the new presidential policy is that “investment at all costs is not always in the national interest”.
In other words, security measures and rules keep American critical infrastructure free.
While it has not yet gained much media attention, it is among the most important economic security policies ever taken to counter Beijing’s objective to “systematically direct and facilitate investment in United States companies and assets to obtain cutting-edge technologies, intellectual property and leverage in strategic industries”, and all of America’s allies and democratic partners should publicly support it and implement it domestically.
We like to think that technologies are neutral mediums that are only vehicles for improvement. As Postman wrote, this belief often rises to the status of an ideology or faith.
“All that is required to make it stick is a population that devoutly believes in the inevitability of progress,” he wrote. “And in this sense … history is moving us toward some preordained paradise and that technology is the force behind that movement.”
Science and technology have of course delivered extraordinary improvements to our health, our economic productivity, our access to information and our ability to connect with other people regardless of geography – provided we engage with it wisely. We must not become cynical about technology entirely, which is why we must maintain control over it and ensure it serves our interests.
The ultimate solution is knowledge and participation. As Postman concluded, the answer must be found in “how we watch”. With no discussion on how to use technology, there has been no “public understanding of what information is and how it gives direction to a culture”.
Postman wrote that “no medium is excessively dangerous if its users understand what its dangers are”. He insisted we were “in a race between education and disaster”.
https://aspi.s3.ap-southeast-2.amazonaws.com/wp-content/uploads/2025/03/20152541/65ea9e2a0f69a9cd3e9e052917b0b7d8.jpg5761024nathanhttp://aspi.s3.ap-southeast-2.amazonaws.com/wp-content/uploads/2024/10/16232551/ASPI-CMYK_SVG.svgnathan2025-03-01 16:41:492025-03-20 15:26:33‘Amusing ourselves to death’ in age of TikTok
With its recent halt on implementing a legally mandated ban on TikTok, the United States is learning the hard way that when it comes to Chinese technology, an ounce of prevention is worth a pound of cure.
The US and like-minded democracies should no longer permit any social media platforms with direct ties to authoritarian governments with political censorship regimes to operate without restriction.
For years, technology and national security analysts have sketched out scenarios of what might happen if a democratic population were to become dependent on a Chinese-owned technology. Once such a technology becomes embedded in people’s daily lives and livelihoods, removing it stirs up a host of domestic political controversies, making it politically untenable to mitigate the national security risks.
That is exactly what has happened with TikTok. Around 170 million Americans—about half the country’s population and an even higher percentage of those using social media—use the short video app, owned by Chinese tech giant ByteDance. Millions of Americans have become dependent on their TikTok followings, built up over years, for their income or to promote their businesses. Tens of millions more use TikTok as a key source of information, community, and entertainment.
In classic American fashion, those users have refused to go gentle into that good night. As a law banning TikTok was set to go into effect on 19 January, many users downloaded the Chinese social media app RedNote, which isn’t just Chinese-owned—it is Chinese itself, based in Shanghai and subject to all Chinese national security and intelligence laws. Self-styled ‘TikTok refugees’ said they moved to RedNote to express their disregard for US government concern about the risks presented by Chinese companies. Overnight, RedNote, which presents even clearer security risks than TikTok, became the top download on the Apple app store in the US.
TikTok called on US President Donald Trump to offer a reprieve, and he did. On his first day in office, Trump signed an executive order authorising a 75-day extension on the law taking effect.
But it’s unclear what will happen next. We will need to see how a Trump administration navigates this issue. The law mandates either a forced divestiture or a ban. A previous US effort to force the sale of TikTok failed when the Chinese government issued new rules requiring Chinese companies to obtain a license for such a sale. Beijing did not grant ByteDance a license, effectively blocking the sale. Discussions are now reportedly underway for the sale of a 50 percent stake in TikTok to a US company, but that would not fulfill the law’s requirements.
This situation demonstrates the need to act early to inhibit the widespread adoption of social media platforms tied to authoritarian governments, such as Russia and China, that implement sweeping surveillance, censorship and manipulation of public opinion.
Western governments had all the information they needed about the risks of social media apps operating under authoritarian systems when TikTok took off in 2018—the year it became one of the world’s most downloaded apps. That was the time to act—the same time action was being taken to prevent Huawei from dominating the 5G telecommunications sector. The question now is whether we learn from our failures. While it’s too late to prevent TikTok from becoming a beloved American online space, it’s not too late to prevent the widespread adoption of similarly problematic apps. RedNote, for example, remains untouched, as do a host of other Chinese platforms.
The main argument against a sweeping ban on problematic foreign-owned apps is that this would infringe on free speech. But the opposite is true—as the US Court of Appeals essentially found. A social media platform under the sway of a foreign government obsessed with censorship and surveillance is an impediment to free speech. Democratic governments should act to preserve free speech by preventing these platforms from dominating online spaces.
Trade experts and economists understand that free markets don’t just happen naturally; creating and preserving a free market requires a strong government hand. There must be laws against unfair market behavior, mechanisms to bring cases against potential violators, means to investigate those claims, and strong enforcement. Sometimes the biggest violators are governments themselves.
In the same way, a free speech environment doesn’t happen naturally. There must be laws and practices in place to protect it. Put another way, it sometimes takes a strong government hand to create and preserve a free market for speech. As with free markets, sometimes the biggest violators of free speech are governments. And just as the public in a democracy has the ultimate power to vote out its own government for violating freedoms, protecting the public from foreign regimes and their intelligence services is the job of democratic governments.
The Chinese government has no right to censor or manipulate information on US soil. The Trump administration should act as soon as possible to ensure that no other social media companies linked to authoritarian governments can again play host to America’s virtual public square.
http://aspi.s3.ap-southeast-2.amazonaws.com/wp-content/uploads/2024/10/16232551/ASPI-CMYK_SVG.svg00markohttp://aspi.s3.ap-southeast-2.amazonaws.com/wp-content/uploads/2024/10/16232551/ASPI-CMYK_SVG.svgmarko2025-01-31 03:00:422025-03-18 12:00:16Democracies should learn the TikTok lesson and restrict risky apps from day one
What if the most popular apps on our phones were quietly undermining national security? Australians often focus on visible threats, but the digital realm poses less obvious yet equally significant dangers. Yet, when it comes to the digital landscape, a blind spot remains: the hidden risks posed by platforms such as TikTok and RedNote (Xiaohongshu). These apps are more than just harmless entertainment; they’re tools in a global battle for data and influence. And we, as a society, remain largely unaware.
TikTok, RedNote and similar platforms have embedded themselves deeply into daily life. Their algorithms delight us with engaging content, fostering a sense of connection and entertainment. But this convenience comes at a cost. Few stop to question what’s behind these apps: who owns them, where our data goes, what it might say about us, and how it might be used. In fact, these platforms, owned by companies who must obey authoritarian governments, present profound risks to our privacy and national security.
Digital risks are invisible and complex and, for most, our understanding is limited. While most Australians grasp the tangible dangers of terrorism or cyberattacks, the concept of apps and data collection being weaponised for disinformation and influence campaigns feels abstract. This gap in understanding is compounded by the prioritisation of convenience over caution. Governments and experts have sounded alarms, conducted enquiries and in extreme cases implemented total bans—as seen with TikTok in the US—but their warnings often fail to resonate amid the noise of daily life. As a result, we remain unprepared for the evolving tactics of malign actors who exploit these vulnerabilities.
Platforms such as TikTok and RedNote collect vast amounts of user data—from location and device details to browsing habits. In the wrong hands, this data can be used to map social networks, identify vulnerabilities or inform targeted disinformation campaigns. Algorithms don’t just show users what they like; they also shape what users believe. Through curated content, adversaries can subtly influence societal narratives, amplify divisions or undermine trust in democratic institutions. Beyond individual users, these platforms could act as backdoors into sensitive areas, through officials’ use of them (despite rules against it) or business executives sharing trade secrets on them.
Australia must address the vulnerabilities on these apps, particularly as the nation strengthens partnerships under such initiatives as AUKUS. Demonstrating robust digital hygiene and security practices will be essential to maintaining credibility and trust among allies.
The enactment of the Protecting Americans from Foreign Adversary Controlled Applications Act has prompted an exodus of users from TikTok, driving them to seek alternative platforms—though Donald Trump has given the app’s owner some indication of a reprieve.
Many TikTok users have turned to RedNote, which has rapidly gained traction as a replacement. Unlike TikTok, which operates a US subsidiary and is banned within China, RedNote is fully Chinese-owned and operates freely within China, creating a level of commingling and data exposure that was not present with TikTok. This raises even greater concerns about privacy and national security. While banning RedNote might seem like a straightforward solution, it does not address the core issue: the lack of public awareness and education about the risks inherent in these platforms. Without understanding how their data is collected, stored, and potentially exploited, users will continue to migrate to similar platforms, perpetuating the cycle of vulnerability. This underscores the urgent need for widespread digital literacy and education.
Recent legislation aimed at protecting children from social media platforms, such as the minimum-age requirements introduced by the Australian government, is a step in the right direction. However, this approach could be endlessly repetitive: new platforms and workarounds could quickly emerge to bypass regulations. The question remains: can the government effectively manage implementation of such policies in a fast-evolving digital landscape? And if we are applying policies to protect children, what about defence force personnel using these free applications? They could inadvertently expose national-security information. A consistent, security-first approach to app usage should be considered across all demographics, especially those with access to critical data.
Governments must take the lead by implementing stricter regulations and launching public awareness campaigns. Comprehensive digital literacy programs should be as common as public-awareness campaigns on physical health or road safety, equipping Australians to recognise and mitigate digital threats. They should know where their data is stored, understand they should resist letting apps know their location, and consider potential consequences. Digital security is no longer a niche concern; it is a core component of modern citizenship.
The hidden risks we scroll past each day are not just a matter of personal privacy but of national security. As Australians, we must shift our mindset and take these threats seriously. By recognising the vulnerabilities embedded in our digital habits, we can build a more secure and resilient society. Because when it comes to national security, ignorance is no longer bliss.
http://aspi.s3.ap-southeast-2.amazonaws.com/wp-content/uploads/2024/10/16232551/ASPI-CMYK_SVG.svg00markohttp://aspi.s3.ap-southeast-2.amazonaws.com/wp-content/uploads/2024/10/16232551/ASPI-CMYK_SVG.svgmarko2025-01-21 04:30:062025-03-18 13:59:35The hidden risks we scroll past: the problem with TikTok—and RedNote
Law enforcement and social media platforms must implement real-time data sharing to stop online extremism before it leads to violence. Using appropriate safeguards, we can achieve this without raising concerns about creating a surveillance state.
Social media companies have vast behavioural data, but their reluctance to share it with authorities means we’re left scrambling after an attack occurs. The resulting delay facilitates radicalisation and puts lives at risk. Rather than reacting to attacks, we should aim to prevent harm through a coordinated, data-driven approach. The current system is failing. Speed matters. Privacy concerns are valid, but when the stakes are this high, we need to ask: how many more lives are we willing to risk?
Extremist groups exploit unregulated online spaces to recruit, radicalise and incite violence. By the time we detect it, it’s often too late. We’ve seen the deadly consequences: shootings, terrorism and violence facilitated through social media. Social media companies like to claim they are neutral platforms, but they control the algorithms that amplify content, creating an environment where radical ideas can thrive.
Take the Christchurch mosque shootings in 2019 for example. The shooter posted his manifesto on Facebook and 8chan (an online message-board) before killing 51 people. Although Facebook moved quickly to remove his manifesto, the content spread to thousands. But his interactions with extremist groups and violent posts could have been flagged long before the attack. If they had then been shared immediately with law enforcement, authorities could have detected his extremist behaviour early and intervened.
Social media platforms must be more proactive in identifying extremist content and sharing it with authorities immediately. Delayed intervention leaves room for radicalisation. This is compounded by algorithms that prioritise content likely to generate engagement—likes, shares and comments. Extreme content, which often elicits strong emotional reactions, is amplified. Conspiracy theories, such as QAnon, spread widely on online platforms, drawing users deeper into radical echo chambers.
This isn’t about mass surveillance—it’s about content moderation. This approach should build on existing moderation systems. Authorities should only be alerted when certain thresholds of suspicious activity are crossed, much as financial institutions report suspicious transactions. For example, if activity suggests a user is being recruited by a terrorist group, or if the user shares plans for violence, social media companies should have the ability—and in fact the responsibility—to flag this behaviour to authorities.
Of course, automated content detection can result in misjudgements. This is where human content moderators within social media companies could play a role: once an automated system flags potentially harmful activity, it could trigger a review by an employee who would assess whether the flagged behaviour meets a threshold for real-time sharing with law enforcement. If the content is likely to incite violence or indicate a credible threat, the moderator could initiate real-time data sharing with authorities for possible intervention.
This verification process could be among the safeguards in place to ensure that only high-risk, potentially harmful activities are flagged, protecting the privacy of those who don’t present a threat and preventing concerns arising about the government creating a surveillance state. Shared data would follow appropriate legal channels, ensuring transparency and accountability.
The costs of implementing real-time data-sharing systems are manageable. Social media platforms already use automated systems for content moderation, which could be adapted to flag extremist behaviour without imposing significant human resource costs. Shared financial responsibility between social media companies and law enforcement could also help. Law enforcement agencies could receive funding to process flagged data, while tech companies would have to pay for technology needed to detect extremist activity. We can manage implementation costs and focus resources where they’re most needed by prioritising high-risk platforms and upscaling the system over time.
A limitation is that Australia could not impose this mechanism on platform operators that had no presence in the country. But the larger platforms’ operators, such as Meta, X and Snap, do.
Our current reactive approach isn’t working. We need real-time data sharing between tech companies and law enforcement to intercept threats before they escalate. Lives are at stake, and we can’t afford to wait for the next tragedy.
https://aspi.s3.ap-southeast-2.amazonaws.com/wp-content/uploads/2024/11/19154837/GettyImages-1474149189-e1732765809596.jpg443728markohttp://aspi.s3.ap-southeast-2.amazonaws.com/wp-content/uploads/2024/10/16232551/ASPI-CMYK_SVG.svgmarko2024-11-28 22:00:542025-03-20 09:39:47To pre-empt extremist violence, we need real-time social media data sharing
Despite a push for openness and transparency in communicating the Australian National Defence posture, one group the Australian government is failing to converse with is its own citizens, especially its youth.
Communication is integral to AUKUS’s resilience and success. As Australia’s youth will be the generation who will be asked to provide for the national defence when AUKUS comes to fruition, it stands to reason that they must understand its value.
A multi-pronged and well-funded approach from the government is therefore needed for effective engagement with them. This approach must be focused on social media presence, outreach to youth organisations and schools and increasing access to AUKUS-related information.
As the agreement moves forward, all three partners must improve their messaging, particularly regarding Pillar II—advanced capabilities. Disjointed messaging between them that fails to account for each country’s socio-political environment risks losing public support and poses a threat to AUKUS’s survival.
Explanations of AUKUS can’t rely wholly on defence aspects but must include non-traditional security facets as well. A key topic Australian youth are most concerned about is the environment. So, in regard to nuclear submarines, Australian government officials must be prepared to discuss plans for disposal of nuclear waste and fears of a naval Chernobyl. Additional discussions must be had about what the youth role in the economy will be under AUKUS and the impacts of Pillars I and II for Australia’s economy and market. Telling them how the agreement benefits their daily lives now and into the future will go a long way to maintaining support.
The Office for Youth recently launched the Engage! Strategy, designed to improve young people’s involvement in government. The government can tailor elements of this strategy to specific needs of AUKUS messaging. Specifically, younger populations are increasingly getting news from non-traditional media sources. They are less likely to look for an official government statement, so the government must meet them in spaces they frequent. In fact, 68.8 percent of young Australians want the government to engage them on social media platforms.
To meet this demand for engagement, the government must be creative in its social media presence.
One example of innovative presence is NATO’s #ProtectTheFuture campaign, in which NATO experts played popular video games on Twitch with streamers from alliance countries. In partnering with Twitch streamer and Youtuber ZeRoyalViking to discuss NATO, cybersecurity and how video games can teach digital safety practices, they reached more than 40,000 people.
Australian government officials should do similar collaborations surrounding AUKUS with YouTube and Twitch streamers from Australia. These could be focused on explaining AUKUS or take a thematic slant aligned with the two pillars.
Keeping a finger on current trends and viral content also plays a critical role in the social media space. Part of the reason for the success of the NATO campaign was that the platform, games and streamers connected with what viewers were interested in at the time.
A comparable phenomenon can be seen with Kamala Harris’s presidential campaign and social media account @KamalaHQ. What has brought Harris’s presidential campaign to the forefront of American youth was singer and songwriter Charli XCX tweeting that ‘kamala IS brat’. This tweet—based off the pop culture trend of Brat Summer—spread to Harris’s marketing campaign, in turn reinvigorating youth voters.
Social media engagement isn’t a panacea, however. Youth organisations, especially at schools across the country, are also key.
Whether it’s by sending AUKUS experts to speak at organisation meetings or hosting online webinars, there’s room for engagement through connecting with those who have intersecting interests. This could look like hosting a Q&A panel or trivia night with student political organisations. Outreach should also engage science and technology organisations and vocational institutions to discuss job opportunities that will become available due to AUKUS.
The final area the Australian government must harness for youth engagement with AUKUS is a singular, dominant digital presence. Creating a first point of contact online ensures that information on AUKUS is accessible for Australian youth who wish to learn more.
A common way to link information is through services such as Linktree or with a website. Currently, the AUKUS partnership has neither. By linking sources from all three governments to associated social media accounts, the public will see trusted, verified sources to turn to alongside traditional media avenues. Starting this process now would generate a solid foundation for issues or addressing misinformation that may come up in the future.
Youth engagement with AUKUS is vital to its long-term success. Through ongoing messaging campaigns, Australia must continue to convince its citizens on why AUKUS matters, how it affects them and what it changes about the Australian way of life—because the fact of the matter is that, without Australia’s younger populace on board, literally and figuratively, the future of AUKUS is uncertain.
https://aspi.s3.ap-southeast-2.amazonaws.com/wp-content/uploads/2024/10/19155519/aukus.png10801080markohttp://aspi.s3.ap-southeast-2.amazonaws.com/wp-content/uploads/2024/10/16232551/ASPI-CMYK_SVG.svgmarko2024-10-03 23:00:402024-10-03 23:00:40Australia needs to engage its youth population around AUKUS
Early misinformation identified a mentally ill man who stabbed 14 people in Sydney on March 13, killing six, as a Muslim or Jewish extremist. These falsehoods highlight the commonplace way in which Muslim and Jewish communities are scapegoated in times of crisis. To improve social cohesion in Australia, we must do more to prevent such instances of xenophobic and religious stereotyping.
Soon after the horrific events at Bondi Junction in Sydney’s east, social media sites including X, Facebook, Instagram, TikTok and Reddit were main conduits for the rapid spread of misinformation. X accounts with large followings, and even a British television presenter, initially alleged the attacker was an Islamist terrorist. Some social media users also suggested the attacker was anti-Semitic, speculating that an area with a high Jewish population was deliberately targeted and that the attack was somehow connected to the Israel-Hamas war.
Others speculated online that the attacker was Jewish. Pro-Kremlin Russian-Australian Simeon Boikov was one key social media figure who amplified this narrative. Boikov is being sheltered by the Russian Consulate in Sydney to avoid arrest for assault. Within a matter of hours, his false claims had reached hundreds of thousands on X and Telegram and were even repeated by a national news outlet.
Whereas the attacker may have been acting on feelings of misogyny, misinformation about his motivation and supposed background as either Muslim or Jewish quickly spread far and wide. Some social media figures who spread this misinformation apologised for their incorrect assumptions, but many other posts and comments still remain online.
This misinformation is deeply problematic and harmful to Muslim and Jewish Australians. It is not uncommon for religious minorities to face retaliation over bouts of violent crimes and extremism, including wider geopolitical events abroad. In the first seven weeks of the Israel-Hamas war last year, there was a thirteen-fold increase in reports of Islamophobia made to Islamophobia Register Australia. In October and November of last year, the Executive Council of the Australian Jewry documented 662 anti-Semitic attacks in Australia.
As made clear in the 2024 threat assessment of the Australian Security Intelligence Organisation, religiously motivated violent extremism remains a real threat in Australia. But in no circumstance should faith communities at large be vilified and harassed for the acts of violent criminals and extremists. Religiously and racially motivated hate is antithetical to Australia’s aim of furthering social cohesion and the safety of all Australian citizens.
As a society we need to be better prepared to prevent the deterioration of social cohesion in response to violent criminal and extremist activities. We saw similar threats to social cohesion emerge in the aftermath of a knife attack at an Assyrian church in Wakeley, Sydney, two days after the Bondi Junction attack. The Wakeley stabbing has similarly led to fears that Islamophobia may increase at the community level.
Regardless of background, all Australians should call out racial or religious hate as it arises. A good example of solidarity was the #illridewithyou campaign aimed at protecting Muslim Australians on public transport from potential backlash emerging from the 2014 Lindt cafe siege. This grassroots campaign on social media demonstrates the role that even ordinary Australians can play in contributing to a safer and more socially cohesive society.
Greater regulation of social media is also needed to prevent the spread of misinformation during times of crisis. A bill before the federal parliament would fine platforms for failure to comply with industry standards and codes of conduct in regulating misinformation and disinformation they carry.
The esafety commissioner’s work in this space is also an important step to tackling the proliferation of terrorist and violent extremist material and activity online. X will appear in court in May to determine whether they breached the law by failing to comply with the commissioner’s notice to remove online footage of the Wakeley stabbing.
The online regulator has also recently issued legal notices to Google, Meta, X, WhatsApp, Telegram and Reddit to answer a series of detailed questions about how each is protecting Australians from terrorist and violent extremist material.
People will continue to turn to social media for communication and for accessing information in times of crisis. But as the two recent stabbings in Sydney show, more must be done to resist the widespread proliferation of misinformation during times of heightened social tension. Increasing the regulation of social media and encouraging ordinary citizens to act in opposition to racially and religiously based misinformation are straightforward strategies that will go towards building a more cohesive Australian society.
http://aspi.s3.ap-southeast-2.amazonaws.com/wp-content/uploads/2024/10/16232551/ASPI-CMYK_SVG.svg00markohttp://aspi.s3.ap-southeast-2.amazonaws.com/wp-content/uploads/2024/10/16232551/ASPI-CMYK_SVG.svgmarko2024-04-26 05:59:122024-04-26 05:59:12Recent stabbings highlight danger of online misinformation
Facebookis king in Papua New Guinea (PNG) but its reign may soon be over. This week Communications Minister Sam Basil, a regular Facebook user himself, announced that PNG would shut down the social media site for a month so that his department can research how the network is being used. While there are mixed signalsabout whether the ban is a certainty or a proposal under consideration, the intention is disturbing and the enforcement of such a ban would set a dangerous precedent in our region.
Such a move—which would put PNGalongside China, Iran and North Korea—would completely upend the country’s interconnected and diversedigital ecosystem that’s relied on by the public, businesses and civil society.
Over the last five years, use of Facebook has grown more than fivefold, from136,000 users to an estimated730,000 active users today. Overall, total internet penetration in PNG is still low, hovering at around 11% of the 8 million population, but these numbersare growing quickly. An additional 110,000 active social media users jumped online in 2017–2018, and Facebook itself increased its PNG user base by 18% over the same period. The overwhelming majority of users, 92% to be exact, access the social network from a mobile phone.
Google searches show the extent of the country’s Facebook engagement. After ‘PNG’ the second most googled search term by Papua New Guineans is ‘Facebook’, with the fifth being ‘Facebook Login.’
According to local media, the PNG government has said the Facebook shutdown ‘will allow information to be collected to identify users that hide behind fake accounts, users that upload pornographic images, users that post false and misleading information on Facebook to be filtered and removed’.
The government’s concerns are all legitimate and most countries are facing a similar set of issues. But it’s important to keep in mind fake accounts, pornography and misleading information is a problem for most networks—including Youtube, Reddit and Instagram—all of which have small chunks of users in PNG. And if pornography is the problem, the PNG government should start a conversation with Twitter. A wide range of PNG accounts appear to be using the microblogging site to push out pornography domestically and internationally.
This may shock some, but the signs have always been there. Over the past decade the government has threatened to block political blogs, announced a ‘monitoring committee’ tasked with identifying citizens who express views the government believes are ‘subversive’, and introduced vague regulations that civil society groups claimprotect politicians from criticism. Under the country’soverly broad ICT laws charges can be laid, for example, if someone is judged to be causing annoyance, inconvenience or needless anxiety to another person via the ‘improper use of ICT services’.
While the government has claimed the ban will be temporary, what if this doesn’t end up being the case? Even if there are current intentions to bring the network back online later this year, it’s just as easy, once political battles have been won, to put a ban extension on the table.
At the end of the day, it’s likely the PNG government’s skirmish with Facebook has more to do with reining in political debate than anything else. Only time will tell if this is an empty threat or the government really will flick the off switch. In the meantime, there are four issues that policymakers, industry and civil society must consider:
The PNG government can enact a ban on Facebook
Unfortunately, enacting such a ban isn’t difficult. The PNG government may not have a well-resourced public service, including on ICT and cyber issues, but it only has to ask the country’s telecommunications and internet service providers to block bothwww.facebook.com and the Facebook messenger application in order to impose this ban.
This is very bad for the PNG economy
Partial or full internet takedowns can cost a countryhundreds of millions of dollars. Banning Facebook will make it almost impossible for most PNG businesses to easily reach their customers. It will also cut off isolated communities from local civil society groups and disrupt the communication channels of a host of local and provincial governments. The PNG government’s promise to look into creating a homegrown alternative social network is very unlikely to get off the ground. It would be expensive, resource-intensive and would require third-party assistance to gain any traction. The hardest part? Getting the public to actually use it.
This is also bad for Australia
With APEC around the corner (for more details, head to thePNG government’s official APEC Facebook page), this sharp turn into cyber censorship is a setback for all Australian organisations with an interest in PNG. This is also a blow for the Australian government, which has invested significant public resources in both PNG and in its cyber diplomacy. With one of our most important bilateral partners threatening such blatant cyber censorship, it shows that there’s a lot of hard and important work that DFAT must do close to home to convince our neighbours that a free and open cyber space is in their national interests. It’s vital for the Australian government to link up with industry to encourage the PNG government away from cyber censorship that will be detrimental to both the economy and to PNG’s hosting of APEC.
It’s terrible for PNG’s place in the world
A Facebook ban will of course stifle public debate and make it difficult for both local and regional media to report on the country. It will also make it difficult for the world to get a good glimpse into PNG and for Papua New Guineans overseas to connect back home. Intentionally advancing its own online isolation—in a world where it already struggles to attract international attention—is the very last thing that PNG needs.
http://aspi.s3.ap-southeast-2.amazonaws.com/wp-content/uploads/2024/10/16232551/ASPI-CMYK_SVG.svg00markohttp://aspi.s3.ap-southeast-2.amazonaws.com/wp-content/uploads/2024/10/16232551/ASPI-CMYK_SVG.svgmarko2018-05-30 03:30:042018-05-30 03:30:04PNG to push out Facebook, taking a sharp turn into cyber censorship
Last month, ASPI and the Australian Department of Defence co-chaired the inaugural Northeast Asia Defence and Security Forum in Sydney.
It was a wide ranging discussion, with a particular focus on how all parties could engage each other and build trust in order to prevent more serious threats emerging—see the full report here and media coverage here and here.
http://aspi.s3.ap-southeast-2.amazonaws.com/wp-content/uploads/2024/10/16232551/ASPI-CMYK_SVG.svg00markohttp://aspi.s3.ap-southeast-2.amazonaws.com/wp-content/uploads/2024/10/16232551/ASPI-CMYK_SVG.svgmarko2013-10-22 03:30:022013-10-22 03:30:02Security implications of modern communication technologies in Northeast Asia
Welcome back for the second round-up of news and articles in the defence and strategy world, coming to you from Jakarta.
Being in Indonesia, I’ve been naturally thinking a lot about reform of the Indonesian military and Australian military engagement. Evan Laksmana’s 2011 paper on American military assistance and defence reform in Indonesia identifies how limits in the design of military-military engagement can seriously hinder long-lasting reform. And there are lessons here for Australia.
Social media is becoming an inextricable part of modern warfare. It’s also now part of a developing area of intelligence analysis called Dynamic Twitter Network Analysis which uses data from Twitter and other social media outlets to gauge public opinion in zones of insecurity and instability. And as the conflict continues between Israel and Hamas, both on the ground and (bizarrely) in the Twitterverse, here’s an Atlantic article that looks at whether this is a violation of Twitter’s terms of use.
This New York Times article on the demographics of the US electorate contains some statistics on the views of Americans on the relative merits of capitalism and socialism which might be surprising—although they mirror the Lowy Institute’s findings that ‘just 60% of Australians say democracy is preferable to any other kind of government, and only 39% of 18 to 29 year olds’.
From the people who brought you Infinity Journal, a free peer-reviewed online journal on strategy, here’s the new issue of the Journal of Military Operations. You’ll need to sign up to view their articles but if the quality is anything like Infinity, it will be well worth the effort.
Sticking with a journal theme, the Australian Defence Force Journal has released its latest issue (PDF), including an article from Strategist contributor Albert Palazzo as well as pieces on Japanese subs for Australia, ANZUS, Australian influence in the South Pacific and UAVs.
The new issue of the Kokoda Foundation’s Security Challenges has articles on ballistic missile defence and China’s multilateral engagement so get your hands on a hard copy unless you can wait until it’s available online. Previous issues are available online and contain articles by most of the well-known names in Australian strategy discussions.
Natalie Sambhi is an analyst at ASPI and editor of The Strategist.
Today on Stop the World, David Wroe speaks with Casey Mock and Sasha Fegan from the US-based Center for Humane Technology. The CHT is at the forefront of efforts to ensure technology makes our lives better, and strengthens rather than divides our communities. They also produce the podcast, Your Undivided Attention—one of the world’s most popular forums for deep and serious conversations about the impact of technology on society. David, Casey and Sasha discuss the tragic case of 14-year-old Sewell Setzer, who took his life after forming an intimate attachment to an online chatbot. They also talk about persuasive technologies that influence users at deep emotional and even unconscious levels, disinformation, the increasingly polluted information landscape, deepfakes, the pros and cons of age verification for social media and Australia’s approach to these challenges.
Warning: this episode discusses mental health and suicide, which some listeners might find distressing. If you need someone to talk to, help is available through a range of services, including Lifeline on 13 11 14 and Beyond Blue on 1300 22 46 36.
https://aspi.s3.ap-southeast-2.amazonaws.com/wp-content/uploads/2024/10/29231723/Stop-the-World-Banner.png4271280markohttp://aspi.s3.ap-southeast-2.amazonaws.com/wp-content/uploads/2024/10/16232551/ASPI-CMYK_SVG.svgmarko2024-11-29 15:26:542025-02-27 17:44:26Stop the World: Artificial intimacy, persuasive technologies, and how bots can manipulate us