In partnership with the Global Internet Forum to Counter Terrorism (GIFCT) from 6-7 November, the Tech Against Terrorism project was officially launched in Jakarta, hosted at Conclave, the co-working space. The first day focused on open dialogue with representatives from Indonesian startups and the GIFCT, with presentations in the morning and practical workshops in the afternoon. The second day focused on responses from government representatives, along with presentations on declassified threat assessments.
The key issues raised during the two-day long event focused on the need to better understand what is terrorism, to focus on sector collaboration to match expertise with capability, and the importance of understanding what is ‘true’ and what is not, particularly with the rise of ‘fake news’.
In the morning session, an introduction by a representative of Facebook stated that “our platform will not be a useful environment for terrorism, or bad actors in total”, emphasising that “the company is driven by a strong sense of doing the right thing, to build communities, and to make sure these communities are operating in a safe environment”. A representative of Google agreed, emphasising the need for collaboration, where “we recognise that we are not experts in this area, so are keen to hear from those who are”. An academic from the Indonesian Consortium for Religious Studies (ICRS) presented on the potential negatives of social media platforms, with an up-lift in “agitation, hate speech, bullying, alternative facts, fake news, and hoaxes”, and emphasised the need for university-government collaboration.
On the discussion of Terms of Service, one audience member suggested that a key challenge was in making the “changes physical to its users”, as most users would scroll straight down instead of actually reading the guidelines. Most believe the document to be one of formality, rather than being genuinely progressive and reactive with its external context. One challenge that tech companies should address is how to make this more ‘physical’ – how to make users comply before violations are made, and to minimise the possibility of violations ever being made.
A key request made of the Tech Against Terrorism team, particularly through our Knowledge Sharing Platform, was to compile a table on different intermediary laws on content regulations, namely outlining the ‘grey area’ in legal distinctions globally. An audience recommendation given for transparency reporting was to attempt to offer as much clarity as possible, including more disclosure around the nature of government requests. This would aim to improve community understanding in exactly what kind of access these governments are talking about.
In the panel focusing on tech company responses, a representative from Facebook emphasised how now “physical distance is less of an issue”, where actors can be “more fluid in where they carry out activities”. As such, she emphasised the need to build collaborative networks, much like terrorist groups online, but utilised for good. An important question was then raised – attacks (particularly due to their smaller scale) are going to become a lot more common, and society may become desensitised as a consequence. How do we then deal with this?
One measure was suggested in the form of counter-narratives, in a presentation by a representative of Love Frankie. The presentation focused on the narratives, and how information travels. In the example of IS content, information dissemination could be broken down into “active members – disseminators – sympathisers – potential recruits”. However, a problem faced by technology companies, as well as those that create counter-narratives, is the nature of targeting – how do you know who to target? At a more macro-level, how do you change someone’s behaviour? A presentation by Google/Youtube emphasised the importance to invest more in counter-speech, citing their ‘Global Voices’ fund as a successful example.
A presentation by representatives from the Australian government agreed, and emphasised the importance of empowering local communities to craft positive narratives, and how communications is required to undermine the “emotional and ideological appeal of extremist narratives”. They suggested that one form of evaluating one’s success would be through the ability to pull target audiences away from extremist ideology and drive individuals towards a “more moderate, inclusive ideology consistent with Australian values and rule of law”. The target audience must be understood, with communications tailored effectively.
Previous workshops have emphasised the need to qualify definitional challenges, especially around the nature of terrorism. The representative from ICRS disagreed, arguing that it was counter-productive continuing down the path of “trying to define who our enemies are”, and that it is a discussion that will never get resolved. Instead, he called on researchers and tech companies to look at indicators, vulnerability factors grounded in a socio-economic reality, to assist in combating terrorism.
An audience question further explored the concept of radicalisation, with those on the panel accepting the need to further engage with women, especially mothers, for countering radicalisation. The representative from Facebook agreed in which while a lot of work is done with NGOs and students, the strong role of women in preventing radicalisation needs to be further recognised. The theme continued onto the second day, with a representative from the Indonesian Government discussing ‘self-radicalisation’ and the related challenges in tracing these individuals, particularly where encrypted platforms are used.
We want to thank Conclave for hosting us, and to all of our participants for their recommendations and insights.