Tanzania began regulating online content in 2015 with the Cybercrimes Act which delineates what constitutes prohibited online content. Tanzania’s regulatory framework has recently been consolidated by the 2020 Online Content Regulations, which replace the 2018 Regulations, expand the scope of prohibited online content and further reduce the deadline for platforms to remove prohibited content from twelve hours to two hours.
Tanzania’s regulatory framework
Relevant national bodies
Key takeaways for tech companies
2020 Online Content Regulations
Cybercrimes Act of 2015
Tech Against Terrorism’s analysis and commentary
Risks for digital rights
As with many of the online regulations analysed in the Online Regulation Series Handbook, the 2020 Regulations, and those issued in 2018 which they supersede, have been criticised by digital rights advocates and legal experts for the risk they present of impacting negatively on users’ digital rights.
Vague definitions of prohibited content categories, a short removal deadline, compulsory automated moderation solutions – and in particular filtering tools – are all key trends in online regulation that have been criticised for promoting an overabundance of caution whereby platforms over-remove and block online content that is legal or non-violent to comply with stringent legal requirements. The risk of over-removal, according to AccessNow in its analysis of the 2020 Regulations, endangers in turn the freedom of access to information. In AccessNow’s assessment, the 2020 Regulations cause online users to limit the information they share online.11Tay (2020)
With only a two-hour removal deadline, the 2020 Regulations are amongst the most severe online regulations in barely allowing time for tech companies to detect and remove prohibited content. Viewed globally, the most stringent legal requirements to react to prohibited content at speed are usually limited to responding to requests from competent authorities and/or to certain categories of content, including terrorist content.12The one-hour removal deadline included in the EU Regulation on addressing the dissemination of terrorist content online is limited to requests to remove terrorist content sent by competent authorities; similarly, Indonesia and Turkey require platforms to remove prohibited content within, respectively, two and four hours, when the request is from a competent authority. However, the 2020 Regulations turn the short removal deadline into a blanket policy for all actioning prohibited content as well as for proactive monitoring.13Beside responding to removal requests from competent authorities within two hours, the 2020 Regulations require tech companies to detect potentially prohibited content uploaded by users, assess whether the content is in fact prohibited under the Regulations, notify the users that their content is prohibited so that the user can remove it, or if necessary, remove the content unilaterally. A two-hour removal requirement is nearly impossible for most tech companies to comply with, and for smaller and emerging platforms in particular, and therefore risks platforms removing non-prohibited content to comply with the law.
The 2020 Regulations also prohibit the use or distribution of tools to access prohibited content, including VPNs. As raised in our analysis of Ministerial Regulation 5 in Indonesia, which includes a similar provision, VPNs are also used by regular users to protect themselves online. Prohibiting users from using VPNs to conduct legal activities thus represents a significant risk for online security and privacy, in particular for political dissidents or marginalised communities whose safety is dependent on being able to maintain their anonymity online.
Online content and respect for “cultural sensitivities”
Beside requiring platforms to remove prohibited content, the 2020 Regulations also demand them to account for the “trends and cultural sensitivities of the general public”14Aloys & Associate (2020), Online Content Regulations 2020. when moderating online content. This provision is reminiscent of similar legal provisions and legislative proposals in the Global South concerning tech companies’ lack of understanding of local contexts and languages when moderating non-English content.15Mauritius justified its 2021 amendments to the Information and Communications Technology Act by criticising perceived tech platforms’ lack of compliance with local laws and insufficient moderation capacity in the local creole language. In Nigeria, the government’s decision to ban Twitter is said to have been underpinned by a “growing consensus within the government calling on Twitter to establish a local presence in order to grasp local context.” See: Idris Abubakar and Adegoke Yink (2021), Inside Nigeria’s decision to ban Twitter, Rest of World. However, as it stands the 2020 Regulations provide no information on what accounting for cultural sensitivities means in practice, nor on what practical steps are expected of platforms in this regard.