When Tech Meets Tyranny

Coming Soon…


“Online platforms that do not regulate anything create a lot of security issues for the users. The cases of doxing of Myanmar human rights defenders are quite high; this is not just about freedom but physical security of people and their lives.”

These words from Lana, a leading voice at Article 19, cut to the heart of a critical dilemma in our interconnected world: the relationship between the rapidly evolving tech sector and those repressive governments increasingly seeking control of digital spaces. As the head of civil society engagement with technology in the Asia-Pacific region, Lana brings a unique background—spanning corporate, multilateral, and non-profit spheres— to address the human rights implications of tech operations in authoritarian states like Myanmar, Vietnam, and China. Her insights, inspired from a 2024 report by Article 19, reveal an often unseen and ironic battle for freedom of expression and privacy playing out across digital frontiers.

The digital space, with its promise of free information flows, is increasingly finding itself under siege in numerous Asian countries. Authorities, far from merely enacting laws to limit expression, are increasingly outsourcing the enforcement of these oppressive mandates directly to private tech companies. In Myanmar, for instance, since the February 2021 coup, the junta has introduced a cybersecurity law that compels telecommunications and ICT companies to comply with their requirements.

As Lana explains, a critical nuance lies in the military's strategy: it's not simply removing content or arresting people directly. Instead, it leverages legislation to shift the burden of censorship and surveillance onto the tech sector. This creates an impossible bind for tech companies, particularly those with a physical presence and employees in the country. The reality of internet shutdowns, exacerbated by conflicts and natural disasters, further complicates the issue, moving beyond mere content moderation to a complete denial of access.

The case of Telenor, the Norwegian telecom giant, serves as an illustration. Faced with demands to hand over sensitive user data to the military post coup, Telenor chose to exit Myanmar, citing ethical reasons. Yet, the extent of data handed over during their departure remains a murky question, raising concerns about the privacy and safety of millions.

Additionally, there is an intricate and often contradictory nature of content moderation by major tech platforms. While companies like Meta (Facebook) have publicly taken a stance against the Myanmar junta (such as suspending content removal requests from regime agencies and blocking pro-military profiles), their actions are not without criticism. Lana notes that while Meta's products were blocked by the junta for allowing pro-democracy messaging, the SAC ironically continued to use Facebook for its own press conference to spread its propaganda.

The effectiveness and ethics of content moderation are consistently questioned. She highlights the significant challenges platforms face due to linguistic diversity, particularly in countries like Myanmar, which has over 100 languages alone. This makes it incredibly difficult to identify hate speech or misinformation, especially when only “a small and very tiny group of people knows the Rakhine language, and is able to identify those specifics of hateful speech.” The contrasting approaches of platforms like Telegram, which Lana believes operate with virtually no moderation, and Meta, with its evolving and often inconsistent community standards, indicates the erratic nature of current practices of tech companies. She suggests that content moderation policies can appear driven more by “public opinion in Western countries and public pressure” than by objective ethical principles.

The absence of comprehensive privacy and data protection laws in countries like Myanmar, Cambodia, and Laos can leave users vulnerable. This legislative gap allows “certain companies to take advantage of this, as that means money - more ads, for example” by collecting excessive user data, she says, suggesting platforms to collect only necessary data to avoid putting users, especially human rights defenders, at physical risk.

Despite the challenges, civil society plays an essential role in advocating for digital rights and holding tech companies accountable. She advocates for a human rights centric approach, and urges companies to conduct comprehensive human rights due diligence and impact assessments before deploying services. Transparency and responsiveness are key: “We want the private sector to be more transparent and responsive when things are flagged by civil society, and solve those issues together, without a fear of being given to their authorities.” Building trusting relationships, she argues, is essential.

A single, universal global policy for content moderation might be impractical given diverse cultural contexts. Yet she stresses the importance of the UN Charter as a guiding framework: “Platforms have to have principles which are compliant with international human rights standards: right to information, right to access, right for freedom of expression.”

Interestingly, tech companies have demonstrated a capacity to push back against oppressive government demands, particularly when facing reputational risk that could impact their profits and user base. When cases of human rights violations, like the spread of hate speech against the Rohingya or the detention of activists, are amplified and known, companies begin to feel compelled to act. However, for Lana, this willpower is often inconsistent: “We always have to consider that it is the private sector, it's a business, and always goes for profits and is then about other policies.”

Ultimately, Lana's message is one of advocacy and the power of collective action. Despite the complexities, she firmly believes that “we should not stop and do nothing as a civil society, just because it's complicated!” This responsibility lies not just with governments or corporations, but with users themselves. “We need more people to understand the importance of the private sector, especially the tech sector, as we are the users of those services, and we want to have services which are safe for us, not only through the digital security side but also from our freedom’s side.” This call to action serves as a reminder that the fight for digital rights is a shared endeavor, requiring sustained pressure and collaboration to build a truly safe and free online world.