The New Divide: Who Gets to Shape the Digital World, and Who Gets Shaped by It?

Tags:
Product Management, Innovation
A cartoon representation of Ahmad Al-Karmi used as an avatar.
Ahmad Karmi
June 8, 2025
LetterLinkedIn

Rethinking the Digital Divide

The concept of the "digital divide" has traditionally focused on disparities in access to digital technologies and the internet. While this framing remains relevant, it is no longer sufficient in an era characterized by artificial intelligence, algorithmic governance, and the global reach of digital platforms. The divide today is not only about who is online, but also about who possesses the power to shape the digital world, and who is passively shaped by it. As digital systems increasingly influence domains such as welfare distribution and cultural narratives, it becomes essential to scrutinize the asymmetries present in their design, deployment, and governance.

This article analyzes the shifting nature of digital inequality through the interdisciplinary perspectives of global development and technology anthropology. The central argument is that digital power remains concentrated among a limited set of geopolitical and corporate actors, often marginalizing the cultures, voices, and rights of those most affected by technological change. The discussion advocates for a reframing of digital development as a participatory, inclusive, and justice-oriented process.

Architecture of Digital Power: Centralization and Control

Digital infrastructure and governance are not neutral phenomena. The ownership of cloud servers, the dominance of platform ecosystems, and the establishment of standards in artificial intelligence are highly concentrated in the Global North, particularly within United States and Chinese technology conglomerates. Amazon Web Services (AWS), Microsoft Azure, and Google Cloud collectively power the majority of global internet traffic. At the same time, Silicon Valley's ideology of rapid innovation continues to shape the ethos of digital innovation.

This architecture of power is further reinforced by geopolitical control over internet infrastructure, including subsea cables, data centers, and satellite systems, as well as regulatory frameworks that privilege Western standards. The European Union's General Data Protection Regulation (GDPR) serves as a global benchmark for privacy, but its norms are often exported globally without sufficient adaptation to local contexts.

Such centralization prompts several critical questions. Who defines ethical artificial intelligence? Whose values are embedded in content moderation algorithms? Who determines what data is collected, stored, and monetized, and who benefits from the resulting data economy? These questions highlight the need for greater transparency and inclusivity in the governance of digital systems.

Global Subsea Cable Map from TeleGeography
https://www.submarinecablemap.com/

Designed Elsewhere, Deployed Everywhere

Technological systems are frequently designed in one region and deployed in another with limited cultural translation or local oversight. This transnational design process often results in systems that do not align with the lived realities of the communities they affect.

For example, algorithmic systems in social welfare programs have been implemented in countries such as India to detect fraud or prioritize benefits. These systems often operate with opaque logic and can have adverse consequences for marginalized populations. Biometric systems have excluded citizens from accessing rations due to recognition errors, while algorithmic scoring has reinforced existing caste and class biases.

In several African countries, facial recognition systems imported from China or the West have demonstrated high error rates for dark-skinned individuals. This issue has been well documented but remains insufficiently addressed. Furthermore, the extraction of training data for artificial intelligence from populations in the Global South, frequently without informed consent or compensation, reflects historical patterns of resource exploitation.

MIT Media Lab’s “Gender Shades” research
https://www.media.mit.edu/projects/gender-shades/overview/

Epistemic Inequality: Who Defines Digital Knowledge?

Epistemic inequality, understood as the unequal recognition and amplification of different forms of knowledge, constitutes a subtle yet pervasive form of digital asymmetry. Global platforms and machine learning systems often marginalize non-Western languages, cultural expressions, and epistemologies.

Large language models such as GPT-4 are trained primarily on English-language data. As a result, these systems encode and replicate Anglo-centric worldviews, frequently omitting or misrepresenting indigenous knowledge systems, regional dialects, or context-specific histories. This dynamic affects not only representation but also the ways in which digital systems internalize and replicate human behavior.

The concept of Digital Humanism posits that technology should reflect the full spectrum of human diversity, not only in demographic terms, but also ontologically. This requires a reconsideration of what constitutes valid knowledge, whose perspectives are prioritized in design processes, and how digital tools can elevate rather than homogenize human culture.

Participation or Tokenism? Inclusion in the Age of Platform Hegemony

Many technology companies and development agencies emphasize inclusion as a core value, yet in practice, this often amounts to token representation. Genuine participation requires co-design, shared governance, and the redistribution of power, rather than simply inviting communities to test pre-built tools.

The proliferation of "technology for good" initiatives often suffers from what anthropologist Lilly Irani terms "entrepreneurial citizenship." Such programs frame marginalized populations as users, consumers, or micro-entrepreneurs, rather than as citizens with rights and agency. This framing can result in interventions that depoliticize structural issues and shift responsibility for change onto individuals.

Nevertheless, examples of participatory alternatives exist. In Brazil, the Portão Digital initiative engaged favela residents in co-creating civic technology solutions. In Kenya, local developers are building platforms in indigenous languages to preserve cultural heritage. These efforts demonstrate the potential of community-driven innovation when provided with appropriate resources and autonomy.

Data Colonialism and the New Extract(ivism)

Data has often been described as the new oil, but a more accurate analogy may be colonial resource extraction. Data colonialism refers to the appropriation of individuals' and communities' data by powerful actors without adequate compensation, consent, or governance.

Applications designed for use in the Global South frequently include surveillance features or data-sharing defaults that would be unacceptable in the West. Development projects that collect biometric or behavioral data under the guise of aid or efficiency often replicate historical patterns of exploitation.

Moreover, the commodification of data for artificial intelligence training, ranging from medical images to social media content, generates value that seldom returns to the communities from which it originates. This asymmetry is not only economic but also epistemological: communities are studied, categorized, and optimized, but rarely invited to define their own digital futures.

Toward Digital Development Justice

Bridging the new digital divide requires moving beyond access and inclusion to focus on justice. This shift entails several key actions:

  • Redesigning governance: Digital systems should be subject to democratic oversight, particularly in development contexts. Local communities must play a meaningful role in designing, implementing, and regulating digital interventions.
  • Supporting local capacity: Investments should be directed toward indigenous technology ecosystems that reflect local values, languages, and needs. Capacity-building should extend beyond technical training to include digital rights education, policy advocacy, and critical media literacy.
  • Decolonizing design: Critical design methodologies should be employed to surface underlying assumptions, challenge dominant paradigms, and invite pluralistic worldviews. Co-creation should become a standard practice.
  • Rebalancing epistemic power: Alternative knowledge systems must be promoted in data collection, artificial intelligence training, and content curation. This includes funding research in and by the Global South and supporting multilingual, multimodal platforms.

Reimagining the Digital Future

The future of digital development should not be determined solely by market logic or geopolitical interests. Instead, it must be co-created by the communities it aims to serve. Digital systems are not neutral tools; they are socio-technical artifacts that embody power, politics, and possibility.

Society stands at a pivotal juncture. One trajectory leads to further extraction, surveillance, and inequality. The alternative offers an opportunity to reclaim digital space as a commons, a domain for shared knowledge, participatory governance, and cultural flourishing.

To achieve this, it is essential to prioritize justice in the design, governance, and deployment of digital systems. Ultimately, those who shape the digital world will determine who has the opportunity to thrive within it.

A cartoon representation of Ahmad Al-Karmi used as an avatar.
Ahmad Karmi
June 8, 2025
LetterLinkedIn
Subscribe to my newlestter