Development and
Cooperation

Technology and ethics

“Data literacy is about building informed, active citizens”

Irene Mwendwa is the executive director of the civil-society organisation Pollicy, an Africa-centred feminist collective working at the intersection of data, design and technology. In this interview, she outlines her vision of ethical tech and talks about her experiences in the fight against disinformation and for the empowerment of Africans, especially women, so that they can defend themselves against unjust policies.
Disinformation and fake news are rampant online, and digital media literacy is more important than ever. picture alliance/Hans Lucas/Martin Bertrand
Disinformation and fake news are rampant online, and digital media literacy is more important than ever.

Irene Mwendwa interviewed by Milena Kaplan

What challenges related to disinformation and artificial intelligence (AI) are currently being faced in Kenya and the Global South?

In Kenya and much of the Global South, disinformation spreads quickly – especially on platforms like Facebook or TikTok, where algorithms reward the most sensational content. With low digital literacy and limited fact-checking in local languages like Kiswahili or Sheng, false narratives often go unchecked, sometimes even fuelling ethnic tensions or influencing elections. On the AI front, we’re dealing with tools built on biased datasets that don’t reflect African realities, leading to things like facial recognition technology that struggles with darker skin tones. Add to that weak AI regulations and extractive data practices (like gig workers in Nairobi being underpaid to train global AI systems), and it’s clear that we’re not just facing a tech problem, but a justice problem.

 

Why is data literacy so crucial for limiting the impact of disinformation?

Data literacy gives people the power to pause, question and verify, rather than just click “share”. It helps communities understand how data can be twisted or misused – whether it’s spotting a deepfake or calling out skewed statistics. More importantly, it builds confidence to challenge platforms or policies that aren’t working in people’s favour. We’ve seen this in action through Pollicy’s initiatives such as our Digital Safe Tea card game, which makes these concepts accessible and engaging, especially for youth and community groups.

Why is it often difficult to promote data literacy on a broad scale – especially in the Global South?

It’s hard because the odds are stacked against them. Many schools don’t have proper internet access or up-to-date curricula, and in rural areas, girls in particular may not have much exposure to digital tools. Then there’s the language gap – so many resources are in English or French, leaving out huge swathes of the population. There’s also a deeper distrust of institutions, rooted in a history of exploitation.

Are there any success stories where data literacy has made a real difference?

Absolutely. One standout is Pollicy’s Afrofeminist Internet Scorecard, which gave African women from seven different countries and LGBTQ+ communities a way to rate how digital platforms treat them, leading to real policy conversations in countries like Uganda and Kenya. Another exciting initiative is the Voice Data Literacy Training Program, which focuses on capacity-building through a hands-on course designed to help young people gain essential data skills. This free course teaches learners how to collect and analyse data, manage it effectively and visualise it using tools like Microsoft Excel and Google Sheets. It’s particularly impactful in enabling youth to prepare professional policy briefs and communicate research findings clearly to decision-makers. To support the learning process, participants get access to real-world datasets.

How does all of this relate to civic participation and inclusive engagement?

At its core, data literacy is about building informed, active citizens. When people understand how data shapes their lives – from how they’re scored for loans to what content they see online – they can speak up, organise and demand better. This kind of engagement is essential for inclusive governance. Projects like Pollicy’s Dear Tech Diary show the power of bringing everyday voices into conversations about technology and accountability. 

Social media

Stop online hate

What role can development organisations – such as UNDP (UN Development Programme) – play in promoting inclusive digital ecosystems and ensuring that no one is left behind in the age of AI?

Organisations like UNDP can make a real difference by funding grassroots innovation and trusting local experts. Pollicy’s work is a perfect example of what happens when communities take the lead, participating in the co-creation of resources like the Africa Data Governance Knowledge Hub that are tailored to real needs. Beyond funding, development organisations should ensure that women, youth and rural communities are not only included in policymaking, but at the centre of it. They also have the power to push for global standards that put people first, not just profits or technical efficiency.

How should AI systems be trained to reflect diverse realities, and what role do data quality and representation play?

It starts with who gets to shape the data. Instead of scraping content without consent, why not involve communities in creating and vetting datasets? That’s something Pollicy has emphasised in our work with gig workers and digital workers, through our Fair Digital Kazi Manifesto. Quality is more important than quantity: An AI system trained on 1000 diverse, well-documented examples is better than 10,000 random ones. Localisation is also key: AI that understands Kiswahili slang or regional nuances will always outperform generic global models. And let’s not forget oversight. Civil society needs a seat at the table to ensure fairness and transparency.

The upcoming Hamburg Declaration on Responsible AI for the SDGs launched by the UNDP and the German Federal Ministry for Economic Cooperation and Development (BMZ) is bringing together policymakers, civil-society representatives, researchers and industry leaders to ensure AI is deployed in a fair, inclusive and sustainable manner. The Declaration will be adopted at the Hamburg Sustainability Conference in June 2025 to which you have been invited as a speaker. What is your perspective on the Declaration in relation to ethical and sustainable AI?

The values behind the Hamburg Declaration – human dignity, inclusivity, sustainability – fit well with Pollicy’s feminist vision for ethical tech. But these kinds of declarations only matter if they actually lead to change on the ground. This means putting the voices of the Global South at the centre of rulemaking, creating accountability for past data exploitation and developing ways to measure impact through an intersectional lens, not just tech performance. Pollicy’s approach to rethinking power structures and centring care offers a practical way to make these principles a reality.

Links
pollicy.org:
bmz-digital.global/en/hsc/  

Irene Mwendwa is a lawyer and executive director of the civil-society organisation Pollicy. She advises governments, civil-society actors and multilateral organisations on digital inclusion, elections and technology policy.
info@pollicy.org 

Irene Mwendwa’s colleague Maureen Kasuku has contributed to this interview.

Related Articles

Primary education

Learning by Ear

Newest Articles

Most viewed articles