World

South African app uses AI to combat gender-based violence after family tragedy

Navigation

Ask Onix

South African app tackles gender-based violence with AI after family murder

The brutal killing of a pregnant 19-year-old relative in 2020 near Cape Town drove Leonora Tima to create Grit, a digital platform offering support, evidence collection, and AI-driven guidance for survivors of abuse. The free app, developed with African communities, now has 13,000 users and processed 10,000 help requests in September alone.

The tragedy that sparked change

Tima's relative, nine months pregnant, was murdered and abandoned beside a Cape Town highway in 2020. The case went unreported by media-so common in South Africa that such violence barely registers as news. The killer remains at large.

"Her death was treated as normal," Tima, a development-sector professional, told The Meta Times. "That silence became the impetus for Grit."

How Grit works: Three core features

1. Emergency recording and response: A single button triggers a 20-second audio recording and alerts a private call center. Operators assess the situation and dispatch local responders if needed. Tima notes misuse has been minimal: "People test us-we test the tech."

2. The Vault: A secure, encrypted digital space for storing photos, screenshots, or voice notes as dated evidence. "Abusers often destroy phones," Tima explains. "This protects critical proof for legal cases."

3. Zuzi, the AI chatbot: Launching this month, Zuzi acts as a non-judgmental "aunt figure"-a warm, trustworthy confidant co-designed with community input. During trials, even perpetrators sought help for anger management, while male survivors used it to discuss their experiences. "AI removes fear of judgment," Tima says.

"We asked: Should Zuzi sound like a lawyer? A social worker? People wanted an aunt-someone warm they could trust."

Leonora Tima, Grit founder

Global attention-and caution

Grit's model has drawn international praise, including at October's Feminist Foreign Policy Conference in Paris, where 31 nations pledged to prioritize gender-based violence (GBV) solutions. Yet experts urge caution.

Lisa Vetten, a GBV specialist, warns AI chatbots risk oversimplifying complex trauma: "They can't replace human empathy. Survivors need to rebuild trust with people, not algorithms." She cites cases where bots gave incorrect legal advice, compounding victims' distress.

Lyric Thompson of the Feminist Foreign Policy Collaborative adds that gender biases in AI-often built by predominantly male, Western teams-can "entrench misogyny" if unchecked. A 2018 World Economic Forum report found just 22% of AI professionals were women.

"AI reflects historic data centered on white men. We need creators who are women of color, from the global south, from marginalized backgrounds-only then can tech serve those who need it most."

Leonora Tima

South Africa's GBV crisis by the numbers

The country's femicide rate is five times the global average, per UN Women. Police data shows seven women murdered daily between 2015-2020. Tima's team surveyed 800 township residents, finding widespread distrust of police and fear of retaliation-some women even faced defamation suits after naming abusers on social media.

Funded by Mozilla, the Gates Foundation, and the Patrick McGovern Foundation, Grit aims to bridge gaps in a broken system. "Existing structures fail victims twice," Tima says. "First by allowing violence, then by silencing them."

The road ahead

While AI's role in GBV response remains debated, Tima insists the focus must shift to who designs the technology. "This isn't just about engineering," she says. "It's about representation."

As Grit expands, its model-a blend of African-led innovation, community co-design, and cautious AI integration-offers a potential blueprint for regions grappling with similar crises.

Related posts

Report a Problem

Help us improve by reporting any issues with this response.

Problem Reported

Thank you for your feedback

Ed