Verified News Network: Building trust in Indigenous communities with AI

Project: Indian Country Chat Bot

Newsroom size: 10 - 20

Solution: An AI-powered chatbot that combats misinformation and support Native communities with accurate, trustworthy information


In an era of widespread misinformation, Verified News Network (VNN) embarked on an ambitious project: to combat the spread of false information within Indigenous communities in northeast Oklahoma, USA. 

The problem: A flood of misinformation

For seven years, VNN has served Indigenous communities, witnessing firsthand the damaging impact of misinformation and disinformation. Brittany Harlow Tidwell, VNN Co-founder, says, "We found that one of the biggest issues and challenges facing our community members is misinformation and disinformation, particularly relating to their communities." 

This problem is exacerbated by historical exploitation, where information, resources, and even land have been extracted from Indigenous communities. "This is a centuries-long issue in Indigenous communities," explains Harlow Tidwell, adding an additional layer of complexity to building trust and introducing new technologies. 

Furthermore, misinformation often serves to benefit external interests, with Harlow Tidwell noting, "Even really on the state level there is a lot of misinformation being weaponised to benefit the state and harm our tribes here in Oklahoma." 

The rise of AI presented both a threat and an opportunity: to create a tool that could counter this tide of false narratives.

Building the solution: The Indian Country Chatbot

VNN conceived the "Indian Country Chatbot," an AI tool designed to provide accurate information sourced exclusively from Indigenous entities. These sources included tribal governments, native-owned companies, and native-led non-profit organisations. 

The team prioritised cultural respect and data sovereignty throughout the development process. As Co-founder, Kelly Tidwell emphasises, "Everything we learned is closed source. Additionally, we got explicit permission from the different people that we got the data from." 

This commitment to ethical data handling, while making the process more challenging, was paramount. "It made our job a lot harder," Harlow Tidwell admits, "but it was very important to the entire process of being culturally respectful."

Instead of building a chatbot platform from scratch, the team opted for chatbot.com due to its speed, affordability, and closed library functionality. However, the development was not without its hurdles. A significant challenge arose from the limited number of contributors willing to share data for AI training, as many Indigenous organisations lacked AI policies or were outright resistant to the technology. This necessitated a shift towards original content creation, with the team developing "fact sheets" to train the chatbot. 

The team consisted of the two co-founders, alongside two reporters who assisted with research and writing, and a part-time web developer. Project management was facilitated through Coda, an online operations platform. 

Tidwell describes their process: "Whenever an unknown chat came in, we would load it into Coda to find the answers. Then we'd go to our reporters, who would point out any misspellings or areas needing cleanup. Once proofed, we'd plug it back into the chatbot for training."

This iterative approach ensured accuracy and quality.

The opportunities: Fostering dialogue and understanding

The Indian Country Chatbot, despite its initial challenges, has opened up new avenues for engagement and learning. The team sees opportunities to further develop the chatbot, focusing on community feedback sessions to understand local needs and concerns regarding AI. Harlow Tidwell states, "The next step is really to have some community feedback sessions to learn more about our communities." 

Another critical opportunity lies in collaborating with academic institutions. While some initial engagement occurred, further dialogue and the development of robust data-sharing policies are necessary to secure broader acceptance and contributions from the academic community. Ultimately, the chatbot can serve as a catalyst for awareness and education about AI within tribal communities. 

As Tidwell notes, "working on tribal communities raises awareness on the nature of AI and chatbots, which is critical because awareness is the big driver."

Lessons for newsrooms

VNN's journey offers valuable lessons for newsrooms looking to implement AI solutions, especially when collaborating with marginalised communities. 

  • Integrate community throughout development: Don't just inform the community. Actively prioritise and integrate community perspectives across the entire AI solution building process. Involvement helps ensure the tool meets their needs.

  • Value and correct assumptions: Understand that even with prior experience, your assumptions about a community's acceptance of new technology may be incorrect. Be open to having initial assumptions proven wrong and value that learning experience.

  • Address historical context and concerns: When collaborating with marginalised communities, you must cultivate a deeper understanding of their historical concerns. Be prepared to address resistance related to issues like the environmental impact of AI or data privacy, which can become significant sticking points.

Explore Previous Grantees Journeys

Find our 2024 Innovation Challenge grantees, their journeys and the outcomes here. This grantmaking programme enabled 35 news organisations around the world to experiment and implement solutions to enhance and improve journalistic systems and processes using AI technologies.

Previous Grantees
Read 2024 Report

The JournalismAI Innovation Challenge, supported by the Google News Initiative, is organised by the JournalismAI team at Polis – the journalism think-tank at the London School of Economics and Political Science, and it is powered by the Google News Initiative.