Vera Files: Using AI to create SEEK, a GenAI fact-checking search engine
Project: S-E-E-K
Newsroom size: 21 - 50
Solution: A chatbot search assistant that uses VERA Files’ archive to answer fact-checking queries and track misinformation trends.
VERA Files, a Philippine media non-profit organisation, identified several bottlenecks during their fact-checking process. One, was in monitoring misinformation. Then came creating a taxonomy of fact-checks since it was a digital-only organisation. Last, was the issue of incentivising their users to engage with their database of over 5,400 fact checks and counting.
From the many use cases, Celine Samson, Head of Online Verification at VERA Files, and her team decided to use AI to improve the number of people engaging with their fact-check database.
“We decided to do this because we already have the dataset available to train the AI on. Also, at the moment we're really on a mission to increase audience engagement with VERA Files,” said Samson.
The Problem: Engaging users to explore their database
Samson who is an alumna of the 2022 JournalismAI Academy said that one of the reasons why they decided to participate in the Innovation Challenge was because their team was now primed for it. The team had just welcomed two developers into their organisation apart from the existing reporters and editors.
“I also learned from the Academy that apart from just having journalists, you actually do need people who know how to carry out the tech part of things,” she added.
Apart from this, AI was the next option after they tried many qualitative methods to achieve their distribution and engagement goals.
“We've already tried so many other approaches to try to engage with our audience. We've had town halls and focus groups with readers. We've tried to optimise how we engage with people on social media, for example. But we also wanted to try and see what AI could do to help us with achieving these goals,” said Samson.
The combination of factors led them to create SEEK, a chatbot search assistant that draws on VERA Files’ archive to answer their fact-checking queries or those on misinformation trends. It offers users a “Quick Answer” and “Think Deeper” setting to allow them to customise the level of detail they’d like in responses.
Building the solution: Aiding users in quicker fact-checks, enabling discovery
SEEK was inspired by other newsrooms’ and fact-checking organisations’ experiments in this space, said Samson.
“One of our main inspirations was Fátima of Aos Fatos where we actually talked to their director for innovation, Bruno Favero, in preparing for SEEK. We asked questions like how did they develop Fátima? What were their considerations? How do they make sure that the answers Fátima comes up with are accurate?” Apart from this, they also used Encyclopedia Britannica’s chatbot and the San Francisco Chronicles’ Kamala Harris news assistant as an inspiration to create SEEK.
For user research, Samson shared that their research involved speaking with teachers, students, and disinformation researchers, all of whom exhibited varying degrees of trust in AI systems and had experience using them. They found that most users employed AI to launch research or to check grammar.
“The overwhelming feeling when we do our fact-checking training is sometimes people tend to get fed up especially when they have to keep fact-checking their parents, for example, who keep on falling for scams. That also became a target market I would say, the ‘fed up family fact-checkers’, how do we make their lives easier?” she explained.
Key priorities identified by users for SEEK's development included speed, clarity, transparency, accuracy, and localised language. Users also indicated they would not use the tool if the interface was too complex, if it was paid, if data breaches occurred, or if it failed to answer their questions.
In further research, they also found that most Filipinos were already using the AI chatbots, and concluded that the learning curve to use their product would not be too steep.
Technology stack
The technology underlying SEEK involved the use of LangChain for the chatbot and then custom building the other features for the AI tool, for example, using the RAG system. This came with its own challenges.
“None of us has had experience building an AI tool from scratch. Our developers had some experience working with LangChain, but not in the capacity of creating a search assistant or a chatbot. It was something they had to learn from the bottom up. Also, we used the RAG. So that was something that the tech team had to learn and then they cascaded it to the rest of the team so that we were all on the same page.
Samson also shared that working towards creating SEEK alongside the full-time duties of their job proved to be challenging as well. In the end, she had to prioritise and re-prioritise tasks in order to reach their goals, both for the search assistant, as well as for their non-grant related tasks.
Impact on the organisation
The impact on VERA Files as an organisation has been positive, shared Samson, with the entire team “proud and excited” to test the tool. It also led them to work substantially on their AI policy.
“Some members were surprised and didn’t know you could use AI that way for journalism and then to see something that was created by VERA Files itself not by a partner, that really impressed them. Another impact was we already had a skeleton of an AI policy but creating SEEK also actually led us to review the AI policy to make it fit the direction we want to go in the future,” explained Samson.
SEEK was beta tested in September 2025, involving 77 people and receiving 39 feedback evaluations. It scored high in question comprehension, answer quality, natural language performance, and information satisfaction. Users particularly appreciated the design, citation of sources, and the "quick answers" versus "think deeper" settings, shared Samson.
Future iterations of SEEK would likely involve using a larger dataset. They would also explore partnerships with mainstream news organisations.
Lessons for the newsroom
Prioritise the problem, not the tool: “Don't put the cart before the horse," says Samson. Identify specific journalistic problems first, and then determine if AI is the most appropriate solution, rather than simply looking for a way to use AI.
Resist AI pressure and understand limitations: Don't feel forced to adopt AI just because it's a popular topic. It's crucial to understand what AI cannot do and ensure you use it appropriately to achieve a concrete goal.
Conduct deep user research: Cast a wide net in defining and detailing your target groups to thoroughly validate your product idea. For example, VERA Files included teachers, researchers, and students to ensure their work met a broad set of community needs.
Explore Previous Grantees Journeys
Find our 2024 Innovation Challenge grantees, their journeys and the outcomes here. This grantmaking programme enabled 35 news organisations around the world to experiment and implement solutions to enhance and improve journalistic systems and processes using AI technologies.
The JournalismAI Innovation Challenge, supported by the Google News Initiative, is organised by the JournalismAI team at Polis – the journalism think-tank at the London School of Economics and Political Science, and it is powered by the Google News Initiative.
