(0)
Bamidele Farinre, PRECIOUS finalist and biomedical scientist

From PRECIOUS Finalist to Shaping UK AI Policy: Bamidele Farinre’s Full-Circle Journey

Bamidele Farinre’s story is one of recognition turned into influence. A finalist in the 2023 PRECIOUS Awards Leadership category, sponsored by The Open University, and again a finalist in 2025 for Outstanding Woman in STEM, she has used those moments of visibility as a springboard into wider advocacy. Today, her work is helping to shape conversations around one of the most urgent issues of our time: how to ensure artificial intelligence is fair, inclusive and accountable.

As a Chartered Biomedical Scientist and Agile practitioner, Bamidele’s work brings together science, leadership and problem-solving. Her professional journey has taken her from frontline healthcare to policy influence, with each stage deepening her understanding of how technology can either reinforce inequality or help dismantle it.

Through her submission, Ensuring AI Equity for a Democratic UK Foundation, she has contributed to a new 2026/27 project under the *APPG on Diversity and Inclusion in STEM, focused on AI equity, biased algorithms and gendered harms.

A PRECIOUS Journey

For Bamidele, this is more than a policy development. It is a full-circle moment.

My journey with the PRECIOUS Awards has been a powerful catalyst for my advocacy,” she says. “In 2023, I was honoured as a finalist in The Leadership category, which validated my ‘No Ceiling’ approach to guiding diverse teams. Building on that momentum, I was named a finalist in the 2025 PRECIOUS Awards for Outstanding Woman in STEM. This recognition bridged the gap between my technical expertise in biomedical science and my commitment to AI equity.

That link between tech know-how and doing right by people is at the core of her story. AI is usually talked about in terms of innovation, speed and efficiency, but Bamidele is asking the harder questions that matter: who gains from it, who gets left out, and who gets hurt when these systems are built on limited data and shaky assumptions?

The Problem with Biased AI

Her biomedical science background gives Bamidele a practical understanding of how AI bias shows up in real life. When AI systems are trained on limited data, they may not work well for people with different skin tones, health conditions or gender-related markers. In healthcare, recruitment and public services, that can lead to unfair decisions and missed opportunities. For Bamidele, these are not one-off mistakes, but signs of a deeper design problem.

“These ‘data deserts’ lead to biased outputs that create real-world harms for all underrepresented groups in clinical decision-making,” she explains. “We must ensure that the ‘invisible ceilings’ of previous generations are not hard-coded into our automated systems.”

That language of invisible ceilings is important. It reflects the way Bamidele sees AI: not as a neutral tool, but as a system that can silently reproduce exclusion unless it is designed with care, accountability and diversity in mind. Her APPG submission was shaped by that concern, but also by a broader vision of what fair technology should look like.

She believes her idea got noticed because it didn’t just talk about the problem; it pushed for real accountability built into the system. Instead of treating fairness in AI as a nice-to-have, she made the case that it’s something democracy actually depends on.

I believe my proposal resonated because it moved beyond abstract theory into the realm of structural accountability,” she says. I addressed the ‘hidden curriculum’ of technology, how algorithms can unintentionally gatekeep access to essential services like healthcare, jobs and justice.

That shift in thinking matters. When technology shapes who gets a job, who gets healthcare, who gets stopped by security and whose voice reaches the people in charge, bias stops being just a tech problem. It becomes a social problem, a legal one and a moral one too. Bamidele’s work on the APPG project has helped move the conversation on from asking whether AI can be fair, to working out how fairness has to be built in from day one.

The project itself will tackle some of the most pressing systemic harms in the AI landscape, including algorithmic glass ceilings in recruitment, diagnostic bias in healthcare and facial recognition inaccuracies. These issues affect many underserved communities, but Bamidele is particularly concerned about the way they impact people whose identities are already underrepresented in data and policy.

“When the data does not see the full breadth of our population, the system cannot serve us,” she says. “We are moving the conversation away from viewing these as ‘edge cases’ and toward recognising them as fundamental design flaws.”

Why Perspective Matters

Her perspective is shaped not only by professional experience, but also by identity and migration. As a Nigerian-born, UK-raised Black woman she describes her leadership as being informed by a dual lens.

“My heritage gives me a global perspective on access, while my UK career has taught me how to dismantle institutional barriers from within,” she says. “I don’t view AI governance as a purely technical challenge; I view it as a civil rights frontier.”

It is a powerful reminder of why her voice matters right now. Post-International Women’s Day, Bamidele argues that the conversation must move beyond celebration and into structural change. Representation, in her view, is not about symbolism. It is about shaping systems that reflect the diversity of the society they serve.

A Vision for Fairer Policy

“Policy is only as robust as the perspectives shaping it,” she says. “If UK AI policy is designed by a homogenous group, it will inevitably contain blind spots that leave millions behind.”

Her advice to others in the PRECIOUS community who want to influence policy is equally clear and grounded in encouragement.

Your excellence is your entry ticket, but your perspective is your innovation,” she says. “Do not wait for an invitation to influence policy; your lived experience is a form of expertise that the government and technological sectors desperately need.

That message feels especially relevant in a year when AI is shaping more and more of everyday life. Bamidele’s journey shows that influence does not begin only in Parliament or boardrooms. It can begin with professional excellence, with community insight, and with the courage to name what is broken.

From PRECIOUS finalist to contributor to UK AI policy, Bamidele Farinre is showing what becomes possible when women of colour in STEM are given the space not just to participate, but to lead. Her story is a reminder that the future of technology should not be left to chance. It should be built with fairness, accountability and inclusion at its core.

*The APPG on Diversity and Inclusion in STEM is a UK parliamentary group that works to improve the inclusion and progression of people from diverse backgrounds in science, technology, engineering and maths.

Follow Bamidele on Instagram

Leave a Reply