About us

UnBias was an EPSRC research project (EP/N02785X/1) funded under the Trust, Identity, Privacy and Security (TIPS) call. The work is now being continued in the follow-on project called ReEnTrust.

The project was a collaboration between researchers from:

From September 2016 to December 2018 the EPSRC funded project “UnBias: Emancipating Users Against Algorithmic Biases for a Trusted Digital Economy” looked at the user experience of algorithm driven internet services and the process of algorithm design. A large part of this work were user group studies to understand the concerns and perspectives of citizens. UnBias provided policy recommendations, ethical guidelines and a ‘fairness toolkit’ co-produced with young people and other stakeholders that includes educational materials and resources to support youth understanding about online environments as well as raise awareness among online providers about the concerns and rights of young internet users. The project is relevant for young people as well as society as a whole to ensure trust and transparency are not missing from the internet.

For more information about the project, see Our Mission. To get to know the people behind the project, read about the UnBias Research Team.

Awards

  • Emerald Real Impact Awards 2018 – ‘Highly Commended’
  • Short listed for University of Nottingham Knowledge Exchange & Impact Awards 2018 for:
    – Policy Impact
    – Expert Commentator
  • EPSRC IAA Impact Exploration Award 2018
  • RCUK Digial Economy Theme ‘Telling Tales of Engagement’ Award 2017

Follow-on project

As part of the TIPS2 round of EPSRC funding the UnBias team is continuing their work in this space in the ReEnTrust project on Rebuilding and Enhancing Trust in Algorithms.

correct DE logo

Emancipating Users Against Algorithmic Biases for a Trusted Digital Economy

%d bloggers like this: