For two years the UnBias project has been examining the user experience of algorithm driven internet platforms and seeking answers to important questions such as:
Are algorithms ever neutral?
How might algorithmic systems produce unexpected outcomes that systematically disadvantage individuals, groups or communities?
How can we make sure that algorithmic processes operate in our best interests?
On October 1st 2018 we held our UnBias project showcase event. Seventy attendees from the fields of research, policy, law, industry and education came together at the Digital Catapult in London to hear and discuss our key project findings. We also highlighted our practical outputs such as policy guidelines and our exciting fairness toolkit.
The event also included a lively panel debate and engaging presentations from external speakers about the social consequences of algorithmic biases and how they might be addressed. We ended the day by announcing plans for our follow-on project, ReEnTrust, which will identify mechanisms to rebuild and enhance trust in algorithmic systems.
The Nottingham UnBias team have finished running our Youth Juries and we are delighted to announce the launch of a report ‘Youth Juries- what we learned from you’ – for the children and young people that participated. If you were one of these people- thank you for taking part! We hope that you enjoy reading the report. We would also like to thank our Youth Advisory Group for their thoughtful and constructive insights that helped to design and shape the report.
For a shorter summary of what we did in the Youth Juries and what the team discovered, please keep reading…
Our new video animation explains what algorithms are, how they shape our online browsing and how they can create risks of bias. It also describes how the UnBias project seeks to promote a future Internet that is free and fair for all. Watch it here!
Earlier this year the UnBias team ran its first Ethicon. An Ethicon is a new kind of event developed by members of the Human Centred Computing theme at Oxford. It works as a twist on the traditional hackathon; it is geared towards forefronting ethical issues alongside design issues in the completion of a task.
In an Ethicon teams work together to carry out a competitive design task. In addition to thinking about technical features of design they are required to address the social and ethical implications of the particular technology involved. They are challenged to identify novel and creative solutions that embed ethical considerations into their design. Teams are interdisciplinary so that they can share expertise and learn from each other in a fun environment. They are then assessed by a panel of experts who judge the technical quality of their work alongside how well they have worked together to identify and address ethical concerns.
June was a month of conferences and workshops for UnBias. The 3rd UnBias project meeting on June 1st, hosted by our Edinburgh partners this time, was quickly followed by the Ethicomp and EuroDIG conferences which both took place from June 5th to 8th.
The workshop took place on February 3rd 2017 at the Digital Catapult centre in London, UK. It brought together participants from academia, education, NGOs and enterprises to discuss fairness in relation to algorithmic practice and design. At the heart of the discussion were four case studies highlighting fake news, personalisation, gaming the system, and transparency.
On February 3rd a group of twenty five stakeholders joined us at the Digital Catapult in London for our first discussion workshop.
The User Engagement workpackage of the project focuses on gathering together professionals from industry, academia, education, NGOs and research institutes in order to discuss societal and ethical issues surrounding the design, development and use of algorithms on the internet. We aim to create a space where these stakeholders can come together and discuss their various concerns and perspectives. This includes finding differences of opinion. For example, participants from industry often view algorithms as proprietary and commercially sensitive whereas those from NGOs frequently call for greater transparency in algorithmic design. It is important for us to draw out these kinds of varying perspectives and understand in detail the reasoning that lies behind them. Then, combined with the outcomes of the other project workpackages, we can identify points of resolution and produce outputs that seek to advance responsibility on algorithm driven internet platforms.
The UnBias team is pleased to announce the launch of a ground-breaking report that articulates the voice of children and young people, and their relationship to the internet and digital technologies.
This report is titled ‘The Internet on our Own Term: How Children and Young People Deliberated about their Digital Rights’ and describes the work carried since April 2015 in which young people aged between 12 and 17 gathered together in the cities of Leeds, London and Nottingham to participate in a series of jury-styled focus groups designed to ‘put the internet on trial’. In total, nine juries took place which included 108 young people, approximately 12 participants per jury.
The first youth jury sessions of the UnBias project took place last weekend and were highly interesting and thought provoking. Despite the cold and rainy weather, we had a great turnout with nearly 30 young people choosing to attend. Our youth jurors mostly ranged in age from 13 to 18 and took part in two interactive activities.