On February 10th and 11th, UnBias participated in the 2017 Explorers Fair Expo at the Nottingham Broadway cinema to engage with parent, children and citizens of any age in discussing the ways in which algorithms affect our lives.
On February 3rd a group of twenty five stakeholders joined us at the Digital Catapult in London for our first discussion workshop.
The User Engagement workpackage of the project focuses on gathering together professionals from industry, academia, education, NGOs and research institutes in order to discuss societal and ethical issues surrounding the design, development and use of algorithms on the internet. We aim to create a space where these stakeholders can come together and discuss their various concerns and perspectives. This includes finding differences of opinion. For example, participants from industry often view algorithms as proprietary and commercially sensitive whereas those from NGOs frequently call for greater transparency in algorithmic design. It is important for us to draw out these kinds of varying perspectives and understand in detail the reasoning that lies behind them. Then, combined with the outcomes of the other project workpackages, we can identify points of resolution and produce outputs that seek to advance responsibility on algorithm driven internet platforms.
As part of the Explorers Fair Expo at the Nottingham Broadway cinema UnBias will run public engagement activities on Friday 1oth and Saturday 11th of February 2017. All ages welcome.
Our program for the event is as follows:
Friday 10th Febr 9.45 – 15.15
Drop in activity: Interacting with different web browsers & search engines – Do you care? E. Pérez-Vallejos, UoN
Hands-on exercises comparing results when using different browsers and/or search engines. To enquiry and discuss about their online preferences and/or concerns regarding: algorithms, filtering systems, fairness and possible recommendations.
Saturday 11th Febr
12-45 – 13.15
Talk: Who is in charge? You or the algorithm? A. Koene, UoN
Looking for an answer to just about any question? Just look it up online. All the world’s information is available through search engines, social networks, news recommenders etc. Ever wondered how these systems select which information is relevant for you?
1.45 – 15.00
“UnBias” Youth Juries: A youth-led discussion about algorithm fairness. M. Cano, L. Dowthwaite, V. Portillo, UoN
Youth-lead focus groups with different scenarios to prompt discussions about some particular aspects on how the internet works (with focus on algorithm fairness when interacting with automated systems), giving participants the chance to share their views and express their concerns.
Aims of stakeholder workshops
Our UnBias stakeholder workshops bring together individuals from a range of professional backgrounds who are likely to have differing perspectives on issues of fairness in relation to algorithmic practices and algorithmic design. The workshops are opportunities to share perspectives and seek answers to key project questions such as:
The workshop discussions will be summarised in written reports and will be used to inform other activities in the project. This includes the production of policy recommendations the development of a fairness toolkit consisting of three co-designed tools 1) a consciousness raising tool for young internet users to help them understand online environments; 2) an empowerment tool to help users navigate through online environments; 3) an empathy tool for online providers and other stakeholders to help them understand the concerns and rights of (young) internet users.
The case studies
We have prepared four case studies concerning key current debates around algorithmic fairness. These relate to: 1) gaming the system – anti-Semitic autocomplete and search results; 2) news recommendation and fake news; 3) personalisation algorithms; 4) algorithmic transparency.
The case studies will help to frame discussion in the first stakeholder workshop on February 3rd 2017. Participants will be divided into four discussion groups with each group focusing on a particular case study and questions arising from it. There will then be an opportunity for open debate on these issues. You might like to read through the case studies in advance of the workshop and take a little time to reflect on the questions for consideration put forward at the end of each one. If you have a particular preference to discuss a certain case study in the workshop please let us know and we will do our best to assign you to that group.
To aid discussion we also suggest the following definitions for key terms:
Bias – unjustified and/or unintended deviation in the distribution of algorithm outputs, with respect to one, or more, of its parameter dimensions.
Discrimination (should relate to legal definitions re protected categories) – unequal treatment of persons on the basis of ‘protected characteristics’ such as age, sexual identity or orientation, marital status, pregnancy, disability, race (including colour, nationality, ethnic of national origin), religion (or lack of religion). Including situations where the ‘protected characteristics’ is indirectly inferred via proxy categories.
Fairness – a context dependent evaluation of the algorithm processes and/or outcomes against socio-cultural values. Typical examples might include evaluating: the disparity between best and worst outcomes; the sum-total of outcomes; worst case scenarios.
Transparency – the ability to see into the workings of the algorithm (and the relevant data) in order to know how the algorithm outputs are determined. This does not have to require publication of the source code, but might instead be more effectively achieved by a schematic diagram of the algorithm’s decision steps.
Privacy/confidentiality and data protection
All the workshops will be audio recorded and transcribed. This in order to facilitate our analysis and ensure that we capture all the detail of what is discussed. We will remove or pseudonymise the names of participating individuals and organisations as well as other potentially identifying details. We will not reveal the identities of any participants (except at the workshops themselves) unless we are given explicit permission to do so. We will also ask all participants to observe the Chatham House rule – meaning that views expressed can be reported back elsewhere but that individual names and affiliations cannot.
The UnBias team is pleased to announce the launch of a ground-breaking report that articulates the voice of children and young people, and their relationship to the internet and digital technologies.
This report is titled ‘The Internet on our Own Term: How Children and Young People Deliberated about their Digital Rights’ and describes the work carried since April 2015 in which young people aged between 12 and 17 gathered together in the cities of Leeds, London and Nottingham to participate in a series of jury-styled focus groups designed to ‘put the internet on trial’. In total, nine juries took place which included 108 young people, approximately 12 participants per jury.
The first youth jury sessions of the UnBias project took place last weekend and were highly interesting and thought provoking. Despite the cold and rainy weather, we had a great turnout with nearly 30 young people choosing to attend. Our youth jurors mostly ranged in age from 13 to 18 and took part in two interactive activities.