In the spirit of recent events surrounding the revelations about Cambridge Analytica and the breaches of trust regarding Facebook and personal data, ISOC UK and the Horizon Digital Economy Research institute held a panel discussion on “Multi Sided Trust for Multi Sided Platforms“. The panel brought together representatives from different sectors to discuss the topic of trust on the Internet, focusing on consumer to business trust; how users trust online services that are offered to them. Such services include, but are not limited to, online shopping, social media, online banking and search engines.
Hundreds of people attended this free event that offered a very diverse and fun programme with lots of hands-on activities and demonstrations on different topics: life sciences, astronomy, chemistry, physics, psychology, natural history, engineering, etc.
As part of the ESRC Festival of Science the UnBias research team will run two Youth Juries on Saturday 11th of November 2017 at the Nottingham Broadway cinema.
Saturday 11th November
10:00 – 12:00 and 14:00-16:00
UnBias Youth Jury: Who is in charge? You or the algorithm?A youth-led discussion about algorithm fairness.
The UnBias Youth Juries are interactive and participative events to allow young people to reflect, understand and have a say about how the Internet works. Young people will be asked to consider, debate and share ideas about recommender systems like Amazon or search engines like Google or DuckDuckGo.
The UnBias Youth Jury will be highly interactive and it will showcase short video clips and scenarios as a way of sparking debate. It will be fun and engaging, and will allow a space for everyone to share their opinions and experiences.
During the event, participants will be invited to become part of a ‘jury’ that will reflect and offer advice on:
•Algorithms’ fairness and their relevance to the participants
•Filtering information from the Internet – how is it or can be done?
•How participants would like to manage their personal identity
•Youth-lead educational tools and policy recommendations
•Ways of further engaging with young people in thinking about and acting upon algorithm bias.
Participants will be asked to complete a short survey at the beginning and end of each ‘jury’ session.
The event will last 2 hours in total, with time allowed for refreshment breaks.
You will get the chance to add your voice to a high-profile campaign on digital rights, while contributing on the development of educational material that will be available to participants, educators and guardians through the UnBias project (http://unbias.wp.horizon.ac.uk/). Your participation will also contribute to policy recommendations to be presented to the UK minister for internet safety and security.
For more information and registration email Elvira.Perez@Nottingham.ac.uk
Members of the UnBias team and the Digital Wildfire project from the Universities of Nottingham and Oxford were delighted to participate in Mozilla Festival (MozFest), which took place over the weekend of 28th-29th October 2017. The festival saw thousands of members of the general public, of all ages and nationalities, pass through the doors of Ravensbourne College to engage in a festival that aimed to promote a healthy internet and a web for all. Issues of digital inclusion, web literacy and privacy and security were some of the key topics that were discussed at the event.
How do you take care on the Internet? What are the dangers of online fake news and filter bubbles? What are appropriate punishments for hate speech and trolling?
These are questions we asked members of the public during the Curiosity Carnival at the University of Oxford on September 30th. The Curiosity Carnival formed part of European Researchers’ Night, celebrated in cities across Europe. Oxford ran a city wide programme of activities across its universities, libraries, gardens and woods to give members of the public a chance to find out about real research projects and meet the people who conduct them.
On September 14th the US ACM organized a panel on Algorithmic Transparency and Accountability in Washington DC to discuss the importance of the Statement on Algorithmic Transparency and Accountability and opportunities for cooperation between academia, government and industry around these principles. Also part of this panel was Ansgar, representing the IEEE Global Initiative on Ethical Considerations for Artificial Intelligence and Autonomous Systems, its P7000 series of Standards activities, and UnBias.
It is our great pleasure to welcome you to the 2nd UnBias stakeholder workshop this June 19th (2017) at the Wellcome Collection in London, UK.
In this workshop we will build on the outcomes of the previous workshop, moving from the exploration of issues to a focus on solutions.
Aims of stakeholder workshops Our UnBias stakeholder workshops bring together individuals from a range of professional backgrounds who are likely to have differing perspectives on issues of fairness in relation to algorithmic practices and algorithmic design. The workshops are opportunities to share perspectives and seek answers to key project questions such as:
What constitutes a fair algorithm?
What kinds of (legal and ethical) responsibilities do internet companies have to ensure their algorithms produce results that are fair and without bias?
What factors might serve to enhance users’ awareness of, and trust in, the role of algorithms in their online experience?
How might concepts of fairness be built into algorithmic design?
The workshop discussions will be summarised in written reports and will be used to inform other activities in the project. This includes the production of policy recommendations and the development of a fairness toolkit consisting of three co-designed tools 1) a consciousness raising tool for young internet users to help them understand online environments; 2) an empowerment tool to help users navigate through online environments; 3) an empathy tool for online providers and other stakeholders to help them understand the concerns and rights of (young) internet users.
Structure of the 2nd stakeholders workshop The workshop will consist of two parts.
In the first part we will present a challenge to choose which out of four possible algorithms is most fair for a limited resources allocation task. We will do this under two transparency conditions: 1. when only observations of outcomes are known; 2. when the rational behind the algorithm is know. we will conclude this part with a discussion about the reasoning behind our algorithm choices.
Having been primed with some of the challenges for designing fair algorithmic decision systems, the second part will explore ideas and frameworks for an ’empathy’ tool to help algorithmic system designers identify possible sources of bias in their system design.
Workshop schedule:
12:00-1:00pm Lunch/informal networking
1:00 – 1:15 Brief introduction with update about the UnBias project & outline of the workshop
Privacy/confidentiality and data protection
All the workshops will be audio recorded and transcribed. This in order to facilitate our analysis and ensure that we capture all the detail of what is discussed. We will remove or pseudonymise the names of participating individuals and organisations as well as other potentially identifying details. We will not reveal the identities of any participants (except at the workshops themselves) unless we are given explicit permission to do so. We will also ask all participants to observe the Chatham House rule – meaning that views expressed can be reported back elsewhere but that individual names and affiliations cannot.
On February 10th and 11th, UnBias participated in the 2017 Explorers Fair Expo at the Nottingham Broadway cinema to engage with parent, children and citizens of any age in discussing the ways in which algorithms affect our lives.
On February 3rd a group of twenty five stakeholders joined us at the Digital Catapult in London for our first discussion workshop.
The User Engagement workpackage of the project focuses on gathering together professionals from industry, academia, education, NGOs and research institutes in order to discuss societal and ethical issues surrounding the design, development and use of algorithms on the internet. We aim to create a space where these stakeholders can come together and discuss their various concerns and perspectives. This includes finding differences of opinion. For example, participants from industry often view algorithms as proprietary and commercially sensitive whereas those from NGOs frequently call for greater transparency in algorithmic design. It is important for us to draw out these kinds of varying perspectives and understand in detail the reasoning that lies behind them. Then, combined with the outcomes of the other project workpackages, we can identify points of resolution and produce outputs that seek to advance responsibility on algorithm driven internet platforms.
As part of the Explorers Fair Expo at the Nottingham Broadway cinema UnBias will run public engagement activities on Friday 1oth and Saturday 11th of February 2017. All ages welcome.
Our program for the event is as follows:
Friday 10th Febr 9.45 – 15.15
Drop in activity: Interacting with different web browsers & search engines – Do you care? E. Pérez-Vallejos, UoN
Hands-on exercises comparing results when using different browsers and/or search engines. To enquiry and discuss about their online preferences and/or concerns regarding: algorithms, filtering systems, fairness and possible recommendations.
Saturday 11th Febr
12-45 – 13.15 Talk: Who is in charge? You or the algorithm? A. Koene, UoN
Looking for an answer to just about any question? Just look it up online. All the world’s information is available through search engines, social networks, news recommenders etc. Ever wondered how these systems select which information is relevant for you?
1.45 – 15.00 “UnBias” Youth Juries: A youth-led discussion about algorithm fairness. M. Cano, L. Dowthwaite, V. Portillo, UoN
Youth-lead focus groups with different scenarios to prompt discussions about some particular aspects on how the internet works (with focus on algorithm fairness when interacting with automated systems), giving participants the chance to share their views and express their concerns.
Emancipating Users Against Algorithmic Biases for a Trusted Digital Economy