The UnBias team are delighted to announce the launch of their new ‘Facilitator Booklet’, designed especially to help group leaders and others to deliver workshops using the UnBias Awareness Cards.
As part of our ongoing collaboration with the UK England chapter of the Internet Society (ISOC-UK England), UnBias will run a workshop on:
Algorithmic awareness building for User Trust in online platforms
Time: Friday, November 30th 2018, 18:00 to 21:00 (UTC), London
Place: Cloudflare offices, 25 Lavington Street, Southwark, London (link to Google Map)
OUR FUTURE INTERNET: FROM BIAS TO TRUST
DIGITAL CATAPULT: OCTOBER 1ST 10.30 AM TO 5.00 PM
On October 1st the UnBias project team will be showcasing the outcomes of our work. We are looking forward to welcoming an audience of 70 stakeholders from research, law, policy, education and industry.
In addition to reporting on our major findings we will also highlight key outputs such as policy guidelines and demonstrate our exciting fairness toolkit. This engaging and interactive event will also include presentations from external speakers and opportunities for networking. Furthermore, we will announce plans for our follow-on project, ReEnTrust, which will identify mechanisms to rebuild and enhance trust in algorithmic systems.
The Fairness Toolkit has been developed for UnBias by Giles Lane and his team at Proboscis, with the input of young people and stakeholders. It is one of our project outputs aiming to promote awareness and stimulate a public civic dialogue about how algorithms shape online experiences and to reflect on possible changes to address issues of online unfairness. The tools are not just for critical thinking, but for civic thinking – supporting a more collective approach to imagining the future as a contrast to the individual atomising effect that such technologies often cause.
The toolkit contains the following elements:
2. Awareness Cards
5. Value Perception Worksheets
All components of Toolkit are freely available to download and print from our site under Creative Commons license (CC BY-NC-SA 4.0).
Download the complete UnBias Fairness Toolkit (zip archive 18Mb)
Demonstrations of the toolkit will be given at the V&A Digital Design weekend, London September 22nd.
On the week-end of June 30th and July 1st, the UnBias team hosted a two-day hackathon at Codebase in Edinburgh, with support from local outfit Product Forge, whose experience organizing such events is unrivalled in Scotland.
The hackathon challenge was formulated as follows:
“Artificial Intelligence shapes digital services that have become central to our everyday lives. Online platforms leverage the power of AI to monetize our attention, with often unethical side-effects: our privacy is routinely breached, our perception of the world is seriously distorted, and we are left with unhealthy addictions to our screens and devices. The deep asymmetry of power between users and service providers, the opacity and unaccountability of the algorithms driving these services, and their exploitation by trolls, bullies and propagandists are serious threats to our well-being in the digital era.
In response to the growing importance of algorithmic products in international trade, regional and international trade negotiations at the WTO and elsewhere are currently seeking to set down new rules regarding issues such as Intellectual Property and algorithmic transparency.
In order to try to avoid outcomes of the trade negotiations that inadvertently blocks algorithmic accountability, Ansgar is supporting Sanya Reid Smith of Third World Network in her efforts to brief trade negotiators on causes and consequences of algorithmic bias and the current status of regulatory and standards initiatives to address these issues.
I was very pleased to present UnBias’ data at two great recent UK events that addressed children’s safety and wellbeing and children’s rights at: the NSPCC annual conference, ‘How safe are our Children? Growing up online’, 20th-21st June, in London and at the launch of the ‘Children, Rights and Childhood’ event, on 22nd June in Birmingham.
We are pleased to announce that UnBias won one of the three 2017 RCUK Digital Economy Theme ‘Telling Tales of Engagement’ awards. The evaluation process for this award considered both the impact of our previous work and a proposed new activity to “tell the story” of our research.
Our submission was titled “building and engaging with multi-stakeholder panels for developing policy recommendations”, highlighting the importance to our research of engaging with our stakeholder panel and with organizations that are shaping the policy and governance space for algorithmic systems.
In the spirit of recent events surrounding the revelations about Cambridge Analytica and the breaches of trust regarding Facebook and personal data, ISOC UK and the Horizon Digital Economy Research institute held a panel discussion on “Multi Sided Trust for Multi Sided Platforms“. The panel brought together representatives from different sectors to discuss the topic of trust on the Internet, focusing on consumer to business trust; how users trust online services that are offered to them. Such services include, but are not limited to, online shopping, social media, online banking and search engines.
Earlier this year the UnBias team ran its first Ethical Hackathon. These are a new kind of event developed by members of the Human Centred Computing theme at Oxford. They works as a twist on the traditional hackathon; by building on principles of responsible innovation, our ethical hackathons are geared towards forefronting ethical issues alongside design ones in the completion of a task.
In the ethical hackathon teams work together to carry out a competitive design task. In addition to thinking about technical features of design they are required to address the social and ethical implications of the particular technology involved. They are challenged to identify novel and creative solutions that embed ethical considerations into their design. Teams are interdisciplinary so that they can share expertise and learn from each other in a fun environment. They are then assessed by a panel of experts who judge the technical quality of their work alongside how well they have worked together to identify and address ethical concerns.