The Nottingham UnBias team have finished running our Youth Juries and we are delighted to announce the launch of a report ‘Youth Juries- what we learned from you’ – for the children and young people that participated. If you were one of these people- thank you for taking part! We hope that you enjoy reading the report. We would also like to thank our Youth Advisory Group for their thoughtful and constructive insights that helped to design and shape the report.
For a shorter summary of what we did in the Youth Juries and what the team discovered, please keep reading…
Continue reading Exciting News for our Youth Jurors!
OUR FUTURE INTERNET: FROM BIAS TO TRUST
DIGITAL CATAPULT: OCTOBER 1ST 10.30 AM TO 5.00 PM
On October 1st the UnBias project team will be showcasing the outcomes of our work. We are looking forward to welcoming an audience of 70 stakeholders from research, law, policy, education and industry.
In addition to reporting on our major findings we will also highlight key outputs such as policy guidelines and demonstrate our exciting fairness toolkit. This engaging and interactive event will also include presentations from external speakers and opportunities for networking. Furthermore, we will announce plans for our follow-on project, ReEnTrust, which will identify mechanisms to rebuild and enhance trust in algorithmic systems.
Continue reading Looking forward to the UnBias Showcase! October 1st 2018, London
The Fairness Toolkit has been developed for UnBias by Giles Lane and his team at Proboscis, with the input of young people and stakeholders. It is one of our project outputs aiming to promote awareness and stimulate a public civic dialogue about how algorithms shape online experiences and to reflect on possible changes to address issues of online unfairness. The tools are not just for critical thinking, but for civic thinking – supporting a more collective approach to imagining the future as a contrast to the individual atomising effect that such technologies often cause.
The toolkit contains the following elements:
2. Awareness Cards
5. Value Perception Worksheets
All components of Toolkit are freely available to download and print from our site under Creative Commons license (CC BY-NC-SA 4.0).
Demonstrations of the toolkit will be given at the V&A Digital Design weekend, London September 22nd.
More information is available on the Fairness Toolkit, and Trustscapes pages.
Continue reading UnBias Fairness Toolkit
In response to the growing importance of algorithmic products in international trade, regional and international trade negotiations at the WTO and elsewhere are currently seeking to set down new rules regarding issues such as Intellectual Property and algorithmic transparency.
In order to try to avoid outcomes of the trade negotiations that inadvertently blocks algorithmic accountability, Ansgar is supporting Sanya Reid Smith of Third World Network in her efforts to brief trade negotiators on causes and consequences of algorithmic bias and the current status of regulatory and standards initiatives to address these issues.
Continue reading Briefing to RCEP eCommerce trade negotiators
On June 21st 2018, the KAIST Institute for Artificial Intelligence, Fourth Industrial Revolution Center in Korea hosted a public forum discussion on “Taming Artificial Intelligence: Engineering, Ethics, and Policy to discuss the ethics of artificial intelligence technology development as well as policy making around the world.
Continue reading KAIST workshop on Taiming AI: Engineering, Ethics and Policy
The Nottingham UnBias Team has been working with children aged 3-13 years to help them to learn how the internet works and to reflect on issues of personal information and online filter bubbles, through creating their own “data gardens”. We attended a Family Discovery Day on the 16th June at the University of Nottingham and the STEM Festival at Bluecoat Beechdale Academy on Saturday 23rd June.
Continue reading How to create your own Data Garden!
We are pleased to announce that UnBias won one of the three 2017 RCUK Digital Economy Theme ‘Telling Tales of Engagement’ awards. The evaluation process for this award considered both the impact of our previous work and a proposed new activity to “tell the story” of our research.
Our submission was titled “building and engaging with multi-stakeholder panels for developing policy recommendations”, highlighting the importance to our research of engaging with our stakeholder panel and with organizations that are shaping the policy and governance space for algorithmic systems.
Continue reading RCUK Digital Economy Theme ‘Telling Tales of Engagement’ award for UnBias
Our new video animation explains what algorithms are, how they shape our online browsing and how they can create risks of bias. It also describes how the UnBias project seeks to promote a future Internet that is free and fair for all. Watch it here!
In the spirit of recent events surrounding the revelations about Cambridge Analytica and the breaches of trust regarding Facebook and personal data, ISOC UK and the Horizon Digital Economy Research institute held a panel discussion on “Multi Sided Trust for Multi Sided Platforms“. The panel brought together representatives from different sectors to discuss the topic of trust on the Internet, focusing on consumer to business trust; how users trust online services that are offered to them. Such services include, but are not limited to, online shopping, social media, online banking and search engines.
Continue reading ISOC UK / Horizon DER panel for Multi Sided Trust on Multi Sided Platforms
‘Do you take care on the internet? Does it take care of you?’
We were very pleased to take part in Science in the Park 2018 last Saturday 10th March, which was held at the beautiful Wollaton Hall in Nottingham and organised by the British Science Association.
Hundreds of people attended this free event that offered a very diverse and fun programme with lots of hands-on activities and demonstrations on different topics: life sciences, astronomy, chemistry, physics, psychology, natural history, engineering, etc.
Continue reading UnBias at SCIENCE IN THE PARK!