Unbias Hackathon

On the week-end of June 30th and July 1st, the UnBias team hosted a two-day hackathon at Codebase in Edinburgh, with support from local outfit Product Forge, whose experience organizing such events is unrivalled in Scotland.

The hackathon challenge was formulated as follows:

“Artificial Intelligence shapes digital services that have become central to our everyday lives. Online platforms leverage the power of AI to monetize our attention, with often unethical side-effects: our privacy is routinely breached, our perception of the world is seriously distorted, and we are left with unhealthy addictions to our screens and devices. The deep asymmetry of power between users and service providers, the opacity and unaccountability of the algorithms driving these services, and their exploitation by trolls, bullies and propagandists are serious threats to our well-being in the digital era.
This hackathon invites participants to build tools to empower users in their online lives. The tools might address any relevant problem in this space, including (but not limited to) filter bubbles and fake news, biased and unaccountable algorithms, or the profit-driven metrics that guide these AI-powered services.”

Participants at work during the Unbias hackathon
Participants at work during the Unbias hackathon

Nine teams entered the competition, with seven reaching the end of the two-day event and presenting their work to the judging panel. There were some very creative ideas, exploring such diverse issues as the addictive design of Facebook, toxicity of the political discourse on Twitter, and the GDPR compliance of Web sites.

The winning projects were the following:

First prize

Prolight: Mock up of tool to detect hiring discrimination

Prolight: a tool to identify discriminatory hiring practices: a company (e.g. a large chain of coffee shops) could use this service to analyze past hiring or promotion decisions from middle managers, identifying and flagging problematic cases for deeper investigation.

Second Prize

demo video  (opens in a new tab)

A browser plugin to remove several addictive features of the facebook Web interface. The plugin removes several notifications, and interrupts the “infinite scrolling” of a facebook news feed, in order to help users get back to work.

Third Prize

Demo using the Guardian web site

A tool to automatically scrape Web sites and investigate their GDPR compliance, checking their tracking behavior, the availability of a privacy policy, and contact information regarding personal data protection.

These three projects received shopping vouchers, kindly provided by our sponsor CapitalOne.

We also awarded a special prize, funded by the 5rights foundation, to a project that addressed the needs of children.

Demo of the role-playing game

The award-winning project was a role-playing game embedded in the browser, designed to help young people navigate the perils of online life, letting their avatar make the poor decisions of giving away their personal data or signing up to dodgy newsletters.

Our team of mentors
The judging panel
Engaging talks

We are grateful for the financial support from SICSA, which made this event a great success.

Age-Appropriate Design Code – call for evidence by the ICO

5Rights report: ‘Digital Childhood: Addressing Childhood Development Milestones in the Digital Environment’

As of May 25th 2018 the Data Protection Act 2018 (DPA2018) has taken effect in the UK, supporting and supplementing the implementation of the EU General Data Protection Regulation (GDPR).

An important requirement in the DPA2018, going beyond the GDPR, is the inclusion of an Age Appropriate Design Code (section 123 of DPA2018) to provide guidance on the design standards that the Information Commissioner’s Office (ICO) will expect providers of online ‘Information Society Services’ (ISS), which are likely to be accessed by children, to meet.

The ICO is responsible for drafting the Code and has issued a call for evidence is the first stage of the consultation process.

Continue reading Age-Appropriate Design Code – call for evidence by the ICO


I was very pleased to present UnBias’ data at two great recent UK events that addressed children’s safety and wellbeing and children’s rights at: the NSPCC annual conference, ‘How safe are our Children? Growing up online’, 20th-21st June, in London and at the launch of the ‘Children, Rights and Childhood’ event, on 22nd June in Birmingham.


On 16th April the House of Select Committee on Artificial Intelligence published a report called ‘AI in the UK: ready, willing and able?”. The report is based on an inquiry by the House of Lords conducted to consider the economic, ethical and social implications of advances in artificial intelligence. UnBias team member Ansgar Koene submitted written evidence based on the combined work of the UnBias investigations and our involvement with the development of the IEEE P7003 Standard for Algorithmic Bias Considerations.

Continue reading

Emancipating Users Against Algorithmic Biases for a Trusted Digital Economy