P7003 Use Case submission form

To help communicate the nature and properties of the use cases we encourage the use of the following templated form. For additional clarity we provide a example below, as well as a text based version of the template.

Use case template:

  • Title of use case:
  • Contributor of the use case (name, affiliation, contact):
  • Area: education | transportation | real-estate | manufacturing |     finance | retail |  healthcare | e-commerce | telecommunications | law | public policy | government| non-profit | others (please specify)
  • Discrimination: race | gender | age | … | others (please specify
  • Status of the use case: in operation | completed | research & development
  • Relevant stakeholders: end-users, private sector organizations, public sector organizations, national/local government, others (please specify)
  • Problem description:
  • Why it matters (from P7003’s viewpoint):
  • Action taken:
  • Relevant regulations and/or standards (in force, coming):
  • Further information (URLs, etc.):

Example use case

  • Title of use case: Beauty contest judging algorithm biased to favour white participants
  • Contributor of the use case (name, affiliation, contact): Ansgar Koene, University of Nottingham, ansgar.koene@nottingham.ac.uk
  • Area:  others (entertainment)
  • Discrimination: race
  • Status of the use case: completed
  • Relevant stakeholders: end-users, private sector organizations (e.g. advertisers)
  • Problem description: An attempt to provide an objective (culturally neutral, racially neutral) judgement of female beauty by using algorithms, trained on crowd-sourced data, to judge the beauty of participants. Roughly 6,000 people from more than 100 countries participated by submitting photos. Out of 44 winners, nearly all were white, a handful were Asian, and only one had dark skin.
  • Why it matters (from P7003’s viewpoint): The concept that the use of algorithms can transform judgement about inherently subjective matters (in this case human beauty) into objective (culturally and racially neutral) judgements is deeply flawed. The data required to establish the judgement criteria was probably culturally and racially biased.
  • Action taken: The Beauty.AI contest appears to have been discontinued
  • Relevant regulations and/or standards (in force, coming): None
  • Further information (URLs, etc.):
    http://beauty.ai/
    https://www.theguardian.com/technology/2016/sep/08/artificial-intelligence-beauty-contest-doesnt-like-black-people

Emancipating Users Against Algorithmic Biases for a Trusted Digital Economy

%d bloggers like this: