Invitation for stakeholder engagement

We would like to invite you to contribute to our ongoing research study by taking part in a small number of stakeholder engagement workshops. These workshops will involve professionals from various groups and will explore the implications of algorithm-mediated interactions on online platforms. They provide an opportunity for relevant stakeholders to put forward their perspectives and discuss the ways in which algorithms shape online behaviours, in particular in relation to access and the dissemination of information to users. The workshops will lead to the production of reports and policy recommendations as well as the design of a ‘fairness toolkit’ for users, online providers and other stakeholders.

The workshops will be audio recorded and transcribed. We might quote extracts from them in our project publications but we will take great care to anonymise all our data. This means that, unless you explicitly request otherwise, your taking part will be kept confidential.

The rest of this invitation provides further information about the project and the stakeholder workshops. You can also contact ansgar.koene@nottingham.ac.uk or helena.webb@cs.ox.ac.uk if you have any queries.

 

What is the UnBias project about?

The UnBias project seeks to promote fairness online. We live in an age of ubiquitous online data collection, analysis and processing. News feeds, search engine results and product recommendations increasingly use personalisation algorithms to determine the information we see when browsing online. Whilst this can help us to cut through the mountains of available information and find those bits that are most relevant to us, how can we be sure that they are operating in our best interests? Are algorithms ever ‘neutral’ and how can we judge the trustworthiness and fairness of systems that heavily rely on algorithms?

Our project investigates the user experience of algorithm driven internet services and the processes of algorithm design. We focus in particular on the perspectives of young people and carry out activities that 1) support user understanding about online environments, 2) raise awareness among online providers about the concerns and rights of internet users, and 3) generate debate about the ‘fair’ operation of algorithms in modern life.

What are the stakeholder engagement workshops about?

As part of our project we are running a series of stakeholder engagement workshops. We invite stakeholders from academia, education, government/regulatory oversight organizations, civil society organizations, media, industry and entrepreneurs to join us in exploring the implications of algorithm-mediated interactions on online platforms especially in relation to access and dissemination of information to users. The workshops will take place over a two-year period (2016 – 2018) and will seek to identify relevant perspectives and concerns as well as provide feedback on our project activities. They will also produce:

  1. a set of policy and design recommendations for enhanced transparency and global fairness in information control algorithms;
  2. a ‘fairness toolkit’ consisting of three co-designed tools
    1. a consciousness raising tool for young internet users to help them understand online environments;
    2. an empowerment tool to help users navigate through online environments;
    3. an empathy tool for online providers and other stakeholders to help them understand the concerns and rights of young internet users.

 

What does participation as stakeholder involve?

We invite each participant to take part in a series of workshops– likely to be between 4 – 6 events over the two-year period. We understand that it might be difficult for the same individual to attend each time, so alternatively we hope that each participating organisation will send a representative to each workshop. This will help us to ensure continuity across events. Participants will be asked to:

  • participate fully in the workshop discussions. These will be audio recorded but we will take care to anonymise potentially identifying details (such as individual names and names of organisations) in all project outputs and publications. We also understand that participants may not be able to divulge commercially sensitive or confidential information.
  • complete brief questionnaires to provide feedback on summary reports after each workshop (reports will be no more than 4 pages long). This will help us to ensure we represent all views accurately.
  • contribute to the production of policy and design recommendations and the co-design of the ‘fairness toolkit’ through input provided both within the workshop sessions and after sessions, as necessary.

The workshops form a central part of our project. We hope that our participants will find them interesting and enjoy having an opportunity to put forward their professional perspectives and to shape our policy recommendations and other outputs. The workshops also provide an opportunity for participants to network with others in relevant fields.

Where, when and how will the workshop discussions take place?

Where: The workshops will generally take place in the UK. We will select locations that best suit the majority of participants so expect that they will often be held in London and other large UK cities. It may be possible for stakeholders to participate through tele-conferencing and we are also exploring the possibility of co-locating some workshops with existing events that particular stakeholder groups are likely to attend, e.g. a session at EuroDIG for Internet Governance related stakeholders.

When: The first set of workshops will take place in January 2017, with the exact date to be determined in coordination with the participants. We expect subsequent workshops will take place at intervals of approximately 4 months, placing the second set of workshops in May or early June 2017.

How: We are seeking participants from a range of stakeholder groups – academia, education, government/regulatory oversight organizations, civil society organizations, media, industry and entrepreneurs. In the first round of workshops we will run separate events for each stakeholder group so that we can explore the particular priorities of each group in turn. We anticipate that these workshops will have up to 12 participants. Some later workshops will likely be larger, as we will combine some or all stakeholder groups to allow participants from different fields to engage with each other. The specific plans for this will be developed in response the feedback following the first set of workshop.

Each workshop will address key topics relating to the implications of algorithm-mediated interactions on online platforms – for example; policy and practice, technical matters, and understandings of fairness. No formal preparation will be necessary but in case participants would like time to think through the issues involved beforehand, we will email round the main topics to be discussed at least a week before each event takes place.

Each workshop will be a half-day event, typically taking place over three hours in the afternoon. A sample workshop schedule is:

  • 5 min Welcome
  • 15 min Updates on UnBias project work and preliminary findings.
  • 5 min Introduction to key workshop topic
  • 10 min Questionnaire/ written individual views regarding the key topic
  • 15 min Coffee break / assigning into subgroups for first round of discussion
  • 30 min Subgroup based discussion
  • 5 min Outcomes gathering break
  • 10 min Reporting on outcomes of subgroup discussions
  • 20 min Combined discussion reflecting on outcomes from the subgroups
  • 15 min Coffee break
  • 50 min Open discussion in issues raised by participants
  • Post workshop drinks at a local pub

 

Privacy/confidentiality and data protection

All the workshops will be audio recorded and transcribed. This in order to facilitate our analysis and ensure that we capture all the detail of what is discussed. We will use quotations from the discussions in our project reports and other outputs but will take great care to anonymise them. We will remove or pseudonymise the names of participating individuals and organisations as well as other potentially identifying details. We will not reveal the identities of any participants (except at the workshops themselves) unless we are given explicit permission to do so. We will also ask all participants to observe the Chatham House rule – meaning that views expressed can be reported back elsewhere but that individual names and affiliations cannot.

We take data protection very seriously. All our data (audio recordings and de-identified transcripts) will be encrypted and stored securely in compliance with the data protection policies of the University of Nottingham and University of Oxford. The data will only be handled by researchers working on the UnBias project.

Emancipating Users Against Algorithmic Biases for a Trusted Digital Economy

%d bloggers like this: