Tag Archives: IEEE

USACM Panel on Algorithm Transparency and Accountability

USACM, the ACM U.S. Public Policy Council, will be hosting a panel event on “Algorithmic Transparency and Accountability.” The event will provide a forum for a discussion between stakeholders and leading computer scientists about the growing impact of algorithmic decision-making on our society and the technical underpinnings of algorithmic models.

Panelists will discuss the importance of the Statement on Algorithmic Transparency and Accountability and the opportunities for cooperation between academia, government and industry around these principles.

AMOIA workshop at ACM Web Science 2017

AMOIA  (Algorithm Mediated Online Information Access) – user trust, transparency, control and responsibility

This Web Science 2017 workshop, delivered by the UnBias project, will be an interactive audience discussion on the role of algorithms in mediating access to information online and issues of trust, transparency, control and responsibility this raises.

The workshop will consist of two parts. The first half will feature talks from the UnBias project and related work by invited speakers. The talks by the UnBias team will contrast the concerns and recommendations that were raised by teen-aged ‘digital natives’ in our Youth Juries deliberations and user observation studies with the perspectives and suggestions from our stakeholder engagement discussions with industry, regulators and civil-society organizations. The second half will be an interactive discussion with the workshop participants based on case studies. Key questions and outcomes from this discussion will be put online for WebSci’17 conference participants to refer to and discuss/comment on during the rest of the conference.

The case studies we will focus on:

  • Case Study 1: The role of recommender algorithms in hoaxes and fake news on the Web
  • Case Study 2: Business models that share AMOIA, how can web-science boost Corporate Social Responsibility / Responsible Research and Innovation
  • Case Study 3: Unintended algorithmic discrimination on the web – routes towards detection and prevention

The UnBias project investigates the user experience of algorithm driven services and the processes of algorithm design. We focus on the interest of a wide range of stakeholders and carry out activities that 1) support user understanding about algorithm mediated information environments, 2) raise awareness among providers of ‘smart’ systems about the concerns and rights of users, and 3) generate debate about the ‘fair’ operation of algorithms in modern life. This EPSRC funded project will provide policy recommendations, ethical guidelines and a ‘fairness toolkit’ that will be co-produced with stakeholders.

The workshop will be a half-day event

Programme

9:00 – 9:10   Introduction
9:10 – 9:30   Observations from the Youth Juries deliberations with young people, by Elvira Perez (University of Nottingham)
9:30 – 9:50   Insights from user observation studies, by Helena Webb (University of Oxford)
9:50 – 10:10 Insights from discussions with industry, regulator and civil-society stakeholders, by Ansgar Koene (University of Nottingham)
10:10 – 10:30  “Platforms: Do we trust them”, by Rene Arnold
10:30 – 10:50 “IEEE Global Initiative for Ethical Considerations in Artificial Intelligence and Autonomous Systems”, by John Havens
10:50 – 11:10 Break
11:10 – 11:50 Discussion of case study 1
11:50 – 12:30 Discussion of case study 2
12:30 – 12:50 Break
12:50 – 13:30 Discussion of case study 3
13:30 – 14:00 Summary of outcomes

Key dates

Workshop registration deadline: 18 June 2017
Workshop date: 25 June 2017
Conference dates: 26-28 June 2017

The first IEEE P7003™ Working Group meeting

IEEE Standards Association (IEEE-SA) invites your participation in the IEEE P7003™, Standard for Algorithmic Bias Considerations Working Group.

Why get involved: 

The goal of this Standard Project is to describe specific methodologies that can help users certify how they worked in order to address and eliminate issues of negative bias in the creation of their algorithms. “Negative bias” refers to the usage of overly subjective or uniformed data sets or information known to be inconsistent with legislation concerning certain protected characteristics (such as race, gender, sexuality, etc.); or with instances of bias against groups not necessarily protected explicitly by legislation, but otherwise diminishing stakeholder or user wellbeing and for which there are good reasons to be considered inappropriate.

Who should participate:

Programmers, manufacturers, researchers or other stakeholders involved in creating an algorithm along with any stakeholders defined as end users of the algorithm, and any non-user affected by the use of the algorithm, including but not limited to customers, citizens or website visitors

How to Participate:

If you wish to participate in the IEEE P7003™ Working Group, please contact the Working Group Chair, Ansgar Koene.

Meeting Information:

The first IEEE P7003™ Working Group meeting will be held online via (WebEx) on Friday, 5 May from 9:00 AM – 11:00 AM (EST)

REGISTER FOR MEETING

If you cannot attend the meeting and want to be added to the distribution list please fill out this form.

IEEE Standard for Algorithm Bias Considerations

As part of our stakeholder engagement work towards the development of algorithm design and regulation recommendations UnBias is engaging with the IEEE Global Initiative for Ethical Considerations in Artificial Intelligence and Autonomous Systems to develop an IEEE Standard for Algorithm Bias Considerations, designated P7003. The P7003 working group is chaired by Ansgar Koene and will have its first web-meeting on May 5th 2017.

Continue reading IEEE Standard for Algorithm Bias Considerations

The Human Standard: Why Ethical Considerations Should Drive Technological Design Webinar

The IEEE Standards Association (IEEE-SA) Corporate Membership Program invites you to join an exclusive webinar.

The Human Standard:
Why Ethical Considerations Should Drive
Technological Design Webinar

Follow this link to register

18 April 2017 at 12:00PM- 1:00PM EDT 

In the age of autonomous and intelligent machines, it is more important than ever to help technologists and organizations be cognizant of the ethical implications of the products, services or systems they are building and how they are being built before making them available to the general public. While established Codes of Ethics provide instrumental guidance for employee behavior, new values-centric methodologies are needed to complement these codes to address the growing use of algorithms and personalization in the marketplace.

Key insights from the Working Group Chairs of three IEEE-SA projects will be presented. The IEEE Global Initiative provided the input and recommendations that led to the creation of Working Groups for these IEEE-SA standards projects:

IEEE P7001™: Transparency of Autonomous Systems

IEEE P7003™: Algorithmic Bias Considerations

Speakers will provide their perspectives on why it is important for business leaders to increase due diligence relative to ethical considerations for what they create.  This focus is not just about avoiding unintended consequences, but also increasing innovation by better aligning with customer and end-user values.

Speakers

Kay Firth-Butterfield
Executive Committee Vice-Chair, The IEEE Global Initiative for Ethical Considerations in Artificial Intelligence and Autonomous Systems, Executive Director, AI Austin
SEE FULL BIO

John C. Havens
Executive Director, The IEEE Global Initiative for Ethical Considerations In Artificial Intelligence and Autonomous Systems
SEE FULL BIO

Konstantinos Karachalios
Managing Director, IEEE Standards Association SEE FULL BIO

Ansgar Koene
Senior Research Fellow at Horizon Digital Economy Research institute, University of Nottingham. Co-Investigator on the UnBias project and Policy Impact lead for Horizon.
SEE FULL BIO

Alan Winfield
Professor, Bristol Robotics Laboratory, University of the West of England; Visiting Professor, University of York
SEE FULL BIO