Tag Archives: Policy

EuroDIG 2017

About

EuroDIG 2017 will take place in Tallinn, 6-7 June and will be hosted by the Ministry of Foreign Affairs of the Republic of Estonia. EuroDIG is not a conference, it is a year round dialogue on politics and digitisation across the whole European continent which culminates in an annual event. More about EuroDIG.

Pre- and side-events

A number of pre and side events will be enriching the EuroDIG programme.  European organisations will organise meetings on day zero 5th June and the European Commission opens the High Level Group on Internet Governance Meeting on 8th June to the public.

Participate!

Our slogan is “Always open, always inclusive and never too late to get involved!”

Org Teams did their best to facilitate the ground for in depth multistakeholder discussion and our Estonian host, the Ministry of Foreign Affairs, worked hard to give you a warm welcome!

Now it is up to you to engage in the discussion – the floor is always open! A first opportunity will be the open mic session, the first session after the welcome.

We would like to hear from YOU: How I am affected by Internet governance?

No chance to travel to Tallinn?

No problem! We are in Estonia, the most advanced country in Europe when it comes to digital futures! For all workshops and plenary sessions we provide, video streaming (passive watching), WebEx (active remote participation) and transcription. Transcripts and videos will be provided at the EuroDIG wiki after the event. Please connect via the links provided in the programme.

UnBias at EuroDIG

UnBias is contributing to EuroDIG 2017 by running a Flash session on “Accountability and Regulation of Algorithms” and as part of the organizing team for the Plenary session of “Internet in the ‘post-truth’ era?“.

Looking forward to seeing you there!

The first IEEE P7003™ Working Group meeting

IEEE Standards Association (IEEE-SA) invites your participation in the IEEE P7003™, Standard for Algorithmic Bias Considerations Working Group.

Why get involved: 

The goal of this Standard Project is to describe specific methodologies that can help users certify how they worked in order to address and eliminate issues of negative bias in the creation of their algorithms. “Negative bias” refers to the usage of overly subjective or uniformed data sets or information known to be inconsistent with legislation concerning certain protected characteristics (such as race, gender, sexuality, etc.); or with instances of bias against groups not necessarily protected explicitly by legislation, but otherwise diminishing stakeholder or user wellbeing and for which there are good reasons to be considered inappropriate.

Who should participate:

Programmers, manufacturers, researchers or other stakeholders involved in creating an algorithm along with any stakeholders defined as end users of the algorithm, and any non-user affected by the use of the algorithm, including but not limited to customers, citizens or website visitors

How to Participate:

If you wish to participate in the IEEE P7003™ Working Group, please contact the Working Group Chair, Ansgar Koene.

Meeting Information:

The first IEEE P7003™ Working Group meeting will be held online via (WebEx) on Friday, 5 May from 9:00 AM – 11:00 AM (EST)

REGISTER FOR MEETING

If you cannot attend the meeting and want to be added to the distribution list please fill out this form.

Publication of 1st WP4 workshop report

We are please to announce that the report summarizing the outcomes of the first UnBias project stakeholder engagement workshop is now available for public dissemination.

The workshop took place on February 3rd 2017 at the Digital Catapult centre in London, UK. It brought together participants from academia, education, NGOs and enterprises to discuss fairness in relation to algorithmic practice and design. At the heart of the discussion were four case studies highlighting fake news, personalisation, gaming the system, and transparency.

Continue reading Publication of 1st WP4 workshop report

When AI goes to War: public opinion, modern conflict, and autonomous weapons

Weaponisation of artificial intelligence presents one of the greatest ethical and technological challenges in the 21st century and has been described as the third revolution in warfare, after the invention of gunpowder and nuclear weapons. Despite the vital importance of this development for modern society, legal and ethical practices, and for technological research, there is little systematic study of public opinion on this critical issue. Our interdisciplinary project, sponsored by CHERISH Digital Economy, addresses this gap. Our objective is to analyse what factors determine public attitudes towards the use of fully autonomous weapons.
 
To do this, we will produce a series of plausible but fictitious scenarios that will be presented to young adults (18-25 years old) part of a focus group. The scenarios will contain dilemmas to stimulate discussions. The aim of these focus groups is not simply to find out what young people think and feel about fully autonomous weapons, but to discover what shapes their thinking; how they came to define certain scenarios as problematic; how they attempt to work together to think through solutions to these problems; the extent to which they are prepared to change their minds in response to discussion with peer or exposure to new information; and how they translate their ideas into practical policy recommendations. Our working hypothesis is that society is not equipped by either biological evolution or contemporary human culture to make informed evaluations about the ethical implications of using autonomous agency to fight our wars for us.
 
This workshop is designed to bring researchers interested in ethics, virtual/augmented reality, modern warfare, public opinion, artificial intelligence and robotics. The speakers will be asked to share their research in a 20 minute presentation, with the goal of contributing to an edited volume based on the workshop, in which Routledge Publishing and Rowman Books have expressed an interest, and with the aim of developing a proposal for a major funding bid for the next state of this project.
 
Bursaries are available to support the travel costs incurred in attending this workshop. Please direct your inquiries to Elvira Perez Elvira.Perez@nottingham.ac.uk.
Workshop organised by:
 
Eugene Miakinkov, Lecturer in War and Society, Swansea University
Elvira Perez, Senior Research Fellow, University of Nottingham
Rob Wortham, PhD Researcher, University of Bath
 
Workshop Sponsored by:

 

Internet Society – European Chapters meeting

Agenda and Details

Wednesday 22 February

12:00 Lunch at the venue

Welcome and introductions, Frederic Donck

Introduction to trust, based on 2016 ISOC report (and discussion), Richard Hill

Editorial responsibility for online content – platform neutrality, recommender systems and the problem of ‘fake news’ (and discussion), Ansgar Koene

Future Internet Scenarios, Konstantinos Komaitis

17:30 Day 1 ends

19:00 Dinner (On Canal Boat leaving from Oosterdok in front of the hotel)

Thursday 23 February

9:00 Day starts

Collaborative security introduction, Olaf Kolkman

Real life examples of collaborative security in action meeting, Andrei Robachevsky

User-Trust, with regard to longevity and security of IoT devices (and discussion), Jonas Jacek

Round table on current issues related to user trust in Europe

12:30 Working lunch at the venue

Search ranking technologies (and discussion), Brandt Dainow

ISOC-NL presentation

Way forward: meeting on next steps, concrete actions for ISOC and chapters

16:00 Day 2 ends

Internet Society UK England – User Trust Webinar

In preparation for the European Chapters meeting (22-23 February 2017) we will have a 90 minutes Webinar / Conference call on Tuesday 14 February 2017 from 6pm to collect input from participants about the ways in which ISOC UK can/should engage with the theme of User Trust.

In June 2016 ISOC published a working paper “A policy framework for an open and trusted Internet” outlining the four interrelated dimension to be considered when developing policies for the internet. http://www.internetsociety.org/doc/policy-framework-open-and-trusted-internet

The aim of the European Chapters meeting is to build on this and identify specific areas related to User Trust that ISOC should prioritise and focus on when engaging with policy maker to build a trusted Internet.

The specific discussions around User Trust that have been proposed for the meeting are:

  • Ethical data handling
  • Privacy
  • Data breaches
  • Examples of collaborative security in action
  • Internet of Things – implications for security, privacy, control (who control which aspect of the device: user vs. service provider), liability in case of problems, longevity (e.g. devices embedded in infrastructure)
  • Digital Literacy – the need for people to understand basic aspects of how the internet, and digital services, work in order to: improve cybersecurity; be able to give informed consent to personal data usage; understand the implications of proposed legislation (e.g. snoopers charter); …
  • User generated content moderation – how to approach the issues related to fake news and editorial responsibility
  • An overview of the situation in Russia

Other areas of User Trust that might be especially relevant for ISOC UK could be:

  • Government surveillance powers (implications and legal challenges to the Investigative Powers Act)
  • The impact of nation-first, anti-globalization movement (Brexit)
  • Governance of the platform economy (e.g. Uber, Deliveroo), i.e. classification as ‘tech’ company to avoid regulations

Which areas should we prioritize? The chapters meeting is only one and a half days long so time is limited.

Looking beyond the European Chapters meeting, what kind of follow-up activities should ISOC UK pursue, e.g. digital literacy 101 for parliamentarians?

Topic: Internet Society UK and User Trust – Webinar
Time: Feb 14, 2017 6:00 PM London