All posts by Ansgar Koene

News, algorithms bias and editorial responsibility

unbias_conversationIn an almost suspiciously conspiracy-like fashion the official launch of UnBias at the start of September was immediately accompanied by a series of news articles providing examples of problems with algorithms that are making recommendations or controlling the flow of information. Cases like the unintentional racial bias in a machine learning based beauty contest algorithm, meant to remove bias of human judges; a series of embarrassing news recommendations on the Facebook trending topics feed, as a results of an attempt to avoid (appearance of) bias by getting rid of human editors; and controversy about Facebook’s automated editorial decision to remove the Pulitzer prize-winning “napalm girl”  photograph because the image was identifies as containing nudity. My view of these events? “Facebook’s algorithms give it more editorial responsibility – not less (published today in the Conversation).

 

Introducing: UnBias

Businesswoman idea concept on blackboard

In an age of ubiquitous data collecting, analysis and processing, how can citizens judge the trustworthiness and fairness of systems that heavily rely on algorithms? News feeds, search engine results and product recommendations increasingly use personalization algorithms to help us cut through the mountains of available information and find those bits that are most relevant, but how can we know if the information we get really is the best match for our interests?

Continue reading Introducing: UnBias

Welcome to the UnBias project

Dear visitor

thank you for your interest in the “UnBias: Emancipating Users Against Algorithmic Biases for a Trusted Digital Economy” project. What you find here is a temporary welcome message. The site is still under construction and will be gaining more content in the coming weeks to better introduce our work.

The official start date for this project is September 1st 2016, so please bear with us as we get things ready.

Sincerely,

the UnBias team