SHERPA at the Unfreezing Freedom Symposium

Do you ever hesitate to click on a link because you think “that might be remembered, and that could come back to me later”? This question was put forward by SHERPA partner Tijmen Schep at the Unfreezing Freedom Symposium in Rotterdam last week. The goal of the event was to put the rise of data-driven chilling effects on the political agenda. Increasingly, data brokers use machine learning algorithms to analyze our personal data in order to create predictions and judgments about us. As citizens gain insight into these dubious practices, they may try to protect themselves by resorting to self-censorship. If you know that your tweets are monitored and analyzed, you might think twice about posting anything too critical. In the long run, the rise of the reputation economy could lead to chilling effects on democracy.

The event pointed out the issue by simply saying that citizens need to become “more aware” of how their data is used. The irony is that increased awareness can in itself become a problem.

The event’s goal of putting this issue on the agenda seemed to be successful: it sold out and was mostly visited by politicians such as the Commissioner and even the king for the province of South-Holland was present, as he officially opened the event. It’s another indication that issues around algorithms and data are being taken more seriously. By sheer coincidence, the theme was continued by the Dutch prime minister, who on the very same day proclaimed that all judgemental algorithms used by the government should become more transparent to citizens.

Visitors to the symposium could also play with an installation called “Survival of the most algorithmically attractive”, which consisted of a machine learning-enabled camera that algorithmically judges the people it sees. In this case, it chooses which of the two is most employable based on their attractiveness, ethnicity, gender, age and expressed emotion. This installation was based on software being developed for the SHERPA project, and which will also be used by students of the Hogeschool Rotterdam in an upcoming design challenge where they will explore issues around algorithmic fairness.

© Foto: Erno Wientjens.

© Foto: Erno Wientjens.

Recommended Posts

Leave a Comment

Contact Us

We're not around right now. But you can send us an email and we'll get back to you, asap.

Not readable? Change text. captcha txt
SHERPA AI podcastethics in innovation