Michal Kosinski

Michal Kosinski: making our post-privacy world a habitable place

How giving some privacy could lead to better health or cheaper car insurance

Discussion on privacy were top of the agenda at one of the biggest technology trade fairs in Europe, the CeBIT 2017 in Hanover, Germany. Indeed as social media are full of the many aspects of the lives of those who share their views with as little as pressing a ‘like’ button on Facebook. Invited speaker at CeBIT 2017, Michal Kosinski, who is an assistant professor of organisational behaviour at Stanford graduate school of business, California, USA, shares update on his latest work about the ability to predict future behaviour from psychological traits inferred from the trail made of the many digital crumbs, including pictures we share over the internet.

His latest work has huge implications for privacy. He believes human faces—available from pictures found on social media networks–can be used as a proxi for people’s hormonal levels, genes, developmental history and culture. “It is pretty creepy that our face gives up so much personal information.” He adds: “I would argue sometimes it is worth giving up some privacy in return for a longer life and better health.”

In this context, the regulators don’t work in a vacuum. But regulators cannot guarantee absolute privacy. He explains that it is an illusion for people to strive to retain control over their own data. “The sooner as a society and policy makers, we stop worrying about winning some battles in a privacy war, and the sooner we accept, ‘OK we’ve lost this war,’ and we move towards how to organising society and culture, and technology and law, in such a way that we make the post-privacy world a habitable place, the better for everyone.”

In this exclusive podcast, Kosinski also discusses the constant struggle between top-down governance and bottom-up self-organisation, which leads to a constant trade off in terms of privacy in our society. He gives an insightful example, The likes of Facebook with their algorithms would be uniquely placed to match people with the right job, or detect suicide before they happen. However, this possibility raises questions concerning the level of invasion of people’s privacy, which is not socially acceptable, even if it could solve some of our society’s problems.

Finally, Kosinski gives another example where people’s privacy has been invaded for the purpose of changing people’s behaviour. Specifically, he refers to intervention by car insurance industry which has added sensors in cars to monitor the drivers’ behaviour, thus breaching their privacy in exchange for lower premium.

For further insights into Kosinski’s work, listen to the speech he made at the events gives food for thought.

Sabine Louët

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.