In the wake of high profile data breaches can anything be done to protect our digital personas? Discover how new innovations are giving privacy back to the people.
How much of yourself are you willing to give away?
We're talking about your data. It's big business – and a matter of much debate.
Collectively, our data has the power to change the world. From autonomous cars to medicine, the information we share can be a divining rod for progress and innovation.
But often it's exploited, either through security breaches, social engineering or surveillance. The reverberations of the Cambridge Analytica scandal can still be felt, when 50 million Facebook profiles were harvested to sway political votes. As individuals we have a right to privacy. And it's a right we should protect fiercely.
According to Risk Based Security, a market research company, 2019 was the worst year to date for data breaches, with the number increasing by 54% in a year. At stake is our most vital information, from our medical history to our passport details.
In the first half of 2019, 3.2 billion private records were exposed, and the business sector was responsible for 84.6% of these. The message is clear: businesses need to do more to protect our data.
But often it's our entry fee to a world of subsidized shopping, same-day delivery and on-demand taxi rides. "At the moment, if you want to use the service you provide your data. You don't think that your data in itself is worth that much. Of course, the aggregation [of it] is actually quite scary," said Michael Kopp, head of research at HERE."
Largely, we don't give our data another thought as soon as we've ticked the digital box. What's the problem with giving our information away?
“Arguing that you don't care about the right to privacy because you have nothing to hide is no different than saying you don't care about free speech because you have nothing to say." - famed American whistleblower Edward Snowden
We all have different tolerances. Some people are prepared to reveal a lot in exchange for incentives. Greg Williams, editor-in-chief of Wired writes: “Some people may be willing to trade reduced rent from a landlord in return for surrendering intimate personal data about the way they live; meanwhile, billions of people every day still input data to Facebook, despite the relentless waves of scandal about the way that information is used."
Much of our data is anonymized, designed to protect our identities, but giving powerful insights into our needs and interests. This is so good in fact that a lot of the time advertisers know what we want before we do - for instance, Target famously predicted a teenager's pregnancy based on her buyer behavior.
But just how anonymous is that data? Researchers from Imperial College London and Belgium's Université Catholique de Louvain created an algorithm to identify a potential security flaw with anonymized data sets. With as few as 15 attributes such as gender, marital status and ZIP code they could identify 99.98% of Americans.
“The goal of anonymization is so we can use data to benefit society," said Yves-Alexandre de Montjoye, one of the researchers who worked on the study. “This is extremely important but should not and does not have to happen at the expense of people's privacy."
The blockchain revolution
Kopp believes that in some cases, you can identify someone with even fewer data points.
“With four places a vehicle has gone, you know who it is, in 80 percent of the cases," he said in a discussion at CES with Axios chief technology correspondent Ina Fried and HERE's chief technology officer Giovanni Lanfranchi:
Getting access to data that's useful but not personal is a challenge, as Kopp describes: “With AI and other machine learning techniques you have a powerful tool that can be used to de-identify people but then blockchain can prevent it. It's essentially an arms race going on."
Asked by Fried how he sees privacy in the context of AI and future cities, Lanfranchi had three points: “First, a society where all the data is private by design. If I provide data, I know what the purpose is. Second, it's very decentralized and personalized. Third, it provides transparency for the entire community.
“We really believe there should be a maturation of the community, to have clear attention on privacy for the people." Fried responded:
“I want less cars on the road. I want more efficiency. Less carbon. I want to share a piece of my data to help make that possible. But I don't want to share all my data. And I want to know how it's being used."
The neutral server launched by HERE allows, for the first time ever, safe, secure and non-discriminatory access of car sensor data for third party service providers while upholding privacy regulations. Daimler is the first car manufacturer to use the service, allowing it to offer third parties access to the data points from millions of cars.
Together with a dedicated consent management system that's protected with blockchain technology and based on GDPR requirements, those parties who want to request access to the data, have to ask for the consent of individual drivers in cases where the data contains personal information. The drivers in turn are free to take part or not, or change their mind at any time.
“As a user, I want to be able to understand where my data is used at any point in time," said Lanfranchi. “I want to have the right to be forgotten. The consent management we have is based on blockchain and is a technology that allows trust among parties who do not necessarily trust each other.
For example the neutral server provides the ability for the end-user to knowingly release their odometer data to an insurance company for usage based insurance purposes.
This interaction is a known interaction, where the individual user is directly identified to the insurance provider in order to benefit from the offer.
“With our neutral server we can offer car manufacturers like Daimler a trusted, safe and secure distribution channel to share their car sensor data that is compliant with European legislation and privacy regulations," Lanfranchi went on to say.
Similarly, DECODE, a data-democratization trial in Amsterdam and Barcelona, gives people the right to decide who to share their information with and what it will be used for. The aim is to improve services throughout the cities while protecting people's identities. Citizens can sign political petitions without revealing sensitive personal information, share noise and pollution sensor data anonymously and use social networking sites with greater control over how their profiles are used. You can read more about the Barcelona project here.
Back in 2009, the Mashable founder Pete Cashmore gave a bleak outlook when he said “privacy is dead, and social media hold the smoking gun". A decade on, are we about to turn the corner when it comes to data privacy?