Introduction and location data
The personalisation revolution is in full swing. Mass advertising is dead… almost. There may still be plenty of commercials aimed at anyone who happens to be watching TV, but increasingly we all have our own digital channels.
From the 'personalised experience' you get on retail websites to the 'suggested posts' on Facebook and the adverts for something you once searched for in Google tracking you around the internet, the web is getting savvy to what individuals want. Personal data is being collected at an alarming rate, and with wearables on the rise and the Internet of Things coming, it's a movement that's only just begun.
By 2021 there will be twice the number of smartphone contracts there are today, reaching 6.3 billion. Ericsson recently reported that the IoT will overtake mobile phones by 2018, with 16 billion connected devices forecast to join the IoT by the end of 2021.
"We are facing a perfect storm," says Ross Woodham, Director for Legal Affairs and Privacy at Cogeco Peer 1. "With new technologies being developed, data being generated by personal devices at an exponential rate, and international security a global concern, it is no wonder legislators are struggling to get to grips with new issues and moral dilemmas."
Pervasive computing and wearables
From phones to wearables, computing is everywhere. It's pervasive. It's personal. It's the 'Quantified Self' where technology collects data on increasingly personal aspects of a person's life. "Whether they know it or not, consumers have long been 'selling' their data and we expect this trend to not only continue, but to grow," says Ashley Winton, Partner and UK head of data protection and privacy at law firm Paul Hastings LLP and Chairman of the UK Data Protection Forum.
There are even now devices like Livia and Bellabeat that track women's daily activity, sleep and menstrual cycles. Now that's highly personal, but potentially also highly useful to third-parties.
No one is suggesting that either of those wearables sells data on to advertisers, but the market is such that all wearables are sold on convenience and lifestyle, but their initial low cost is often for a hidden reason. "Their cost will be at least partly offset by the monetisation of personal data – the end-user will be offered a strictly limited interface to what the device does, and that interface will not expose or control what the device does with the data it collects and generates," says Robin Wilton, Technical Outreach Director for Identity and Privacy at the Internet Society.
Explicit consent
All of those trends are already seen in the first wave of connected consumer devices – the buyer get a shiny device and a shiny free app, but loses control over some very intimate personal data. That bargain is not spelled out clearly enough, according to some.
"The more 'seamless' the service, the less awareness the user has of what is going on … it perpetuates a situation in which the user makes privacy decisions based on an unrepresentative subset of relevant information," says Wilton. "The imbalance of power between the user and the device/service provider persists, and there is no real notion of informed, explicit consent."
There are also currently no sanctions if that personal data is lost. "These device companies will likely be protected from liability in a breach by their own privacy policies, and interestingly, while the FTC still has jurisdiction if there is a major incident, it remains to be seen if people would be angry if their data of this sort is breached," says Robert Stroud, Director on ISACA's board and Principal Analyst at Forrester Research.
What about location data?
Part of personalisation is about a phone app or a wearable knowing where you are, which adds valuable contextualisation. Think Google Maps telling you about nearby restaurants, Facebook showing you which of your friends are nearby, or a smartwatch tracking you on a run or walk. The same applies to a laptop opened in a coffee shop.
Such location data is highly revealing about social networks, identity and predicated behaviour, which Professor Sandy Pentland at MIT proved five years ago. More recent research at Columbia University has found that location data makes users highly linkable across different services.
"The point about linking across contexts is important," says Wilton. "Individuals' sense of control and self-determination is closely linked to whether information they disclose in one context shows up in another context – without their intervention or awareness." The more it does, the less sense of privacy the user has, and the more their autonomy is undermined, according to Wilton.
Generation of inferences
'Generation of inferences'
'Personalised analytics' doesn't come without risks. "Profiling, analytics and big data all allow the generation of inferences, which affect individuals even if they were not the source of the original data," says Wilton. "Inference data is not currently well regulated – if at all – and yet it can often have at least as much privacy impact as the data from which it was derived."
For the upcoming era of wearable smart devices in medical care, this could be crucial – the immensely personal data doctors collect as they track pacemakers and monitor the vital signs (and even GPS position) of hospitalised patients would be vulnerable.
Although wearables categorise themselves as 'personal entertainment device monitors', as they become more prevalent in healthcare their data could be used beyond the initial reason for collection. "These devices could be used in a corporate wellness program, and leveraging the personal data may allow the company to impose sanctions on those employees that are not active enough," says Stroud.
Public datasets
Data is a jigsaw, and the more pieces floating around, the more likely it is that so-called anonymous data can be pieced together. What's made that easier is the emergence of public datasets.
"Right now medical data has specific requirements for anonymised data, but with the proliferation of public datasets it could be difficult to prove that you can't put that data back together," says Stroud. So if you can use the customer data without the personal information, you should. "Also developing are micro-retention policies, which are not graduated by server or dataset, but field-level retention strategies," he adds.
What should we do about all this personal data?
That's simple: minimisation, metadata, context and consent all need to be reformed. Wilton thinks that less personal details should be asked for and disclosed, that metadata should be restricted to the layers of the internet where it is functionally essential, and that any disclosed data should be tightly bound to a clear context with clear constraints and obligations about what can be done with the data.
He also thinks that users should be given a simple record of consent, independent of the service provider's record of the transaction. "Service providers need an incentive to shift from the status quo," he says. "One way to encourage that is for users to make it clear that respect for privacy has a dividend for the service provider, in terms of reputation and customer satisfaction." In other words, users need to complain loudly and often, and things will change.
Increasing control
To some extent, this is already happening. Some even see a cultural shift within the tech industry to giving people control over how businesses use their data. For instance, you can do a 'privacy check-up' on Google and Facebook accounts.
"Businesses are giving consumers a larger number of controls over how their personal data is used, and I would argue that this increased control increases the intangible asset value of that data," says Winton, who expects that as technology develops it will hand over more control to users.
"It is only with more online controls and a better audit trail around use of those controls that businesses will be able to entrench control of personal data in the hands of the consumer, and better cope with the prospective General Data Protection Regulation," he adds.
The lesson for businesses worried about compliance is simple: collect less personal data, and rather than worry about where to safely store that data, give it back to the owner.
http://ift.tt/296ELud
No comments:
Post a Comment