Apple’s CEO Tim Cook has joined the chorus of voices warning that data itself is being weaponized again people and societies — arguing that the trade in digital data has exploded into a “data industrial complex”.
Cook did not namecheck the adtech elephants in the room: Google, Facebook and other background data brokers that profit from privacy-hostile business models. But his target was clear.
“Our own information — from the everyday to the deeply personal — is being weaponized against us with military efficiency,” warned Cook. “These scraps of data, each one harmless enough on its own, are carefully assembled, synthesized, traded and sold.
“Taken to the extreme this process creates an enduring digital profile and lets companies know you better than you may know yourself. Your profile is a bunch of algorithms that serve up increasingly extreme content, pounding our harmless preferences into harm.”
“We shouldn’t sugarcoat the consequences. This is surveillance,” he added.
Cook was giving the keynote speech at the 40th International Conference of Data Protection and Privacy Commissioners (ICDPPC), which is being held in Brussels this year, right inside the European Parliament’s Hemicycle.
“Artificial intelligence is one area I think a lot about,” he told an audience of international data protection experts and policy wonks, which included the inventor of the World Wide Web itself, Sir Tim Berners-Lee, another keynote speaker at the event.
“At its core this technology promises to learn from people individually to benefit us all. But advancing AI by collecting huge personal profiles is laziness, not efficiency,” Cook continued.
“For artificial intelligence to be truly smart it must respect human values — including privacy. If we get this wrong, the dangers are profound. We can achieve both great artificial intelligence and great privacy standards. It is not only a possibility — it is a responsibility.”
That sense of responsibility is why Apple puts human values at the heart of its engineering, Cook said.
In the speech, which we previewed yesterday, he also laid out a positive vision for technology’s “potential for good” — when combined with “good policy and political will”.
“We should celebrate the transformative work of the European institutions tasked with the successful implementation of the GDPR. We also celebrate the new steps taken, not only here in Europe but around the world — in Singapore, Japan, Brazil, New Zealand. In many more nations regulators are asking tough questions — and crafting effective reform.
“It is time for the rest of the world, including my home country, to follow your lead.”
Cook said Apple is “in full support of a comprehensive, federal privacy law in the United States” — making the company’s clearest statement yet of support for robust domestic privacy laws, and earning himself a burst of applause from assembled delegates in the process.
Cook argued for a US privacy law to prioritize four things:
- data minimization — “the right to have personal data minimized”, saying companies should “challenge themselves” to de-identify customer data or not collect it in the first place
- transparency — “the right to knowledge”, saying users should “always know what data is being collected and what it is being collected for, saying it’s the only way to “empower users to decide what collection is legitimate and what isn’t”. “Anything less is a shame,” he added
- the right to access — saying companies should recognize that “data belongs to users”, and it should be made easy for users to get a copy of, correct and delete their personal data
- the right to security — saying “security is foundational to trust and all other privacy rights”
“We see vividly, painfully how technology can harm, rather than help,” he continued, arguing that platforms can “magnify our worst human tendencies… deepen divisions, incite violence and even undermine our shared sense or what is true or false”.
“This crisis is real. Those of us who believe in technology’s potential for good must not shrink from this moment”, he added, saying the company hopes “to work with you as partners”, and saying: “Our missions are closely aligned.”
He also made a sideswipe at tech industry efforts to defang privacy laws — saying that some companies will “endorse reform in public and then resist and undermine it behind closed doors”.
“They may say to you our companies can never achieve technology’s true potential if there were strengthened privacy regulations. But this notion isn’t just wrong it is destructive — technology’s potential is and always must be rooted in the faith people have in it. In the optimism and the creativity that stirs the hearts of individuals. In its promise and capacity to make the world a better place.”
“It’s time to face facts,” he added. “We will never achieve technology’s true potential without the full faith and confidence of the people who use it.”
Opening the conference before Cook took to the stage, Europe’s data protection supervisor Giovanni Buttarelli argued that digitization is driving a new generational shift in the respect for privacy — saying there is an urgent need for regulators and indeed societies to agree on and establish “a sustainable ethics for a digitised society”.
“The so-called ‘privacy paradox’ is not that people have conflicting desires to hide and to expose. The paradox is that we have not yet learned how to navigate the new possibilities and vulnerabilities opened up by rapid digitization,” Buttarelli argued.
“To cultivate a sustainable digital ethics, we need to look, objectively, at how those technologies have affected people in good ways and bad; We need a critical understanding of the ethics informing decisions by companies, governments and regulators whenever they develop and deploy new technologies.”
The EU’s data protection supervisor told an audience largely made up of data protection regulators and policy wonks that laws that merely set a minimum standard are not enough, including the EU’s freshly painted GDPR.
“We need to ask whether our moral compass been suspended in the drive for scale and innovation,” he said. “At this tipping point for our digital society, it is time to develop a clear and sustainable moral code.”
“We do not have a[n ethical] consensus in Europe, and we certainly do not have one at a global level. But we urgently need one,” he added.
“Not everything that is legally compliant and technically feasible is morally sustainable,” Buttarelli continued, pointing out that “privacy has too easily been reduced to a marketing slogan. But ethics cannot be reduced to a slogan.”
“For us as data protection authorities, I believe that ethics is among our most pressing strategic challenges,” he added.
“We have to be able to understand technology, and to articulate a coherent ethical framework. Otherwise how can we perform our mission to safeguard human rights in the digital age?”
https://ift.tt/2ApiEuJ
No comments:
Post a Comment