Major Internet platforms will be required to open up their algorithms to regulatory oversight under proposals European lawmakers are set to introduce next month.
In a speech today Commission EVP Margrethe Vestager suggested algorithmic accountability will be a key plank of the forthcoming legislative digital package — with draft rules incoming that will require platforms to explain how their recommendation systems work as well as offering users more control over them.
“The rules we’re preparing would give all digital services a duty to cooperate with regulators. And the biggest platforms would have to provide more information on the way their algorithms work, when regulators ask for it,” she said, adding that platforms will also “have to give regulators and researchers access to the data they hold — including ad archives”.
While social media platforms like Facebook have set up ad archives ahead of any regulatory requirement to do so there are ongoing complaints from third party researchers about how the information is structured and how (in)accessible it is to independent study.
More information for users around ad targeting is another planned requirement, along with greater reporting requirements for platforms to explain content moderation decisions, per Vestager — who also gave a preview of what’s coming down the pipe in the Digital Services Act and Digital Markets Act in another speech earlier this week.
Regional lawmakers are responding to concerns that ‘blackbox’ algorithms can have damaging effects on individuals and societies — flowing from how they process data and order and rank information, with risks such as discrimination, amplification of bias and abusive targeting of vulnerable individuals and groups.
The Commission has said it’s working on binding transparency rules with the aim of forcing tech giants to take more responsibility for the content their platforms amplify and monetize. Although the devil will be in both the detail of the requirements and how effectively they will be enforced — but a draft of the plan is due in a month or so.
“One of the main goals of the Digital Services Act that we’ll put forward in December will be to protect our democracy, by making sure that platforms are transparent about the way these algorithms work – and make those platforms more accountable for the decisions they make,” said Vestager in a speech today at an event organized by not-for-profit research advocacy group AlgorithmWatch.
“The proposals that we’re working on would mean platforms have to tell users how their recommender systems decide which content to show – so it’s easier for us to judge whether to trust the picture of the world that they give us or not.”
Under the planned rules the most powerful Internet platforms — so-called ‘gatekeepers’ in EU parlance — will have to provide regular reports on “the content moderation tools they use, and the accuracy and results of those tools”, as Vestager put it.
There will also be specific disclosure requirements for ad targeting that go beyond the current fuzzy disclosures that platforms like Facebook may already offer (in its case via the ‘why am I seeing this ad?’ menu).
“Better information” will have to be provided, she said, such as platforms telling users “who placed a certain ad, and why it’s been targeted at us”. The overarching aim will be to ensure users of such platforms have “a better idea of who’s trying to influence us — and a better chance of spotting when algorithms are discriminating against us,” she added.
Today a coalition of 46 civic society organizations led by AlgorithmWatch urged the Commission to make sure transparency requirements in the forthcoming legislation are “meaningful” — calling for it to introduce “comprehensive data access frameworks” that provide watchdogs with the tools they need to hold platforms accountable, as well as to enable journalists, academics, and civil society to “challenge and scrutinize power”.
The group’s set of recommendations call for binding disclosure obligations based on the technical functionalities of dominant platforms; a single EU institution “with a clear legal mandate to enable access to data and to enforce transparency obligations”; and provisions to ensure data collection complies with EU data protection rules.
Another way to rebalance the power asymmetry between data-mining platform giants and the individuals who they track, profile and target could involve requirements to let users switch off algorithmic feeds entirely if they wish — opting out of the possibility of data-driven discrimination or manipulation. But it remains to be seen whether EU lawmakers will go that far in the forthcoming legislative proposals.
The only hints Vestager offered on this front was to say that the planned rules “will also give more power to users — so algorithms don’t have the last word about what we get to see, and what we don’t get to see”.
Platforms will also have to give users “the ability to influence the choices that recommender systems make on our behalf”, she also said.
In further remarks she confirmed there will be more detailed reporting requirements for digital services around content moderation and takedowns — saying they will have to tell users when they take content down, and give them “effective rights to challenge that removal”. While there is widespread public support across the bloc for rebooting the rules of play for digital giants there are also strongly held views that regulation should not impinge on online freedom of expression — such as by encouraging platforms to shrink their regulatory risk by applying upload filters or removing controversial content without a valid reason.
The proposals will need the support of EU Member States, via the European Council, and elected representatives in the European parliament.
The latter has already voted in support of tighter rules on ad targeting. MEPs also urged the Commission to reject the use of upload filters or any form of ex-ante content control for harmful or illegal content, saying the final decision on whether content is legal or not should be taken by an independent judiciary.
Simultaneously the Commission is working on shaping rules specifically for applications that use artificial intelligence — but that legislative package is not due until next year.
Vestager confirmed that will be introduced early in 2021 with the aim of creating “an AI ecosystem of trust”.
https://ift.tt/35Rhczx
No comments:
Post a Comment