In the wake of Facebook CEO Mark Zuckerberg’s closed-door meeting with European Parliament heavyweights just prior to the introduction of the bloc’s General Data Protection Regulation (GDPR) activation, New Europe’s Irene Kostaki caught up with European Data Protection Supervisor Giovanni Buttarelli for an exclusive interview at the institution’s premises in Brussels.
What did you think about Zuckerberg’s hearing? Do you think Facebook will immediately comply with the GDPR?
That is a question to be forwarded to Mr. Zuckerberg. I appreciate that he came to the European Parliament…being available to answer to questions. As president [Antonio] Tajani said. we had a first meeting and we are now waiting for action. I’ve seen a shorter version of what he told the Congressional hearing in Washington DC where he basically said he ‘shares European values; there has been a mistake; I feel accountable; all the responsibility is on us; we apologise; etc’.
Our reaction is that ‘sorry’ is not enough. We have to understand what happened and, secondly, to apply the outcome to a broader scale. My perception is that waiting for the final conclusions of the investigations that are ongoing in many countries …before the summer break, we have to evaluate the enforcement of the GDPR and the role that both Cambridge Analytica, and Facebook have played.
This was not a data leak or a data breach… nor an isolated case. Nor was it a violation of contractor clauses. It didn’t come out of the blue. This was known to all of the relevant players from the beginning. But this is just the tip of the iceberg. There’s another case that was revealed last week that, again, secretly collected data on 3 million people with a personality test. At least, this is the first known figure.
Facebook announced that it is immediately suspending 200 apps. The Cambridge Analytica case was already known to them … when they appeared before the data protection authorities in the UK and Ireland. Which suggests that the data was deleted, but that was not the case at all. [Zuckerberg] said he trusted the mails from Cambridge Analytica. Facebook has been between what we call passive negligence, tolerance, or even complicity, by underestimating the impact of a lack of trust. The Cambridge Analytica case is not a standalone scandal, it is likely to produce long-term repercussions.
So, will the tech giants comply?
The GDPR implementation will require more than just maintenance, as we need to simplify things. I consider myself as a better than average, or maybe even an expert user, and I find it difficult to read all these privacy notices, not only from Facebook, because they’re drafted in hard-to-understand legalese and crafted in such a way that they defend data controllers, instead of protecting a person’s ability to be in control of their own data.
But one way or another, compliance will come. Some inconsistencies, let’s say, could be acceptable or even understandable in certain companies with more than 2 billion users in the world. The approach by tech-giants to the GDPR will also different, with some even challenging the approach. Others don’t believe we will succeed when it comes to actually enforcing the rules.
When it comes to implementing the GDPR, enforcing the rules will make a difference. Are the Data Protection Authorities across the Member States ready? And what about administrative fines? To your knowledge, which of the Member States is ready to check on who complies with the GDPR and, therefore, impose administrative fines if necessary?
The countries have put in a lot of effort and at least 2/3 of them will adopt the regulations, but the adoption of national provisions is not essential because the framework regulation is self-executing and does not allow for delays. Some of these measures are important to equip the data protection authorities with more resources.
What is essential now is that the Data Protection Authorities (DPAs) be accountable, selective, accessible, predictable, and communicative in converting to new technologies and severs, in case of serious infringements.
Concerning your role as the European Data Protection Supervisor, the General Data Protection Regulation changes your role due to an intergovernmental status of the EDPS. Someone could say that this makes you yet another Directorate-General of the European Commission.
No, definitely, we are and will be more independent, regardless of the GDPR. The EU decided to deal with the task of interoperability. We have a lot of large scale ID data base systems that have been built depending on the different data frameworks…and there is overlap when it comes to the publication of data. But a merger of the systems would take years.
Making this data accessible is a challenge on one hand and therefore we have been asked to be the supervisor of Europol, Eurojust, the European and Mediterranean Plant Protection Organization, and the European Free Trade Association.
The EDPS is the only body that can supervise in this case, and we supervise the Director Generals of the European Commission, which means we should be out of the equation so we can continue to build on the cooperation of institutions.
We will remain the key advisor of the Commission and we have a Memorandum of Understanding for confidential access, at an early stage, to help them make good decisions. We continue to give third party advice in the European Court of Justice, but we are not the appeals body for national decisions.
The European Council and Parliament have agreed on the GDPR for the EU institutions and bodies, and they will all now more less follow the same approach of the EDPS, with the same powers and duties. One novelty is that we will now impose fines.
Regarding autonomous technologies and the prohibition on solely automated decision making – is this provision going to impinge on things like Artificial Intelligence (AI), autonomous driving. and autonomous tech in general?
No, exactly the opposite. The GDPR provides for the right to not be subjected to a decision made entirely devoted by a machine, with some exceptions. You have hundreds of examples when automation can lead to discrimination and exclusion, so you have the right not to be forced, the principle can be easily harmonised with the sustainable development of AI.
As a former prosecutor in Italy – do you think EU and the Member States have struck the right balance between data protection and lawful access to data by law enforcement authorities? How do you see the recent e-Evidence proposal by the European Commission?
We will issue an opinion on the US cloud act and its specifics. As a judge with a criminal issues background, I have to say that my colleagues and law enforcement officials require easy access when they have a legal warrant. They should be well trained and have access to all the same tools that the criminals have and their actions should be very intuitive and aggressive. That means the interference has to be intense, but not all data should be collected beforehand. Selectivity is key to efficiency.
Privacy as a competition issue, is it time to break up data monopolies?
…This is exactly our mantra. The GDPR alone cannot solve all the problems, even if all tech giants fully comply. We will have a balance of power between users, on the one hand, and the data controllers on the other. The question is more about the digital dividend and how it is concentrated in the hands of a few and how that is influencing the proper development of sustainable principles. The big platforms are forcing publishers into problematic take it or leave it scenarios.
We would like to interact more with other law enforcement authorities and this is the reason why we proposed opening a new chapter in the Fake News Regulation, with a view to upgrade the very proactive and dynamic synergy that exists between elements that include anti-trust campaigns, consumer protection, audio-visual material, electoral campaigns, as well as intellectual property rights and transparency.
Finally, can you share any insights about the privacy conference in October?
Don’t miss the event. We aim to make this the event of the decade. Regardless of the success of the conference, we have already said that we have been successful in opening an international debate on digital ethics. We should go beyond legal routes and move towards the interaction of legal rules to values.
Values around the new society of the fourth industrial revolution will be defined now. If we do not do it now, it will be very, very late. In a machine to machine world, robotics, AI, machine learning…it is essential that those now designing the future need to start considering what is morally acceptable, not only what is legal sustainable.