A way out of the Brexit morass?
09 May 2019 – 14:15 | No Comment

Brexit-bound Britain will participate in this month’s European Parliament (EP) election, unless UK prime minister, Theresa May, and opposition leader, Jeremy Corbyn, manage to push the thrice-rejected EU withdrawal agreement through the House of Commons …

Read the full story »

Energy & Environment

Circular Economy

Climate Change


Home » Data security, Europe, Technology

‘GDPR still leaves room for companies to manipulate personal data’

Submitted by on 14 Sep 2018 – 17:51

GDPR doesn’t prohibit the use of data, but forces companies to inform users about the collection and use of their data. At the beginning it was conceived with an honourable goal — to give control over personal data to users. In practice, Geoffroy Didier MEP says it had the reverse effect. This, in his opinion, is the first and the biggest problem of the regulation

Recently our inboxes were bombarded with privacy updates related to the General Data Protection Regulation (GDPR) that entered into force on 25th May 2018. The flood of emails aimed to inform us about the use of our personal data, a legal requirement under GDPR.

The Cambridge Analytica scandal took the debates about data regulation to a different level. GDPR will alter the data landscape, going some way towards curbing the powers of the internet giants, but this will only be the beginning. How much of an impact will the current regulation have on how tech platforms store and use personal data? Can GDPR fully protect us from abusive data collection and misuse?

Lack of responsibility

At the beginning, GDPR was conceived with an honourable goal — to give control over personal data to users. In practice it had the reverse effect. Instead of giving more accountability to companies, GDPR put the full responsibility on users. This, in my opinion, is the first and the biggest problem of the regulation.

GDPR doesn’t prohibit the use of data, but forces companies to inform users about the collection and use of their data. Companies that collect personal data have to provide a variety of notifications designed to inform the user about who is collecting their data, what it is being collected for, how it will be used, to which third parties it may be provided and how long it will be stored.

As a result, GDPR arrived in the shape of email consent forms full of incomprehensible information and links to other sites with terms and conditions. It made it absolutely impossible for users to get clear information about the use of their data or to choose when and in which applications they want to share their data and under which circumstances they don’t want to share any data.

For instance, I spent countless hours trying to distinguish Google and Facebook applications that I use and give my consent to use my data from those that I don’t use, where I want to delete all information about myself. Each time I was redirected to a different link without any concrete result. Finally, I gave up.

Unfortunately, this is an illustration of the fact that GDPR still leaves room for companies to manipulate personal data while avoiding responsibility for their actions.

Data harvesting and data mining

At present, personal data is the most precious asset, especially for tech firms. Here lies another problem: GDPR doesn’t protect personal data from interference by algorithms and use by advertisers.

This problem mainly relates to tech firms. In order to have access to a newspaper, a customer needs to buy it. On the other hand, the internet and the digital world is available free of charge. Users do not pay to access Facebook, Google, Snapchat or YouTube. Tech firms generate revenues from advertisements.

As a result, they create complicated algorithms and applications (so-called “persuasive technology”) that are aimed at pushing users to share their views, preferences and tastes in order to trace a detailed behavioural correlation scheme and create psychological portraits of users. It is about intentional cultivation and harvesting of data through persuasive technology in order to eventually sell it to advertisers.

Allthey need is an algorithm to help them understand behavioural correlations. Meanwhile, those complicated algorithms, the machine learning tools that power everything — from our news feeds on social media to the products suggested by search engines — remain opaque and unregulated. They don’t fall under the scope of GDPR. Moreover, they are protected and seen as intellectual property rights, making them trade secrets.

A clear definition of personal data

GDPR protects personal data such as travel records, religious affiliations, biometric data, web search results and credit card numbers. But what about the personal data generated by the persuasive technologies described above, aimed at targeting consumers for marketing or political campaigning purposes? Unfortunately, GDPR doesn’t provide a miracle solution to this problem.

A big step forward, but not enough to catch up with fast-developing technologies

There is enough vagueness in GDPR to ensure that companies will be tweaking well beyond the deadline. The simple fact is that the legislation has failed to keep up with the speed at which technology has advanced. GDPR doesn’t make direct reference to data harvesting and data mining. It doesn’t outlaw the use of personal data for marketing or political purposes. Finally, GDPR allows companies to claim a legitimate interest — usually commercial — not to disclose data. GDPR is a huge step forward towards data regulation, but certainly not a miracle solution. GDPR puts data protection debates on a new level and it’s for us to develop them further.