Internet

Facebook After the NHS Trusts Leak: What Can We Learn From That?

Facebook is almost synonymous with data violations dating back to as early as 2006. Nearly two decades after its launch, Meta has turned Facebook into user data-mining Adtech, not excluding its other services and acquisitions.

Although the company always finds words to rationalize its thirst for personal data, data protection regulations are coming into effect bringing turbulent skies over Meta’s scraping algorithms.

Europe’s General Data Protection Regulation (GDPR) is at the forefront of Meta’s current troubles. On April 13th, 2023, European Data Protection Board (EDPB) issued an unprecedented fine of 1.2 billion euro regarding the transfer of EU citizen data to the US. GDPR puts strict boundaries over the gathering, utilizing, and storing of EU citizen data to ensure its safety and prevent misuse. Furthermore, it prevents transferring such data outside the EU unless specifically required by exceptional circumstances.

Facebook was quick to deny allegations of wrongdoings, stating: we are appealing these decisions and will immediately seek a stay with the courts who can pause the implementation deadlines, given the harm that these orders would cause, including to the millions of people who use Facebook every day.

Although the prominent social network uses privacy as a catchword (The Future Is Private claim is but a publicity stunt), it continuously fails to live up to the promise.

Furthermore, Facebook was involved in a renowned Cambridge Analytica data scandal and was slapped with a $725 million fine for its participation in user data gathering, misuse, and manipulation.

One would think a billion-dollar fine from EDPB and a warning to cease European data mining operations would repel Meta from further activities. But a month later, the social network is again in deep water, caught in the act of collecting personally identifiable information from 20 NHS trusts.

On May 27th, 2023, the Guardian ran an article illuminating how Facebook injected a Meta Pixel tool into NHS trust websites that collected various information: keyword searches, page views, and even prescriptions and doctor appointments.

What’s worse, it matched this data with the user’s IP address (a personally identifiable piece of information that can be used to construct an online user profile) and sent it to their US servers. In other words, more than 22 million UK residents shared their most intimate medical details with a US third party without giving consent.

The Meta Pixel is a covert piece of code that loads an invisible pixel on selected websites, this time, NHS trusts. It loads a JavaScript algorithm that follows the user online, monitoring online activities. Facebook’s position is that the trusts themselves shared this data, and it somehow got through the filters it uses to weed out information prohibited from gathering. It’s a hardly believable statement considering an ex-Google engineer demonstrated how Meta uses its Pixel to mine user data and was aware of it half a year ago.

There’s little stopping Meta from increasing ad revenue through exploitative marketing practices since it regularly pays fines instead of changing the business model. But there’s a lesson to be taken away – one of personal responsibility.

Billions jumped the social network hype train sharing personal details without thinking twice. However, there’s a qualitative difference between sharing cat pictures and doctor prescriptions. Although NHS website visitors were not aware their online activities were being spied upon, many failed to protect their personal data from third-party surveillance or unauthorized access. Verifying each site’s privacy policy is essential if it handles the most sensitive data, like medical records. Sadly, only three of the twenty affected NHS trusts had Meta or Facebook mentioned in their privacy statements. Sometimes the user is left in the absolute dark.

It’s hard to speculate exactly how Facebook uses the gathered data. It always pushes personalized ad targeting as a primary reason, but using medical records for advertising raises a set of different questions, such as business ethics.

So far, trusting personal information with Facebook is a double-edged sword. The social network’s comfort and entertainment come with privacy violations and data misuse. Ex-Facebook VP for user growth Chamath Palihapitiya warned that the company uses data for psychological manipulation to keep users logged in and hooked. The question of where all the UK NHS trusts data goes remains open, but if we were to guess, it wouldn’t look transparent.

One way or another, it’s but a last example of two-decade-long data mining, this time concentrated on the most personally sensitive information. It’s as good time as any to limit one’s Facebook use until it pays numerous fines for data violations.

Related Articles

Back to top button