Mental health websites in Europe found sharing user data for ads – gpgmail


Research by a privacy rights advocacy group has found popular mental health websites in the EU are sharing users’ sensitive personal data with advertisers.

Europeans going online to seek support with mental health issues are having sensitive health data tracked and passed to third parties, according to Privacy International’s findings — including depression websites passing answers and results of mental health check tests direct to third parties for ad targeting purposes.

The charity used the open source Webxray tool to analyze the data gathering habits of 136 popular mental health web pages in France, Germany and the UK, as well as looking at a small sub-set of online depression tests (the top three Google search results for the phrase per country).

It has compiled its findings into a report called Your mental health for sale.

“Our findings show that many mental health websites don’t take the privacy of their visitors as seriously as they should,” Privacy International writes. “This research also shows that some mental health websites treat the personal data of their visitors as a commodity, while failing to meet their obligations under European data protection and privacy laws.”

Under Europe’s General Data Protection Regulation (GDPR), there are strict rules governing the processing of health data — which is classified as special category personal data.

If consent is being used as the legal basis to gather this type of data the standard that must be obtained from the user is “explicit” consent.

In practice that might mean a pop-up before you take a depression test which asks whether you’d like to share your mental health with a laundry list of advertisers so they can use it to sell you stuff when you’re feeling low — also offering a clear ‘hell no’ penalty-free choice not to consent (but still get to take the test).

Safe to say, such unvarnished consent screens are as rare as hen’s teeth on the modern Internet.

But, in Europe, beefed up privacy laws are now being used to challenge the ‘data industrial complex’s systemic abuses and help individuals enforce their rights against a behavior-tracking adtech industry that regulators have warned is out of control.

Among Privacy International’s key findings are that —

  • 76.04% of the mental health web pages contained third-party trackers for marketing purposes
  • Google trackers are almost impossible to avoid, with 87.8% of the web pages in France having a Google tracker, 84.09% in Germany and 92.16% in the UK
  •  Facebook is the second most common third-party tracker after Google, with 48.78% of all French web pages analysed sharing data with Facebook; 22.73% for Germany; and 49.02 % for the UK.
  • Amazon Marketing Services were also used by many of the mental health web pages analysed (24.39% of analyzed web pages in France; 13.64 % in Germany; and 11.76% in the UK)
  • Depression-related web pages used a large number of third-party tracking cookies which were placed before users were able to express (or deny) consent. On average, PI found the mental health web pages placed 44.49 cookies in France; 7.82 for Germany; and 12.24 for the UK

European law around consent as a legal basis for processing (general) personal data — including for dropping tracking cookies — requires it to be informed, specific and freely given. This means websites that wish to gather user data must clearly state what data they intend to collect for what purpose, and do so before doing it, providing visitors with a free choice to accept or decline the tracking.

Dropping tracking cookies without even asking clearly falls foul of that legal standard. And very far foul when you consider the personal data being handled by these mental health websites is highly sensitive special category health data.

It is exceedingly difficult for people to seek mental health information and for example take a depression test without countless of third parties watching,” said Privacy International technologist Eliot Bendinelli in a statement. “All website providers have a responsibility to protect the privacy of their users and comply with existing laws, but this is particularly the case for websites that share unusually granular or sensitive data with third parties. Such is the case for mental health websites.”

Additionally, the group’s analysis found some of the trackers embedded on mental health websites are used to enable a programmatic advertising practice known as Real Time Bidding (RTB). 

This is important because RTB is subject to multiple complaints under GDPR.

These complaints argue that the systematic, high velocity trading of personal data is, by nature, inherently insecure — with no way for people’s information to be secured after it’s shared with hundreds or even thousands of entities involved in the programmatic chain, because there’s no way to control it once it’s been passed. And, therefore, that RTB fails to comply with the GDPR’s requirement that personal data be processed securely.

Complaints are being considered by regulators across multiple Member States. But this summer the UK’s data watchdog, the ICO, essentially signalled it is in agreement with the crux of the argument — putting the adtech industry on watch in an update report in which it warns that behavioral advertising is out of control and instructs the industry it must reform.

However the regulator also said it would give players “an appropriate period of time to adjust their practices”, rather than wade in with a decision and banhammers to enforce the law now.

The ICO’s decision to opt for an implied threat of future enforcement to push for reform of non-compliant adtech practices, rather than taking immediate action to end privacy breaches, drew criticism from privacy campaigners.

And it does look problematic now, given Privacy International’s findings suggest sensitive mental health data is being sucked up into bid requests and put about at insecure scale — where it could pose a serious risk to individuals’ rights and freedoms.

Privacy International says it found “numerous” mental health websites including trackers from known data brokers and AdTech companies — some of which engage in programmatic advertising. It also found some depression test websites (namely: netdoktor.de, passeportsante.net and doctissimo.fr, out of those it looked at) are using programmatic advertising with RTB.

“The findings of this study are part of a broader, much more systemic problem: The ways in which companies exploit people’s data to target ads with ever more precision is fundamentally broken,” adds Bendinelli. “We’re hopeful that the UK regulator is currently probing the AdTech industry and the many ways it uses special category data in ways that are neither transparent nor fair and often lack a clear legal basis.”

We’ve reached out to the ICO with questions.

We also asked the Internet Advertising Bureau Europe what steps it is taking to encourage reform of RTB to bring the system into compliance with EU privacy law. At the time of writing the industry association had not responded.

The IAB recently released a new version of what it refers to as a “transparency and consent management framework” intended for websites to embed to collect consent from visitors to processing their data including for ad targeting purposes — legally, the IAB contends.

However critics argue this is just another dose of business as usual ‘compliance theatre’ from the adtech industry — with users offered only phoney choices as there’s no real control over how their personal data gets used or where it ends up.

Earlier this year Google’s lead privacy regulator in Europe, the Irish DPC, opened a formal investigation into the company’s processing of personal data in the context of its online Ad Exchange — also as a result of a RTB complaint filed in Ireland.

The DPC said it will look at each stage of an ad transaction to establish whether the ad exchange is processing personal data in compliance with GDPR — including looking at the lawful basis for processing; the principles of transparency and data minimisation; and its data retention practices.

The outcome of that investigation remains to be seen. (Fresh fuel has just today been poured on with the complainant submitting new evidence of their personal data being shared in a way they allege infringes the GDPR.)

Increased regulatory attention on adtech practices is certainly highlighting plenty of legally questionable and ethically dubious stuff — like embedded tracking infrastructure that’s taking liberal notes on people’s mental health condition for ad targeting purposes. And it’s clear that EU regulators have a lot more work to do to deliver on the promise of GDPR.




10 minutes mail – Also known by names like : 10minemail, 10minutemail, 10mins email, mail 10 minutes, 10 minute e-mail, 10min mail, 10minute email or 10 minute temporary email. 10 minute email address is a disposable temporary email that self-destructed after a 10 minutes. https://tempemail.co/– is most advanced throwaway email service that helps you avoid spam and stay safe. Try tempemail and you can view content, post comments or download something

Facebook could face billions in potential damages as court rules facial recognition lawsuit can proceed – gpgmail


Facebook is facing exposure to billions of dollars in potential damages as a federal appeals court on Thursday rejected Facebook’s arguments to halt a class action lawsuit claiming it illegally collected and stored the biometric data of millions of users.

The class action lawsuit has been working its way through the courts since 2015, when Illinois Facebook users sued the company for alleged violations of the state’s Biometric Information Privacy Act by automatically collecting and identifying people in photographs posted to the service.

Now, thanks to an unanimous decision from the 9th U.S. Circuit Court of Appeals in San Francisco, the lawsuit can proceed.

The most significant language from the decision from the circuit court seems to be this:

 We conclude that the development of face template using facial-recognition technology without consent (as alleged here) invades an individual’s private affairs and concrete interests. Similar conduct is actionable at common law.

The American Civil Liberties Union came out in favor of the court’s ruling.

“This decision is a strong recognition of the dangers of unfettered use of face surveillance technology,” said Nathan Freed Wessler, staff attorney with the ACLU Speech, Privacy, and Technology Project, in a statement. “The capability to instantaneously identify and track people based on their faces raises chilling potential for privacy violations at an unprecedented scale. Both corporations and the government are now on notice that this technology poses unique risks to people’s privacy and safety.”

As April Glaser noted in “Slate”, Facebook already may have the world’s largest database of faces, and that’s something that should concern regulators and privacy advocates.

“Facebook wants to be able to certify identity in a variety of areas of life just as it has been trying to corner the market on identify verification on the web,” Siva Vaidhyanathan told Slate in an interview. “The payoff for Facebook is to have a bigger and broader sense of everybody’s preferences, both individually and collectively. That helps it not only target ads but target and develop services, too.”

That could apply to facial recognition technologies as well. Facebook, thankfully, doesn’t sell its facial recognition data to other people, but it does allow companies to use its data to target certain populations. It also allows people to use its information for research and to develop new services that could target Facebooks billion-strong population of users.

As our own Josh Constine noted in an article about the company’s planned cryptocurrency wallet, the developer community poses as much of a risk to how Facebook’s products and services are used and abused as Facebook itself.

Facebook has said that it plans to appeal the decision. “We have always disclosed our use of face recognition technology and that people can turn it on or off at any time,” a spokesman said in an email to “Reuters”.

Now, the lawsuit will go back to the court of U.S. District Judge James Donato in San Francisco who approved the class action lawsuit last April for a possible trial.

Under the privacy law in Illinois, negligent violations could be subject to damages of up to $1,000 and intentional violations of privacy are subject to up to $5,000 in penalties. For the potential 7 million Facebook users that could be included in the lawsuit those figures could amount to real money.

“BIPA’s innovative protections for biometric information are now enforceable in federal court,” added Rebecca Glenberg, senior staff attorney at the ACLU of Illinois. “If a corporation violates a statute by taking your personal information without your consent, you do not have to wait until your data is stolen or misused to go to court. As our General Assembly understood when it enacted BIPA, a strong enforcement mechanism is crucial to hold companies accountable when they violate our privacy laws. Corporations that misuse Illinoisans sensitive biometric data now do so at their own peril.”

These civil damages could come on top of fines that Facebook has already paid to the U.S. government for violating its agreement with the Federal Trade Commission over its handling of private user data. That resulted in one of the single largest penalties levied against a U.S. technology company. Facebook is potentially on the hook for a $5 billion payout to the U.S. government. That penalty is still subject to approval by the Justice Department.


10 minutes mail – Also known by names like : 10minemail, 10minutemail, 10mins email, mail 10 minutes, 10 minute e-mail, 10min mail, 10minute email or 10 minute temporary email. 10 minute email address is a disposable temporary email that self-destructed after a 10 minutes. https://tempemail.co/– is most advanced throwaway email service that helps you avoid spam and stay safe. Try tempemail and you can view content, post comments or download something