Tag Archives: Data brokers

The Sale of American’s Mental Health Data

Millions of Americans have been using health tracking apps for for the last few years and since the pandemic, numerous apps for mental health issues like depression and anxiety have proliferated. We have become used to our private medical information from the usual medical settings being protected by HIPAA (the Health Insurance Portability and Accountability Act). But unfortunately, HIPAA wasn’t designed for the modern digital world, with its new technologies. Most apps—including health, mental health, and biometric tracking devices—don’t fall under HIPAA rules, meaning that these companies can sell your private health data to third parties, with or without your consent.

A new research report published by Duke University’s Technology Policy Lab reveals that data brokers are selling huge datasets full of identifiable personal information—including psychiatric diagnoses and medication prescriptions, as well as many other identifiers, such as age, gender, ethnicity, religion, number of children, marital status, net worth, credit score, home ownership, profession, and date of birth—all matched with names, addresses, and phone numbers of individuals.

Data brokers are selling massive lists of psychiatric diagnoses, prescriptions, hospitalizations, and even lab results, all linked to identifiable contact information.

Researcher Joanne Kim began by searching for data brokers online. She contacted 37 of them by email or a form on their website (Kim identified herself as a researcher in the initial contact). None of those she contacted via email responded; some of those she contacted via form referred her to other data brokers. A total of 26 responded in some way (including some automated responses). Ultimately, only 10 data brokers had sustained contact by call or virtual meeting with Kim, so they were included in the study.

The 10 most engaged data brokers asked about the purpose of the purchase and the intended use cases for the data; however, after receiving that information (verbally or in writing) from the author, those companies did not appear to have additional controls for client management, and there was no indication in emails and phone calls that they had conducted separate background checks to corroborate the author’s (non-deceptive) statements.

Data Brokers reported conditions for selling data:

  • emphasized that the requested data on individuals’ mental health conditions was “extremely restricted” and that their team would need more information on intended use cases—yet continued to send a sample of aggregated, deidentified data counts.
  • confirmed that the author was not part of a marketing entity, the sales representative said that as long as the author did not contact the individuals in the dataset, the author could use the data freely.
  • implied they may have fully identified patient data, but said they were unable to share this individual-level data due to HIPAA compliance concerns. Instead, the sales representative offered to aggregate the data of interest in a deidentified form.
  • one was most willing to sell data on depressed and anxious individuals at the author’s budget price of $2,500 and stated no apparent, restrictive data-use limitations post-purchase.
  • another advertised highly sensitive mental health data to the author, including names and postal addresses of individuals with depression, bipolar disorder, anxiety issues, panic disorder, cancer, PTSD, OCD, and personality disorder, as well as individuals who have had strokes and data on those people’s races and ethnicities.
  • two data brokers, mentioned nondisclosure agreements (NDAs) in their communications, and one indicated that signing an NDA was a prerequisite for obtaining access to information on the data it sells.
  • one often made unsolicited calls to the author’s personal cell. If the author was delayed in responding to an email from the data broker, the frequency of calls seemed to increase.

Conclusions

The author concludes that additional research is critical as more depressed and anxious individuals utilize personal devices and software-based health-tracking applications (which are not protected by HIPAA), often unknowingly putting their sensitive mental health data at risk. This report finds that the industry appears to lack a set of best practices for handling individuals’ mental health data, particularly in the areas of privacy and buyer vetting. It finds that there are data brokers which advertise and are willing and able to sell data concerning Americans’ highly sensitive mental health information.

This research concludes by highlighting that the largely unregulated and black-box nature of the data broker industry, its buying and selling of sensitive mental health data, and the lack of clear consumer privacy protections in the U.S. necessitate a comprehensive federal privacy law or, at the very least, an expansion of HIPAA’s privacy protections alongside bans on the sale of mental health data on the open market.

[Link]