Tag Archives: privacy

New York Passes Bill to Ban Addictive Social Media for Children

New York lawmakers this week passed a bill that bans internet companies from exploiting personal data and implementing “addictive” algorithms that are designed to keep children ‘hooked’ on social media.

As part of an ongoing effort to curb technology’s role in fueling a mental health crisis in youth, New York’s governor’s office is also supporting a ban on the use of smartphones in schools, which will be debated by educational departments, healthcare professionals, parents and lawmakers over the next few months.

The Stop Addictive Feeds Exploitation for Kids Act, will require social media companies to restrict key addictive features on their platforms for users under 18 in New York. Once approved and signed into law, the Attorney General’s Office will devise specific enforcement rules and regulations. The measures will then take effect 180 days after those enforcement details are finalized. Technology companies will face fines of up to $5,000 per violation of the youth data privacy and addictive algorithm ban in New York.

A second bill, called the New York Child Data Protection Act, would prohibit all online sites from collecting, using, sharing, or selling personal data of anyone under the age of 18, unless they receive informed consent or unless doing so is strictly necessary for the purpose of the website. For users under 13, that informed consent must come from a parent.

National Online Privacy

Currently, a federal proposal — called The American Privacy RIghts Act has aimed to set nationwide standards for how companies like Meta, TikTok, Google and others can gather, use and sell user data, requiring them to collect only the amount necessary to provide products and services. That bill would transform how social media companies and online search engines use consumers’ personal data in a push to give Americans more control.

“HistoricStep” Forward in New York

New York is making a serious push to improve youth mental health and “create a safer digital environment for young people.” According to the NY Attorney General Letitia James, “Our children are enduring a mental health crisis, and social media is fueling the fire and profiting from the epidemic,” this push has targeted “the addictive features that have made social media so insidious and anxiety-producing,” she added. State Senator Andrew Gounardes, D-Brooklyn stated that “New York is sending a clear message to Big Tech: your profits are not more important than our kids’ privacy and wellbeing.” He noted that the bill he championed overcame substantial lobbying and opposition from the tech industry.

[Link]

What Do Teen Girls Think About TikTok, Instagram, and How Social Media Impacts Their Lives

A new report by Common Sense Media shows that nearly half (45%) of girls who use TikTok say they feel “addicted” to the platform or use it more than intended.

Today, new research report by Common Sense Media, reveals what teen girls think about TikTok and Instagram, and describes the impact that these and other social media platforms have on their lives. According to the report, Teens and Mental Health: How Girls Really Feel About Social Media, nearly half (45%) of girls who use TikTok say they feel “addicted” to the platform or use it more than intended at least weekly. Among girls with moderate to severe depressive symptoms, roughly seven in 10 who use Instagram (75%) and TikTok (69%) say they come across problematic suicide-related content at least monthly on these platforms.

A survey of over 1,300 adolescent girls across the country sought to better understand how the most popular social media platforms and design features impact their lives today. Among the report’s key findings, adolescent girls spend over two hours daily on TikTok, YouTube, and Snapchat, and more than 90 minutes on Instagram and messaging apps. When asked about platform design features, the majority of girls believe that features like location sharing, public accounts, endless scrolling, and appearance filters have an effect on them, but they’re split on whether those effects are positive or negative. Girls were most likely to say that location sharing (45%) and public accounts (33%) had a mostly negative effect on them, compared to other features. In contrast, they were most likely to say that video recommendations (49%) and private messaging (45%) had a mostly positive impact on them.

Other key findings

  1. Nearly four in 10 (38%) girls surveyed report symptoms of depression, and among these girls, social media has an outsize impact—for better and for worse
  2. Girls who are struggling socially offline are three to four times as likely as other girls to report daily negative social experiences online, but they’re also more likely to reap the benefits of the digital world.
  3. Seven out of 10 adolescent girls of color who use TikTok (72%) or Instagram (71%) report encountering positive or identity-affirming content related to race at least monthly on these platforms, but nearly half report exposure to racist content or language on TikTok (47%) or Instagram (48%) at least monthly.
  4. Across platforms, LGBTQ+ adolescent respondents are roughly twice as likely as non-LGBTQ+ adolescents to encounter hate speech related to sexual or gender identity, but also more likely to find a connection. More than one in three LGBTQ+ young people (35%) who use TikTok say they have this experience daily or more on the platform, as do 31% of LGBTQ+ users of messaging apps, 27% of Instagram users, 25% of Snapchat users, and 19% of YouTube users.
  5. Girls have mixed experiences related to body image when they use social media. Roughly one in three girls who use TikTok (31%), Instagram (32%), and Snapchat (28%) say they feel bad about their body at least weekly when using these platforms, while nearly twice as many say they feel good or accepting of their bodies at least weekly while using TikTok (60%), Instagram (57%), and Snapchat (59%).
  6. The majority of girls who use Instagram (58%) and Snapchat (57%) say they’ve been contacted by a stranger on these platforms in ways that make them uncomfortable. These experiences were less common, though still frequent, on other platforms, with nearly half of TikTok (46%) and messaging app (48%) users having been contacted by strangers on these platforms.

Common Sense Media also announced today that the organization is launching the “Healthy Young Minds” campaign, a multiyear initiative focused on building public understanding of the youth mental health crisis, spotlighting solutions, and catalyzing momentum for industry and policy change. Town halls are scheduled for New York City, Arizona, Los Angeles, Indianapolis, Florida, Massachusetts, London, and Brussels, with more locations to be announced shortly. Further research and digital well-being resources for educators will be released in the coming year.

To learn more about Common Sense Media, the survey or educational materials available:

Source: Common Sense
Link to Survey: Teens and Mental Health: How Girls Really Feel About Social Media
Report Infographic
Curriculum and classroom resources


The Sale of American’s Mental Health Data

Millions of Americans have been using health tracking apps for for the last few years and since the pandemic, numerous apps for mental health issues like depression and anxiety have proliferated. We have become used to our private medical information from the usual medical settings being protected by HIPAA (the Health Insurance Portability and Accountability Act). But unfortunately, HIPAA wasn’t designed for the modern digital world, with its new technologies. Most apps—including health, mental health, and biometric tracking devices—don’t fall under HIPAA rules, meaning that these companies can sell your private health data to third parties, with or without your consent.

A new research report published by Duke University’s Technology Policy Lab reveals that data brokers are selling huge datasets full of identifiable personal information—including psychiatric diagnoses and medication prescriptions, as well as many other identifiers, such as age, gender, ethnicity, religion, number of children, marital status, net worth, credit score, home ownership, profession, and date of birth—all matched with names, addresses, and phone numbers of individuals.

Data brokers are selling massive lists of psychiatric diagnoses, prescriptions, hospitalizations, and even lab results, all linked to identifiable contact information.

Researcher Joanne Kim began by searching for data brokers online. She contacted 37 of them by email or a form on their website (Kim identified herself as a researcher in the initial contact). None of those she contacted via email responded; some of those she contacted via form referred her to other data brokers. A total of 26 responded in some way (including some automated responses). Ultimately, only 10 data brokers had sustained contact by call or virtual meeting with Kim, so they were included in the study.

The 10 most engaged data brokers asked about the purpose of the purchase and the intended use cases for the data; however, after receiving that information (verbally or in writing) from the author, those companies did not appear to have additional controls for client management, and there was no indication in emails and phone calls that they had conducted separate background checks to corroborate the author’s (non-deceptive) statements.

Data Brokers reported conditions for selling data:

  • emphasized that the requested data on individuals’ mental health conditions was “extremely restricted” and that their team would need more information on intended use cases—yet continued to send a sample of aggregated, deidentified data counts.
  • confirmed that the author was not part of a marketing entity, the sales representative said that as long as the author did not contact the individuals in the dataset, the author could use the data freely.
  • implied they may have fully identified patient data, but said they were unable to share this individual-level data due to HIPAA compliance concerns. Instead, the sales representative offered to aggregate the data of interest in a deidentified form.
  • one was most willing to sell data on depressed and anxious individuals at the author’s budget price of $2,500 and stated no apparent, restrictive data-use limitations post-purchase.
  • another advertised highly sensitive mental health data to the author, including names and postal addresses of individuals with depression, bipolar disorder, anxiety issues, panic disorder, cancer, PTSD, OCD, and personality disorder, as well as individuals who have had strokes and data on those people’s races and ethnicities.
  • two data brokers, mentioned nondisclosure agreements (NDAs) in their communications, and one indicated that signing an NDA was a prerequisite for obtaining access to information on the data it sells.
  • one often made unsolicited calls to the author’s personal cell. If the author was delayed in responding to an email from the data broker, the frequency of calls seemed to increase.

Conclusions

The author concludes that additional research is critical as more depressed and anxious individuals utilize personal devices and software-based health-tracking applications (which are not protected by HIPAA), often unknowingly putting their sensitive mental health data at risk. This report finds that the industry appears to lack a set of best practices for handling individuals’ mental health data, particularly in the areas of privacy and buyer vetting. It finds that there are data brokers which advertise and are willing and able to sell data concerning Americans’ highly sensitive mental health information.

This research concludes by highlighting that the largely unregulated and black-box nature of the data broker industry, its buying and selling of sensitive mental health data, and the lack of clear consumer privacy protections in the U.S. necessitate a comprehensive federal privacy law or, at the very least, an expansion of HIPAA’s privacy protections alongside bans on the sale of mental health data on the open market.

[Link]