A checkup on privacy risks posed by digital wellness benefits
In the U.S., employers are the main source of health coverage and, increasingly, benefits that encourage “wellness.” Many of them are provided in digital form, like meditation apps, virtual therapy or wearables that track our steps, heart rate or stress level.
But with that web of digital benefits comes privacy concerns, according to a new report from the nonprofit Data & Society, titled “Wellness Capitalism: Employee Health, the Benefits Maze, and Worker Control.”
Marketplace’s Meghan McCarty Carino delved into the report with its co-authors, senior researcher Tamara Nopper and research analyst Eve Zelickson, both with Data & Society’s labor futures team.
The following is an edited transcript of their conversation.
Eve Zelickson: A lot of these third-party apps, they have two things in common — data collection with little transparency and a lot of use of these kind of nascent, overhyped technology like [artificial intelligence] chatbots and predictive algorithms. So, you know, one Fortune 500 company might have over 10 different health and wellness benefits. And all of these vendors have their own referral practices, their own data-sharing processes and their own privacy policies. And workers don’t necessarily know how it’s being collected, stored or shared. And one thing I wanted to note is that this is not included in our primer, but it just happened a few weeks ago, and it’s very relevant. It was revealed that the National Eating Disorders Association, their AI chatbot was giving people pretty horrendous and dangerous advice, encouraging them to count calories and lose weight. And I bring this up to illustrate that these aren’t unfounded concerns. They’re already happening. And we saw very similar AI chatbots implemented to do similar things. And so we see companies are handing over these very sensitive, high-stakes situations to chatbots, especially proprietary chatbots, right? Where there’s little transparency around input data, testing and what it means to actually do a good job in these situations.
Meghan McCarty Carino: Tamara, how do these benefits operate at times as a form of worker surveillance?
Tamara Nopper: Well, there’s several ways that it can happen. And so one is that it gives the employer — but also all these different companies and entities — a window into your life, things like financial health, right? Does your family have problems? It could involve, like, counseling for your children. We also emphasize the role of criminalization. People have been legally prosecuted for issues that are associated with kind of a medical or health issue. So things like HIV criminalization, things like substance abuse or the current wave of criminalizing abortions or people not carrying a pregnancy to term. And so if you have things like family planning or fertility apps, and they’re offered to you as this benefit, what does that open you up to in terms of both the employer being more aware of your life and of kind of your activities and of your family members’ activities in a lot of cases?
McCarty Carino: And many of these digital health tools are not covered by HIPAA, right?
Nopper: Right. And it’s really kind of designed for an older-school model of how health and wellness is being done. Increasingly, more medical professionals and more of kind of health care delivery is involving digital tools. But with something like employee wellness gaining traction, and also the growth of the wellness industry, you have all these companies that would not be considered “covered entities.” At the heart of kind of the HIPAA protections is the Privacy Rule and which entities are actually covered by that rule. But legally, a lot of these companies are not required to kind of maintain certain privacy policies under HIPAA. And so a lot of this kind of data collection and a lack of privacy is happening out in the open. This is actually what characterizes kind of employee wellness and the rise of digital tools. And so that’s part of what people are navigating in this digitized world. We think of sometimes health information as being kind of always consistently protected. But it really has to do with who is getting that information and what direction it flows and the purpose of the information being used.
McCarty Carino: These services [involve] some of the most sensitive parts of our lives — mental health, reproductive health. Do either of you know what happens with that data?
Zelickson: I mean, that is the question. You know, these companies don’t necessarily want to talk to researchers and give them information, proprietary information about their innovative technologies. So we don’t know exactly what happens to the data. And it’s made even more complex by privacy policies and terms of service, and they provide no transparency or clarity on, you know, exactly where your, this information is going.
McCarty Carino: Are the benefits that we’re talking about mostly optional? Can workers just opt out of these if they’re worried about privacy?
Nopper: You have many programs where people can kind of decide to participate, but it’s not tied up to kind of the cost of their health insurance. But you have people being charged sometimes a penalty fee for not signing up for a plan. That raises real serious questions about if these are compulsory in some ways. Another thing we wanted to talk about, though, is a lot of times when people have critiqued employee wellness, they’ve critiqued lifestyle discrimination, and rightfully so. Lifestyle discrimination is the idea that you’re being discriminated against for how you kind of live your life outside of work. And a lot of these tools do give employers a lot of insight in these companies about how you actually act outside of the workspace. But we also talked about, you have some apps, for example, that kind of promote themselves as team-building apps where people’s scores are kind of, you know, their activities are kind of seen by a wide array of people and kind of put on a board almost, right? What does it mean to be seen as maybe not a good team player, even if you’re doing your job well?
Related links: More insight from Meghan McCarty Carino
We recently discussed what happens with consumer data from all these health apps that aren’t covered by HIPAA, the Health Insurance Portability and Accountability Act.
A research team at Duke University found data brokers selling information about users’ health concerns and diagnoses, sometimes even tied to names and addresses.
It’s not all that expensive to buy, and it’s all completely legal.
The future of this podcast starts with you.
Every day, the “Marketplace Tech” team demystifies the digital economy with stories that explore more than just Big Tech. We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.
As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.
Support “Marketplace Tech” in any amount today and become a partner in our mission.