December 14, 2024

Best Health Ideas

Every Health & Fitness Helps

Lyra Health Ex-Therapists Warn Of Ethical Conflicts

Lyra Health Ex-Therapists Warn Of Ethical Conflicts


Ariel Davis for BuzzFeed News

Feeling stressed and overwhelmed last January, Daniel Rojas decided to take advantage of a benefit Starbucks often touts for its employees around the country: free therapy through Lyra Health, a mental health startup that provides counseling services for some of the biggest companies in the world.

Rojas, a 25-year-old shift supervisor in Buffalo, New York, had been dealing with gender dysphoria and body image problems, two issues he says compound each other “like a snake eating its tail.” So Rojas jumped at the coffee giant’s offer of 20 free counseling sessions from Lyra, a Silicon Valley darling cofounded and led by former Facebook CFO David Ebersman.

But four sessions in, Rojas, who uses he/they pronouns, felt frustrated with the progress of their treatment. He said he had to constantly re-explain things he’d gone over in previous sessions, which made him relive the same traumas every time he had to repeat them. So they decided to end treatment with that counselor and find another one on Lyra’s platform.

When they attempted to find someone else, though, they said a Lyra rep told them in a video call that their issues were too advanced for the company’s care. The rep suggested he seek long-term treatment elsewhere and left him to figure it out on his own.

“I work really hard at Starbucks and I want to get every benefit I possibly can,” Rojas said. “I felt alienated. I felt like I was being cheated.”

Starbucks did not respond to multiple requests for comment on Rojas’s situation, and Lyra declined to address it.

The tech industry’s growth-at-all-costs outlook may not translate well to a field as delicate as mental health.

Starbucks bills its Lyra benefit as “mental healthcare for a wide-range of needs, from mild to complex.” But Rojas’s experience reveals one way patients can feel underserved by a startup aiming to be a model for “modern mental healthcare.” In interviews with BuzzFeed News, 18 users, therapists, and former Lyra employees voiced concerns about some of the company’s business practices, including its productivity-based bonus structure for therapists and its use of patient data. Some of the people who spoke to BuzzFeed News for this story did so under the condition of anonymity because they feared repercussions from their employers or former employers.

Lyra — whose juggernaut slate of corporate clients also includes Google, Facebook parent Meta, and Morgan Stanley — is one of the leaders in a wave of startups focusing on mental health, applying Silicon Valley’s data-absorbed ethos to the discipline of therapy. Tech giants like Facebook and Google often weather criticism for taking liberties with people’s personal information, but the business model behind startups such as Lyra has received less scrutiny. The company, which has raised $700 million in funding to date, generates revenue through deals with high-profile companies, using anonymized patient data to prove it provides worthwhile benefits.

Better access to therapy, of course, is a good thing. Lyra’s supporters cite good wages for therapists, a well-built software platform, and the awareness the company has brought to people who might not have otherwise sought therapy. Other mental health companies, including Ginger, Modern Health, and Cerebral, have also become workplace staples, especially throughout a global pandemic. (BuzzFeed has a relationship with Ginger to offer mental health benefits to employees.)

As more people entrust this burgeoning class of therapy apps with their well-being, the tech industry’s growth-at-all-costs outlook may not translate well to a field as delicate as mental health. Lyra’s prominence raises questions about whether a high-flying Silicon Valley startup’s need to justify its reported $4.6 billion valuation conflicts with its ability to provide quality mental health services.

Lyra spokesperson Dyani Vanderhorst said in a statement, “Our approach makes it easy for millions of people to access high-quality mental healthcare. As demand accelerates, we remain committed to delivering clinically proven, outcomes-based mental healthcare for employees and their dependents across all facets of mental health.”

“It Can Get Dicey In Terms Of Ethics”

Ebersman founded Lyra Health seven years ago in Burlingame, California, about 20 miles south of San Francisco. The former Facebook executive, who was previously the financial chief at Genentech before arriving at Mark Zuckerberg’s social network, said he decided to start Lyra after having a difficult experience finding care for a family member. (Lyra declined to make Ebersman available for an interview.)

The startup employs its own therapists while also tapping into a network of contractors. When a company hires Lyra to be an Employee Assistance Program (EAP), its employees are typically given a set number of free sessions per year to see a counselor. The original plan was to offer users unlimited therapy sessions, two former early employees said, though that policy was later changed. The clinicians on Lyra’s platform specialize in evidence-based “blended care” therapy, a mix of in-person or live-video sessions and digital lessons and other content. After employees use all of their free sessions, they can continue seeing their Lyra therapist by paying out of pocket or through health insurance.

When it comes to clinical work, the company puts an emphasis on efficiency. The startup’s in-house therapists are entitled to bonuses based on productivity, two former Lyra staff therapists told BuzzFeed News, which is measured through a range of goals, including symptoms improving over time based on patient surveys.

“You can’t just throw people in and expect them to see results.”

One of the former therapists, Megha Reddy, said the bonus model can push therapists into “churning out” patients quickly. Reddy, who worked at Lyra until 2019, said the system can encourage questionable behavior, and could incentivize therapists to not see a patient for more than a certain number of sessions.

“This isn’t an assembly line. This is actually people,” Reddy said. “You can’t just throw people in and expect them to see results.”

Vanderhorst, the Lyra spokesperson, didn’t answer specific questions about the bonus system or what changes may have been made to it, but said in a statement, “We take great care in creating a supportive and dynamic work experience for our providers as well as offering them fair compensation.”

As a part-time employee working 20 hours a week at Lyra, Reddy said she was expected to see 12 to 20 patients a week with the goal of having a whole new slate of patients every six to 10 weeks. The financial incentives create the potential for abuse, she said. Her discomfort with the bonus system was her main reason for leaving Lyra.

“It can get dicey in terms of ethics,” Reddy said. “You’re not going to dictate to me when a patient is supposed to feel better based on numbers. That’s going to be based on the patient and my discretion.”

Vanderhorst said providers are the ones that determine how many sessions a patient needs.

Arthur Caplan, head of the Division of Medical Ethics at the NYU Grossman School of Medicine, said a bonus system like the one used by Lyra makes him “nervous.” “It could be a conflict of interest,” he said. “Turnover as a measure of success is certainly dubious in mental healthcare.”

“We set the tone. We basically started an industry.”

Facebook, Google, and Morgan Stanley declined to comment on Lyra’s bonus structure, and Starbucks did not respond to multiple requests for comment.

Other mental health startups have also reportedly incentivized productivity from therapists. In December, Forbes reported that Cerebral had reclassified salaried therapists as contractors, making access to medical, vision, and dental benefits contingent on meeting quotas. “This was done so that our best and most productive therapists have the opportunity to earn more,” CEO Kyle Robertson said in response. Cerebral did not respond to a request for comment.

But while other apps engage in similar practices when it comes to data policies and productivity incentives, Lyra Health bears some of the responsibility because it was a pioneer in the space, two former employees said. “We set the tone,” said one of them. “We basically started an industry.”

Outcome Health

Ebersman has said he wants to bring some of Facebook’s data-centric approach to mental health. “One of the things that’s so magical about Facebook is how the experience is completely personalized,” Ebersman said when Lyra launched in 2015. “And that is generally absent from your experience in healthcare.”

To collect data on the progress of treatment, Lyra periodically sends patients “outcomes surveys.” The questionnaires inquire, for example, about things like anxiety or irritability over the last two weeks, asking patients to rank their intensity from 0 to 3, according to surveys viewed by BuzzFeed News. The surveys, which use clinically accepted and standardized questions, are optional. But patients may feel compelled to complete them because the automated emails look like they are coming from their therapist.

Clinicians can use the data to help shape their treatment, but there’s another reason Lyra pushes the surveys: The company shares aggregated and anonymized data about patient outcomes with employers to illustrate the effectiveness of its services.

In one version of the survey viewed by BuzzFeed News that is hosted on research.net, a disclosure that explains how Lyra shares aggregated and anonymous outcomes data with employers appears on page three of five. Another version of the survey accessed through Google’s internal Lyra portal and viewed by BuzzFeed News does not explicitly say that outcomes data will be shared. Instead, it reads: “Your responses are confidential and are not shared with the employer sponsoring your Lyra benefit.” Lyra declined to answer questions about how it currently discloses to patients that it shares outcomes data with employers.

Google and Starbucks confirmed they receive data from Lyra in order to judge the service’s value to employees. “Google does not access the medical records of people using Lyra Health, and we have no special access,” Google spokesperson Jennifer Rodstrom said in a statement. Facebook and Morgan Stanley declined to comment.

“The bottom line is, this is a business. So the bottom line is money.”

Outcomes data is so central to Lyra’s philosophy that the company’s previous name was Outcome Health, according to an internal document viewed by BuzzFeed News. The name was changed to Lyra Health prior to the company’s launch.

“The bottom line is, this is a business. So the bottom line is money,” said one former Lyra employee who worked on the company’s clinical team. “And how can you get money? By data. By saying, ‘Look how successful we are. Please invest in us.’”

BuzzFeed News spoke to seven current and former Google, Facebook, and Starbucks employees who saw Lyra therapists and were upset about the sharing of outcomes data. One former Facebook employee, who worked on privacy initiatives at the tech giant, was concerned the data could be abused even if aggregated and anonymized. “I understand that employers want to measure the efficacy of their programs,” the former employee said, but it’s “completely inappropriate” to share such sensitive data.

Aside from the disclosure on some surveys, Lyra has laid out its data practices in a privacy policy, a more than 5,000-word document that lives at the bottom of its website. The company says the data sharing complies with the Health Insurance Portability and Accountability Act of 1996 (HIPAA), which regulates the use of health information. The company’s HIPAA notice, also found at the bottom of its website, says Lyra shares patient data “to support our business operations.”

Vanderhorst said new users must acknowledge both the privacy policy and HIPAA notice while setting up their accounts.

Still, some patients had not known about the data sharing. Of the seven current and former Google, Facebook, and Starbucks employees who spoke to BuzzFeed News, all but one of them said they did not know the data from these surveys could be shared with employers in any form. “It’s shocking to me,” said a former Google employee, who said she didn’t remember a data disclosure while filling out the surveys. “I had no idea they were doing that.”

Lyra defended how it communicates its privacy practices to patients. “Lyra follows all U.S. regulations regarding privacy,” Vanderhorst said in a statement. “Our privacy policy is standard format and provides detailed information about our practices.”

Jennifer King, privacy and data policy fellow at the Stanford University Institute for Human-Centered Artificial Intelligence, said there’s the legal process of gathering consent, and then there’s “the moral question” of making sure people fully understand. The added layer of sharing information with an employer makes it even more problematic. “People tend to feel somewhat better with aggregation, but in the workplace is different,” she said.

Lyra isn’t the only company in the mental health space facing questions about what it’s doing with anonymous user data. Loris.ai, the partner company to the nonprofit Crisis Text Line, is contending with criticism after Politico reported that it uses anonymous but sensitive data drawn from conversations on the text-based suicide hotline for business purposes.

Some Lyra therapists were not aware Lyra shares outcomes data with employers, either. BuzzFeed News interviewed eight current and former Lyra therapists, and six of them said they did not know about the data sharing. The therapists said meaningful consent from patients is crucial, even though their names are not attached to the data.

Some patients and therapists didn’t mind the data being shared anonymously, since it might be valuable for a company to know if its workforce is depressed or riddled with anxiety. But one former Lyra therapist says patients should get to choose what they want shared. “They should be able to select whether they’re willing for their outcomes to be reported,” she said.

Data collection was a key issue for some therapists during the early days of the company, according to three former Lyra employees. They said concerns about data sharing made it difficult to recruit therapists to work with Lyra when the company was getting started. When company leadership was told about those hesitations, they were dismissive of the concerns, the former employees said.

“Lyra has tremendous respect for the clinical knowledge, experience, and expertise of our providers,” Vanderhorst said in a statement. “Provider recruitment and retention are essential to the care we provide members and the success of our organization.”

The company has also had a history of its clinicians feeling overlooked, two former employees said. While engineering and data teams were valued for their input, people on the clinical team were treated like “second-class citizens,” one of the former employees said. That employee said that culture was instilled as Ebersman began to bring in people who used to work at Facebook. Lyra did not address these allegations and Facebook declined to comment.

“A Big Brother Kind Of Approach”

Chelsey Glasson, a former Google and Facebook employee, has recently sounded the alarm on EAPs like Lyra and the potential conflict of interest that could occur when your employer pays for your therapist. In an October op-ed for Insider, she called for more transparency in the relationship between third-party mental health providers and employers. Glasson, who is suing Google after alleged pregnancy discrimination, had sought session notes from her Lyra therapist as part of the lawsuit. Google then demanded and received the notes as well. After that, Glasson said, her therapist called and indicated she was no longer comfortable seeing her.

Google declined to comment. Glasson’s former therapist didn’t respond to multiple requests for comment. In Lyra’s privacy policy, the company says it can use personal information to “comply with our legal obligations.”

“It’s all inappropriate and unethical,” Glasson said of Lyra’s business practices. “People have no idea this is happening.”

Glasson, who is based in Seattle, filed a complaint against her therapist, and the situation is now under investigation by the Washington State Department of Health, according to emails viewed by BuzzFeed News.

“It’s all inappropriate and unethical,” Glasson said of Lyra’s business practices. “People have no idea this is happening.” 

After consulting with Glasson, Washington State Sen. Karen Keiser sent a letter in November to the state’s secretary of health about the “potential conflict” between employees and employers that participate in EAPs, according to a copy of the letter viewed by BuzzFeed News. Then, in December, Keiser pre-filed legislation that aims to give workers more rights when it comes to EAPs. The bill, called SB 5564, would prohibit employers from disciplining workers based on their decision to see — or not see — a therapist through an EAP. It would also make it illegal for an employer to obtain individually identifiable information about an employee. A state senate committee discussed the bill at a hearing last month.

“Our huge technology companies don’t hold personal privacy with the same regard that I think they should,” Keiser told BuzzFeed News. “They’ve been data mining personal privacy information for years and for their own bottom line. But when they use it on their employees, that’s a whole different thing. It’s really a big brother kind of approach.”

Lyra’s policies have at least some people wary about seeking therapy through their employers. After Glasson’s experience with her therapist was reported by the New York Times in July, some Google workers became less likely to use the EAP services provided by Lyra, said the Alphabet Workers Union, which represents workers for Google and its parent company. Google declined to comment.

“I was surprised when I heard about her story,” said a former Google employee. “It really shed a lot of light on the relationship that the counselor has with the company.” ●

Katharine Schwab contributed reporting.