When healthcare companies like Ginger.io share our information with countless members of the company, what happens to our privacy?

Adrianna Nine
5 min readFeb 11, 2020

--

There’s a new Silicon Valley startup in town, and its name is Ginger.io. Ginger lauds itself as “the leading behavioral health system” delivered conveniently via mobile app. It’s only available to people whose employer contracts with Ginger as a benefit, and its approach is a little unorthodox (though the startup probably wouldn’t belong in Silicon Valley if it weren’t). Ginger’s problem, however, is not that it operates outside the box of traditional therapy — it’s that patients who come to Ginger for mental health services are stripped of their privacy, perhaps indefinitely.

When a new patient signs up for Ginger, they’re automatically assigned an “emotional coach,” or a professional in training to become a licensed therapist who allegedly is there to offer light emotional assistance, like stress management strategies or goal tracking. Ginger’s whole shtick is that emotional coaching is available 24/7, which means if you message your coach outside of their shift, you’ll be assigned a new coach. And if you send a message outside of their hours, you’ll be assigned another one. So while coaching is an option at all hours of the day and every day of the week, you’re dealing with a handful of people with different personalities, different ways of thinking, and likely different strategies.

(Side note: Yes, Ginger’s emotional coaching option is only available via chat. When I signed up with the app — which only works on smartphones, so prepare to explain your whole emotional journey with your thumbs — I had hoped for video or even phone assistance, but was told coaching was text-based only.)

Let’s say that emotional coaching isn’t enough. The patient who’s just signed up for Ginger requires a little extra support, like the kind only a licensed therapist, psychologist, or psychiatrist can offer. Ginger offers this as well, so long as the patient’s employer sponsors Ginger therapy sessions; if they don’t, sessions are $119 each, and Ginger is not considered in-network with most major health insurance carriers. The patient is left with the same problem true of traditional therapy services: find someone in-network or prepare to pay a steep price for mental wellness. If the patient’s employer does sponsor Ginger therapy sessions, it’s typically only a handful of them, which means once a bond is formed between the patient and therapist, the patient faces a nasty decision: stick with the valuable relationship they’ve built and pay the price, or restart the entire process of seeking care.

Luckily, Ginger’s therapy sessions are conducted via video at appointed times, similar to those offered by other (more private) telehealth platforms like Lyra and LiveHealth. But in order to sign up for a session with an actual therapist, the patient has to spill their business to Ginger support, which is also text-based. Before the patient can be matched with a therapist, support wants to know why emotional coaching isn’t sufficient for the patient and what the patient is seeking help with — information many people would like to keep private, especially when the person on the other side of the chat doesn’t have a name and likely changes several times throughout the conversation.

After support receives this information, they assign a therapist and session to the patient based on their availability. Because there’s little attention paid to the type of therapy offered by Ginger therapists or the type that the patient most benefits from, it’s a crapshoot; the therapist might be a good match for the patient, or they might not. If after a couple sessions the patient decides the therapist is not a good match for them (as occurs all the time in the traditional therapy world), the patient has to explain why to support, who then consults Ginger’s “team of clinicians” in search of a solution. It is undisclosed who these clinicians are, how many of them exist on said team, or what the team’s main purpose is.

This is where the journey splits off. If the patient shares my experience, they’re told to stick it out with the therapist who isn’t a good match in the hope that the therapist will gradually build the appropriate skills to help the patient out. If the patient gets lucky, they’re matched with a new therapist, who hopefully is a better fit.

I’d like to ask if you’ve been keeping count of everyone the patient has spilled to at this point, but the problem is that keeping count is impossible under Ginger’s healthcare model. It is impossible to know how many support agents have read the patient’s business; how many emotional coaches have been assigned to the patient’s text-based chat regarding their concerns; and how many clinicians have discussed the patient’s reasons for seeking mental health services before arbitrarily assigning a therapist.

Oh, and Ginger’s intake form — you know, the one you fill out when you’re meeting with a new therapist, which asks how “down, depressed, or hopeless” you’ve been feeling and whether you experience thoughts of self-harm — is a Google form with a destination undisclosed to patients. Who knows who’s receiving and reading that, or where it’s stored.

Ginger.io claims to maintain confidentiality in a similar way that traditional mental healthcare services do, by requiring that patients sign a privacy statement about how Ginger.io does not sell patient data, will only release session notes to law enforcement if a subpoena is involved, and so on. There is a concerning section of this agreement which states Ginger may “disclose health information about you to your family members or friends if [Ginger obtains] your verbal or written agreement or if [Ginger] can infer from the circumstances, based on [Ginger’s] professional judgment that you would not object,” which is fishy for the average person and incredibly dangerous for people who may be in undisclosed abusive relationships. But the concern regarding privacy at Ginger isn’t necessarily focused on how information may find its way outside of the company — it’s about the distribution of a patient’s private information within it.

When a patient has no way of knowing who at a healthcare practice knows the details of their mental health concerns, it means there are an undisclosed collection of people wandering around the world with knowledge of and access to that patient’s most private struggles, habits, and thoughts. This also means confidentiality — and any subsequent breach thereof — is nearly impossible to track. For many, therapy itself is a scary thing to seek out; they’ve never shared their mental health matters or deepest life concerns with someone, and opening up to a professional about them is a soul-shaking feat. But even for those who are well-versed in attending therapy, Ginger’s model is unsettling. Depending on a patient’s career, romantic and family relationships, and other factors, this lack of privacy can be uncomfortable at best and life-ruining at worst.

Don’t get me wrong: the traditional United States mental healthcare system is not perfect. It’s costly, and many people (even those who enjoy insurance coverage) have a difficult time finding a provider who is accepting new patients. But spreading a patient’s mental health details to numerous undisclosed employees within a so-called mental health practice in Silicon Valley is not the solution — and if healthcare continues down this path, it will in fact be another problem to fix.

--

--

Adrianna Nine

Tech & science writer who scribbles about social activism and mental health in her free time.