We are excited to bring Transform 2022 back in-person July 19 and virtually July 20 – 28. Join AI and data leaders for insightful talks and exciting networking opportunities. Register today!
The panic ignited by the leaked draft opinion from the Supreme Court’s ruling on Roe v. Wade elicited many reactions, including prompting many to delete their period- and menstruation-tracking applications and brought millions more to wonder if they should.
Two of the most widely used period-tracking apps, Flo and Glow, faced backlash recently for failing to properly protect sensitive user data. The companies say they have since worked to bolster security and create more transparency around their privacy-protection policies. However, had that occurred after Roe’s potential overturn, it may have harmed users with potentially incriminating evidence against them.
What’s more, these health apps typically aren’t subject to HIPAA compliance. Volunteering intimate data without federal protections in place, combined with complicated privacy policies, can be overwhelming and potentially harmful for users.
“I do think that this is a kind of public reckoning [for] period-tracking apps or others in the health space to think about their practices and the ways they communicate them to users,” said Marika Cifor, Ph.D. and assistant professor at the University of Washington’s Information School and adjunct faculty for gender, women and sexuality studies.
Join us at the leading event on applied AI for enterprise business and technology decision makers in-person July 19 and virtually from July 20-28.
“This also speaks to the ways in which we lack data-protection laws and certain kinds of privacy and transparency measures that we should [be able to] expect,” Cifor said. “There are all sorts of other health data that we could imagine might be used for purposes beyond what the user intended.”
It’s not just about health tech
Though the spotlight is on health tech apps, the conversation and its implications reach beyond this sector into enterprises across industries.
“Although they may not be selling that [health] data … many third parties can match and mingle data points to paint a holistic picture of that consumer. All companies have a part to play in data protection across industries,” said Walter Harrison, founder of Tapestri, a company that pays consumers for their consent to share their data.
Though Tapestri isn’t in the healthcare tech realm, it’s based in Illinois, a state with protected abortion access that borders places like Missouri, Indiana, Kentucky and Wisconsin — all of which have restricted access or have trigger laws that will activate to ban abortion if Roe is overturned. Since the leaked draft opinion, Harrison said his company has received two requests to purchase health data about users near the Illinois state borders.
“On the border of Missouri or Indiana, that’s clearly where a lot of Planned Parenthood places may be. We’ve already had clients reach out trying to purchase this data and we’ve declined 100%,” Harrison said. “We decided that we’re not going to monetize anything in healthcare.”
What data reveals post-Roe v. Wade ruling
If Roe v. Wade is overturned, 26 states have trigger laws that immediately go into effect criminalizing those who have abortions and those who provide them.
Period- and other health-tracking apps can store data that could reveal, for instance, if a user reported menstruating regularly and then suddenly tracks a missed period, their location tracking to a clinic, and then if they begin tracking a period again a month later. This data could potentially showcase evidence of a health concern, a miscarriage or an abortion. If abortions become illegal in an era when the U.S. lacks secure policy around companies selling this type of data from third parties, data like this could potentially become useful to infer from and then prosecute and investigate patterns of this sort.
In fact, Jamf, a company that helps security teams manage and protect devices, data and applications for end users, recently released a study on app permissions. The company analyzed the metadata within a sample of nearly 100,000 popular apps across the iOS App Store and found that 44% of health and fitness apps request permission to access your camera and 19% ask to access your microphone.
“I think that enterprises need to step up and help not only better protect their own organizations, but help people realize the risks they face with all of the devices that we carry around every day. One that’s going to continue to be front and center is application risk,” said Michael Covington, vice president of product and strategy at Jamf.
“What a lot of people don’t realize is how much data these devices are collecting on them. There have been some good studies over the last few years on how user location tracking is achieved on mobile devices, whether you grant the application access to your location data or not,” Covington said. “I think that’s one of the most alarming aspects, especially when you look at it in the context of Roe v. Wade and how that data can be misused against an individual.”
Though app developers may state they anonymize data sold to third parties, data privacy experts argue that data, when aggregated from multiple sources, can be used to paint an accurate picture of an individual’s habits and practices. And, in the case of Roe’s potential overturn, become incriminating — or at least imply what could be incriminating evidence.
“This idea of companies saying, ‘Oh, we’re not sharing any individually identifiable data,’ isn’t accurate. One of the important things about data privacy is that privacy is contextual. The company doesn’t have the ability to say, ‘we’re not sharing anything identifiable,’” said Os Keyes. Keyes is a Ph.D. candidate at the University of Washington’s department of human-centered design and engineering, where he researches big data, engineering and design, data ethics, medical AI, gender and sexuality, information science, race and equity.
“Whether you’re really identifiable or not is about the data individual companies are sharing and about what happens when someone collates all of that together,” said Keyes. “What picture for the person can you work out from putting all the things together? All you need to do is be able to stitch existing datasets together.”
The current landscape of period-tracking apps
Though there has been an increase in consumers deleting apps, which experts agree is important to do in some instances, better understanding of privacy policies is also important. VentureBeat asked Keyes to review privacy policies from several of the top period-tracking apps on the market, including Clue, Flo, SpotOn (developed by Planned Parenthood Federation of America) and Glow. When combined, these apps have active users reaching into the hundreds of millions. Overall, Keyes found that while some offered transparency, others were severely lacking it:
- Glow/Glowing: “Their policy is so generic it’s essentially a cover-your-ass, not actual transparency … Even the assurance that the data is deidentified seems at best fishy. If we’ve learned one thing over the last few years, it’s that whether data is deidentified depends on context and on what else it can be combined with. They explicitly list that they sell their data to third-party advertisers — you know, people whose whole schtick is ‘collecting and recombining and identifying data as much as possible’ — well, that’s worrisome. At least they’re kind enough to provide a link through which one can ask them if they would please not sell my data to ad companies in particular.”
- Clue: “Clue, I’ve got to give a big thumbs-up. Their policy is clear, transparent and their data protection approach is pretty locked down because they’re purposefully in the EU [so governed by GDPR]. They’re also very careful to emphasize that they don’t take tracking data lightly — which is rare — even in policies that recognize data sensitivity, there’s usually some excuse-making. The cause of all of this and of them … getting it, seems to quite simply be that their business model isn’t based on data …. That’s where every other company on this list has gone wrong and one of the reasons their policies are so bad: Once you’ve decided your customers are the product, selling them out isn’t a matter of the principle, it’s a matter of the price.”
VentureBeat reached out to SpotOn, Glow, Flo and Clue for comment (Glow and SpotOn did not respond to our request).
Clue responded and reiterated its adherence to GDPR, under which they are governed, and referenced a published statement from both of their CEOs.
Alternatives exist. A recent Consumer Reports article recommends three apps, “Drip, Euki and Periodical — which all store data locally and don’t allow third-party tracking.” The common theme here: All three aren’t U.S.-based companies and are governed by stronger privacy laws like GDPR.
Where does data privacy protection responsibility fall?
Location data, browser history, website searches, gender data, purchases and more can be — and often are — sold to third-party companies, particularly when an app is offered for free. Though there is no price tag to download, upon using the app and agreeing to its terms, a user agrees to sell this information. Even when privacy protections are promised in a policy, it isn’t always as secure.
So, who does the responsibility fall to? It’s complicated. Some say users should continue to be the ones who have to carry the burden of understanding the risks at play when choosing to use an app. Others think companies should treat privacy as a foundational pillar of operations. And some experts think the U.S. government should take the lead.
“My take is that it is similar to the Hippocratic oath that a physician would take, which is to do no harm to the consumer,” said Brian Mandelbaum, CEO of Klover, a leading fintech and data platform/app that collects and uses consented zero-party data. “The way that you do that most efficiently is by being abundantly transparent at the first interaction with the consumer about the data that you’re collecting and the usage of that data.”
“Companies that handle our sensitive information or intimate information need to have duties of loyalty and an antidiscrimination commitment. So if you handle my intimate information, then you can’t use it in a way that’s adverse to my interest and in yours,” said Danielle Citron, professor of privacy law at the University of Virginia and author of the upcoming book The Fight for Privacy: Protecting Dignity, Identity and Love in the Digital Age.
“So what’s wrong with GDPR? Its commitments are so thin,” said Citron. “They’re procedural. They’re more than we have, but we need substantive commitments, loyalty, to care, to confidentiality to antidiscrimination. That’s why I frame internet privacy as a civil right. One that all of us deserve, but also that recognizes the structural discrimination.”
Look at privacy as an asset
If enterprises can begin to embrace privacy protection as an asset rather than as a hurdle to clear or hoop to jump through, they can win users over and even use privacy as a main marketing point, said Estelle Masse, the Europe legislative manager and global data protection lead at Access Now, a data privacy advocacy organization that defends the digital rights of users worldwide.
For companies that aren’t sure what to do or feel like they’re struggling with privacy, Masse advises to only collect the data you need to and to understand that privacy doesn’t get better the more you talk about it — it gets better the more you act on it.
“You can yell it louder and louder, [but] if you don’t change the way you operate and if you don’t change your privacy controls and tools, this is not going to make it any more true. Take the time to demonstrate that you’ve understood the mistakes and make the changes necessary,” she said. “If you’re a business, people will trust you more for that. Privacy is actually a commercial advantage. Companies need to move beyond thinking it’s an annoying compliance checklist. It can be a competitive advantage for you and build trust for your users.”
Investors like Lu Zhang from Fusion Fund are paying close attention to what may unfold next.
“Regulation always comes last. It can take many years for a regulation to be enforced. Regulations also have a difficult time covering all possible use cases,” he said. “So, we are focused on investing in the technology companies that are providing solutions to healthcare companies on how to manage their data. I’d like to reinforce the point that companies should have a detailed data strategy and data policy. They need to understand the potential implications and sensitivities to what data they have access to.”
Time will tell if regulation will unfold first or if the tech landscape will see more companies begin to shift toward embracing privacy as a strength, like Apple with its recent privacy-focused commercials.
As for Roe v. Wade and the shifts it may bring, the Supreme Court is expected to release its final decision any day. Since the leak, legislation has been drafted by Sen. Elizabeth Warren (D, Massachusetts) with the aim of preventing the sale of U.S. citizens’ location and health data if Roe is overturned.
VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn more about membership.
Source: Read Full Article