The majority of health apps were found to share user data and lacked transparency around the practice, according to a recent BMJ study.
Researchers from the University of Toronto analyzed 24 of the highest-rated health medicine management Android apps from the U.S., Canada, U.K., and Australia to determine if and how user data is shared and to determine privacy risks to app users, including both consumers and clinicians.
They simulated real-world, in-depth use of these apps using four dummy user profiles, in response to a recent report that Australia’s most popular medical appointment booking app, HealthEngine, routinely shared users’ private medical data to personal injury law firms, as part of a mutual referral contract.
“Mobile health apps are a booming market targeted at both patients and health professionals,” the report authors wrote. “These apps claim to offer tailored and cost-effective health promotion, but they pose unprecedented risk to consumers’ privacy given their ability to collect user data, including sensitive information.”
“Health app developers routinely, and legally, share consumer data with third-parties in exchange for services that enhance the user’s experience… or to monetize the app,” they added. “Little transparency exists around third-party data sharing, and health apps routinely fail to provide privacy assurances, despite collecting and transmitting multiple forms of personal and identifying information.”
The study found that 19 out of the 24 apps, or 79 percent, shared user data, with 55 unique entities (owned by 46 parent companies) receiving or processing user app data with developers, parent companies, and third-party service providers.
And about 67 percent of the apps provided services related to the collection and analysis of user data, such as advertising or analytics, which the researchers suggested increased risks to user privacy.
Most apps transmitted user data outside of the apps (71 percent), including user’s device name, operating system version, browsing behavior, and email addresses. The researchers explained that out of 104 detected transmissions, 94 percent were encrypted and 6% occurred in clear text.
Of the sample apps, 13 percent leaked at least one type of user data in clear text, with 58 percent transmitting encrypted user data over HTTPS or did not transmit user data in the traffic analysis (29 percent).
“Many types of user data are unique and identifying, or potentially identifiable when aggregated,” the researchers wrote. “A few apps shared sensitive data such as a user’s drug list and location that could potentially be transmitted among a mobile ecosystem of companies seeking to commercialize these data.”
“After implementation of the GDPR, developers disclosed additional data sharing relations within privacy policies, including for two additional apps that had not transmitted any user data during the traffic analysis,” they added.
The researchers also performed a network analysis, which determined first- and third-parties received a median of three unique transmissions of user data, while third-parties “advertised the ability to share user data with 216 ‘fourth-parties.’”
Further, within the same network, entities had access to a median of three unique transmissions of user data. Meanwhile, “several companies occupied central positions within the network with the ability to aggregate and re-identify user data.”
The sampled apps requested on average four (out of 10) “dangerous” permissions, or “data or resources that involve the user’s private information or stored data or can affect the operation of other apps.”
But the most common risk was that about 79 percent of the apps requested permission to read or write the device storage, 46 percent asked to view wi-fi connections, 29 percent requested to read the accounts listed on the device, and 29 percent wanted phone status data, including the phone number, network information, and when a user had a phone call.
Another 25 percent wanted access to approximate or precise location.
“The collection and commercialization of app users’ data continues to be a legitimate business practice,” the researchers wrote. “The lack of transparency, inadequate efforts to secure users’ consent, and dominance of companies who use these data for the purposes of marketing, suggests that this practice is not for the benefit of the consumer.”
“The presence of trackers for advertising and analytics, uses additional data and processing time and could increase the app’s vulnerability to security breaches,” they added. “Clinicians should be conscious about the choices they make in relation to their app use and, when recommending apps to consumers, explain the potential for loss of personal privacy as part of informed consent. Privacy regulators should consider that loss of privacy is not a fair cost for the use of digital health services.”