Artificial intelligence poses new threat to privacy of health data

0
582
Stethoscope and Laptop Computer

Study says current laws need strengthening in order to keep up with artificial intelligence

Advances in artificial intelligence have created new threats to the privacy of people’s health data, a new University of California, Berkeley, study shows.

Led by UC Berkeley engineer Anil Aswani, the study suggests current laws and regulations are nowhere near sufficient to keep an individual’s health status private in the face of AI development. The research has been published in the JAMA Network Open journal.

The findings show that by using artificial intelligence, it is possible to identify individuals by learning daily patterns in step data. Such data is collected by activity trackers, smartwatches and smartphones, and can be correlated to demographic data.

For India the findings are significant because India has m, just 100 days ago embarked on a massive health plan which involves collection of data at a huge scale. Use of personal fitness devices too is rising but that data is a blip compared to the health data generated by PMJAY which covers 500 million people.

India has just 100 days ago embarked on a massive health plan which involves collection of data at a huge scale. Use of personal fitness devices too is rising

The mining of two years’ worth of data covering more than 15,000 Americans led to the conclusion that the privacy standards associated with 1996’s HIPAA (Health Insurance Portability and Accountability Act) legislation need to be revisited and reworked.

“We wanted to use NHANES (the National Health and Nutrition Examination Survey) to look at privacy questions because this data is representative of the diverse population in the U.S.,” said Aswani. “The results point out a major problem. If you strip all the identifying information, it doesn’t protect you as much as you’d think. Someone else can come back and put it all back together if they have the right kind of information.”

He makes some alarming statements.

“In principle, you could imagine Facebook gathering step data from the app on your smartphone, then buying health care data from another company and matching the two,” he added. “Now they would have health care data that’s matched to names, and they could either start selling advertising based on that or they could sell the data to others.”

According to Aswani, the problem isn’t with the devices, but with how the information the devices capture can be misused and potentially sold on the open market.

“I’m not saying we should abandon these devices,” he said. “But we need to be very careful about how we are using this data. We need to protect the information. If we can do that, it’s a net positive.”

Though the study specifically looked at step data, the results suggest a broader threat to the privacy of health data.

Aswani said advances in AI make it easier for companies to gain access to health data, the temptation for companies to use it in illegal or unethical ways will increase. Employers, mortgage lenders, credit card companies and others could potentially use AI to discriminate based on pregnancy or disability status, for instance.