Some state privacy laws, such as Washington State’s My Health My Data Act, could throw a curve ball in the use of certain consumer information for artificial intelligence and machine learning endeavors, said regulatory attorney Adam Greene of the law firm Davis Wright Tremaine.
“If you’ve got consumer health data that falls outside of HIPAA, then you have to be very sensitive to these state laws,” he said in an interview with Information Security Media Group during the HIMSS 2025 conference in Las Vegas, Nevada.
For example, “Washington State’s MHMD Act prohibits taking non-health data and using that data to draw health conclusions and essentially create health data without the permission of individuals,” he said.
“And in AI development, that could mean that you might not be able to take a dataset and apply AI to it to create new health inferences from that data without first getting the authorizations of individuals whose data you have, which could make the entire project unfeasible,” he said.
In this audio interview with Information Security Media Group (see audio link below photo), Greene also discussed:
- The current lack of federal guidance on the use of HIPAA-protected health information for AI and ML, and what that means for regulated organizations;
- Privacy considerations involving the use of patient data for AI and ML under other federal health regulations such as 42 CRF Part 2, which pertains to substance disorder records;
- Avoiding privacy mistakes involving the use of health information for AI projects;
- Uncertainty involving the federal government’s current stance on a variety of other health data privacy and security matters, as well as AI.
Greene specializes in health information privacy and security laws, including applying those laws to new technologies such as AI and ML. He formerly served as senior health information technology and privacy specialist at the HHS OCR, where he played a significant role in administering and enforcing HIPAA privacy, security and breach notification rules.