Governance & Risk Management
,
Operational Technology (OT)
,
Video
Privacy Expert Chiara Rustici on Laws Governing Autonomous Robots, Embedded AI
As embedded artificial intelligence moves from labs into real environments, organizations face growing liability risks. From border patrol robots to healthcare automation, leaders must understand how AI governance, product liability, data protection and security laws apply, said Chiara Rustici, chief privacy officer, AI governance and data protection officer, and independent analyst.
See Also: Analysis: State of Operational Technology and Cybersecurity
Liability doesn’t depend on creating new laws for novel technology. It depends on applying existing legal frameworks with precision, she said. AI governance, data protection, cybersecurity and product safety rules already overlap. The challenge lies in mapping them to specific use cases, anticipated misuse and shifting risk profiles as deployments evolve.
“People think this is a technological breakthrough that needs a novel legal framework. That’s the wrong quest,” Rustici said. “We already have lots of frameworks that keep applying and accumulating to novel engineering uses.”
In this video interview with Information Security Media Group, Rustici also discussed:
- How AI governance, data laws and product liability overlap for embedded AI;
- Why civilian and military use cases trigger different liability models;
- What CISOs should do to assess risk as use cases change.
Rustici is an independent academic and data regulation analyst. She is a former research grant recipient from Italy’s CNR, teaching fellow in jurisprudence at the University of Genoa and researcher at the universities of Milan and Edinburgh. She was previously Chair of BCS Law Specialists Group and now part of the BCS governance team with the PPP committee. In 2019 Rustici was recognized in the inaugural DPO200 list compiled by the Swiss-based GDPR Institut as one of the “individuals who have made significant contributions to the privacy and security sectors.”

