Shift privacy to the left: Achieve AI-powered API Privacy using Open Source presented at DevSecCon 2019

by Gianluca Brigandi,

Summary : I’ll begin with a brief survey of today’s privacy landscape: how it affects the software development industry now, and how it might in the future. Of particular interest are requirements imposed by recent regulations like GDPR and the CCPA, which require all processors of data to pay more attention to how they treat, store, and disseminate customers’ sensitive personal data.Next, I’ll introduce the Privacy-by-Design (PbD) approach. With PbD, we hope to “shift privacy to the left” in the software development life cycle; similar to using the DevSecOps philosophy for security.I’ll explore the challenges that organizations face in the new regulation-heavy climate, particularly in terms of taking into account privacy concerns in legacy software, which may have been written before privacy regulations became a significant factor.Moving on, I’ll share what AI, more specifically Deep Neural Networks, can bring to the table in terms of assisting with a thorough review of the applications to make sure that they do not harbor privacy risks. Likewise, for all new developments, I’ll explore how AI can be harnessed to help ensure that privacy principles are successfully implemented.I’ll then explore a reference application, specifically its dataflows, through which leakages of sensitive data that are not allowed by a privacy policy defined in a compliance context might occur.Next, I will introduce an open source project (PrivAPI) that uses deep learning - mainly on top of Keras and Tensorflow - to detect sensitive data leakages, specifically within RESTful API communication.I’ll drill down on PrivAPI’s core architecture and design principles, as well as the use cases that it supports. I’ll explain how it can be integrated into the SDLC, as well as in production environments.Finally, I’ll provide a live demo of PrivAPI (, covering the detection capabilities with real world APIs communication.