Homomorphic Encryption & Machine Learning: New Business Models
Introduction to Homomorphic Encryption, use cases and impact on Machine Learning projects
One of the main “issues” about AI projects is data privacy. Indeed, you might identify the best use case for your company and then realize that your business project depends on data you are not allowed to use since you can not comply with existing data privacy regulations (for good reasons). This situation hinders our ability to leverage AI in real-life business applications.
Indeed, most Machine Learning systems are fed by data that are very sensitive and personal (customer data, health records, CCTV footage, etc.). After several projects in this industry, I can assure you that concerns over privacy legal issues are a serious barrier to the development of new AI solutions and business models.
Moreover, most companies are unaware that they could create new revenue streams using the data at their disposal while preserving the privacy of it. A solution called Homomorphic encryption might help to improve the situation.
In this article, I will explain what is Homomorphic Encryption (HE), the current limits of this technology, and how it will help create new business models while preserving data privacy.
Today’s issues
Let’s start with some common ML issues. Machine learning models are difficult to share across various users. Moreover, sharing modules is not feasible as it requires high computation at every end.
Most companies rely on cloud computing and leverage ML models through cloud APIs. This situation helps fix the issue of computational power but exposes data to the cloud providers.
In some European countries, having a US cloud provider will prevent you from doing business with public organizations and state-owned companies (1).
Moreover, current data privacy regulatinons such as the GDPR (2) in Europe limits the use of AI solutions. For instance, if you work in the financial services industry, your organization handles a lot of personally identifiable…