Artificial intelligence is not the only future of enterprise computing. It’s also the future of chipmaking giant Intel Corp.
Recently, Intel’s improving data center business ranked second behind its PC chip division. In 2019, the company made about $3.5 billion from AI-related products. And by acquiring Nervana Systems, a young company known for silicon architecture that can influence data sets at lightning-quick speed, and Movidius Ltd., that provides technology for running computer vision models inside of “internet of things” devices, Intel is bringing out its specialty in AI.
Have you thought of what this progressing role looks like for the 52-year-old semiconductor giant? Although Intel is developing its AI practice, it’s also strictly viewing at the difficulties of securing data and, probably more significantly, how AI can assist in growing a new era of industry collaboration for an improved society.
The senior director of AI products had a conversation with the host of the CUBE John Furrier, during the RSA Conference in San Francisco. They talked about the utilization of technologies to maintain data privacy while training models, an industry collaboration to build new solutions, the need to add professionals across various sectors, and an opportunity to enhance society through more significant access to relevant information.
Wierzynski is one of the important voices in a movement for Privacy-Preserving Machine Learning (PPML). PPML aims to merge complementary technologies so that machine learning can make use of data for training models without providing the anonymity of personal information inside the data set.
One of the privacy-preserving techniques being observed critically and used by Wierzynski and his team members at Intel is the homomorphic encryption. It’s a public/private key cryptography system that is designed to let applications perform training on encrypted information without exposing the data itself.
Security can collapse when the data needs to be decrypted to be used. It enables data to stay encrypted when it is being processed for training models.
Google and Microsoft get involved
The ability to utilize homomorphic encryption for preserving privacy and also feed the AI engine has been so attractive that it has brought lots of representatives from Intel, Google LLC, and Microsoft Corp. Currently, Microsoft released an open-source homomorphic encryption library for use by developers.
This is one of the most leading techniques in this area’. There are methods of doing math on the data while it stays encrypted, and the result that comes out is still encrypted. Only the actual owner of the data can reveal the answer.”
Intel has created a personal open-source tool for the technology called Homomorphic Encryption Transformer. This open-source tool was released in December 2018, the function of this tool is to provide a homomorphic encryption backend to Intel’s neural network compiler, which is called a graph.
There are also early signs revealing that the Homomorphic Encryption bandwagon is starting to collate momentum as new firms come into the market. In February, one of the homomorphic encryption company, Enveil that was founded by a former National Security Agency senior researcher, revealed series A funding of $10 million. Another Homomorphic Encryption startup, Duality Inc., also announced its round of funding in December, and Intel Capital led it.
Improving Patient Healthcare
Undoubtedly, the Intel senior director has been actively pursuing a “big tent” approach for HE, fully aware that the more data that can be shared securely, which will surely improve the works of artificial intelligence and machine learning. A better example of this can be seen in the healthcare sector.
Currently, strict laws are governing the use of patient data as it is carefully secured by hospitals and health organizations across the world. If this information could be combined and widely shared, the reasoning goes that the enhancement of AI would result in better diagnosis results for patients.
“If you can expatiate the availability of data, then it will always assist machine-learning systems, so we’re trying to unlock data silos that will exist in all countries and organizations,” Wierzynski said. “There are lots of great ideas such as federated learning where you could somehow decentralize the machine-learning process so you can still reverence privacy but still have statistical power.”
The big tent also includes many various constituencies. The machine-learning sector is by nature a collaborative one, which accounts for the reason that tech powers such as Google and Microsoft are taking an open mind on how to share resources and make data sharing occur.
“Almost all companies will have the personal secret sauce, the things it wants to keep possessive, but companies need to engage this broader community of researchers,” Wierzynski said. It will require a series of experts such as policy experts and applied mathematicians and linguists and neuroscientists.”
Homomorphic encryption is a compute-intensive exercise, an area of expertise for Intel all through the last half-century. The company is taking its ownership to the data space by providing a solution that, if generally adopted, can have a profound significance on major economic sectors of the entire world like healthcare, finance, and retail.
Intel think that AI computation of encrypted data doesn’t have to be a zero-sum game. Privacy can be secured, and models can be developed by using a new approach that will probably get improved over time.