Apple Explores Differential Privacy
The tech giant has come up with safe method to study about groups' activities without impairing the privacy of an individual.
One of the prime take away of Apple’s Worldwide Developers’ Conference held on Monday was the tech titan’s current work on privacy. Like any gigantic company, Apple needs to know as much as possible about its customer to bring about adequate updates in its software and overall services. However, the world’s most valuable company has had a strong stance for keeping the privacy of the users intact. So, the Silicon Valley tech giant has come up with a method dubbed as “differential privacy” to hit two birds with one stone.
During the conference held in San Francisco, Craig Federighi –Apple’s senior vice president of software engineering –gave an in-depth view about the privacy. He assured that the Cupertino, Calif. firm is not gathering user-profiles and it still provide end to end encryption to Facetime and iMessage and thrive to keep the private conversations away from Apple servers and limited to personal device. However, the vice president put forward an important point that the user’s information is quite necessary for writing good software and in today’s world of machine learning and big data analysis so the company has adopted the safe method which ensures win-win situation to both parties –the company and consumers.
In simpler terms, differential privacy is the comprehensive and thorough study of a group while strictly restricting going into the details of any individual. Federighi elaborated that the method requires research in the areas of data analytics and statistics by using noise injection, hashing, and subsampling for crowdsourced learning while simultaneously keeping the information of the individual users strictly private. He added, “Apple has been doing some super important work in this area to enable differential privacy to be deployed at scale.” The new technique will be integrated in Apple’s latest iOS 10.
Apple can study about a particular group –their likes or dislikes; what they say or do but it cannot point out the same for an individual therefore making the assessment of user’s information virtually impossible for intelligence agencies or hackers. The tech giant backed the theory with findings of Aaron Roth –a science professor from University of Pennsylvannia –compiled in a book called “Algorithmic Foundations of Differential Privacy.” In the said book, Mr. Roth has highlighted about a mathematical proof which ensures that through differential privacy no one can learn about an individual. Referring to the tech giant’s work in the field, Mr. Roth merely stated that the US tech behemoth is “doing it right.”
After the classical showdown between the world’s powerful government and Apple Inc., the company has a large number of audience is waiting to scrutinize its approach to privacy. Similar to its prior claims, the latest work of the tech giant accentuate the firmness of the iPhone maker for bolstering the security of its devices to give safe and secured user experience to its consumers.