4. A culture of trust…and differentiated privacy
“The potential of big data hinges on one thing: trust,” explained Penny Pritzker, secretary of the U.S. Department of Commerce. “This promotion of trust can happen in how you plan to use their data, informing users of how you plan to use their data, training standards for employees, and also in consumers; consumers must share in trust through their online behaviors and becoming informed on their online data management.”
Cynthia Dwork for Microsoft Research explained how a relatively new definition of privacy, called differential privacy, could be used to help counter some of the security risks with online data.
Over the past five years, a new approach to privacy-preserving data analysis has born fruit, said Dwork, and this approach—differential privacy—includes a defined formal privacy guarantee. The data analysis techniques presented are then rigorously proved to satisfy the guarantee.
“Roughly speaking, this ensures that (almost, and quantifiably) no risk is incurred by joining a statistical database,” she said.
For more information on differential privacy, read Dwork’s paper.
One of the ways differential privacy is accomplished is through modern cryptography, which proposes a definition of what you’d like the cryptosystem to identify; then, an algorithm is developed to satisfy that definition until the definition is refined and, therefore, broken; then, another, stronger algorithm is developed for the new definition; and so continues the cycle.
Modern cryptography is a privacy-enhancing technology because it “uses techniques that extract from data without actually seeing it, keeping individual data sets private,” explained Shafi Goldwasser, professor at MIT CSAIL.
According to Goldwasser, there’s a lot of math involved, but higher education institutions can use modern cryptography to analyze information on students and programs mainly through the sharing of data sets with other institutions.
“Parties only learn the function of the output but nothing else about others’ inputs,” she said.
“By developing stronger cryptosystems, privacy loss can be managed,” said Dwork.
5. Privacy vs. the illusion of privacy
“Ultimately what needs to happen is an acceptance of the illusion of privacy,” said Manolis Kellis, associate professor at MIT CSAIL.
Every time you take a drink from a glass, explained Kellis, you leave behind your DNA, which reveals your personal information. In theory, if someone wanted to take that DNA from your glass they could.
“But you don’t hide your DNA by never touching things,” he continued. “Just like with Big Data and privacy, it’s out there and that’s the risk everyone takes by functioning in society. It’s the policies and laws that are implemented, security logs, and the information on how your data is used, that protects you.”
To think that a data breach won’t happen is an illusion; to plan for one through knowledge of current policies and cutting-edge technology tools to help mitigate that breach, is one way to protect privacy in a real way, said Kellis.