November 11, 2014
Google is re-evaluating its privacy standards with a new open source tool that is designed to maintain confidentiality among participants being evaluated in heavy data sets. The ongoing project, known as RAPPOR, stems from a 1960’s technique that disrupts the correlation between a given data point and the individual behind that data point. The project is set to preserve the privacy and identity of the individual that is often vulnerable in the hands of companies today.
According to The Wall Street Journal, participants are asked to partake in a game of heads or tails. Depending on the outcome of the flipped coin, participants are given two choices.
“If the coin turns up heads, they should answer yes regardless of the true answer,” notes the article. “If it comes up tails, they should answer truthfully.” Moving forward, any “yes” accounted for in a yes-or-no question can belong to one of two groups, those encouraged to be truthful and those that were asked to answer “yes” to adhere to the rules. Researchers can then adjust calculations to interpret data in an accurate way.
Joseph Lorenzo Hall of the Center for Democracy and Technology explains that differential privacy was used as a tactical method to collect data about sensitive subjects such as individuals with sexually transmitted infections. “Respondents in some cases weren’t convinced that survey researchers could keep responses confidential and anonymous,” and so RAPPOR was developed to entice participants to feel comfortable being honest.
Google is experimenting with the technique to evaluate how many people are using its software.
For now, Google has yet to announce any further plans with RAPPOR, nor has there been word about whether or not other companies will adopt the technique moving forward.