GAO Report Looks at Privacy Concerns of Facial Recognition
August 3, 2015
U.S. Senator Al Franken (D-Minnesota) highlighted the findings of a just-released GAO (Government Accountability Office) report focusing on the privacy implications of facial recognition technology. The report details concerns about the practices of companies that collect, use and store massive amounts of personal information. Franken, chair of the Judiciary Subcommittee on Privacy Technology and the Law, pointed to the report’s findings as more proof that federal standards are needed.
The government report reveals that in-car navigation companies should provide consumers more information about how they use and share location data. Franken has long been concerned with the impact of facial recognition technology on consumer privacy. According to his website, the senator convened a hearing on protecting mobile privacy in May 2011, when he heard experts testify about “the benefits and dangers of using location data.”
“Over the past several years, we’ve seen tremendous growth in the use of facial recognition technologies, and it has profound implications for consumer privacy,” said Franken. “Facial recognition tracks you in the real world — from cameras stationed on street corners and in shopping centers, and through photographs taken by friends and strangers alike.”
According to TechCrunch, “Companies like Facebook and Google use face-recognition technology to tag you in photos, for example, but as you could imagine, this technology could be used for nefarious things if it’s in the wrong hands (like ‘The Terminator,’ with the wrong intentions).”
Although the GAO report does not make specific recommendations, it does suggest that Congress examine “strengthening the consumer privacy framework” to reflect changes in evolving technologies. “No federal privacy law expressly regulates commercial uses of facial recognition technology,” notes the GAO.
No Comments Yet
You can be the first to comment!
Sorry, comments for this entry are closed at this time.