This week, Google's chief internet evangelist, VintCerf spoke on privacy at an FTC event, inferring that privacy might not necessarily be sustainable, especially in regards to apps and social networks:
“Our social behavior is also quite damaging with regard to privacy," Cerf says. He gives an example how a person could be exposed doing something that they wanted to keep secret by being tagged in the background of a stranger's photo — a photo they never expected to be caught in. "The technology that we use today has far outraced our social intuition, our headlights. ... [There's a] need to develop social conventions that are more respectful of people’s privacy."
"We are gonna live through situations where some people get embarrassed, some people end up going to jail, some other people have other problems as a consequence of some of these experiences," Cerf said. More respectful privacy conventions will likely develop as we move forward, he says, but for now, "This is something we're gonna have to live through. I don't think it’s easy to dictate this." – “Google’s chief internet evangelist says “privacy may actually be an anomaly”, ReadWriteWeb
We’ve all probably experienced this lack of privacy on a social network – perhaps you were tagged in a photo from an event that you didn’t realize would be public, or perhaps information was shared that wasn’t exactly for public consumption. This kind of thing is becoming more commonplace, unfortunately, especially as privacy policies are changed often without keeping the consumer informed of changes that might impact their public-facing personas.
Privacy in regards to apps, especially when apps ask for information in order to function, is also starting to become a common issue, especially with the glut of apps out there, the amount of apps that consumers download on their devices, and the lack of oversight for app privacy. Most people actually do expect quite a bit of privacy out of their apps, even though for the most part, that privacy is perceived. According to a recent study:
“46 percent, for example, believe that carriers should not store location information for any length of time at all, while 59 percent believe data on a phone is "about as private" as data on a personal computer — which isn't necessarily the case depending on how a phone is loaded up.”
Is app privacy an illusion – an “anomaly”, as Mr. Cerf suggests in his talk referenced above? After all, we give data to our favorite social networking sites which then use that data to find friends, events, and organizations for us to continue to interact with. When we use a certain large search engine along with its peripheral services, we are essentially giving it the “key to the castle” with how much data we’re allowing it to see and use. When we go shopping, the pair of boots that we liked is going to show up on a popup ad sometime in the future of our Web browsing. This sounds potentially intrusive when written down in black and white, but in reality, this is something that is expected as part of the overall customization and personalization of the services we use every day – both web-based and app-based.
Even though most people realize that many apps and social networking services do gather information , like location, names, username, and other data, users still value their privacy very highly and want control over how that data is collected, used, or shared. In order to make privacy a reality and not just an “anomaly”, apps need to have reasonable disclosure of information that the apps plan on collecting, with a clearly written menu that helps them to make thoughtful choices about what information they want to share. While information gathering is the standard, it doesn’t have to be overly intrusive; users should be respected by the developers behind the apps, with safety measures in place that protect sensitive data.
Is there really a problem, or are a few consumers just overreacting? A recent study from the Privacy Rights Clearinghouse suggests strongly otherwise:
• Many apps send data in the clear – unencrypted -- without user knowledge.
• Many apps connect to several third-party sites without user knowledge.
• Unencrypted connections potentially expose sensitive and embarrassing data to everyone on a network.
• Nearly three-fourths, or 72%, of the apps we assessed presented medium (32%) to high (40%) risk regarding personal privacy.
• The apps which presented the lowest privacy risk to users were paid apps. This is primarily due to the fact that they don't rely solely on advertising to make money, which means the data is less likely to be available to other parties.
The Electronic Frontier Foundation (EFF) has a very thoughtful “Mobile User Privacy Bill of Rights” that includes these concrete, practical recommendations that developers can take into account to guard user privacy:
• Anonymizing and obfuscation: Wherever possible, information should be hashed, obfuscated, or otherwise anonymized. A "find friends" feature, for example, could match email addresses even if it only uploaded hashes of the address book.
• Secure data transit: TLS connections should be the default for transferring any personally identifiable information, and must be the default for sensitive information.
• Secure data storage: Developers should only retain the information only for the duration necessary to provide their service, and the information they store should be properly encrypted.
• Internal security: Companies should provide security not just against external attackers, but against the threat of employees abusing their power to view sensitive information.
• Penetration testing: Remember Schneier's Law: "Anyone, from the most clueless amateur to the best cryptographer, can create an algorithm that he himself can't break." Security systems should be independently tested and verified before they are compromised.
• Do Not Track: One way for users to effectively indicate their privacy preferences is through a Do Not Track (DNT) setting at the operating system (OS) level.
“To truly test AppPrivacy, I decided to make my “HappyFunTimes” app as obnoxious as possible, so I checked the box that said I wanted to send marketing messages to my users’ contacts.“Warning!” a pop-up box read. “If you are going to access the user’s contacts database and use it for marketing purpose, you must have their permission first. Also, you should gain consent from any contact you plan to send marketing messages to.”” – “How coders should make their apps more privacy-friendly”, BusinessWeek.com
How concerned are you about privacy – both for yourself and for your users? How are you providing privacy boundaries in the apps you are building? Share with us in the comments below.
For more complete information about compiler optimizations, see our Optimization Notice.