Apps and privacy practices are not two things that you hear linked together very often, and with good reason: there doesn’t exist a good framework in the app development ecosystem for integrating the two in a transparent, user-friendly process. In a previous article, we talked about the problem of privacy, and how a recent Federal Trade Commission survey of over 400 different downloaded apps found a disturbing amount of glaring privacy violations within the apps themselves. In this article, we’re going to dig down deeper into the problem of app privacy, especially in regards to apps aimed towards children.
The problem of privacy
As recounted in the previous article, transparency in privacy practices – especially with data collection and usage – is a problem, especially with how fast the app market is growing. As of this writing, there are several hundred thousand apps available for download from a variety of app stores, and while the marketplace is growing at breakneck speed, there has been little time or effort addressed to making any kind of meaningful industry-wide privacy guidelines.
The FTC has repeatedly called for app industry best practices in regards to privacy. These would include basic design within the app that would automatically encourage a more private experience, information about how data is used, and transparency about what kind of data is going to be collected, especially in apps that are primarily aimed towards children. While these rough guidelines would certainly go far in giving people greater confidence in the app industry overall, they are easier said than done when it comes to implementation.
As I personally have seen this last week, it is far too easy for children to purchase goods and services within an app itself. In fact, I found that the barrier to purchase was set ridiculously low, while attempting to set any kind of restrictions – parental controls, account management, etc. – was decidedly frustrating. I’m definitely not a novice user, either, so one can only imagine the confusion this causes most people when trying to set boundaries within apps for young children. The FTC study showed clearly that very few apps have any sort of restrictions in place in the context of in-app purchases; on the contrary, it’s just a matter of a few clicks and boom! You get a wonderful surprise on your credit card bill at the end of the month.
What kind of information is being shared?
Undisclosed and inappropriate sharing and transmission of private data within apps is collected and used in apps on a regular basis – including apps for kids. In the FTC survey, more than four hundred different apps for kids were actually downloaded and put through their paces. Staff reviewed disclosures, links on promotional pages, app developer websites, and any information from within the app itself, as well as interactive features, the ability to make in-app purchases, and how the app collected and used information.
Overall, their findings were shocking. Only 16% of all apps reviewed out of four hundred actually provided any sort of links to privacy documents or other disclosures before download. This kind of information is meant to be collected prior to downloading, since once an app is downloaded, it’s already paid for and the information is on its way to third parties.
Out of the four hundred apps reviewed, only 20% contained any sort of privacy-related disclosures on the app’s promotion page within the app store, the developer website, or within the app itself. And what about these privacy policies? Most were unreadable legal documents that were difficult to decipher and filled with information that made no sense to anyone without years of legal training. Many of them also lacked the most basic of information, lost in the legalese: what information would be collected, the reason behind collecting that information, and what parties will use that information.
59% of apps in the FTC survey were found to actively transmit user information to the app developer or another third party. The most common piece of information shared with the user’s device ID. This bit of data was disseminated to ad networks, developers, analytics companies, or another third party entity.
What are device ID’s, and why are they important? They are short strings of data that uniquely identify specific mobile devices. Most electronic devices have multiple device IDs and each string contains different identifiers. These seemingly innocuous bits of data can tell companies, developers, and other third parties information such as the device model, carrier, what OS they are using and which version, and any sort of language settings. They can also divulge more personal data, such as the user’s name, phone number, email address, list of friends on multiple social networks, and where they are geographically.
The problem with collecting device ID’s comes when they are linked together from multiple apps. This data can be associated to other data on different devices with the same device ID, and can be used to compile complete profiles about individuals. The FTC survey found that some companies collect device IDs through apps, which can then be linked to other information about the same use through other apps they have downloaded on the same device. An app might transmit just one piece of information, but in every instance where an app is giving out this information, it’s also giving out a unique user device ID, which means that any third party that receives it can add it to data previously collected through other apps running on the same device – and that’s’ where it really starts to get murky as far as user privacy. Basically, it’s a slippery slope that unscrupulous companies and individuals can use to construct a frighteningly complete profile about an individual – without their knowledge or permission.
In-app advertising and purchases
When’s the last time you downloaded an app that actively disclosed that the app contained advertising? Probably never. Most apps do not disclose that they in fact contain any sort of ads, and those that do disclose advertising have been found to be misleading; they do actually feature ads, some of which are meant for mature audiences.
Content in ads and data collection from these ads should have full disclosure before downloading, especially apps aimed towards children. The same goes for in-app purchases. Many apps, especially gaming apps, encourage users to purchase additional game content and pieces though in-app purchases. There’s nothing inherently wrong with this practice. However, the ability to make these purchases is not often disclosed, especially for kids’ apps, which means that parents often get a fun surprise on their credit card bill at the end of the month. There are also not good disclosures on the how’s and why’s of in-app purchases: why should I buy this? Is this a recurring charge? What kind of authorization is needed? Is there a refund process?
As many parents have found out the hard way, the barrier to in-app purchasing is set way too low within apps. The apps that are marketed as free are some of the worst offenders, since that’s how they make their money. Better privacy disclosures for in-app purchases must be adapted as the app ecosystem continues to grow.
Many apps these days have links to social media services to users can share what they’re doing: drawings, high scores, images, etc. For apps aimed towards kids, this can be problematic: kids don’t always utilize common sense when using social networks, and they could potentially give out identifying information to people who don’t have their best interests in mind. This kind of interactivity is rarely disclosed before downloading apps in today’s marketplace. It’s difficult to underemphasize how important it is for apps and app developers to make sure there is a thoughtful framework in place in regards to social networking, especially for young children who are too easily exploited.
Obviously, as we’ve talked about in this and in a previous article on app privacy, significant discrepancies exist between privacy disclosures and what is actually happening once an app is downloaded. Most apps offer very little to no privacy boundaries, and the pathway to figuring out how information is collected and transmitted is filled with obstacles. This problem is especially true and troubling when applied to apps aimed towards children, since these apps are using potentially using information belonging to minors. Developers can be the heroes in this scenario, and they are the ones who can actively work to make it better:
- By minimizing any potential risks to personal information within the app itself
- By giving users straightforward choices about possible data mining practices
- By giving users the information they need about how their data is collected, used, and shared, both within the app and without
How do you feel about the current state of privacy and apps? Not enough, too much, or somewhere in between? What would you do to make apps safer for kids, especially? Share with us in the comments section.