COPPA is a US Federal Trade Commission (FTC) statute – TITLE XIII – also known as the Children’s Online Privacy Protection Law. It was originally written in 1998 and had not been updated to account for the latest wave of digital kid’s content in the form of apps and other mobile media aimed at kids under 13. The latest revision, completed last year has just become the ‘law of the land’ for the USA as of July 1st, 2013.
App developers with a focus on children’s content and consumers (caregivers, teachers and other professionals that use apps with children) need to be aware of what COPPA does and does not require for protecting children from unwanted privacy violations within app content. So what does COPPA say, in a nutshell? According to the FTC’s website:
Companies whose apps collect, store or transmit this information, as well as other personal information previously covered by the rule like a child’s name or address, must get parents’ consent before collecting the information. In addition, companies must also ensure that any third party receiving the information can keep it secure and confidential, as well as abiding by new rules affecting how the information is stored and retained.
The rule applies not only to information collected by the apps themselves, but also to any information collected by third parties like advertising networks within the apps.
COPPA compliance is required by all app developers who’s apps are aimed at children 13 and under, so this is an important law for anyone publishing children’s content to consider. Even if you think you app or ebook is compliant or not gathering identifying personal information from children, you should definitely read this if you are responsible for publishing digital media for kids. A good resource for book publishers is, “What Every Kids’ E-Book Publisher Needs To Know About COPPA” at e-booksandkids.com. For app developers in general, and parents looking for a guarantee of COPPA compliance, see MomsWithApps.com’s “KNOW what’s inside program” which has an iconic badge for approved developers to show as “a signal to parents that your app is designed with privacy in mind.” It tells consumers that as app developers:
We care about parents and kids enjoying technology together. We have committed to tell you how our apps work, so you know what to expect.”
A nice summary of COPPA is also available for parents in the online kids’ social media site Everloop’s blog, by CEO Hilary DeCesar – including this useful note:
You can also continue to utilize safety controls across all devices to allow customization of your family cyber experience, including turning off location services on social sites and apps that children use. Setting limits for social media use should be similar to setting parental controls and limits on time spent watching TV or video games.
And nothing is more important than having frank discussions with kids about the potential dangers of over sharing information, such as addresses, passwords, locations and or any other identifiable information with peers or others they ‘chat with’ online, whether or not on COPPA-complaint sites.
While I salute congress for updating this very outdated set of protections, it is also a good reminder to parents and others who share digital content with kids that these protections are very basic and not nearly as ‘protective’ of children as we would provide ourselves. COPPA compliance does not protect kids from advertising, in-app links or purchases or provide any educational assurances about quality claims made by content creators. Caregivers still need to be active participants in their children’s selection and use of media in all forms for true protection. The teaching needed for kids to transition to making these decisions themselves about their personal information is also critical.
These rules do not provide any protection for young people from 14-18 who are still very vulnerable to exploitation by commercial interests in addition to their still-developing long-term decision making skills – something that can greatly impact the choice to share or not share personal information. There are also no current rules to address the fact that young people 18-24 are among the most vulnerable to something called ‘coercive monetization’ in apps, described well in this quote from Gamesutra’s blog post, “The Top F2P Monetization Tricks”:
Note that while monetizing those under 18 runs the risk of charge backs, those between the age of 18 and 25 are still in the process of brain development and are considered legal adults. It seems unlikely that anyone in this age range, having been anointed with adulthood, is going to appeal to a credit card company for relief by saying they are still developmentally immature. Thus this group is a vulnerable population with no legal protection, making them the ideal target audience for these methods. Not coincidentally, this age range of consumer is also highly desired by credit card companies.
Like the common advice during air travel safety presentations to put your own oxygen mask on before assisting your child, we must first get educated as parents and educators and then begin the process of helping our children become savvy consumers of digital media. It’s truly an unlimited buffet, but a lot of the content aimed at children and young adults is not especially nutritious and some of it may even be toxic.
How do you stay up-to-date about digital media for your kids? What advice would you give parents new to supervising their child’s digital content consumption?