Navigating Kids' Data Privacy: Essential Guidelines for Developers and Brands
Learn about the evolving landscape of children's data privacy laws, including COPPA, GDPR, and AADC, and how to create compliant digital experiences for youth.
File
KidAware for Developers - Compliance with Childrens Data Privacy Laws
Added on 10/01/2024
Speakers
add Add new speaker

Speaker 1: This module is presented to you by Super Awesome, the world's leading kid tech company. Super Awesome's mission is to make the Internet safer for children by helping brands and developers make their engagement with kids compliant and responsible. Super Awesome builds tech to that end, for advertisers, creators, and developers, including our Kids Web Services product, which enables developers to manage parental verification and consent. Today, we will provide an overview of the kids digital landscape and the relevant rules and regulations that apply. Understanding your responsibilities is vital in order to safely navigate the space to reach your intended audience. Should you wish to learn more, KidAware offers full trainings, which will go into much more detail. You can request instructor-led training or find more materials on kidaware.com. We know that most companies are well-intentioned and want to make cool products and engage child audiences. Even if you don't intend to have a service for kids, you may unintentionally find yourself in a situation with kids where regulation applies. It's important to give an honest assessment of your audience and continue to do so as your products and services evolve. During this presentation, we will discuss the kids data privacy landscape, as well as how to create compliant experiences for youth audiences. Please remember, we are not your lawyers, and this module does not contain legal advice. It provides a general introduction to the kids data privacy landscape only, and should not be construed as guidance to be applied to any specific factual situation. Any questions on how any of the laws and regulations referred to in this module may apply to your company or your company's products should be directed to your legal counsel. Let's start off by talking about the children's data privacy landscape. The approach to children's data protection has evolved considerably and is still an area that is continuing to change. Twenty years ago, the only children's privacy regulation was the Children's Online Privacy Protection Act, also known as COPPA. COPPA seeks to put parents in control of their children's information. Under COPPA, if your service is directed to kids, or you have actual knowledge that kids are there, you must get parental consent before collecting their personal information. Parental consent must be verified and must be obtained by an approved method enumerated by the Federal Trade Commission or FTC. This step adds friction to your service because kids have to stop to get a willing parent, have them prove their adult status, and give consent. COPPA is currently being reviewed by the FTC and is expected to be strengthened. Eighteen years later, the European Union passed the General Data Protection Regulation, or the GDPR, which builds on COPPA. It recognizes that some types of data may be personal only when processed for a certain purpose, and it allows for other legal bases besides consent. Like COPPA, when it comes to kids, GDPRK requires parental consent before the collection of PII for certain purposes. The law's principles-based approach is different from a parent-centered paradigm like COPPA, and instead requires companies to respect certain rights of the child regardless of parental involvement. Countries around the world have followed suit, passing similar regulations. For example, Brazil, China, and South Korea. US states have as well, California being the most prominent example. All of these varying laws have created a patchwork of requirements that companies must follow if they want to engage with children responsibly. Doing so can be complicated since the laws can sometimes be at odds with each other. Then most recently, we have the UK's age-appropriate design code. It's not technically a new law, but rather a code built on top of the GDPR, so it has the force of law. The effects of the AADC are being felt on a global scale. As companies realize, they will have to comply with it across Europe, and as politicians and regulators around the world are talking about emulating it. The AADC takes concepts from GDPR one step further and looks at what features we design using personal data. It requires operators to ensure those features are in the child's best interest. We'll discuss what that means a little bit later. As new laws have emerged to protect kids' privacy, we've seen more and more developers coming into their scope. COPPA initially only applied to child-directed sites, or where the operator had actual knowledge that children were accessing the service. This was a fairly narrow definition, and it worked for a while. But the Federal Trade Commission has cracked down heavily on sites that primarily target adults as we saw when it fined TikTok and YouTube for allegedly violating COPPA by illegally collecting personal information from children without their parents' consent. GDPR dramatically expands the universe of regulated experiences. If a service is accessible by kids, it can be considered a service offered to children. As a result, many casual games like Angry Birds and Candy Crush had to start thinking about how to protect their younger audiences. The AADC takes it further still. The ICO has clearly stated that the GDPRs offered to a child means any service likely to be accessed by children, and likely equals more likely than not. The other trend is that more children are being covered by new data privacy laws because the threshold age under which you should obtain parental consent has been on the rise. COPPA and Brazil's Privacy Act mandate under 13, but laws keep raising the age of consent. China's PIPA sets the age at 14. GDPR defaults to 16 with the ability for individual countries to lower it as young as 13. For example, Germany and Ireland are 16, France is 15, Spain is 14, and so forth. California's new privacy law, the CCPA, thought that this was a good idea and implemented protections for kids ages 13-15 as well. The AADC applies to all kids under the age of 18, and says even if you're engaging with kids over the age of consent, you still have to take their best interests into account. So how have operators reacted to these trends? Well, their approach has been shifting. Historically, platforms tried to keep kids off their services, but the kids came anyway. So they are having to figure out how to give kids access and embrace solutions for them specifically. Companies should embed privacy by design principles early in the design phase to ensure compliance with these laws. This is much easier than retrofitting a product down the road. AADC came into force this past September, and it has required companies to rethink a lot of their design choices and features with kids in mind. It is why we saw lots of changes from big platforms like Google and Facebook, setting accounts by default to private, turning notifications off at night, implementing bedtime reminders, etc. The code sets out 15 standards that anyone should follow when engaging with kids, or if their service is likely to be accessed by anyone under 18. It's paramount principle is to always act in the best interests of the child. AADC raises the bar, focusing on the child's own rights. The code also sets out several design recommendations, like high privacy by default, and removing manipulative mechanics like autoplay and autoscroll. To touch on some challenges in particular for developers to consider, parental consent alone doesn't make you compliant here. You should also think about the child's best interests when considering certain features that may cause harm. The code requires you to think about appropriate default settings for different ages, kids, tweens, teens, and require certain things like geolocation to be off by default always. The code prohibits certain practices like inappropriate nudging to lower their privacy settings. You have to design features with kids' best interest in mind. For example, rewarding users for playing for longer without an easy way to pause, or not allowing users to save their progress, are likely to be considered inappropriate. Instead, the code asks providers to consider issues such as the need for screen breaks and general user welfare not directly related to privacy. There are also other practices that are under intense scrutiny by regulators, like the use of targeted advertising, loot boxes, and recommendation algorithms that you should be aware of. These developments don't stop in the UK. The AADC is being copied around the world. For example, the Irish Fundamentals reinforce the code and add a greater focus on enabling the child to exercise their own rights. On the other hand, the new Dutch Code for Children's Rights pinpoints the risks of in-app purchasing offers in children's games and imposes some guardrails on how these are presented. US politicians have taken notice and are increasingly agitating for similar concepts, such as the best interest of the child and high privacy by default to be factored into regulatory action by the FTC or to form part of new legislation. Let's go through some of the current compliance requirements and discuss what they may mean for developers. How your service is classified under kids' data privacy laws determines how you should treat visitors. First, you need to identify the age of your audience in order to implement the proper protections. If your content is appealing to kids, it is likely to be child-directed. COPPA sets out a list of factors to help you decide whether your service may be attractive to kids, but it's a subjective decision. If you decide it is directed to kids, it means you must treat all visitors as children. No data collection from those under the age of digital consent, and don't drive them to e-commerce or social media destinations. If your site or app's target audience is not kids, but still attracts a sizable under 13 audience, under COPPA, it's referred to as mixed audience. Under GDPRK, it's likely to be considered accessible to children. For mixed audience sites, you can segregate kids with an age gate. Note that you may not block under 13 users, but instead you can provide them with an experience that does not collect PII, or you can obtain verified parental consent to allow data collection. You want to be clear about your intent and show you know how to distinguish between kids and adults. Your age gate must be neutral. This means no tip-off language that hints to a child what the age threshold is that would make it easy to circumvent. The kid side should be appropriate for kids, no tracking, no hard-sell messages. The parent side should have a grown-up feel and not be enticing to kids. It can have links to retail, prices, social media, and trackers. An important principle under the GDPR and many emerging laws, is that you should always minimize personal data collection. All technical identifiers, such as IP address and mobile identifiers, are likely to be considered personal information under kids data privacy laws. User-generated content or input text, even if optional, presents an opportunity for children to provide their personal information. If you have chat or content upload, you may wish to consider moderating it to filter out PII. There are systems available which may enable user-generated content features without the need to obtain parental consent. Platforms such as PopJam or Roblox filter-free text chat, use such moderation technology partners. If you do have features that require you to collect personal information, you may need to obtain verifiable parental consent, also known as VPC. That can be challenging, but it's getting easier all the time. Superawesome has a number of tools and solutions for developers to make that as frictionless as possible. Make sure you are transparent with your users. The privacy policy should be in clear language that even non-lawyers can understand. It must also address kids data specifically. Consider also that certain laws may require that your privacy practices are digestible by young audiences. For these instances, we suggest using simple language and considering just-in-time privacy notices or bite-sized explanations. It's important to remember that you are responsible for any data collection on your service including that from third-party partners. You need to make sure that any partners or plugins you include are fully capable of respecting kids privacy laws, that they understand you have a child audience, and that you have the right contractual assurances on any data they may collect from your users. Consider a safe harbor seal. The FTC actually has a program called Safe Harbor. A company will audit all your processes for you to ensure COPPA compliance. If you get certified, you have some legal protections and can demonstrate to consumers, regulators, and activists that you are taking kids privacy seriously. Obtaining VPC is required under a number of privacy laws today. This concept was invented by the Federal Trade Commission and was picked up by GDPR and other data privacy laws globally including Brazil. The idea is that you get consent from the parent and must do something to verify it's the parent and not a child merely impersonating a parent to gain access. COPPA enumerates several methods to verify parent identity. A few examples are taking a small credit card charge, verifying a parent's social security number or similar identification number in another country, and checking it against a national database, or getting a signed consent form scanned or mailed back. The point is to prove that the person providing consent is an adult. These methods are imperfect, but this is what regulators are accepting. There is innovation in this space, however, and we expect new methods may emerge over the next few years. You should be reviewing your products for any feature that collects data or allows PII to be shared. Here are some examples of features that may require you to obtain verifiable parental consent. Most features that use photos are considered to collect personal information thus requiring consent. Allowing kids to share pictures or videos with camera filters, for example. Even images that don't contain children in them may be PII because the file may contain metadata that has details that are personally identifiable such as geolocation, which if it is as granular as street and city, is considered PII and thus require consent under the laws. Augmented reality also often makes use of geolocation, which then may require consent. Next, an e-mail address is considered personal information. Collecting it for marketing purposes requires consent unless it falls under specific exceptions. If you want to host a competition like a sweepstakes or contest, usually you need VPC to have kids submit personal information for entries and be able to follow up with them to distribute prizes. Further, if you want kids to submit drawings or stories or videos, those could all also contain PII as well. Lastly, if you have a feature to help users find your store or product that collects geolocation, you'll need to get consent first as well. Text chat, if not pre-moderated, may allow kids to disclose PII, thus potentially requiring VPC. Voice chat is also likely to require VPC. Not only is a child's voice considered PII, but minors can reveal PII through the chat. Technology does not yet exist to moderate this in real time. Advertising that would make use of profiles or target based on behavior or interests of specific users requires consent. In games that match players with similar interests or skill sets, there is often profiling and tracking in ways that trigger a requirement for parental consent. Finally, push notifications may require consent in some instances. Here is an example of the flow that a child and that a parent would typically go through when obtaining consent and going through verification. This is the way our KWS solution works. We recommend that you allow gameplay at the outset that doesn't require collection. We call this progressive permissioning, and it means you allow the child to explore as much of your experience as possible. Only when you get to a feature that requires PII, you trigger the consent flow. This is the stage at which you're first talking to parents. The child provides a parent e-mail and you send them a notice explaining your proposed data collection practices and requesting their consent. Once the consent has been given, you would offer them a choice of verification methods. The objective is to maximize the number of parents that can find a method that works for them. Making the decision to implement a consent mechanism helps future-proof your product. If you're a developer, you're getting squeezed by two forces. Players expect amazing experiences with social features and multiplayer cross-play. If you're a kid, you don't expect your experience to be different. On the other side, you have legislation evolving to cover more young people and deepen compliance requirements within your product. An optimized consent management platform can help developers offset these two dynamics. Developers are happy because it makes the implementation process easy. Parents are happy because the process is clear and easy. Kids are happy because they gain access to experiences more often and seamlessly. Regulators are happy because developers are compliant. The metaverse will be a place where people of varying ages meet, play games, and take part in events just like they do in the physical world. It's important that developers have tools to ensure that they're able to deliver audience-appropriate experiences that are compliant with relevant privacy laws. By making KWS parent verification free, we hope to enable more developers to create safer digital experiences while empowering parents to make the choices that are right for their families. Thank you for taking the time to learn more about the kids' digital landscape. If you have any questions, please feel free to reach out to Paul Nunn or Nigel Mbandla.

ai AI Insights
Summary

Generate a brief summary highlighting the main points of the transcript.

Generate
Title

Generate a concise and relevant title for the transcript based on the main themes and content discussed.

Generate
Keywords

Identify and highlight the key words or phrases most relevant to the content of the transcript.

Generate
Enter your query
Sentiments

Analyze the emotional tone of the transcript to determine whether the sentiment is positive, negative, or neutral.

Generate
Quizzes

Create interactive quizzes based on the content of the transcript to test comprehension or engage users.

Generate
{{ secondsToHumanTime(time) }}
Back
Forward
{{ Math.round(speed * 100) / 100 }}x
{{ secondsToHumanTime(duration) }}
close
New speaker
Add speaker
close
Edit speaker
Save changes
close
Share Transcript