A Call for a Better Adult Verification System

“How many of you guys have a cell phone?” [most hands go up]

How many of you guys have an account on Instagram or Snapchat?” [nearly the same amount of hands go up]

This gymnasium of 4th through 6th graders in Northwest Kansas was just another example of our failed Adult Verification System. “Age gating”— the industry term for asking users their age when signing up for an account on a website that collects personally identifiable information, aka every social media app on the planet, is horribly broken.

The intent of age gating is to comply with COPPA— a federal law that prevents the collection of personally identifiable information of those 12 and under. More technically, age gating helps social media apps get out of very expensive COPPA compliance because they don’t have to maintain a system whereby parents can get access to all of the information collected about their children.

See, creating an account on a social media site is not about capabilities or gaining parental permission. (see When Should I Allow My Child to Get a Social Media Account?) Creating an account on a website or app that collects personal information is about age… like driving a car or voting or buying alcohol.

The minimum age to create an account is 13. But the system for verifying users is virtually non-existent.

An Age Old Cycle of Blame

You can’t tell me that app manufacturers don’t know that they’ve have tons of underage users. It’s rampant and they definitely profit from their usage and data.

Whenever I do assemblies with elementary-aged kids lots of them have questions about apps that they aren’t old enough to use.

And when I talk to parent groups– most of the parents readily admit that their 9 or 10 year old has an account on Instagram, Snapchat, Twitter, or another app that they aren’t old enough for.

When questioned about this, most app manufacturers simply blame parents. They cite quotes from research like this:

“Our data show that many parents knowingly allow their children to lie about their age — in fact, often help them to do so — in order to gain access to age-restricted sites” — Dana Boyd

Source

The problem with app developers blaming parents? The Federal Trade Commission doesn’t make age verification the responsibility of parents… it’s the website or apps responsibility:

If you operate a commercial Web site or an online service directed to children under 13 that collects personal information from children or if you operate a general audience Web site and have actual knowledge that you are collecting personal information from children, you must comply with the Children’s Online Privacy Protection Act.

Source

COPPA requires those sites and services to notify parents directly and get their approval before they collect, use, or disclose a child’s personal information. Personal information in the world of COPPA includes a kid’s name, address, phone number or email address; their physical whereabouts; photos, videos and audio recordings of the child, and persistent identifiers, like IP addresses, that can be used to track a child’s activities over time and across different websites and online services.

Source

See, in regards to COPPA compliance from a legal perspective, the burden isn’t on parents. (Though we all agree they should play a part in compliance.) It’s on the operators of the sites which collect personally identifiable information.

We Need a Better Age Verification System

This slideshow requires JavaScript.

Most apps deploy a simple script on their account creation page which acts as their Age Verification System. They call it Age Gating  because the app is only going to check once, it’s a gate. Provide a birthdate older than 13 years old one time and it’ll never ask again.

Technically speaking we’re talking about a few lines of code, asking for a user’s birthdate, and that’s it. If you tell Instagram or Facebook or Snapchat you are under 13 years old it doesn’t allow you to create an account. But standard age gating doesn’t prevent the same IP address or even browser session from submitting the exact same information with a different birthdate. Deploying a blacklist isn’t that hard to do. It’s simply blacklisting an IP address or device ID to prevent it from changing the birthdate to get past the gate. If app developers wrote a script that told users they weren’t old enough and then prevented them from trying to submit the form again for 24 or 48 hours, most would go away.

[box type=”note” size=”large” style=”rounded” border=”full”]Most of these same sites use a similar blacklisting technology to prevent spammers and bots from creating hundreds of accounts from the same device or IP address. So the technology already exists and is deployed, just not applied to age verification.[/box]

[box type=”note” size=”large” style=”rounded”]Did you know Instagram and Twitter don’t even ask users for an age? How is this COPPA compliant? The operators make no attempt to keep underaged users off their application.[/box]

A Call to Develop On-Going Age Verification Versus Age Gating

Rather than simply age gating a social media app which has done nothing to prevent underage users from gaining access to sites intended for older teens and adults, I would like to see a developer create an on-going age verification system that not only prevents underage users from creating accounts but also flags on-going underage user activity

For instance, many app and device manufacturers (most notably Facebook with facial recognition on images and Apple with fingerprints) are using biometrics to identify users. (see Biometric Technology Takes Off) So the technology is available to identify users based on their physical biometrics, like if they use their finger to unlock their phone, or if they post a photo of themselves on Instagram but no effort is put forth by these companies to verify that users are of legal age to use their services. For instance, if a 10 year old creates an account on Instagram saying she is 16 then later tags themselves in a selfie with her mom whom she also tags in the photo… why doesn’t Instagram’s API correlate that data to the tagged parties Facebook account where the mom correctly lists the child’s age as 10? It’s not that the technology doesn’t exist, it’s that Instagram– a Facebook company– has no will to do so.

Similarly, we know that technology exists for apps to flag content based on geolocation, keywords, or message content. If an app developer were truly interested in on-going Age Verification the technology exists to limit the collection of personally identifiable information to those 13 years or older but the will to do so does not exist.

Instead, most app developers take a passive approach, ignoring [and profiting from] underage users and only dealing with it when reported. It’s not seen as a legal compliance issue, it’s usually seen as a community management issue. 

[box type=”note” size=”large” style=”rounded” border=”full”]I’ve made the same argument about Snapchat’s lack of will to prevent the creation and distribution of child pornography. Could they not deploy a technology that detected nudity and prevented someone under 18 years of age from sending or receiving illicit images? Of course they could. They don’t have the will to do so because ultimately the illegal behavior is driving usage and making them rich![/box]

A Call on the Federal Trade Commission to Force Operators to Comply with COPPA

14.    Will the amended COPPA Rule prevent children from lying about their age to register for general audience sites or online services whose terms of service prohibit their participation?

No.  COPPA covers operators of general audience websites or online services only where such operators have actual knowledge that a child under age 13 is the person providing personal information.  The Rule does not require operators to ask the age of visitors.  However, an operator of a general audience site or service that chooses to screen its users for age in a neutral fashion may rely on the age information its users enter, even if that age information is not accurate.  In some circumstances, this may mean that children are able to register on a site or service in violation of the operator’s Terms of Service.  If, however, the operator later determines that a particular user is a child under age 13, COPPA’s notice and parental consent requirements will be triggered.

Source, emphasis mine

The Child Online Privacy Protection Act belongs to the Federal Trade Commission. The will of social media operators to comply with COPPA is not there, therefore it is up to the FTC to force operators to comply.

I’m calling on the Federal Trade Commission to do two things:

  1. The law allows fines up to $16,000 per instance depending on the level of egregiousness. It’s time the FTC brings social media operators to account. Operators are knowingly allowing underage users to create accounts and use their services. They are choosing to not deploy existing technology to automatically flag and remove underage users. If the will to comply does not exist, fine them.
  2. I’m calling on the Federal Trade Commission to force large social media operators, those with over 1,000,000 user accounts, to deploy technology which actively verifies user age, automatically flagging potential underage accounts for removal or compliance with parental notification rules.

 What Can You Do?

If you are like me and believe that social media operators should actively comply with the intent and letter of the Child Online Privacy Protection Act (COPPA) than I’m asking you to consider the following:

  1. Raise awareness of the intent & boundaries of COPPA among the parents and children in your life. Ask them to read COPPA.org or this article for parents from the Federal Trade Commission. Have a discussion about not just the rules… but also why it’s important. 
  2. Report underage accounts to social media operators. (see this post for links to social media operators reporting systems)
  3. Report offending operators to the Federal Trade Commission. If you have reported an underage user and the operator has not removed the account, report it to the FTC for investigation.
  4. Raise awareness among parents and other adults by sharing this post on social media sites.

Posted

in

by

Tags:

Comments

Leave a Reply