ESRB proposes facial recognition as new form of age verification
Update: This story has been updated with quotes from the ESRB regarding its proposed facial verification technology.
The ESRB has issued a new proposal to the FTC calling for the approval of facial recognition. The video game ratings body believes the technology can better help verify that anyone attempting to buy a mature-rated game is a legal adult.
Known as the Privacy-Protective Facial Age Estimation (and first spotted by GamesIndustry), the proposal is made in collaboration with the digital identity firm Yoti and Epic Games subsidiary SuperAwesome.
Though the ESRB has made an effort to outline a game's content on physical boxes, age verification to buy digital games is incredibly easy to bypass: all one has to do is put in a birthdate older than 18 years, often without any real follow up.
In its proposal, the ESRB said the Age Estimation "offers parents an easy way to provide VPC through a quick process, without needing to provide extensive personal information, in line with data minimization principles." It also noted that this verification method would be especially useful for guardians without government ID.
To buy a game, a user would have to take a picture of their face right at that moment to submit for verification. Photos that don't meet the required quality will be rejected, while accepted photos are submitted to Yoti's backend for age estimation. After the verification, the photo is said to be "immediately, permanently deleted."
Beyond it being generally easier than pulling one's ID out of their wallet (if they have one), it's hard to glean if there is much of a substantial benefit to facial verification. Software can often fail for any random reason, and having it dictate purchases may end up causing more trouble than it's worth.
Despite how considerably easier it is, facial recognition carries several risks. For phones with FaceID, it could be very easy for someone (like a police officer) to access them with no effort. The software has also previously had trouble identifying people of color, so its bias (and that of its developers) may do more harm than good.
The ESRB did acknowledge that there were potential risks in the software, but the ratings board argued that "parents remain in control of what information they choose to share. [...] Any risk is easily outweighed by the benefits to consumers and businesses of using this method."
Update: A spokesperson for the ESRB released a statement to Game Developer, disputing the earlier characterization of the technology. It argued that the application of its Facial Age Estimation is not "to authorize the use of this technology with children."
The statement continues by stating the facial scans fall under accordance with U.S. law and "[confirms] if the person who is verifying consent is 25 years or older." That specific age threshold was listed by the ESRB "to prevent teenagers or older-looking children from pretending to be a parent."
It concluded its statement by repeating that "any images and data used for this process are never stored, used for AI training, used for marketing, or shared with anyone."
About the Author
You May Also Like