Videogame ratings are far from perfect and the poor Entertainment Software Review Board is tasked with assigning moral values to the hundreds and now thousands of games that hit the market each year. They have apparently given up the soft touch of their human board of reviewers and turned some of the operation to the world of machines via digital question, according to the NY Times:
Faced with an explosion in the number of games being released online, the board plans to announce on Monday that the main evaluation of hundreds of games each year will be based not on direct human judgment but instead on a detailed digital questionnaire meant to gauge every subtle nuance of violence, sexuality, profanity, drug use, gambling and bodily function that could possibly offend anyone.
It’s not all bad, Seth Schiesel argues. In fact, it’s just a consequence of modern day commerce:
And yet in this digital age it is inevitable perhaps that a group that is paid to sort creative entertainment endeavors into neat and tidy categories based on content would eventually start outsourcing its mission to computers. After all, major companies, including banks, credit-rating agencies, Amazon.com, Netflix, and Google, have made it their business to reduce consumer behavior to an algorithm. For video games, the publishers’ answers to the questionnaire will determine the rating.
CORRECTION: The original headline implied that robots were playing videogames. That was incorrect. We are sorry.