How Do We Keep The Digital Playground Safe In 2022?
In 2032, there will be more people online than in any other year. We are constantly being bombarded with new technology and ideas that have the power to change how we live our lives. How do we keep this digital playground safe from abuse? What is society capable of if it was free from social barriers or government interference?
Nick Woodford, Anzu’s Content Manager and Copywriter, and Brenna Schaaf, Kidas’ Director of Marketing, collaborated on this piece.
Although video games have always been popular among children, the sheer number of titles available, the ease with which they can be obtained, and the growing number of immersive and engaging free-to-play titles have made gaming an extremely popular pastime in which children not only play but also socialize, create, learn, and in some cases earn money. According to a recent research, 93% of boys aged 8-11 and 79% of females aged 12-15 indicated they had played a video game in the previous month.
With so many young players flocking to gaming platforms, game developers must guarantee that this area remains safe for children while still being enjoyable for adults. But what should they be aware of? And how can they guarantee that their games are safe places for players worldwide by safe-proofing them? We’ll go over some of the important areas of concentration in the next paragraphs, such as in-game money issues, cyberbullying prevention, and the necessity of collaborating with the correct partners.
Ensure that your communication lines are impenetrable
When we think about bad communication in games, we frequently imagine someone yelling obscenities at another player who has just killed them in the game over a headset. However, game developers should keep an eye on bad actors, particularly if their games are aimed towards a younger audience.
Internal chat systems, where players may communicate through text messaging, are another form of negative communication via gaming. Many online games, like Roblox, include built-in safety features, such as AI-powered technology that monitors discussions in real time. Developers that lack the capacity to design their own security measures might rely on third parties. ProtectMe by Kidas’ revolutionary context understanding technology, for example, uses chat platforms to safely identify cases of cyberbullying, predation, and toxic gaming behavior.
Choosing the best technique of monetization
For game producers, cyberbullying is a continuous problem. With the introduction of new technologies and methods to interact with games, creators must pay great attention to their platforms in order to avoid any issues. One area where this is becoming more prevalent is in-game purchasing, where children are singled out for not having the same skins as their peers or for wearing just the free ‘default’ choice. In one case, a teacher at a UK primary school revealed how one of his kids was ridiculed because he didn’t have a premium Fortnite skin, and how he “begged his parents for [money] to get a skin since no one would play with him.”
Paid DLC, in which players must pay money to get access to additional levels and material, may lead to children being excluded from the game since they are unable to access the same regions as their classmates.
Loot boxes have been highly condemned by both parents and gamers, but for different reasons. According to studies, this kind of in-game commercialization might lead to gambling-related poor spending behaviors. Several countries have imposed rules to protect both children and adults from this sort of monetization in order to eradicate this practice. China was the first to advocate for this by compelling game developers to disclose the chances of treasure boxes. In light of legal and public scrutiny over how they are utilized, gaming platforms and big publishers have embraced this strategy in the United States.
So, what can we take out from this? Game producers should carefully evaluate the best ways to monetize their games. They should monitor how they are employed and the influence they have on their players’ experience once they are in play. Anzu assists creators in integrating blended in-game adverts that compliment gameplay and add realism, enabling them to commercialize their games without sacrificing the user experience.
Getting rid of offensive material
Rating a game used to be simple, and it let parents understand what themes, stories, and material their children would be exposed to.
This procedure has become more difficult due to the proliferation of user-generated content through open-world multiplayer experiences like Fortnite, Minecraft, and Roblox, where players may jump between multiple games on the same platform. All of these platforms, on the other hand, have tight criteria for vetting and verifying any experiences made on the platform to ensure that they are appropriate for their target consumers. Developers producing content for these worlds should make sure that it complies with all applicable laws and that there are no portions of the game that might damage or negatively effect users under the age of 18.
This is equally true for game creators working on other platforms. With the vast number of games currently accessible, it may be daunting and impossible for parents to vet each game their children play. Developers have a responsibility to ensure that their names accurately convey what players may anticipate from them and who they are appropriate for. This is especially true if they anticipate that their games would appeal to a younger audience.
Considering the future
As gaming becomes more popular and children spend more time online, we all have a responsibility to safeguard future generations from dangerous and improper behavior and information. One approach for game creators to achieve this is to collaborate with the correct partners who are aware of these challenges and can assist with any concerns or roadblocks they may have.
This article was co-written by Anzu and Kidas. Anzu is an award-winning in-game advertising system that enables game producers to safely and securely monetise their products by displaying blended in-game adverts that sit in the background of games, complimenting gameplay and boosting the realism of the experience. Kidas is a service that provides notifications to parents when their children are exposed to bullying, online predators, sexual material, hate speech, and other harmful behaviors in games. Its technology can be integrated in practically any mobile or PC game.