Video Games

How EA’s Positive Play Initiative Wants to Move From Banning Toxic Players to Creating Positive Experiences

We’ve known for years that online gaming can be a minefield of poisons and bullying, especially for women. Moderation tools in general have been around for about as long, but it seems like major game companies have really realized their responsibility and power not only to stop this practice, but to actively create positive opinions. It started in the last few years. space.

just last month, We’ve seen Riot Games and Ubisoft partnering on such a projectWhen Xbox recently started providing data on moderation topics likewise. However, one company that has been publicly pushing this strategy for several years is EA through its Positive Play program.

The Positive Play program is led by EA’s Chief Experience Officer, Chris Bruzzo. He has been with the company for eight-and-a-half years, and as EA’s Chief Marketing Officer for six years, he came to this newly created position. He and his current CMO, David Tinson, struck up a conversation that led to Positive Play at EA.

“David and I really appreciate the need to involve the community in this, and the toxic nature of games and some of the really difficult things that are happening with the rapidly growing social community in and around games. We’ve been talking for years about things that need to be addressed,” says Bruzzo. “And for several years [in 2019], we held a summit at E3 to discuss how game companies and everyone else, players, and stakeholders have a joint responsibility to address hateful and harmful behavior in games. started talking about “

pitching positive play

EA’s Building Healthy Communities Summit It featured content creators from 20 countries, EA employees, and third-party experts on online communities and toxicity. Lectures and roundtable discussions were held, as well as opportunities to provide feedback on how to address the issues raised.

Bruzzo said he attended the summit and the feedback that followed made it clear that women in particular had a “pervasive bad experience” with social gaming. Women often reported being harassed and bullied for revealing their gender or hearing voices. But the response from Summit convinced him that EA was in a position to do something about it.

He sought out Rachel Franklin, the former head of Maxis who joined Meta (then Facebook) in 2016 as head of social VR. Bruzzo indicates that she has unfortunately had more relevant experience on the subject.

“If you want to find a more toxic environment than a gaming community, go to a VR social community,” says Bruzzo. “Not only does it have the same amount of toxicity, but my avatar can quickly appear and enter your avatar’s face, creating a whole other level that doesn’t feel safe or contained. .”

Led by Franklin as EA’s SVP of Positive Play, the group got to work. They are, positive play charter This is effectively a summary of the do’s and don’ts for social play in EA’s games. Its pillars include treating others with respect, keeping things fair, sharing clean content, and following local laws, and players who don’t follow these rules will have their EA Account deleted. It may sound rudimentary, but Bruzzo said EA could enhance the mitigation of bad behavior and make the experience more progressive and positive. We have formed a framework that we can start creating in.

Moderation Army

On the moderation side, Bruzzo strives to make it easier for players to report issues with EA games, increasingly using and improving AI agents to identify patterns of bad behavior and automatically alert them. It states that it is emitting Of course, we can’t rely entirely on AI. Real humans should see cases that are exceptions or outliers and make appropriate decisions.

Bruzzo names players as examples of how AI is facilitating the process. The player’s name is one of the most common toxicity problems they encounter, he says. It’s easy to train the AI ​​to ban certain bad words, but players who want to behave badly use symbols and other tricks to get around the ban filter. But with AI, we’re getting better at identifying and preventing these workarounds. Last summer they ran an AI check on his 30 million Apex Legends club names and removed 145,000 of his that were in violation, he says. No human could do that.

And it’s not just the name. Since the launch of the Positive Play initiative, Bruzzo says there has been a significant drop in hateful content on EA’s platform.

The moment your expression begins to infringe on someone else’s ability to feel safe…that’s the moment you lose your ability to do so.

“One of the reasons we are better positioned than social media platforms is [is because] We are not a social media platform,” he says. “We are a community of people coming together to have fun. So this is not really a platform for all political discourse. The moment someone else starts to feel safe and included, the environment is fair and it starts to infringe on everyone’s ability to enjoy themselves, that’s the moment your ability to do that is gone. Do it, it’s a community of people, players having fun together and this gives you a really big advantage in having very clear parameters so you can see the results We can get out and make real progress in reducing destructive behavior.”

That covers text, but what about voice chat? It’s notoriously difficult to coordinate what people say to each other in voice communications without violating privacy laws related to recorded conversations. ing.

Bruzzo admits it’s difficult. According to him, EA has considerable backing from platforms like Steam, Microsoft, Sony, Epic, and his holders whenever VC is hosted, as they can bring their toolset to the table. . But at the moment, unfortunately, the best solution is for players to block or mute the toxic communication, or remove themselves.

“For voice, the most important and effective thing anyone can do today is make it easy for players to turn off voice,” he says. “It’s the best we can do.”

Another way EA is working to make its games less toxic might seem a little unrelated. They are actively banning cheaters.

“If your game has bugs or has cheaters, especially if you don’t have good anti-cheat in competitive games, or if your anti-cheat lags behind, one of the root causes for a large percentage of toxicity is if: Players feel the environment is unfair,” says Bruzzo. “That they can’t compete fairly. But you love this game and you’ve put a lot of time and energy into it and it’s very upsetting, so it’s one of the best ways to reduce the toxicity of the game. , prioritized dealing with cheaters.”

good game

One thing Bruzzo really wants to say is that removing toxicity is just as important, but promoting positivity is just as important. And it’s not like he’s working from scratch. Gaming misbehavior is public and memorable, but the majority of gaming sessions are not harmful. They are neutral at worst and often already positive without additional help from EA.

“In less than 1% of our game sessions, a player reports another player,” he says. “Right now, with hundreds of millions of people playing our games, it’s still massive and we feel…the future of entertainment is interactive, so we have to work on this now.” No… but it’s important to remember 99 out of 100 sessions, it doesn’t result in players having to report inappropriate behavior.

As of 2022, the most common text comment among players is actually ‘gg’.

“And another thing we saw in Apex Legends the other day is that the most common text comment between players so far in 2022 is actually ‘gg’. ‘I hate you is not. It’s not blasphemy and it’s not competitive. It’s a “good game”. And in fact, “thank you”. “Thank you” in 2022 alone he’s been used over a billion times in Apex Legends alone.

“Lastly I would like to say vote for humanity I will never be offended again. That alone gives me hope.”

It is this positive spirit that Bruzzo hopes to foster in the future. I asked him what he would look like in 10 years if EA’s positive his play his initiative continued to be successful.

“I hope we’ve moved on from our biggest issue of trying to get rid of hateful or harmful content. Instead, we’re discussing how to design the game to be as inclusive as possible.” I think 10 years from now we will see games with adaptive controls, different onboarding and different servers for different playstyles. You’ll see us create something, not just cosmetics, but actually create playable objects in games, and all of that will be positive content, positive play environments, positive You will benefit from all the work we do to create a strong social community.”

Rebekah Valentine is a news reporter for IGN. you can find her on her twitter @duck valentine.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button