top of page

No Collections Here

Sort your projects into collections. Click on "Manage Collections" to get started

EA Hackathon

Enhancing interaction between gamers by reducing instances of harassment

Challenge

I had the opportunity to participate in a 24 hour hackathon for EA Games, where we were challenged to develop a solution that enhances interactions between gamers. I worked in a team with a data scientist, two web developers and one other UX designer.

Solution

We choose in-game harassment as our problem space, focusing on EA's popular game Apex Legends. Our solution was to inform players of users returning from a ban which had resulted from repeat harassment, and highlights users' agency to mute players' voice chat should they want to avoid potential re-offenders.

Team

2 x UX designers

Duration

24 hours

2 x Web developers

1 x Data scientist

Role

UX Designer

U/I Designer

Project Manager

Gamer Interaction

The problem space

Online gaming is a medium where player interaction is ubiquitous. 

But what does it mean when over 70% of users are still experiencing harassment while playing games online?  How can we help decrease instances of player harassment in order to maintain a more enjoyable environment?

EA maintains many systems which address in game player harassment - what kind of intervention can fit into their system while making an impact worth its effort in a 24 hour time frame?

What are we aiming for?

User goals, business goals

To the player, lowered instances of harassment means a healthier environment, and a healthier environment lends itself to more time spent having fun. To the business, lowered instances of harassment means less interruptions to the player experience. 

We can measure the success of these goals by witnessing a statistically significant change to the number of reported instances of in-game player harassment. Further, an increase in player retention might be reflected.

Narrowing the scope

Ideating our intervention

Our Exploration of the problem space focused on highlighting instances where we might find a high number of negative experiences with users, looking specifically for areas of intervention where solutions could be most realized within a short time.

 

Exploring instances where users interact during online gaming, harassment was a reoccurring area affecting a wide range of players. Digging deeper we found that instances of harassment are dramatically reduced when offenders were visibly flagged, and continue to lower as user agency is introduced to the flagging system.

Moderation Graph.png

Do You Care Who Flagged This Post? Effects of Moderator Visibility on Bystander Behavior
Journal of Computer-Mediated Communication, Volume 26, Issue 5, September 2021,

We decide to focus efforts here and to find a solution that would lower player harassment. It would be easiest to do this in an environment that contained one or more vectors where harassment could occur, and to choose one to focus on. We chose the widely successful PC game Apex Legends, a high intensity team based game where communication is key, our chosen vector was voice chat.

Target aquired

Applying our solution

Our solution was two pronged. Introduce flagging by the system by flagging users that had already been banned by EA's existing moderation system upon their return to the game, indicating that the user is returning from a ban that resulted from harassment.

Then, to highlight players' existing agency by reminding them of their ability to mute other players.

We made the decision to not also allow players to flag, as this can easily be exploited by either party. The aim is to highlight players' existing agency without allowing them to partake in direct moderation.

Matchmaking lobby - first instance of player interaction

We want to illicit a "what's that" reaction, the feeling you get when you want to mouse over something on screen. The flag should be easily visible but not distract from gameplay. It should be visible from the game lobby and once the round starts, so that information remains consistent.

In-game

Recreating EA's design system, we designed a warning symbol to display next to usernames in game and in the lobby, and a text box pop up that appears when the user hovers next to the symbol over another username denoting it's meaning, and outlining the mute button below, highlighting action the user can take.

In-game player menu

Results

New achievement

Our solution is a success if there is a statistically significant negative change in reported cases of in game harassment. A high degree of success would see a decrease in player bans resulting from harassment. A/B testing would be implemented in the form of player surveys in order to measure the success of this solution.

bottom of page