Sponsored By

As subscription MMO City Of Heroes launches its Mission Architect expansion, allowing user-generated stories and quests, Paragon Studios' Morrissey discusses the logistics of creating policing for the system

Joe Morrissey, Blogger

April 16, 2009

10 Min Read

[As subscription MMO City Of Heroes launches its Mission Architect expansion, allowing user-generated stories and quests, Paragon Studios' Morrissey discusses the logistics of creating policing for the system.]

Am I allowed to say penis in this article?  If not, how would you stop me?  Maybe I could get past your profanity filter by spelling it peenis. Or I could get creative and simply write in euphemisms about my third leg, my love tool, or my joy stick. Seriously, if I really wanted to say penis, how would you stop me?

If you're planning on making a game with user-created content you are going to have this conversation... a lot.

For those of you not keeping track, City of Heroes, the first massively multiplayer game set in the comic book genre, has just unveiled a new game system: Mission Architect.

In short, the system lets players create their own stories and then share those stories. These stories are rated by the players and the best ones garner prestige and in-game rewards.

This system is tied directly into the lore of the game through a company called Architect Entertainment, offering virtual world experiences.

This world-within-a-world format allows players to take their in-game characters and walk them into a virtual environment where they can create and play their own adventures.

Even though the Mission Architect feature has done many things right, there is one question that always, and I mean ALWAYS, comes up:

How do you avoid the beef stick?

When this conundrum presents itself, like it did for us, the quick answer is to simply throw money at it. Have Customer Service vet all content before it goes live. As exorbitantly pricey and time-consuming as this process would be, there are actually companies out there that do it.

In these situations, the games are usually those that deliver content to children, where turnaround time isn't an issue, and there aren't a lot of people submitting material. However, in any other situation this isn't a viable option.

At the instant of Mission Architect's launch, the players will outpace any Customer Service department. Besides, it's not exactly something you can hand off to an overseas outsourcing company. Much of what makes content inappropriate is cultural and difficult to teach to non-native speakers.

So, if you were to try to do this, where would it leave you?  Well, below are some potential solutions we came up with in an attempt to avoid the problem of inappropriate player-created content.

Language Filters: A Good Place to Start

Have them.  Have a lot of them. And have fifty-thousand variations for the same word. If your game has any kind of chat system in place, odds are you already have something like this. Even if it comes down to a huge text file filled with words you hope your grandmother doesn't know and secretly enjoy adding to the list anyway.

Develop a Player Policing System

Allow your players to flag content as inappropriate. This is a pretty common system. If someone reads something he doesn't like, he can hit a big red button that reports the content and potentially removes in from the system right then and there. That's the high level idea; unfortunately, the devil is in the details. 

You have to decide how draconian you want to be. The more hardcore you are, the fewer people who will see inappropriate content, but you expose yourself to potential grief voting. Grief voting is when a player flags perfectly acceptable content as inappropriate just because it's fun.

If it only takes a single vote to eliminate content from the game, clicking that button is going to be the game for a lot of players. You don't want perfectly good content getting pulled because someone's a jerk. If your system is so harsh that a single vote pulls content, then you're making it really easy for griefers to have a lot of fun.

Auto Ban: It's a Treatment, Not a Cure

Auto Banning is when content is pulled without it ever being seen by any member of your team. This happens when content is flagged so quickly by players that it has to be bad.  Even if it's not, it's safer to just pull it for review. Once it's reviewed, CS can determine the real outcome of the content.

Auto Banning is a not a flawless system. Any system not monitored by humans is going to get out of control. It opens up a potential for grief voting, which will bring down player submissions, overload your CS department, and eventually kill the feature.

Instead, use auto ban on a curve. The more content is played and voted "acceptable", the higher the threshold needs to be for something to get banned for inappropriateness. This pulls bad content quickly at the beginning of the curve, but makes good content "grief limited" at the end of the curve.

Appeals: I Object to Your Objection

Auto Ban is good for removing bad content quickly. However, since no one ever sees the content but the creator and the person flagging it, you can't really trust that the creator is getting a fair shake.

To that end, Mission Architect allows the player to appeal a ban. When a player is told that his mission has been banned, he's given two options: he can delete the mission or he can appeal to Customer Service.

This does two things. It gives good content a chance to survive. Also, if the content is actually bad and is appealed, well, that's like a criminal showing up to a police station with a bag of bank money in his hands complaining of a paper cut.

Odds are, the offender will get the warning and try and find a different way to offend people that doesn't get him banned. Either way, this takes more of a burden off of the CS department.

Datamine: Watch Lists Aren't Just for Terrorists

In MMOs you datamine. That's the only real way to know what's going on with your game once it has gone live. So for Architect, we wanted to track as much as we could. We need to see what content is getting flagged.

Who is flagging it? Who is being flagged? Has the person ever been banned? If so, how often and for how long? What's a person's overall rating? Has he ever flagged content as inappropriate and been wrong? How often and what content? That last bit helps track down the grief voters.

Tracking all of this helps root out the few bad apples for our Customer Service team. From here they can keep tabs on just about anyone they want. This is the ideal situation because Customer Service is seeing the content sooner than normal, allowing it to get pulled before others have to flag it as inappropriate.

Customer Service is Your Friend

Early on, we brought Customer Service (CS) into the design discussion for Mission Architect. We knew that one of the biggest hits to this system was going to be the amount of complaint/griefing tickets being generated. We also knew that we didn't have a team of hundreds of Customer Service employees to go through each and every story with a fine-tooth comb.

So, the first hurdle was to determine how we were going to limit the number of tickets CS actually saw on a day-to-day basis. That's when we decided to turn the players into the first line of defense against inappropriate content. Only after content reached a certain threshold would CS become notified.

However, even that had the potential of flooding the system with too many tickets. Continuing to work with CS, we came up with the "pull and republish" approach.

If a player hits the threshold of inappropriate content and their story gets pulled, we alert the player and give them the chance to republish. If the arc gets pulled again, the player can attempt a third publish, but this time it goes straight to CS for approval before it is published. That's when the ticket shows up.

This creates two opportunities for the player to submit appropriate content before CS is notified. The belief being that the majority of content will get pulled or rewritten before it gets to the final 'submit to CS' stage. This method worked for both development and customer service.

The next step, once we had the actual design in place, was the implementation. We needed to make sure that the different parts of our game talked to CS management tools as well as being able to email the player directly if need be.

This took development and CS into areas not often touched within our game, account management systems and support tools. Eventually it got ironed out so that CS had the commands and the tools they needed to monitor and police the system as a whole.

The real challenge for CS and thus for us in development is in attempting to predict how players are going to break the system, putting data mining and other search features in place to see where it's happening and then be able to act quickly to stop it. In the end, even with all of the improvements, this is a feature is a large additional burden to CS.

Thankfully, development and support truly believe that it's worthwhile and are willing to take on the extra workload. I think the reason why is a combination of it being a genuinely different idea for us and the fact that at every step of the way the people who are going to be supporting it have had a say in how the feature got developed.

Potential Pitfalls

As of this writing, the cracks in the system haven't really started to show yet. That being said, this system (any system) will not be enough to control the trouser snake from emerging.  Below are some potential pitfalls:

  • If no one flags content as inappropriate, nothing gets pulled. This allows for the posting of completely inappropriate content that everyone sees. Not really how we want our IP to be remembered.

  • Euphemisms. A mild term for a harsher or distasteful one. You can't fight against this. You can't filter it. Our language is constantly evolving and phrases that were innocuous one day are suddenly taboo the next because of a shared, cultural context.  Preventing any of this rests solely on the shoulders of the players and Customer Service.

  • I saved the best for last: ASCII, using random characters to create images. Because you can't stop the 8==D.

In closing

This article focused mainly on how we use the stick in an attempt to stop the player from creating inappropriate content. Punishing and the player for being bad is only half of the process. We must also reward the player for being good.

In our next article, we'll cover how we use the carrot in an attempt to get players to create the awesome content we are hoping for.

Stay tuned.

Read more about:

Features

About the Author(s)

Joe Morrissey

Blogger

Joe Morrissey, Senior Game Designer, has a decade of game experience writing and designing games for companies such as Blizzard Entertainment, Backbone Entertainment, Cryptic Studios and Paragon Studios. He's had the pleasure of working on such great games as Diablo II and the Diablo II expansion as a game designer and lead writer. Currently at Paragon Studios, Joe is working as a senior designer on City of Heroes and City of Villains. He has played a vital role in spearheading the Mission Architect system, the first system of its kind in any MMO.

Daily news, dev blogs, and stories from Game Developer straight to your inbox

You May Also Like