On good, evil, and reputation

2007, October 7th 7:33 PM

For some reason Fable and KOTOR's reputation systems keep coming up among my group of friends. I personally think those were an interesting try, but ultimately unsuccessful. The problem I have with them is that they come really close, but every once in a while the good/evil system kind of . . . breaks down.

The best example I have is with the beginning of Fable. My friend was playing at the time. There's a wife who tells you to go find her husband. If you find him then come back to tell her where he is, she'll give you a coin. Yay. So you go find him, and he's making out with another girl. He says if you don't tell anyone, he'll give you a coin.

Now, first off, the game is making the assumption that the husband is the bad guy. Second, the game is making the assumption that the husband doesn't deserve his privacy, because he's being the bad guy. I'd argue with both of those (or, at least, argue with it being an obvious assumption), but that's a moral position.

The big issue is that the game assumes your actions are motivated by the results for the participants, and not for cold hard cash. Well, my friend wanted the cash. So he promised he wouldn't tell the wife, and got a coin, and then promptly went and ratted the guy out, and got another coin. Amusingly, grabbing the first coin caused him to become more evil, but the second caused him to become more good – despite the fact that he was playing "I don't care who I hurt, I just want cashola".

The developer can't solve this problem. The only possible "solution" is to quiz the player, at every step, on why they're doing the thing that they're currently doing. And, as entertaining as the idea is, that's not really a solution.

The best fix to this is convenient, because it's actually practical, more realistic, and more interesting. Reputation shouldn't be a global number. It's not a single axis, where priests love you on one side and Hitler loves you on the other. Reputation is a personal thing and a group thing, but not a global thing. Luckily computers are now powerful enough that we can manage this without much trouble at all.

What should have happened is surprisingly simple. Let's assume, for the sake of argument, that the husband is a jerk and that the wife is really in the right. You promise the wife that you'll help her out, and she thinks slightly more of you. You promise the husband that you'll keep his secret, and he thinks a lot more of you. Now you go back and rat him out, the wife thinks a lot more of you (I presume you don't mention "oh yeah I took his money first), but once he finds out – well, the husband hates you.

Now add in communication. The husband probably talks to his friends. He might mention "that kid who ratted me out, after taking my money!" Now they won't like you as much. Now, the husband's friends are probably a bunch of lowlifes, since like attracts like. So now you've got all the scum disliking you. No problem here, right?

The wife is thankful that you helped out. She'll probably mention that to her friends. So now her friends like you a bit (and, incidentally, like the husband quite a lot less – we may as well be thorough here). No problem here either.

But when the two groups meet up and compare notes, if they in fact ever do, they might realize that, okay, so you did tell the truth – but you're also not to be trusted. And neither of them are going to appreciate that – the wife's side, or the husband's side. There's the proper penalty, and it's simply not a single axis.

This isn't even particularly hard to orchestrate. You can simulate it pretty easily by sending "knowledge packets" between characters. Husband knows "kid cannot be trusted", and that knowledge has a chance of duplicating itself every time someone with it interacts with someone else. Wife knows "kid helped me out", and that also has a chance of duplicating on each interaction. Give both of those infopackets some effect on how people think of you ("cannot be trusted" might actually be a good thing in some circles!) and then add in some broad strokes of who-talks-to-who – which can be very simple and vague, to the point of "people in this village tend to talk to each other and people tend to talk to their spouses" and still be effective – and you've got a system where things like that actually can hurt you later on.

Throw in a few ways to assemble text and you're golden. Player talks to NPC, asks for favor, NPC checks its databases and sees it doesn't like player. Why not? Randomly pick one of its reasons, weighted based on importance. "Well, I'd like to help you out, but I heard about {what you did to Jeffrey off in Smalltown} . . . and I just don't think I can trust you, can I?"

And remember, "kid can't be trusted" might endear you to some certain groups. You might have honorable thieves, sure, but it's entirely possible you could have dishonorable thieves too, who slap you on the back and buy you a drink if you tell the story of How I Got Two Coins Instead Of Just One (right before spiking your drink and stealing your wallet.) You can code this into them without too much trouble – just set it up so that "this person is low on trustworthiness to me" is a good thing. I imagine you'd have several major axes of opinion – "trustworthiness", "helpfulness", "power" – and certain people could easily react differently to each axis. In fact there's probably a fascinating research paper to figure out what major axes people actually think of each other on – in real life, the phrase "he's a nice guy, but he's just not very bright" shows up quite often, and that's just a simple example.

Is that cool? I think it's cool. And it neatly avoids trying to divide every action into "good" and "evil" – your actions will be important, not based on the nature of the action, but based on how the characters around you regard that action.

  • Ming

    2007, October 15th 12:46 PM

    There was a proto-game (experimental game) where the characters communicated reputations to each other and to your character. There was a monster who ran around the world and ate some of the little guys. If one of the little guys saw this, they would run away and tell other little guys. So, when the monster came to an *informed* little guy, that little guy would run away.

    In any case, that was what was intended in the game. And I sort of saw it happening. But unfortunately, the game was flawed in many other ways that distracted from the major research in this area of reputation and communication. I tried to get around the flaws, but it was ultimately too difficult to play the game because the nature of communications between the little guys could cause the game to go into an unknown and unrecoverable state.

    A further flaw was that other actors in the game were independently controlled and communicated with each other when they weren't on screen. Thus, important changes in relationships were happening outside of your knowledge. This may or may not have been the cause of the game going into a state where you could not complete the quests.

    Also, the amount of information to be tagged and acted on intelligently may grow very large for any sort of game. While this may become a storage issue on some smaller systems like the DS, the main problem is that it is difficult for the developers to design distinguishable intelligences for many AI agents. It is also difficult for the designer to coordinate the AI agents into quest-like, game-like goals for the player.

    These problems are not insurmountable. And I believe that the concept itself is sound and has promise. However, this area of gameplay is still a wilderness, and the first one to really tackle it will have to deal with clearing the land to find solutions to problems that have not been encountered before.

    It would be fun to do this, but expensive as well.

    -Ming

  • Zorba

    2007, October 15th 8:11 PM

    I seem to remember there was an issue with Neverwinter Nights where two or three factions would randomly end up at war with each other just due to running into each other in the woods. When this happened, it would generally happen long, long before the player reached that location, and you'd arrive at a city to find it completely desolate and abandoned due to the war.

    So, yeah, there's more than a few issues with this :)

    I'm really thinking about it – as a first approach, at least – being simply a replacement for "reputation" or "notoriety", so that it didn't destroy the game design, it only caused issues if you were being "evil". It would be awesome as a large-scale politics simulator, but yes, it would be more than a little difficult to design.

    Definitely a first attempt as a small game – wouldn't want to try building a AAA game around this one.

  • Bejoscha

    2007, October 23rd 11:37 PM

    Letting things happen "in the blackbox" is okay for games, but letting it happen "without the programmers controlling it" is very, very dangerouse. I just think of your "emergent behavior" posting. Just giving information-exchange a "chance" is not a good solution IMHO. It leads to situations where you can do things, while playing the same game the same way the next day, you simply can't. Though it might be realistic, it's frustrating. Somehow, games in the past "played" most realisticly, if the "reactions" of the other NPCs were "predicted" by the programmers and then desgined with care. Howerver, this of course limits the posibilities…
    As for the amount of information: Keeping things simple is important. There might be some cases where a "special" knowledge is important to travel round, but most often it will be enough to have one number for each character resembling how much he likes you. This single number can interact with the numbers of other chracters. It is not always important WHY somebody likes/hates you, just that he is doing.

  • Zorba

    2007, October 24th 10:57 AM

    Really, whether they think that's "realism" or "frustrating" depends entirely on the player. There are virtually always random effects in games – doing the exact same sequence of moves won't work twice in a row. This takes that to a slightly further extreme, so that people's opinions of you have a somewhat random effect, but if it's done well it shouldn't be a huge random effect – an individual NPC may not have heard of the same set of your actions, but it'll have a similar semi-randomly-picked view.

    In many ways this is actually simpler than having a single number per character for how much they like you. The problem with that approach is that you need some way of modifying those numbers properly, and that means for every single major action the player can do you have to sit down and figure out who's going to hear about it and how much they're going to care. It leads to potential bugs and inconsistencies, since every single number is hardcoded, and makes game-world manipulations difficult and bugprone. (Imagine the case of adding a trade route between two villages that previously didn't communicate directly – the people in those villages should hear about your actions now when they didn't before, but there's a good chance things will get missed.)

    While it's a more complex process by far, it means that the human data input can be far simpler and still result in a more interesting simulation.

  • Bejoscha

    2007, October 24th 10:27 PM

    Hmm. Agreed on. It seems just that it is what it is: complex. And that might be the reason why it isn't common yet… *g*

Leave a Comment

Subscribe without commenting