Wednesday, March 7, 2012

Abiliy Mod Inflation

Brendan over at Untimately referred me to this post when I was talking about ability checks and saving throws (and possibly mixing the two) over at Roger the GS' post about Vancian Magic (I've a post on that matter as well but all in due time).  While I still plan on doing an extensive writeup on ability checks and saving throws and merging the two, it will have to wait until I've collected my thoughts and sifted through the labyrinth that has become my notes.  This time I'm going to talk about ability mod inflation, my experience with them and possible solutions I've tried.



The original discussion on the Dragon's Foot thread was about replacing saving throws with ability checks, such as save vs Dex to avoid a falling boulder.  The problem that many of the replies highlighted was that it greatly rewarded high ability scores and punished those with low ability scores.  In a game that prides itself on randomized ability scores this is quite a problem.  It also encourages what is commonly called Ability Mod inflation.

Ability mod inflation is roughly defined that in each successive generation of D&D the qualifiers for getting an ability modifier have decreased.  Originally ability scores were for prime requisites to determine what class you could play and also gave you an experience bonus if your stat was 13 or 15 depending on the class (as I recall).  In the earliest few editions after that you might have needed a 15 to get a +1 to reaction or missile weapons or perhaps as a morale bonus or increased resistance towards spells.  Later in the transition to AD&D and 2nd edition it was possible to get a modifier greater than 1.  With a dexterity of 15 you would have a +1 to armor class (well -1 with descending AC), with a 16 the modifier increases to +2 and with a 17 and 18 it would become +3 and +4 respectively. There was also a missile attack modifier that scaled at a slightly slower rate but if you were a traditional sword and board character what mattered the most was the AC bonus.  Constitution progressed at a similar rate, well, provided you were a warrior class to get past the +2 bonus.   A higher intelligence was necessary to be able to learn spells and also for the maximum amount of spells per level, Wisdom was used for bonus spells and had a magical defense (saving throw) bonus that scaled at the same rate as Dex's AC or Con's HP modifiers.

Strength was the oddball out in that edition where you needed a 16 for a +1 to damage, a 17 for +1 to hit and damage and an 18 could provided a wide range of modifiers depending on the percentile dice rolled.  I played so many fighting classes back in the day that I knew the strength modifiers by heart (and I didn't have to look it up to verify like the others) so that also goes to show how big an impact ability mod inflation had on AD&D.  If you didn't get an 18 in your strength score you were probably better off playing a non-fighting class because an 18 strength regardless of the percentiles is massively better than any other strength score.  The same could be said for other scores but not nearly as much as strength.  Sure if you're INT or WIS wasn't high enough you would be plagued with spell failures or wasted scrolls.  It also dictated the maximum spell level you could cast in your career but even a 9INT wizard could still cast 4th level spells and a cleric with a Wisdom of 13 or greater no longer suffered spell failure.  The only thing strength mattered for below 16 was that you might have a somewhat decent chance to open a door (40% at best) and you had an incredibly slim chance of bending bars.  There were of course weight limits but I can count on one hand the number people I have known that have given that much weight to encumbrance rules.  I am not sorry for that pun.

Finally you get to 3rd edition where each ability score follows a uniform progression for ability modifiers, the formula is (Ability score - 10) / 2 or 12-13 = +1, 14-15 = +2, 16-17 = +3 and 18-19 = +4, 20-21=+5.  The obvious flaw with this system was that an odd number was no different than an even number, 15 was no better than a 14 except in very small nuances (carry capacity again, and possibly feat requirements, ala INT 13 for combat expertise).  You may also notice that 19+ is included in that list indicating any PC (not just demihumans) could transcend the mythical 3-18 boundaries of ability scores.  Every 4 levels you could assign an ability point wherever you wanted.  As someone who played a 17 strength fighter for half a year in AD&D let me tell you this offer was one I could not refuse.  The problem with this of course was that you no longer had a defined ceiling for the maximum human achievement.  You might have great stats at the get go but someone higher level than you would be better (this isn't necessarily a bad thing), it also meant that someone with lower stats could potentially improve upon them which is a good thing except it never happened.  In almost all cases that ability point went into your casting stat if you were a caster, Dex or Int if you were a skill based class like a rogue, and strength of course for the fighter so he could hit more often and harder.  I'm sure you can think of times where the point might go somewhere else, such as making an even number for an increased modifier or increasing one's Con for durability or satisfying feat requirements (bit of a tax if you ask me) but the fact of the matter is the choice was non-optimal.  Worrying about optimization is the last thing you want in the very heart of your game system.  There were other issues, silly stat boost templates that weren't worth the levels you were giving up or magical items that granted stat boosts being almost mandatory and everyone walking around like a magical glowing Christmas tree but we don't need to delve that far.

To sum up the last few paragraphs, the proliferation of ability modifiers and the intrinsic boon in the system is to make character with higher stats (and thus higher modifiers) so that they are better leads to Stat Inflation.  Where if the stat in question is not up to a certain tier it is forgettable at best and unusable at worst.  Something that goes strictly against the original spirit of the rules where one always had a chance of succeeding (no matter how small) and your ability score was a representation of how decent a shot you had.  Having a 9 was not sub-optimal to the point of being useless, it was just average and indicated you would have a tough time attempting something above average.  Having an 18 was by no means necessary or even encouraged, it meant that you could do some amazing things even though those very things may not even be within the scope of your career but most importantly it did not devalue the assets of your companions.  That's what ability scores are there for, to define a particular aspect of your character and to let you know what they are and are not capable of.  Not as some reward/punishment mechanic built into the very fabric of your system.  I think it has a lot to do with why people prefer OD&D to AD&D and why 3ed and especially 4thed turned a lot of people off.  To much damn focus on the ability modifiers and an expectation in the game balance that you would have at least a +2 (15) in one edition and +4 (18 or higher) in another (hello hyper-inflation!)

My this post has grown quite large.  I'll try to keep the next bit brief.  Here are a few attempts I made in the past to address the problem of chasing after bigger and bigger ability modifiers.  My first attempt was to try an eliminate them entirely altogether, using ability scores entirely.  One of my major gripes with 3e was that odd numbers were rather pointless, the difference between a 14 and 15 was near non-existent; in a ability score only world that's an extra point you're adding to any roll.  As you can imagine there are a number of problems with this approach but they for me they did not become illuminated till after trying it out for a few sessions.
  • The major problem was that what mattered most in your success was you ability score and not the dice.  De-emphasizing chance and deflating tension quite readily.  This is even more exaggerated if you're using 3d6 which has a drastically reduced variability compared to the d20
  • Damage is another problematic area if you are used to adding an ability mod for a weapon type.
  • Initiative could be problematic due to both 1 and 2 above.  This becomes exasperated if you are are using an initiative system where one can get more than one action in a round.
  • Saving Throws are especially bad if you are allowing more than one ability to influence a save, say Wisdom affecting magic for instance.  This can be alleviated by simply using the higher stat if two apply.
  • It doesn't exactly fix the problem as you still want a higher stat instead of a higher mod.  In fact mods might be a way to discourage chasing high stats which brings me to my next attempt.
Diminishing returns is used in a lot of games where stat growths are expected and the player may allocate them as they please.  Some systems have it so that a higher stat costs exponentially more than a lower stat.  Other systems have it so that you can place that point anywhere you want but the more points you put in the less you get back, diminishing returns.  Demon's/Dark Souls is a good example of this, any stat past 30 excluding Vitality was rather pointless as you saw a smaller and smaller growth until it almost negligible.  The design actually encouraged versatility and branching out which is incredibly helpful in a game based around single player exploration which is particularly lethal to one trick ponies (which reminds me I need to get around to writing that Demon's/Dark Souls post).  If we were to adapt diminishing returns to something like 3-18 ability scores it might look a little something like this.

  • 11 = +1; 12 = +2; 13 = +3;
  • 14-15 = +4; 16-17-18 = +5
This system encourages people not to move past 13 if you are using something like point buy.  If you are rolling 3d6 it means that rolling past the standard deviation isn't excessively better than being within the average range (10-14).  It still retains the problem that 15, 17 and 18 are no better than their counterparts.  I haven't actually had a chance to try this system out aside from a single test so it's strengths and weaknesses are not immediately apparent to me.  Interestingly enough, Labyrinth Lord does the opposite of this, each increment in an ability modifier has fewer stat scores in it than the last (13-15, 16-17, 18) it may have something to do with mindset that an 18 is special which I imagine was born out of stat inflation (18's are always important in every edition since AD&D).

The current method that I currently use is that your ability modifier is your merely your ability score divided by 3 (round to the nearest integer, or for easier math treat you score as the nearest multiple of 3).  It was chosen for two reasons, one it's very easy to remember, no complex formula to memorize or table to look up.  The other thought was that if everyone has an ability modifier (and a respectable one at that, 9 = +3 vs 15 = +5) then from a player's mindset you would be less obsessed with attaining the exponential growth in a high score and hopefully not as disappointed if they didn't roll a 15+.  It does however mean that even someone with a pitiful score of 6 or fewer still has an ability modifier of 1 or 2.  The one boon is that it makes off the cuff ability checks very simple to adjudicate, roll a d6 and get under your ability modifier, so that someone with an 18 can still fail 1/6 of the time.  It's certainly not perfect but it has performed decently so far.  Lately though I have been considering trying out diminishing returns or barring that, going with the original fix to the no stat mods problem where any stat of 15 or greater provided a +1 modifier to one thing and that's it.

If anyone has other suggestions I'd love to hear them (and try them out!)

No comments:

Post a Comment