Happy 15th anniversary to Diablo II, one of the greatest games of its time! Diablo II is a member of that rare class of game whose innovations seem so obvious in retrospect that they’re taken for granted. It’s treated as a success of iteration, understanding what worked about the first Diablo (and its clones) and expanding on it successfully. But that’s unfair to Diablo II, a game that reformed and defined one of the most essential components of the role-playing genre: character customization.
Here’s the great innovation of Diablo II: every character class is equally complex. The original Diablo had included three classes. This included the Warrior, a melee fighter who mostly just attacked but could cast a spell every few minutes; the Rogue, a ranged fighter who could cast a spell once per minute; and the Sorcerer, who had dozens of spells to use and had to manage all of them in different ways. Even though Diablo was the same game in every other way depending on your class choice, playing as the Warrior made for an experience that required monumentally less management than the Sorcerer.
This was in keeping with the rest of RPG history. In almost every game prior, fighters hit things in exactly the same way every time, thieves got a few quirks like backstabbing and mages or clerics had dozens of different spells to choose from. For most of RPG history, though, this didn’t matter, because these games were party-based. The player has all the complexity of spellcasting choices, even if the character does not, because their party includes multiple casters. But in the late 1990s, the genre shifted from party-based games like Might & Magic or Wizardry to single-character games like The Elder Scrolls, Fallout or, perhaps the most popular of them all, Diablo.
Here’s the problem: how can you encourage people to try a variety of playstyles when one character class is significantly more complicated than the next? I like the simplicity of the fighter, so I was rarely motivated to try the more complicated wizard — my Daggerfall characters almost always debuffed all spellcasting in order to be extraordinarily strong melee fighters. And I certainly had friends who believed that picking the magic-users was the way to only way to play these games, because they had options other characters didn’t.
Diablo II changed all this by giving every character class a roughly equivalent skill tree. “Magic” and “skill” weren’t treated as totally different mechanics, where the former has dozens of options and the latter maybe one or two. Casting classes like the Sorceress or the Necromancer gain their spells via the same skill tree that melee fighters like Barbarians and Paladins use to get their special moves. A Charged Bolt that fires lighting across the ground is, functionally speaking, equivalent to Zeal, an attack that hits multiple enemies with a single click. Both are acquired and triggered in the same fashion. Both use mana, which has to be maintained. And both exist on skill trees of equivalent complexity.
In the short-term, the motivation for this is simple: how can you have a multiplayer-focused game where every character is equally interesting to play? That goes beyond character classes filling different roles. The drive to make the multiplayer balanced in terms of time and effort is also a major reason why the distinction between skills and magic is eliminated in Diablo II. In most games skills were tied to discrete combat or rest states—for example, in the old-fashioned Pillars of Eternity, fighters have skills they can use in each combat, that only recharge after the battle, while mages only recover spells after resting.
Restrictions like those, however, don’t work so well in multiplayer. Thus Diablo II’s focus on mana as the only restrcition on skill use opened up the possibilities for characters to all push forward through its combat and challenges according to the players’ own pace. The main gateway to progress in Diablo II became inventory, either in terms of space of items to carry back into town, or potions carried.
Diablo II wasn’t necessarily the first game to make any individual one of these changes to the RPG model. But the specific combination of these models proved both durable and popular, eventually taking over other subgenres. Most obviously, Blizzard’s next RPG, World of Warcraft, adapted several of Diablo II’s customization ideals, right down to the three-part character specialization screens. And at its core, WOW tried to make certain that every one of its classes was equally interesting to play, whether tanking as a Warrior, stabbing as a Rogue or calling on demonic power as a Warlock. (Healers have always struggled to be as interesting as the other classes, leading other MMRPGs, like Guild Wars 2, to eliminate them entirely.)
The slower-paced combat of the massively multiplayer RPG as compared to the action RPG led World of Warcraft to add a third major element to skill use: cooldown times. It’s not just amount of mana, but also time restrictions to prevent players from gaining too much advantage by overusing certain skills. This combination of cooldowns, equivalent skills, and mana (or equivalent) charging has become the default model of most role-playing games.
Consider BioWare’s recent games, which, as one of their earliest options, ask players to choose which class they’ll play. A Vanguard in Mass Effect will have entirely different weapon and skill options than an Adept, but those options will be a roughly equivalent combination of a few regularly-used skills with general combat prowess. Nominal throwback games like Divinity: Original Sin still include enough skills to put fighters on relatively equal footing with magic-users. Even games that seem to deliberately reject the idea, like Skyrim, still work to allow any character build the opportunity to engage in as much or as little complexity as they want, with a change in focus.
Travel up the family tree, and this all leads back to Diablo II, whose character customization drove the entire genre forward.