For most of human history, artistic expression was a one-time action. Ancient sculptors couldn’t take back a stroke of their chisel, and paint didn’t come off when it hit the canvas. There was a simple but beautiful synergy between the weight of those brief artistic moments and the focused, passionate energy that inspired them.
Humans are still expressive creatures, but today we’re afforded much more time and space to craft works of art. We can re-take a digital photo or edit it later, film a certain scene until we get it just right or even re-post a tweet to make sound a little more snappy. I’m no artist, but from a broad perspective, I think it’s fair to say that this trend makes art less emotional and more intellectual than it used to be. If nothing else, it makes art more malleable; where paintings and sculptures were once altered by little more than the elements and time, films and music can now last forever on a hard drive, going under the knife of an editor years later. (I’m looking at you, George Lucas.)
We’ve been heading in that direction for decades, with a different flavor of technology leading the way at any given time. Recent events have convinced me, though, that video games–still questioned by some as an art form at all–are spearheading the trend more than anything else. And while the effect on art is nothing new, I’m not sure it’s a good thing for video games.
The most recent, eye-catching example in the industry, of course, involves the ending of Mass Effect 3. Released barely a month ago to incredible critical acclaim, the game quickly drew the ire of many fans for failing to provide a conclusive ending to a trilogy into which millions of people have each invested well over 100 hours. Developer BioWare was criticized for allegedly exploiting fans’ expectations and caving to pressure from publisher Electronic Arts to sell downloadable content in the future. Caught between overwhelming negative pressure from the public and a respectable confidence in its original artistic visions, BioWare was in a bad place. Eventually they capitulated, announcing that an extended ending would arrive via DLC this summer.
It’s a landmark moment in triple-A games. As an interactive medium, the artistic message of games is inherently variable from player to player. But with only a handful of other cases–the minor surgery on Fallout 3’s ending comes to mind–we’ve never seen developers take such a direct, top-down approach to altering an artistic product after it’s been released. To go back to my earlier analogy, that would be like Michelangelo climbing up the Sistine Chapel a decade after finishing it because a new pope believed that some of the fig leaves were a little too revealing.
To be honest, I was happy to see news of BioWare’s plans to extend Mass Effect 3’s ending. I care too much about that story to leave it where it stands. But even so, I think it sets a dangerous precedent. What if developers start assuming that it’s okay to sell incomplete experiences? If they can get away with “fixing” them later, what’s to stop them from going so far as to sell us the true ending of a game? We’ve already seen visions of that nightmare in games like Asura’s Wrath and Assassin’s Creed II, which explicitly and intentionally excluded plot-critical portions of the game and later sold them to consumers who were emotionally invested.
I’ll be the first to admit that downloadable content and patches give developers valuable tools that can expand and improve a product after its release, if used correctly. But they shouldn’t be considered an acceptable fallback for an incomplete or inadequate single-player experience. I can hardly imagine the fallout if something similar happened in some of gaming’s more sacred series, like Zelda or Halo.