Nothing is immortal to the passage of time. This is especially true in the world of gaming, a world of change and advancement at a blistering pace. Look over the horizon, and you’ll see new games and developers and ideas destined to replace the old. Blink, and one generation of consoles is preparing to give way to the next.
So, what video game trends were lost during this modern era? What patterns phased out after the Wii and PlayStation 3 and Xbox 360 have burned through 6-7 years of their momentous life cycles, lying now on the precipice of becoming the Wii U and PlayStation 4 and Xbox 720/Durango/8/Infinity.
We thought of 13. Some, we’re sad to see go. Others sting to be reminded about. But their scope can lie anywhere: a genre; a subject matter; patterns of philosophy, behavior or design.
So have a look. Reminisce or revel. If you think there’s something we missed or have something to say, stick around and let us know in the comments below.
Even a decade ago, anyone who had played a strategy game or a deep RPG was familiar with the feeling of ramification, that acute for-every-action-a-possible-reaction state of mind. Only in this generation with games Mass Effect, Fallout 3 and Heavy Rain, however, has the idea of player choice so thoroughly infiltrated the mainstream – and often to the tone of moral ambiguity. So often now, experiences that were once full of mindless escapism have been replaced by existential crises – What devil have I become, murdering this innocent villager in his sleep and stealing his gold? – and the navigation around those crises turned into a focal point of gameplay.
Don’t get us wrong, though. We love the way more games have become preoccupied with the human condition; replaced Schwarzenegger with Cicero, engines of war with wheels of dialogue and decision. The depth and complexity and boundless storytelling potential such experiences offer are redefining the way games can impact our lives – and changing the perception of art as we know it.
Most commonly appearing in the form of split-screen, offline multiplayer in the developer-supported sense has gone the way of the steam locomotive and propeller plane in recent years.
Even as gaming continues to grow as a communal experience – today’s bestsellers boast online firefights between the dozens, and Nintendo, Microsoft and Sony all trumpet socially attuned online networks – the amount of multiplayer time we actually spend face-to-face has diminished to the point that split-screen games are a bona fide rarity. That intimate competitive spirit fostered by Goldeneye, Mario Kart, and so many more during split screen’s golden age has now been twisted into a music/dance/3D/party/fitness-game amalgamation of frivolity that occasionally delivers a Dance Central or Kinect Sports, but more often than not settles for Michael Jackson: The Experience.
The book hasn’t been written yet; Smart Glass and the Wii U GamePad’s emphasis on multi-screen gaming will present new offline opportunities for games of all creeds. Whether or not developers will choose to utilize them is unclear – but we’re holding out hope. We can’t afford to lose sight of the fact that great gaming can also be social gaming, without a connection bar or spec of bandwidth at hand.
For most gamers old enough to remember the late 90’s, Goldenye 64 is rooted firmly in the Pantheon of First-person Shooters (see also: Offline Multiplayer) But it’s because of this revered reputation that we rarely mention what would be a fatal flaw today: the painfully cumbersome escort mission with Severnaya computer programmer/obtuse AI companion Natalya Simonova. Recreated with hilarious authenticity in a YouTube parody earlier this year, this banal branch of Goldeneye’s “Facility” level was a testament to the immersion-breaking, pace-killing power of what frequently comes across as a broken babysitting simulation.
Even though later eras would see escort missions improve – their piece de resistance was arguably 2001’s Ico, eponymously named after its protagonist who guides a twisted Queen’s daughter, Yorda, through a breathtaking ancient-castle escape – the term is such taboo today that developers still tread carefully around the association.
Theoretically, The Last of Us is an escort-mission game. But assuming all goes well when it releases next year, most writers will never call it that unless first pointing to its distinguishing features: Joel and Ellie’s charismatic bonding, gameplay that calls for meticulous measuring of combat and navigational choices and, from all appearances, an AI companion in Ellie who behaves with an extraordinary lifelike realism.
That hasn’t kept Naughty Dog from dissuading the phrase, though. Connotation trumps categorical for anything no one wants to be a part of.
As humans, we live in an increasingly wireless world. As gamers, the days of tripping over plastic-coated aluminum are all but a distant memory. It’s impossible to walk into a used electronics store and miss them: wired controllers, wired microphones, wired headsets, DSL internet cables – remnants of a once-disconnected infrastructure, reaching out with its primitive unkempt tendrils.
Is there an inherent gameplay advantage to wielding a cord-free gamepad, the tactical edge its box will readily advertise? Unless you’re trying to sneak in rounds of Call of Duty during sessions of cooking or Brazilian Jiu-Jitsu, probably not. But in the new dawning of voice commands, motion sensing, and Wii U GamePad multitasking, we’re thankful that this one departure has preceded them. Playing from our awkwardly arranged furniture is an extra bonus.
RTS Console Futility
It may have taken until its twilight years, but this generation of hardware finally buried the notion that consoles are graveyards for the real-time strategy genre, a place its titles would go to rot in gaming H-E-double upside-down analog sticks. Sure, there were hiccups along the way: Kingdom Under Fire: Circle of Doom might have doomed the franchise; the Battlestations series could never bring all hands on deck; and 2008’s Supreme Commander was a showcase of the supreme PC/console quality dissonance.
But we’re only so scrutinizing because the bar has concurrently been rising. Command & Conquer 3: Tiberium Wars assured the C&C franchise’s console foothold with a stellar outing in 2007, all without sacrificing its hardcore RTS appeal. Shortly thereafter games like Tom Clancy’s EndWar and Halo Wars, slightly inferior in quality, successfully brought key elements of the genre to the mainstream and showed that even casual gamers could overcome the gamepad’s functionality challenges.
Gazing towards the future, the console version of Firaxis’ XCOM: Enemy Unknown looks to be every bit the preeminent RTS experience the original 1994 PC game was. And with next gen developers having access to control schema like the Wii U GamePad and Smart Glass, a genre such as the RTS, so predicated on its constant hands-on style, couldn’t have a more promising outlook.
Second only to the factory-fresh, plasticy aroma of a newly unwrapped box, sifting through a hefty instruction manual used to be one of those reward-center-satisfying rituals of buying a brand-new game. They were ubiquitous: full-color, 30-40 page handbooks, sometimes with intricate graphical art, welcoming us to our new purchase with story introductions and character bios and helpful tips and tricks.
Now, we’re lucky to get a black-and-white insert describing the game’s controls alongside a 12-digit code for an online pass (Limit: one-time use only). We get that publishers want to save money or a few trees; both have their vital functions. We also get that many gamers prefer to skip to, you know, the actual game, never bothering for the contents of a box aside from its disk. That being said, finding a classic manual in today’s era of in-game tutorials and digitally embedded control guides is rare to a discomforting degree. An outlier worth slowing down to enjoy.
The penetrating shrill of a 10-year old’s affliction. Acoustic fingernails, tearing through the chalkboard of cyberspace; piercing through your speaker set in a burst of boundless profanity, slurs, and overt hatred; occasionally threatening permanently the loss of hearing and mental stability.
It’s hard to imagine, but even the worst of what game chat has to offer evokes a certain level of nostalgia, perhaps because the sparsely-accompanying pleasant interactions made it all worthwhile, and, well, both are hard to find these days.
It seems like it happened overnight. This generation’s Xbox Live managed to augment its interface with cross-game party chat, squad-based chat in shooters became all the rage, and responsive public microphones soon disappeared. The airwaves went aphonic.
Have we all become selfish, infatuated with kills in lieu of teamwork and communication? Is it just another example of society growing more reclusive? Who knows? We just miss getting a heads up on the enemy right behind us before we’re assassinated in plain sight of our squad.
Memories are fickle things. They lose their freshness, they lose their relevance, and more often than not the brain finds no need to employ them going forward. They become obsolete.
Memory cards have proven no different. If the writing wasn’t on the wall when the current generation launched with 20GB-hard drive Xbox 360s and 60GB PlayStation 3s, it became vividly muralized when those numbers started burgeoning to 120, 250, and now 320.
Sure, these blocky reservoirs of mission checkpoints, custom settings and DLC downloads still eke out a respectable existence on the Nintendo Wii (they don’t look like they’ll be discarded for the Wii U, either) and in the USB ports of Microsoft and Sony’s smaller hard-drive console variants, but anyone buying a new Xbox 360 or PS3 today will find the low-gig options to be near nonexistent. The memory card decline isn’t likely to be staved off in the next-generation either, as storage and streaming continues to grow more efficient.
And that’s the way it has to be: We download more, we update more, and if there’s ever a need to take our saves on the road, we’ll likely be sending them to the cloud more, too.
World War II
For the first half of the aughts, World War II games were the menacing Panzer tank in a publisher’s army, the tip of the spear that rolled out onto the market at a relentless rate, more often than not guaranteeing a sales victory. Today, however, they’ve been reduced to a flimsy German motorcycle sidecar – and Indiana Jones isn’t driving.
It was a confluence of oversaturation calamities that conspired against World War II’s gaming dominance. Gamers grew tired of storming the beaches of Normandy in a developer’s Saving Private Ryan du jour; the European theater, picturesque though it was, ran dry of fertile battlegrounds, and the rising tide of moral ambiguity (see: moral clarity) saw Hitler, Hirohito, and even Mussolini turn into archenemy outcasts.
Once Call of Duty – after years as gaming’s standard bearer for the Second World War – finally discovered the magic formula that was Modern Warfare, there was no going back. Call of Duty: World at War was a tantalizing glimpse into what could have been – a World War II revival in the Pacific theater – but with developer Treyarch now fixated on the future in Black Ops II, their locus at D-Day +30,000 more or less epitomizes the genre as a whole.
Once omnipresent titans of physical media and trade-in shakedowns, outlets like GameStop are seeing their stocks trade at 5-year lows, while Game in the UK considers that blessed – having gone into administration this March and losing the business of Nintendo, EA, Capcom and others.
The culprit? Sloppy stickers. Digital media. Countermeasures against used games from big name companies aren’t bestowing any favors either, but according to the NPD’s first report on digital revenue, sales of the digital format (which includes DLC, subscriptions, and mobile/social network games) have risen 10% in the first three months of 2012 compared to the same period in 2011. Not coincidentally, 2011 saw an 8% plunge in new physical game sales and an 11% dive in hardware. The pervasion of everything from Steam to digital console storefronts to more casual gamers flocking to the iPhone has sliced a schism into transactions that once took place almost exclusively in-store – in-person.
Yes, they’re still moving enough units to keep the lights on, and they’ll likely even enjoy semi-relevancy throughout the coming decade, but the current generation has painted a macabre picture for the brick-and-mortar video game retailer.
3D has a special place on our list, if not in our hearts: it’s the only trend to have been born and summarily killed in the same generation; a Rebel Without a Cause – and one too many flaws.
We’re not too far removed from 3D’s explosive revival in the film industry – the period around 2007-2009 when monolith budgets were thrown at stereoscopic feature films, which culminated when James Cameron’s Avatar went on to gross the highest box-office revenue of all time.
But unlike the relatively passive experience of watching a film, games are an exercise in constant adjustment. Indefinitely, they generate dynamic visual cues, demand a continual cognitive refocusing and reward laser-like reflexes. Simply training the brain for properly handling an analog-stick camera can take hours for someone who’s never played a game before.
So, not surprisingly, when many in the gaming industry tried to ride on the coattails of film with heavy 3D investments, it never took off. There are some breathtaking visual experiences to behold in games like Motorstorm: Pacific Rift or even Call of Duty: Black Ops, but as our Jason Weismann chronicled last year, many gamers have found that it’s not worth the eyestrain (not to mention the wallet strain for compatible tech). The industry has witnessed a drastic drawdown in 3D marketing this year, and – blame it if you must on our jaded vision after playing James Cameron’s Avatar: The Game – we don’t see a resurgence any time soon.
Mainstream Media Vilification
For a while it seemed impossible – the Mass Effect “Se-xbox” segment on Fox News set a high bar in 2008 – but the vilification lust that’s permeated the mainstream media’s coverage of video games since the days of Doom and Mortal Kombat has curtailed in recent years. We still see the ignorant rushes to judgment, the good-old fashioned parenting scare stories after the “Animal Crackers Might Cause Cancer” headlines run dry. But it’s beyond recent memory that we’ve had a genuine debate about games as the scourge the younger generation, the one medium – more so than movies, TV, music, or literature – that dragoons our youth into a violent, sociopathic lifestyle.
And quite frankly, that’s because no one really goes there anymore. The last recent chance to venture outside those confines of logic – 2011’s tragic terrorist shootings in Norway, where the shooter, Anders Behring Breivik, was found to be an avid WarCraft and Call of Duty player – witnessed an amazingly minute backlash against video game violence. The focus, mercifully, remained on mourning the 77 deceased victims and seeking justice in the court of law – but imagine if such a thing happened in, say, 2005, before publicity chaser Jack Thompson was disbarred by the state of Florida.
There’s a larger point to be made here: Video games are growing in our collective cultural conscience; the individuals covering them for national news outlets have a good chance of being gamers themselves. At this rate, it won’t even be long before we update those “gamer” stock photos showing eight-year-olds playing PlayStation in a prison-cell-sized room, pupils dilated and drool reaching the floor. The sepia may still be en vogue, but stereotypes aren’t.
Can a game many regard as the best of its genre also sow the seeds of that genre’s decline?
Resident Evil 4 redefined survival horror in 2005 in ways that garnered universal acclaim: groundbreakingly gory and realistic graphics, a heated pace to combat that demanded the aiming precision and twitchy reflexes of an FPS veteran, and interactive cutscenes of an equally dynamic nature.
But as it turned out, these shifts set the genre teetering on a slippery slope, and just as critics and gamers were swooned by RE 4’s offerings, so too were developers, piling on enough weight to push it over the edge.
An endless number of examples can explain how the dissonance continued from there, but we know this: The genre that was built on the foundation of isolated terror, combat avoidance, and a distinctly stylized Japanese aesthetic is now centered upon their polar opposites – co-op collaboration, flashy action-packed fight scenes and a stark, gritty, often visceral artistic template.
Change is often applauded in the video game industry and rightfully so. But it’s getting hard to argue that true survival horror – or, at the very least, its golden age – hasn’t crept away from us.
It’s hard to define progress, but there’s no doubt the things we leave behind are an integral part. Where many industries and enterprises are unfortunately equating dying trends with dying times, gaming still carries itself with that classic escapist optimism – the sense that, though never perfect, the best is just an insight or innovation away.
Our list isn’t always pretty – much needed remedies go hand-in-hand with unfortunate casualties – but from World War II to escort missions to memory cards, the trends that fizzled in this generation are a veritable record of how far we’ve come. They also warn that what we think we know at the dawn of a new generation could be out-of-style, 3D-style, by dusk.
Ranters, what’s your take on our 13-fold list? Is there anything to add or take away? What trends do you think we’ll see disappear in the next generation?
Follow me on Twitter @Brian_Sipple.