...When It's DONE by Mike Zupan



What’s your idea of an optimal gaming experience?

Granted, this is a dangerous question. It often propels what’s meant to be a thoughtful discussion into heated debates about graphics and gameplay mechanics. One thing that people tend to overlook in these ‘debates’ however, is a far more basic necessity. First and foremost, a game has to work. Having grown up in the glam-tastic 80’s, I remember what it was like to grab a cartridge, jam it in a console, push the power button and play for hours on end. I’m not going to sit here and pretend like I never encountered an issue - after all, blowing in NES cartridges was practically a staple of my childhood – but my paranoia fueled brain had never been sick with concerns over game breaking bugs. When I bought a game, it worked. No fuss, no muss.

But today? We have to endure the rigmarole of firmware updates. As most of you are well aware, most games require a day one update to – and these are just a few examples – correct issues we never would have noticed in the first place, enable multiplayer support, or hammer out game breaking bugs. Furthermore, if consumers manage to stumble upon something in the post-launch window, developers are conveniently able to address those concerns through another patch. It’s nice to know that if a bug manages to squeak through quality control, I can (supposedly) rest assured that developers will be working ‘round the clock until I’m able to experience their game the way they intended. Of course, because nothing is sacred when money is involved, this once consumer friendly feature is now little more than a shadow of its former self.

What does this mean for us as consumers, exactly? It means that publishers are taking advantage of what they've perceived as consumer complacency. Now, once a game is deemed 'good enough' by the powers that be, it's whisked away in its unfinished state to a pressing plant. The discs are then packaged, shipped, and held in stock rooms until release. All the while, developers continue to work on ensuring their product is 100% in time for launch.

I think most of us would agree that actions speak louder than words, so what's the logical conclusion here? Simple - Publishers couldn't care less if you're spending your hard earned money on an incomplete product. They believe that as long as they have everything fixed in time for launch, most of the gaming community won't even care... and to be fair, it's not like we've proven them wrong. Despite the outcry from message board crusaders, there aren't many people actually speaking with their wallet. How can we expect publishers to listen to our concerns while we're still throwing money at them?

I know what some of you might be saying. "It's a digital future anyway, so who cares if there's a day one update? All that matters is that the game works." Of course, the caveat is that not every game actually works at the time of launch. There shouldn't be any precedent that allows publishers to kick a game out the door for competitive reasons rather than logical ones, because when they do, they're basically gambling with our money.

And trust me, you've probably been affected by the stuff I'm talking about already, and multiple times at that.

We don't have to look too far back to see what can happen when a game's release is the product of a deadline. Yeah, you know where this is going - Battlefield 4. Single player campaign saves were corrupting, network issues were rampant, and while the experience has drastically improved ever since, people are STILL reporting problems to this day. Is there any question that this game wasn't ready, and that the parties involved weren't aware of that? It's unacceptable by every stretch of the imagination, but what's worse is how they've been tripping over themselves in the media as a result of their irresponsibility.

Image title

Some months ago, DICE had stated on the official 'Battlelog' that, "Resolving the launch issues is our #1 priority. In fact, we are so serious that we have the entire team working to stabilize the game and we will not move on to other projects until we are sure that Battlefield 4 meets – and exceeds – your expectations. It is the right thing to do." Technically, the 'right thing' would have been to stop selling DLC and pull the game from shelves until it was fixed, but I digress.

Fast forward to February, and EA's chief creative officer Rich Hilleman - in an interview with Nathan Grayson of Rock, Paper, Scissors - sang an entirely different tune. "Battlefield 4 has been an exceedingly successful product on both consoles and PC. From a sales perspective, from a gameplay perspective." He went on with, "I don't think most of my customers are willing to say - 'it's a bad product, I wish I didn't buy it.' That's not the conversation we're having now." I don't know about the vast majority, but I've had that conversation... with LOTS of people. "We did things wrong. We know that. We're gonna fix those things. We're gonna try to be smart about what customers want in the future."

There's so much wrong with his response, my head's still spinning. I mean, money aside, how can BF4 be spun as an 'exceedingly successful product'? Regardless of where you stand today, I think it's fair to say it had one of the worst launches in recent memory. And as far as 'trying' to appease the community... well, allow me to counter that quote with another - "There is no try. Only do." People just want their games to work. It doesn't get any less complicated than that.

Another way this 'patch it later' attitude has affected the community, was with the entire next-gen console lineup. The Wii-U required a firmware update to activate most of its key features, while the PS4 and Xbox One were loaded to the brim with promise - Promise you'll have this feature, promise you'll have that feature... that is, as long as you're willing to spend $400 to $500 up front. The PS4 and Xbox One - despite the fantastic gaming experience they provide - clearly weren't ready to be released. So, why were they? Well, if you recall what happened last time, Microsoft had a yearlong advantage over Sony, and I don't think either party was willing to risk a similar disparity this time around. So, once the consoles were in a playable state, they were kicked out the door. The end result? Well, PS4 owners are dealing with broken Share functionality to this day, and despite how far the Xbox One has come, it's still paying the price for its lackluster reveal in 2013. I don't want to spin near hyperbole here, but the early adopters have essentially paid for the privilege of beta testing next-gen consoles.

But that's peanuts compared to what this means for us over the long term. To put it bluntly, I think we're witnessing the death of video game preservation as we know it, and that scares me. I know, I know - Some of you have a tendency to play a game and trade it in just as fast. I've been there, done that... and have almost always regretted it. I wish I still had my NES, SNES, Genesis, N64, and the list goes on. But you know what? I could go to my local retro gaming shop today and buy these consoles with their respective games. Once I get it all home and set it up, all I'd have to do is grab a cartridge, jam it in a console, push the power button and play for hours on end. Unfortunately, that convenience simply won't be possible with the games of today.

Batman: Arkham Origins was released with a bug that would cause save file corruption. Skyrim - much like any other title from Bethesda - suffered from broken quests, texture down-scaling, and massive load times after extended play. The Fable franchise also had its share of frustrating glitches and broken quests. Hell, even The Legend of Zelda: Skyward Sword had a bug that made game progression impossible. These are just a few notable titles off the top of my head, but a Google search of 'game breaking bugs' will reveal much, much more.

Of course, most of these issues were later resolved with updates, but here's the rub - What happens when we decide to revisit these games in 20-30 years? I mean, the content on the discs themselves is incomplete, so when we inevitably come across a game or immersion breaking bug, we're going to be screwed. After all, the servers for our console(s) of choice won't be around forever, and when they disappear, so will the opportunity to acquire a much needed patch. Worse yet, if you have a console that needs to be reformatted, you can kiss all the functionality that came after day one goodbye.

This is why we need to fight to ensure that developers and publishers refuse to release a game until it's ready, because otherwise, we're just spending full price today for a wasted investment tomorrow. Technology may have brought us to a point where games can be more fulfilling than feature length films, but when we can't even trust that a product is ready at the time of release, it's clear the industry has lost sight of pretty much everything. Strip all the variables away, and gaming is just as valid a form of entertainment as music or film. Could you imagine if a couple of songs on an album had been cut off, only to later find an apology in the booklet that says, "Sorry, we couldn't finish the songs as we intended because we couldn't meet the deadline. Here's a digital code to redeem the completed tracks in two weeks." What if you went to the movies and saw a similar message from the director on a title card in place of what should have been the final moments of the film? Would you stand for it? Of course not.

Because of the ever changing nature of technology in general, it's easy to rationalize anti-consumer policies by saying: "That's just the way things are." But the gaming community proved a year ago that nothing is ever 'just the way it is', because we pretty much forced Microsoft - and Sony, even if you're not aware of that - to change their stance on DRM. The more you complain and the more you refuse to spend money on products that don't deserve it, the more we can make things happen.





Comments:

#1
Turismo
A few buddies & I got burned on the Battlefield 4 bug that wiped all progress on the single player campaign. One of their game patches wiped out players progress, don't they test this stuff? I guess not. I'm never buying a Battlefield game again.
#2
Corleth the Fey
I got about 3 missions into Battlefield 4's single player campaign. Then I was hit by the bug. No way am I playing that below average 'bullet point on the back of the box' campaign for what should be a multi player only title. Only for me to probably lose progress again. If it had been the adventures of Sweet Water, Haggard and the awesome B Company - I would have jumped straight back in. How in a million hells DICE thinks the sub par COD gibberish they've spat out in the last two games is in anyway worth while still beggars my belief.
#3
D1NO
I never played the single player in Battlefield. And as bad as it was in the first 3-4 months since launch, the game online is now really enjoyable. 32 vs 32 is holding up online. -cheers
#4Beardybrave
I'm a bit torn when it comes to the patching culture in modern games development tbh. I absolutely agree that BF4 was pretty much the worst launch in many many years, and their attitude since has ensured that they won't see the contents of my wallet for quite some time. But on the other hand, gamer behaviour, exploitation of the most ridiculously unpredictable bugs to gain an advantage. Well that is what dictates a great many of the patches we see. It's telling that even Nintendo, famous for their polish and their zero tolerance attitude to unfinished software, were forced to introduce patching to fix one of MK7s tracks. I guess my point is that while it's obvious that BF4 and the two new consoles were released unfinished, the line between unfinished and fit for use is a line who's defining limits are very blurred indeed. /Edited for clarity.

Please log in to the forum to post!
Why does it pay to be a member? Read the skinny, here!