It's the New Year, and that means it's time for resolutions. You know, drop some weight, quit smoking, get organized - just like you resolve to do every year, right?

It's also a good time to reflect, maybe mull over some things you learned during the year and how you can apply those lessons in the coming one.

How about the year for games? Just as predicted, 2008 was a high-volume year, and although a recent rash of layoffs and economic struggles for the games biz proved that we're not "recession-proof," video games as an industry and as an entertainment medium still made big strides forward.

And what have we learned? New Year's resolutions tend to last until about mid-January, but looking back on the year in games, why don't we take a look at some resolutions we'd like to see ourselves - and the industry - actually keep?

New Year's Resolutions For The Game Industry

No More Holiday Avalanches: Once the holiday season came around, there were so many great titles to choose from that we had no idea what to buy. That's part of why it's been so hard to have a decisive opinion on the year-end lineup of titles - even reviewers and critics couldn't thoroughly keep pace with the barrage, and it was harder than ever to be the kind of core gamer who plays every title. There were just too many of them coming out at once.

Why on earth would the industry's publishers sync their calendars to launch their most competitive titles all at once? Well, the marketing wisdom goes something like this: Roll out a great game in a period of little competition, and consumers ask themselves whether or not they should buy it. Roll out a great game in a crowded, competitive season and consumers are more likely to ask themselves which title they should buy.

That's how movies do it, which is why you tend to see your Summer action flicks, your Spring romantic comedies, your Fall kids movies or your Holiday dramas come out in packs. The thinking is that more crowded release schedules draw consumers' attention to the space, and actually seem to increase the chance that someone will spend money on something.

But this season's rapidly-encroaching economic constraints meant that only the "sure things" earned money - players knew what to expect from Gears 2 or CoD5, so that's what they bought. Filling out the rest of the November NPD charts were the same mainstay Nintendo titles for Wii that more casual players have bought regularly throughout the year. While slightly more divergent, critically-acclaimed titles like Left 4 Dead, Fable II and Fallout 3 still made the top 20, they likely could have sold even better in a period where they were allowed to be the main event. And forget the titles that were any smaller than that — literally, do you even remember them?

Publishers plan release schedules months, often years in advance, and no one could have predicted that the recession could hit so quick and so hard. But this year, we hope they'll have a little more faith in their own titles, and in the audience's appetite for innovation, and resolve to release at least a few of their gems when they can stand on their own, when audiences can pay proper attention to them, and when consumer wallets aren't already bled to death.

Less Is More: Even before the economic downturn set in, the industry was beginning to feel a serious squeeze from just how large development have gotten today. As the industry's grown in value, it's also become much more competitive, and major companies feel the need to sink millions into long, long dev cycles just to keep up. And profitability's hard - analyst group EEDAR recently suggested that only 20 percent of games are profitable. Yes, that means that as much as 80 percent of games lose money.

So what can the games biz do? Well, we've heard companies that would never before breathe the word "microtransactions" start grudgingly letting it slip. We're seeing fewer hours on our big-budget game discs thanks to the promise of less-expensive DLC down the road, since publishers are hoping to develop games with "long tail" revenue potential.

We've seen DRM get much stricter - even though industry organizations like like the ESA and the PC Gaming Alliance have yet to quantify with data just how much game piracy hurts sales, all they know is that it does - and that's a reason to try and staunch the bleeding. Game publishers are starting to turn a sharper eye toward used game sales and trade-ins, too, trying to figure out how they can thwart the leech in revenue - at the consumer's expense, if you listen to GameStop CEO Dan DeMatteo.

These are all fairly viable strategies (though, while the games biz is at it, it could work a bit more on those DRM solutions) - but what about making better games with smaller budgets?

That's right - bigger isn't always better. Every year, we see browser-based one-man shows or small-team titles like Daniel Benmergui's I Wish I Were The Moon that awe and inspire, while two of the year's biggest main market successes - Jon Blow's Braid and 2D Boy's World of Goo - were indie titles whose budget is likely a fraction of Cutting Edge FPS Vol. 29's. What about the idea that clever design, artful animation and a dash of genuine spirit can sell just as well as a big-budget game - and at a much lower cost?

Of course, we'll always love our AAA blockbusters. But wouldn't it be great if we could see the industry's major leading companies also try a new paradigm, and possibly see more profits in the process? Skim the bloat - it'd be a win-win.

Value Your Talent: You know the author of a book. You know the director of a film, and its major players. You know the names of the members of your favorite band, and if you're a music buff, you might even know who inspired them. So why do we still have no real idea who makes our games?

Sure, we're often familiar with whoever's name is on the project as director. You'll see the publisher's producer quoted constantly on a title you're looking forward to (sometimes with unintentionally hilarious results) - even if that guy's never laid a finger on the game itself.

Game teams are often enormous, and it's unlikely that the audience will be familiar with every hard worker who made a successful project happen. But marketing departments are apparently so terrified to let "the message" get out of their control that it's very rare that the press is able to speak to a real, live designer - which means the audience probably never knows who that is.

Why is this a problem? Other than the fact that individuals deserve credit for what they do, it may result in an under-valuation of games themselves; the danger of treating games strictly as a software product and not as the product of the creative and technical expertise of its development leadership means that many audiences will never never see games as anything other than - well, software products. Not art, not experiences, not human interaction, just shiny things in boxes that tend to sell units.

This is challenging even to an audience who's used to understanding and enjoying video games - when's the last time it occurred to you that a video game was made by human beings? But it's an even bigger problem for the future viability of the medium. We want mainstream audiences to be able to accept that games are as valuable as any other entertainment medium, firstly. Secondly, talent is miserable when it's simply an interchangeable pair of hands assigned to get a product to ship, and the game reflects that. So please, game companies, let your title benefit from the input of individual spirit that feels valued.

New Year's Resolutions For Gamers
Ditch The Pessimism: Ever notice that gamers seem to be much more enthusiastic about the things they hate than the things they like? Ever see a new game announcement where half of the comments say, "I hope this doesn't suck?" - And those are just the optimists.

Of course, no one should ever say that audiences have no right to criticize games, dislike them or discuss the reasons why something didn't work for them. And there are probably some logical reasons why the game community on the internet seems to be so vitriolic and negative so much of the time. The industry hype cycle starts early - big promises and long waits for expensive new titles leads to expectations that can't possibly be met, for one thing, so perhaps gamers can be forgiven for feeling cynical much of the time. The anonymity of the internet takes some credit, too - it's a lot easier to build an echo chamber of the kind of ill conduct we can't easily vent in real life, which could mean that the way people behave online isn't a true reflection of their sentiment around video games at all.

And many of our major titles are, to be fair, depressing - they feature dark environments, they ask you to kill things, or they feature narrative themes designed to prompt brooding and gut-wrenching as a shortcut to artistic integrity.

But let's work with the theory that gaming is a medium in its adolescence - that one day, we'll see a greater diversity of genres, an inspiring range of experiences, and true social acceptance of the worthiness of gaming. If that's true, then right now, we the passionate enthusiasts are the early adopters. We're on the forefront. Do we want to be tearing down its shortcomings, or cheering on its successes? Think about it.

Choose To Engage: Our expectations of games today have been somewhat shaped by games yesterday. We tend to enter a game with half of our expectations built from prior titles, and half of them built by early buzz, whether it's positive or negative. We are promised an Enriching Experience, and with controller in hand, we sit and wait for it to be delivered. When the way we push buttons fails to yield an intellectual or emotional connection, we blame the game.

But maybe part of enjoying a game is wanting to enjoy it. In our demand for interactivity, it seems it's very possible to lose track of the fact that "interactivity" requires input from the player. Each of us is certainly entitled to his own taste - but who knows what we might be missing when we wait for the game to make that emotional connection rather than making it ourselves?

For example, long cutscenes have fallen out of favor, so anytime we see one, we become frustrated. We want to play games, not watch them, comes the old refrain, and so we tap-tap-tap a button, hoping to skip. But sometimes, just trying to sit still for a minute can be an enlightening experience. Sometimes cutscenes are useless filler - but sometimes they're expositions that offer opportunities for the player to relate the events of the game to their own emotions and experiences. Never know unless you try, right?

It's akin to reading a novel - if you skip pages just to get to the kinds of scenes that you find most interesting, you might be missing a learning experience, a moment that could unexpectedly touch your heart, or a chance to learn more about the book's characters that would add additional meaning to its main plot.

Of course, cutscenes are just one example. In general, as players, we could definitely benefit from a more open mind and a willingness to try and engage with the game world - an in-game choice, after all, is really just sets of ones and zeroes unless you are the one who decides it means something to you. The game can't do it all for you.

Embrace Innovation: There's a reason sequels sell so well - people can rely on them. There's little risk, as a purchaser, in something you know is going to be just more of something you already know you like.
And again, everyone's entitled to personal taste, of course. Still, much of the audience spends a lot of time clamoring for something "different" - but would we recognize it if it were right in front of our face?

Different doesn't always mean "better," of course. There are plenty of attempts at innovating on design that end up poorly executed. There are plenty of earnest efforts to create compelling characters that just turn out to be annoying. But interestingly, it seems many of the titles that try the hardest to forge out in a new direction are the ones that make the least splash - they're the sad gems buried in the holiday rush, or the cult favorites that no one else seems to "get."

Could it be we're still fastened to our old ways of thinking, evaluating games in the dated "five categories" of a much simpler time? One lesson from 2008 was how stunningly diverse the catalog of available titles was - and how divided many critics were, and how hard it was for reviewers to arrive at a consensus of whether or not a game is "good." Games are not simple anymore, period.

Maybe the successes of 2009 will be even less easily-defined; the "formula" for success becomes less and less relevant and less and less predictable. We'll no longer be able to evaluate games against a list of pre-defined genre traits, marking them positively or negatively based on how well they adhere to the expectations set by their predecessors.

In other words, we'll have to cultivate our appetite for innovation, and work on regarding experimentation as a positive to be studied, not a negative to be quashed just because it doesn't adhere to expectations. We might enjoy games more that way, and even learn more about them.

http://kotaku.com/5121437/new-years-...-game-industry