The Digital Foundry blog has an article about measuring an important but often nebulous aspect of console gameplay: input lag. Using a video camera and a custom input monitor made by console modder Ben Heck, and after calibrating for display lag, they tested a variety of games to an accuracy of one video frame in order to determine the latency between pressing a button and seeing its effect on the screen. Quoting:
"If a proven methodology can be put into place, games reviewers can better inform their readers, but more importantly developers can benefit in helping to eliminate unwanted lag from their code. ... It's fair to say that players today have become conditioned to what the truly hardcore PC gamers would consider to be almost unacceptably high levels of latency to the point where cloud gaming services such as OnLive and Gaikai rely heavily upon it. The average videogame runs at 30fps, and appears to have an average lag in the region of 133ms. On top of that is additional delay from the display itself, bringing the overall latency to around 166ms. Assuming that the most ultra-PC gaming set-up has a latency less than one third of that, this is good news for cloud gaming in that there's a good 80ms or so window for game video to be transmitted from client to server.

http://hardware.slashdot.org/story/0...-Console-Games