Rant the first:
In theory, my laptop has enough processing power to play back HD video. It's not got much headroom (it only has a 1.5GHz Pentium M and embedded Intel graphics), but it can do so. And it can play back glitch-free.
In practice, it can't. Or at least it can't when using BBC's iPlayer, since that appears to result in a 5x overhead for video playback. And the video is full of tearing (which is a problem that was solved well over a decade ago).
I'm not exaggerating the overhead either: playing a 720p 3Mbps H.264 stream in a WMV container takes 10% CPU on a 2.4GHz Core 2 Quad processor. Playing a 720p 3Mbps H.264 iPlayer stream uses over 50% CPU on the same processor.
Rant the second:
'Tis E3, and so the Internet is full of shiny trailers for shiny games. A few years ago these would have been available for download, as any of a number of well-supported formats. They would have been compressed *once* (or maybe twice, with the intermediate being a codec designed for production use), and so would have few compression artefacts. They may even have used multi-pass compression to eke out a little bit more quality. The resulting files would have been reasonably large for the time, and would have taken a while to download, but would have been worth it.
That was then. Now YouTube exists, with the end result that it is impossible to get a first-generation copy. You have to play it in YouTube’s Flash player, which eats processor and gives you virtually no control over the playback. It is possible to download videos from YouTube (although they make it hard to do so), but what you get has been passed through yet another codec most likely running with "one size fits all" settings, resulting in spectacular fail at times. This assumes that the source media was any good in the first place, and wasn't a nth-generation copy that has been reuploaded a dozen times (with a few banners added and removed along the way).
Oh, and there's tearing. Is it really *that* hard to make vsync work?