All I want to do is set up two monitors, one running at 1280x960@85Hz and the other at 1152x864@75Hz (though I'll settle for 1024x768 on the other). Now, neither monitor advertises this resolution as being supported, so in Windows I have to go to the Monitor properties and untick the box that says "Only display settings supported by this monitor". Fair enough, it stops people doing stupid things.
On Ubuntu, which is supposed to be nice and friendly and easy... I have to open a shell, use cvt to generate a modeline, use xrandr to tell X about that modeline, use xrandr *again* to tell X which outputs can use that modeline (this is getting too close to manually editing X config files for my liking, and that's never gone well for me). Then because I want to test the resolution first to make sure I remembered it right I then open the Display properties window and pick the resolution. It asks me some mumbo-jumbo about setting the virtual desktop size in a config file, then tells me I have to log out to apply the change!
Ok, so log out, wait for the Ubuntu LiveCD to auto-login... and it's still running at the old resolution. Open the Display properties... and the resolution I added has gone! At this point I gave up, because I'm only using the LiveCD to run a smart test on a hard disk (the Windows smartctl doesn't like the controller it's on).
Come on, Linux, join the 21st Century! I've been able to do this easily in other operating systems for over a decade.
Edit: Oh, and running the BBC's flash-based iPlayer in fullscreen mode has horrendous tearing, along with some clicks and pops from the sound. From a quick Google search it looks like if I spend quite some time easter egging various settings I *might* fix this... or I could fire up the Windows-based laptop which works perfectly.
(Credit where credit's due: Ubuntu detected and got at least partially working sound, graphics, Bluetooth, the SATA controller, the IDE controller, an ethernet card and the 802.11g dongle without needing any hacking or driver hunting)