[…] People look at these games and are like: “oh my god, look at how intensive this is. It must look beautiful.” I’ve even seen screens of Crysis. It looks nice but not that great.
My theory is that all these games are just poorly coded. Look at HL2, every Blizzard game ever made, Supreme Commander, and some others. These games can be gorgeous at high resolutions and still look great at lower resolutions. These companies take pride in their product and it shows. I wonder if this has anything to do with the decline in PC gaming.
That’s definitely part of it. Console gaming competes with PC gaming more now that consoles have advanced so far, especially with Xbox Live.
To play cutting-edge PC games, you need to have a gaming computer, which generally requires a continuous upgrade cycle with an average cost of $500-1000 every year (depending on whether you want to play on “high” settings).
When you want to play a game, you have to spend a bit of time installing it, then (usually) a lot of time downloading and installing patches and updates. Then, eventually, you launch the game and sit through the ridiculously long disc-copy-protection checker that masquerades as a “Loading” box (to penalize people who legally buy the game), then the game actually starts loading, then you sit through a million company logo screens that you can’t skip quickly, then you configure everything, possibly screwing with drivers if necessary, then you finally start a game… after sitting through the intro sequence that you can’t skip. Oh, and all of this has to happen on Windows, with all of the wonderful annoyances that go with that.
The entire gaming industry is dysfunctional, but PC gaming shows the worst of it.
Spending $300 every 3-5 years on consoles with far less hassle involved is much more attractive for most people.
(Except me, since the only types of games I like are RTS, FPS, and SimCity-type construction games… all of which suck on consoles.)
Sometimes I wish I liked stupid sports games and RPGs more.