bza a very bad gay

Joined: 24 Jul 2010 Location: A cave in a swamp somewhere
|
Posted: Fri Nov 22, 2013 4:58 am |
|
|
I built my PC a couple years ago for $600 and it ran The Witcher 2 just fine (before I upgraded the GPU) at 1080p and mostly 60 fps. The big performance drain is always anti-aliasing so I kept that off and left everything else at whatever it auto detected. After the upgrade I ran it with everything absolutely maxed, and the upgrade was ~$170 for a Radeon 7870 fancy whatever edition on sale. Not so sure I'll need to upgrade during this gen unless I do end up getting a super high res display. I'm pretty dang curious about how this Mantle thing is gonna pan out.
Maybe it's just the games I play, but I really haven't had to do any sort of fiddling lately at all. Most devs seem to be pretty good with the autodetection thing and I really can't be assed to run third party AA injectors. Warframe, Outlast, Path of Exile, and even ArmA 3 straight up had settings I didn't need to change at all upon first launch. There's also this AMD/Raptr thing that optimizes game settings based on other people's configs, and I think nvidia has one too! The age of futzing around before playing every PC game is hopefully over soon.
But... my TV is a POS and the only console I have is a Wii, where the worst of my problems was just setting the correct resolution. I totally understand why people buy consoles and I don't think it's dumb, but for real I've had more fun with the random weird free/cheap indie games for PC than most big titles in the past year or two. The PS4 and Xbone really don't interest me at all, and unless either gets a seriously killer exclusive lineup I'm sticking to Wii/3DS/PC mustard race. Maybe I'll become an idort and get a Vita if they get cheaper... _________________
 |
|