.

Started by Wula, November 04, 2011, 07:54:54 PM

Previous topic - Next topic

drewdle

I'm of two minds about this.

First is that the PS3, arguably the most powerful of the current console generation, has what amounts to a pair of GeForce 7800 cards SLI'd together (the RSX processor). The Xbox 360 makes do with less. And let's not talk about the Wii. I recently picked up a 360 and hooked it up to my 23" 1080p computer monitor, and am disappointed by the graphics quality (which looked way better on a low-res tube TV, go figure). However, yes, all the hardware is compatible, so more time is spent playing games and less time is spent on jiggery pokery, drivers, updates, tweaking, etc. It's plug it in and go.

Second, though, is that a year ago I had what would be called a "decent" gaming rig. It was a budget HP with 4GB of ram, a three core AMD processor, and a GTS450 overclocked card in it. On the same monitor as mentioned above, it rocked. Most games could be played with all the important sliders cranked at 1080p with almost no frame drops. The detail, especially when it came to anti-aliasing and textures, was vastly superior to what the 360 is putting out. So a computer allows for constant upgrades of power, which allows developers to constantly push the envelope. Any game that gets put out for PCs and consoles is almost certainly watered down on the console port for this reason. However, at the end of the day, you need a Windows license, anti-virus software because of Windows, twice the amount of cash or more for hardware outlay, and some patience if you're going to get everything set up properly.

I'd say after playing in both camps I prefer PC gaming, for the reasons above and most of Pat The Fox's post. That said, I can see the benefits in either method.