New comics Monday, Wednesday, and Friday!
Dee Yun: (contact-deleteme[at]-deleteme-direman [dot] com) 2014-02-03 02:03:21
Xbox One GPU Usage
So that's the culprit. 10 percent of the Xbox One's GPU is perma-dedicated to Kinect functionality. The Xbone multitasking and Kinect functions ARE pretty freaking sweet (when they function cleanly), but when I play games, I want my console concentrating on that singular job at hand. I don't want processing cycles running idle waiting for face/voice/gesture recognition during games that don't employ face/voice/gesture recognition if it means sacrificing visual fidelity.
The reason I'm joining the throng of internet graphics whiners is that they've got a point, even if many of them don't necessarily grasp what it is. It's not about MOAR PIXELZ; it's about standards. Contemporary consumer televisions output at 1080p and our games should meet that criteria, no more no less. Upscaling from 720 is not an acceptable alternative.
Forbes reports a rumor that Microsoft is mulling over giving developers access to more of the GPU's horsepower. Yes, please. I still prefer the Xbox Live ecosystem over PlayStation Network, but with Assassin's Creed 4, Call of Duty: Ghosts, Battlefield 4, and now Tomb Raider: Definitive Edition, I've opted for the PS4 versions because they simply look crisper. I hear a lot of Microsoft apologists claiming that you can't tell the difference during gameplay, but that's poppycock twaddle hooey. I can readily identify which version is running at a glance.
Of course, Battlefield 4 on the PlayStation 4 STILL keeps crashing AND corrupting its save file with the teeth gnashing vexation of Error Code CE-34878-0. You should probably look into that, Sony. Minor graphical superiority doesn't even COME CLOSE to making up for a console that behaves like the lead role in Memento.