The new Far Cry patch does indeed seem to be a decent showcase for Shader Model 3.0's potential. The GeForce 6800 cards gain up to about 10% in average frame rates when using the SM3.0 code path. No, the differences aren't going to convert the masses into NVIDIA fanboys overnight, but they do show NVIDIA wasn't kidding about Shader Model 3.0 offering some advantages. This patch also seems to have cleaned up some problems with the GeForce 6800 code path in the game. Even with Shader Model 2.0, GeForce 6800 performance is way up. Image quality also seems to be quite a bit better than it was on the GeForce 6800 with Far Cry version 1.1. My eye is accustomed to seeing the game run on an Radeon 9800, and the GeForce 6800 cards looked just fine to me with the new patch. I played through a few levels of the game, including some with lots of pixel shader-laden effects on the walls and floors, and I didn't notice any corner cutting or blocky shading or texturing. Of course, some of these changes may be attributable to newer NVIDIA drivers, but the effect is the same. So what does all of this tell us about the eternal question: "Should I fork over my cash for a GeForce 6800 or a Radeon X800?" I'm not sure, exactly. PC games seem to be approaching the way the console world works, where publishers cut deals to publish exclusive or enhanced games for a given platform. In this case, Ubisoft worked with NVIDIA to make one of the best games of the past six months run smoothly on the GeForce 6800. That's spectacular, especially because the game still runs very well on Radeon cards.