Gameguru Mania Updated:02:06 AM EDT Oct,23


POPULAR CHAT TOPIC
343 Industries Explains 20GB
Advanced Warfare PC Minimum
TechNews - iPad Air 2, Mac U
Halo: The Master Chief Colle
Company of Heroes 2: Ardenne
The Female Gamers In Support
Hatred - New Isometric Shoot
Aliens versus Predator Class
Far Cry 4 - Yet Another Game
Borderlands: The Pre-Sequel
Star Citizen Persistent Univ
Alien Isolation on Oculus Ri
Star Wars: Battle Pod Arcade
TechNews - Holograms & 3D Po
Star Citizen Will Support AM
Firefly Online - 3D Environm
Every Game Collector's Dream
The Black Glove Kickstarter
Final Fantasy XIII for PC -
Battlefield 4 Premium Editio
The Evil Within PC Officiall
Dragon Age: Origins Free on


Please e-mail us if you have news.

(c) 1997-2014 Box Network Ltd.
Privacy Policy statement


SEARCH GAME:   Windows 8.1 Tools    CD/DVD tools    SteamOS Beta
 Tomb Raider Video Card Performance Review - tech
(hx) 03:10 PM EDT - Mar,20 2013
The chaps over at HardOCP have posted their Tomb Raider Video Card Performance and IQ Review. Here's a taster:
Tomb Raider's image quality and detail are outstanding. The Crystal Dynamics engine successfully used DX11 features like tessellation, ambient occlusion, and depth of field to display more lush and rich environments. Tessellation smoothed out characters, and also gave more depth to objects we encountered and the environment around us. We are glad to see the inclusion of tessellation again in games. It seems most game's released aren't using tessellation. It was the talk of the town when DX11 was released, and some games made it a showpiece, but lately, it's a technology that was kind of shoved under the rug for a lot of new games in 2012. We are finally seeing some new games that use this technology and we are thrilled to see it used on characters and environment. The PC platform has the GPU performance to use it, and use it we want.

The shadows were another part of this game that provided more depth and detail. Not only were there high resolution shadows, but we had SSAO working as well and real-time globally illuminated images. The most disappointing thing in Tomb Raider is the performance cost required to enable SSAA at higher resolutions. The image quality greatly improves when using 2X SSAA and 4X SSAA which really disappoints us how much single GPUs struggled with it.

TressFX was the graphics option that made the image quality stand out compared to other video games. For the first time we witnessed what it was like to have real-time individuality rendered hair that reacts to your movements, jumping, or being hit. It also reacts to weather effects like wind and rain. When running hair quality at normal, Lara's ponytail was one piece of hair stuck together. Enabling TressFX allows her hair to have thousands of individually rendered strands, each of which react to her movements and the other variables. There are some bugs that need to be fixed with TressFX however, but this looks like a very good start to seeing this technology in games we can buy. Strands of hair have been shown off for many years in demos but never delivered in games truly.

There are some problems with the hair clipping through Lara and her clothes, or her weapons, not rendering at times, and some shadows cast on her face are overbearing. The physics of the technology needs to be improved, there were many times the hair would blow a little too extreme, or the strands would be a little too lively. The strands would bounce a little too much, and the clipping through things took away from the immersion. Performance while using TressFX was faster after receiving performance drivers from AMD and NVIDIA. NVIDIA also cured the problem that existed, blurring Lara's hair when 2X SSAA was enabled at higher resolutions, and with 4X SSAA at 1080p.

This is just the first rendition of this technology, and hopefully it will be improved over time. The looks are good, but the physical properties really need to be improved. It is an excellent first attempt though, and we are finally seeing what we've seen in tech-demo's for so long, good looking hair in games. It sure took them long enough to get this kind of technology in games!

last 10 comments:
Stumpus(03:44 PM EDT - Mar,20 2013 )
Amazing that my graphics card GeForce GTX660TI doesn't equate to their figures in the tests at 1920x1080p I have higher fps than them with FULL settings x16 and more settings enabled than a horse can shit on the cards own enhancements.
The processor is shit compaired to my I5 3570k (value for money) SSAO on Ultra. We all know that the I5 is the better processor for gaming as the i7 uses hyper threading, which games don't use. Waste of money for games and barely any performance increase regardless. Motherboards, RAM all factor in the equation

Tom(12:23 PM EDT - Mar,21 2013 )
I5 3570 vs I5 2500K... 2500K is better both in value and performance. 600 series video card, I have a 560 and it doesn't have a problem with anything I've thrown at it at 1920x1080. I can overclock my 2500K to 4.8GHz if I wanted too. Ask on most any game site what's better and people will tell you a 2500K is the better processor. Games support hyper threading, just it doesn't relate exactly to more fps.

Stumpus(02:43 PM EDT - Mar,21 2013 )
Except I have 3570k which has hit 5GHz on air. The features are what I want with my maximus v gene PCIe 3 and more features than a horse can shit.
The 560 is a mile behind the 660ti in gaming http://www.tomshardware.com/reviews/geforce-gtx-660-geforce-gtx-650-benchmark,3297-6.html
I'm talking maximum settings too not just running 1920x1080 on medium. For a start TressFx does nothing for performance loss to me, compaired to other people moaning about jerky or stutter or crashes. It flows perfectly with everything on ultra and x16 x32 ssao anything doesn't matter.
Also I have the board for future proofing and instant one touch overclocking to 4.8GHz with my arctic cooler i30. The 2500k is a good CPU the 3570k is about 10-15% faster but you need to upgrade your card, any online comparison shows that.

gx-x(03:21 PM EDT - Mar,21 2013 )
I suggest you to stop reading Tom's HW starting from today. They do very bad testing. They take scores of cards they already have and then put them in the review scores of cards they test currently. You can bet your ass that 560Ti and 570 are running on a year old drivers in most of the tests there.
Also, judging by their scores 660~660Ti which is far from reality.
660Ti is excellent card and is at least 30% faster then 560Ti but it also costs twice as much.

PS. Any S1155 board based on P or Z chipset is as future proof as any Z7x board. Also, Ivi bridge i5 has no advantages over sandy bridge i5. Actually, it has better APU, but who gives a shit about APU anyway. IB is all about APU, just another intel "TOK" before the new "TIK" (and 1155 boards will not be "future proof" anymore when "TIK" happens, trust me.)

Stumpus(05:36 PM EDT - Mar,21 2013 )
I wasn't reading toms hardware, it was one of the first comparison sites that I found. To prove the point, regardless of cost, is that the 560 is toiling now when it comes to playing in higher settings.
For your information, the 560 ti costs more or the same as a 660ti
http://www.amazon.co.uk/s/ref=nb_sb_ss_i_4_3?url=search-alias%3Daps&field-keywords=560+gtx+ti&sprefix=560%2Caps%2C225

660ti
http://www.amazon.co.uk/s/ref=nb_sb_ss_i_0_10?url=search-alias%3Daps&field-keywords=660+gtx+ti&sprefix=660+gtx+ti%2Caps%2C197&rh=i%3Aaps%2Ck%3A660+gtx+ti

Costs twice as much? Don't think so....

Also, RAM and many other factors govern performance differences between ivy and sandy, you can't just wave a de facto wand and say a sweeping statement. It doesn't apply to my rig or someone else's, as people don't and do their own custom builds. SSD, SATA 3 etc,etc. differing performances and so it goes on.
As it stands, I beat the test rig they used for tomb raider with a 660ti, and that's fact.

gx-x(11:42 AM EDT - Mar,22 2013 )
Here, where I live 560Ti 1GB is ~150e, 660Ti is ~300e. On newegg it is 200$ vs 300$. Screw the amazon. :) (you wouldn't believe the import prices on most of the stuff. Re-sellers are making 50% over the import+costs expenses in most cases)

and you are wrong with IB vs SB. APU is different, all else is negligible difference. RAM? I did some benchmarks (crysis warhead, crysis 2, FarCry 2, RE6) with ddr3 1333 cl5 and ddr3 1600 cl5 there was no difference in performance. It's a waste of money on intel platform. AMD does see difference due to a different architecture.
Oh and keep in mind I talking about gaming performance, not SuperPI and other e-peen software that I don't really care about.

PS. For what it's worth, I also beat their test machine with my 560Ti. Tom has old figures, and HardOCP stopped being serious review site 3 years ago. Their results are always sub-par for some reason?!

Tom(12:27 PM EDT - Mar,22 2013 )
Stumpus, you are the kind of guy Nvidia loves. Senseless video card upgrades just so you can say you have it. Oh look here is a review I agree with so I will go buy it. I don't see me upgrading my 560 any time soon. Nothing I play has stressed the card, been choppy or anything.

What games are toiling with a 560 Stumpus? I have no games with stutter or anything. They play fluid at 1920x1080 which is the max of my current LED. I remember someone telling me that Metro 2033 with DX11 would lag my video card. Guess what, didn't have that problem. FPS were ~55-60 and didn't bother even a bit. I paid $200 for my 560 like 2 years ago, it wasn't the best or anything. I just made sure to get a GTX and the fact it could be overclocked.

I'll let ya know Stumpus, if/when I upgrade. So far my i5/2500K, 16GB, Win7 on SSD and my LoWly 560 GTX is more than cutting it.

gx-x(12:54 PM EDT - Mar,22 2013 )
ugh, metro 2033 with those DX11 options enabled, dx11 AA... is pretty choppy judging by reviews all over the net. I remember playing it with ati 5850, and later gtx260, all was fine if I used normal settings but enabling dx11 fog would cut fps from 40 to 10. literary. Yes, That was on older cards, but 5850 is still a very good card. I haven't tried Metro2033 on gtx460 or 560Ti (no interest in the game really) but maybe I will...meh, probably won't :P

Tom(03:27 PM EDT - Mar,22 2013 )
I should mention Stumpus, while I do up the quality to max in all my games I typically turn off AA because AA is bullshit anyhow. Likely why I don't lag in much games. When you are going 1920x1080 with everything else maxxed, I'm hard pressed to see jaggedness in any game I play. Seeing how I play a lot of FPS, I'm usually moving and don't have time to notice.

Good point though gx-x. I remember arguing here or somewhere about Metro 2033 and DX11 and my hardware and how it was impossible to get the fps I was getting at max quality. If I remember it was simply because I had AA off. I'd have to check, it's still sitting on my disk.

All comments
 Add your comment (free registration required)