Gameguru Mania Updated:05:47 PM EDT Apr,17

Pillars of Eternity Patch v1
Virtuix Omni - Grand Theft A
The Witcher 3: Wild Hunt Has
Games Media Only Accounts fo
TechNews - W7 installs Windo
Grand Theft Auto - Windows U
The Witcher 3: Wild Hunt Scr
NVIDIA GeForce 350.12 WHQL D
Computer games for kids - To
Xbox 360 Getting 2 TB Hard D
Descent Underground Kickstar
Blue Estate - PC Gameplay -
GOG Lets You Reclaim Your Ga
Deus Ex: Mankind Divided Rev
Rainbow Six Siege Gameplay P
Witcher 3 Getting Two Massiv
Diablo 3 With A 3rd-Person C
Tomb Raider Sells 8.5 Millio
Squad Is Now on Steam Greenl
TechNews - Windows 7 Momentu
GTA Online Cheaters Punished
Microsoft Targets Halo Onlin

Please e-mail us if you have news.

(c) 1997-2015 Gameguru Mania
Privacy Policy statement
SEARCH GAME:   Windows 8.1 Tools    CD/DVD tools    SteamOS Beta
 Gameguru Mania News - Nov,07 2007 - tech 
NVIDIA ForceWare 163.75 WHQL Drivers - tech
(hx) 04:21 AM EST - Nov,07 2007 - Post a comment
NVIDIA has released a new ForceWare WHQL drivers for its various GeForce video chips (6x, 7x and 8x series). They are available for Windows XP 32-bit, Windows Server 2003 x64 Edition/Windows XP Professional x64 Edition, Windows Vista 32-bit, and Windows Vista 64-bit. The new drivers add support for GeForce 7150, 7100, 7050, 7050, 7100, improve compatibility for Half-Life 2: Episode 2, and add NVIDIA SLI profiles for Half-Life 2:Episode 2, Portal, Clive Barker's Jericho, SEGA Rally Revo, NHL 08, and European Street Racing.
 Gameguru Mania News - Nov,04 2007 - tech
TechNews - Fake Wii :Vii first open box video shot - tech
(hx) 06:18 PM EST - Nov,04 2007 - Post a comment
 Gameguru Mania News - Nov,02 2007 - tech
TechNews - HD-DVD Player For Less Than $100 - tech
(hx) 09:59 PM EDT - Nov,02 2007 - Post a comment
 Gameguru Mania News - Oct,31 2007 - tech
TechNews - ForceWare 169.04 BETA (TimeShift) - tech
(hx) 08:30 PM EDT - Oct,31 2007 - Post a comment / read (4)
 Gameguru Mania News - Oct,30 2007 - tech
Crysis 32 vs 64 bit - the real story? - tech
(hx) 09:26 PM EDT - Oct,30 2007 - Post a comment
Yougamers have posted a brief comparison between how well Crysis demo runs in 32 bit and 64 bit modes:
To begin with, we took the following system and ran the CPU and GPU benchmarks in 32 bit and 64 bit mode: Windows Vista Ultimate x64, Intel Core 2 Quad Q6600 @ 3GHz , 4GB system RAM, NVIDIA GeForce 8800 GTX 768MB, ForceWare 163.69 drivers.

We took two extremes to see what the results would churn up: 800 x 600, running in DX9 with all detail settings at Low, and 1680 x 1050, running DX10 with all detail settings on Very High. Each benchmark batch file was run 5 times, with all of the collated results averaged. The results are in the image to the right.

Before we all jump to any conclusions though, bare in mind that this is a demo and not the final retail game - we could genuinely see noticeable gains by using the 64 bit version but if we assume that the demo is a pretty close code version to the launch one, then not all of us are going to be so lucky. Please let us know what you find out: try running the CPU and GPU benchmarks across a range of settings in 32 and 64 bit, and then post your findings!
 Gameguru Mania News - Oct,29 2007 - tech
TechNews - Core 2 Extreme QX9650 Tested - tech
(hx) 10:11 PM EDT - Oct,29 2007 - Post a comment / read (2)
NVidia 8800 GT vs. Unreal Tournament 3 - tech
(hx) 06:39 PM EDT - Oct,29 2007 - Post a comment
All of these cards can play the UT3 demo reasonably well at this resolution, the 8800 GT included (TechReport: )
As I mentioned earlier today, NVidia's much-anticipated 8800 GT has arrived and many reviews are using the Unreal Tournament 3 Beta Demo as one of their benchmarks! (big thanks BeyondUnreal)

TechReport: We tested the UT3 demo by playing a deathmatch against some bots and recording frame rates during 60-second gameplay sessions using FRAPS. This method has the advantage of duplicating real gameplay, but it comes at the expense of precise repeatability. We believe five sample sessions are sufficient to get reasonably consistent and trustworthy results. In addition to average frame rates, we've included the low frames rates, because those tend to reflect the user experience in performance-critical situations. In order to diminish the effect of outliers, we've reported the median of the five low frame rates we encountered.

Because the Unreal engine doesn't support multisampled antialiasing, we tested without AA. Instead, we just cranked up the resolution to 2560x1600 and turned up the demo's quality sliders to the max. I also disabled the demo's 62 FPS frame rate cap before testing.
Crysis DX10 Features in Windows XP - tech
(hx) 12:04 PM EDT - Oct,29 2007 - Post a comment / read (2)
DX 10 Features in Windows XP - MUST SEE thread on the Crysis Forums offers a tip on improving the visuals in the Crysis demo under DirectX 9, saying the game can be coaxed into looking more like the game running under DirectX 10 without using Vista!
If you tweak the configuration files in CVarGroups by copying and pasting the "very high" settings (1st paragraph) IN PLACE of the "high" settings (last paragraph) the game will load the highest possible settings even though the drop-down menus display "high." The difference between "high" settings and the tweaked settings is immense: shadows are deeper, more realistic; the leaves have better reflective properties, better textures; the colours are better; and the level of detail is simply stunning.

With these settings I'm running the game between 15-25 FPS at 1440x900 and (wait for it) 8x AA, and it looks PERFECT. Best of all, this is in XP. So I'm happy. I have XP and I'm playing the game at settings higher than DX9 allows (strictly speaking). Give this a try if your rig can handle it.
Nvidia's GeForce 8800 GT Tested - tech
(hx) 11:47 AM EDT - Oct,29 2007 - Post a comment / read (3)
NVIDIA today is set to launch its newest midrange graphics card part, the GeForce 8800 GT (65nm), previously known by its codename G92. NVIDIA guidance states that its newest graphics card will be sold at a retail price in the $199 and $249 price range.
The GeForce 8800 GT sports a 100 MHz speed bump over the 8800 GTS, and comes factory clocked at 600 MHz. The 600 MHz clock speed of the 8800 GT is actually 15 MHz higher than the 8800 GTX's default GPU clock, which is set at 575 MHz. The 8800 GT's clock speed also comes within striking distance of the GeForce 8800 Ultra's 612 MHz GPU clock speed. The NVIDIA GeForce 8800 GT features 112 stream processors, 16 less than the 128 stream processors found on the ultra high-end 8800 GTX and 16 more than the 96 stream processors found on NVIDIA's 8800 GTS. The stream processors of the 8800 GT come clocked at 1500 MHz, the same speed as the stream processors of the GeForce 8800 Ultra. Comparatively, the GeForce 8800 GTX comes with its stream processors clocked at 1350 MHz while the 8800 GTS' stream processors are clocked at 1200 MHz.

The GDDR3 memory of the GeForce 8800 GT comes clocked at 900 MHz -- equal to the memory frequency of the GeForce 8800 GTX. However, the 8800 GT falls short of the 8800 Ultra's memory speed, which is 1080 MHz (2160 MHz effective). NVIDIA guidance states that the GeForce 8800 GT supports the new PCIe 2.0 bus standard. The PCI-Express Special Interest Group claims that the new bus standard yield improvements in bandwidth. High-definition video fans will be glad to hear that the GeForce 8800 GTS comes integrated with support for NVIDIA's 2nd generation PureVideo HD engine, which allows for H.264 video decoding to be offloaded from the processor and on to the video card. HDCP support is also present on all reference designs.
The first reviews can be found on AnandandandandTech, Boot Daily, Chile Hardware, Elite Bastards, FPS Lab, Guru3D (8800 GT SLI tested), Hot Hardware,, Neoseeker, TechPowerUp, TechReport, THG, Tweaktown, VR-Zone:
In the end, Nvidia has taken its time, but offers an exceptional card with the GeForce 8800GT. For about $230 it nearly displays the power of a GeForce 8800GTX (under 3%) twice as expensive, with the only downside being a slightly lower memory capacity (512 MB instead of 768 MB) and a bandwidth 10% lower than that of a GeForce 8800GTS. However, in games, the latter is usually outperformed by 30% without filters (using the 320 MB version) by the GeForce 8800GT and the GTX is only 12% higher. We have to say that the transition to the 65 nm process meant the arrival of more than just a couple of G92 on a wafer. By the way, this chip boasts 754 million transistors. It's small, consumes less than a GeForce 8800GTS 320 MB, and the 8800GT is also quite silent despite its single slot cooling system.
 Gameguru Mania News - Oct,28 2007 - tech
Crysis Demo Performance Analysis - tech
(hx) 05:55 AM EDT - Oct,28 2007 - Post a comment / read (4)
Both PC Perspective and TweakTown have Crysis demo performance analyses:
The Crysis settings system works by only offering you resolutions that the computer can handle. What's frustrating is that none of the cards here today gave us the ability to run at our native 2560 x 1600 resolution with any detail settings. With that said though, given the Ultra was only cracking 50FPS at 1920 x 1200 we wouldn't be expecting big numbers at 2560 x 1600. Also, if their system was flawless it wouldn't let you choose past 800 x 600 for the 8600GTS and HD 2600 XT. While I don't doubt that there are people out there with an 8600GTS saying "Nah this game runs great at 1280 x 1024 and high settings", it's clear that these people have no idea what smooth performance looks like.

Look at the performance from the 8800 GTX and the 8800 GTS 640MB cards compared to one another, we found some interesting inflections. For instance, at 1024x768 without AA, the 8800 GTS 640MB system out performed the GTX system. Yes, I know the testing process wasn't exactly the same, so we have to make some broader generalizations, but it would appear that the quad-core CPU that Jeremy used was in fact a factor in overall performance as the Crytek developers indicated. Another interesting note that came from this weekend - NVIDIA acknowledged that SLI scaling on Crysis was some crippled for the time being. A new driver is going to be released that will help with it, but they are saying that Crytek has a couple of fixes of their own that need to be made for proper multi-GPU performance so again, we might want to wait for the final retail version of the software to really get into the multi-GPU capabilities of the engine.
 Gameguru Mania News - Oct,26 2007 - tech
TechNews - ForceWare 169.01 Crysis Drivers - tech
(hx) 09:22 PM EDT - Oct,26 2007 - Post a comment / read (4)
 Gameguru Mania News - Oct,24 2007 - tech
TechNews - 8800GT has 9 possible variations - tech
(hx) 09:38 PM EDT - Oct,24 2007 - Post a comment / read (2)
 Gameguru Mania News - Oct,23 2007 - tech
Hellgate London Demo GPU Peformance - tech
(hx) 05:48 AM EDT - Oct,23 2007 - Post a comment
AMDZone benchmarked the Hellgate London beta on 12 different video cards to find out how it performs under DirectX 9. Here's a bit:
With 12 cards you have a lot of numbers to look at. What is clear again is that the GeForce 8800s dominate. Each are a good deal over 100 frames per second. Next up is the Radeon X1900XTX. The card has been out a while, but 73.7 frames per second is pretty damn impressive. Following it are the 7900s, with the 1950 Pro not far behind. Each manages to be well about 50 FPS and quite playable at 1280X1024. Then there is a bit of a drop off to the 8600s, the 7600GT, and the 2600XT which range from the low 30s to the low 40s in FPS. Then at the bottom of the barrel is a once proud warrior, the heat source that is the 6800GT which narrowly beats out the budget X1300SE. If you have a 6800GT and you are not running SLI then you really need to be thinking about upgrading.
 Gameguru Mania News - Oct,22 2007 - tech
TechNews - Windows 7 core details - tech
(hx) 09:42 PM EDT - Oct,22 2007 - Post a comment / read (1)
 Gameguru Mania News - Oct,20 2007 - tech
TechNews - TVLinks Shut Down - tech
(hx) 07:16 AM EDT - Oct,20 2007 - Post a comment / read (2)
 Gameguru Mania News - Oct,18 2007 - tech
TechNews - Enable 4GB of RAM with Vista - tech
(hx) 07:44 AM EDT - Oct,18 2007 - Post a comment
 Gameguru Mania News - Oct,17 2007 - tech
UT3 - CPU & High End GPU Analysis - tech
(hx) 10:18 AM EDT - Oct,17 2007 - Post a comment
AnandTech has published a valuable article called "Unreal Tournament 3 CPU & High End GPU Analysis: Next-Gen Gaming Explored". Here's a taster:
We don't often look at single-core performance given how cheap dual-core CPUs are today, but it's important to look at where we've come from over the past couple of years. One to two cores gives us an impressive 60% increase in performance on average, if we look back at our first dual-core processor review none of our gaming tests showed any performance increase from one to two cores. From 0 - 60% in two years isn't bad at all. The performance improvement from 2 to 4 cores isn't anywhere near as impressive, but still reasonable. In our first two tests we see a 9% increase and the third one gives us a 20% boost, for an average 13% jump in performance. If 3D games follow the same trend that we've seen over the past two years, it'll be another two years from now before we really see significant performance increases from quad-core processors. If in 2009 we hardly bother with dual-core chips because quad-core is so prevalent, you'll not hear any complaining from us.

Quad-core gaming is still years away from being relevant (much less a requirement), but the industry has come a tremendous distance in an honestly very short period of time. We're more likely to have multi-threaded games these days than 64-bit versions of those titles, mostly thanks to the multi-core architecture in both the Xbox 360 and PlayStation 3. Like it or not, but much of PC gaming development is being driven by consoles, the numbers are simply higher on that side of the fence (even though the games themselves look better on this side).
 Gameguru Mania News - Oct,16 2007 - tech
TechNews - 4TB hard drives by 2011 - tech
(hx) 07:13 AM EDT - Oct,16 2007 - Post a comment
 Gameguru Mania News - Oct,15 2007 - tech
Call of Duty 4 Demo Performance - tech
(hx) 09:23 PM EDT - Oct,15 2007 - Post a comment
The chaps over at FiringSquad gathered a dozen different GPUs ranging from the GeForce 7900 GT and Radeon X1950 Pro all the way up to the 8800 Ultra to see how today's latest high-end cards perform with Call of Duty 4. They have also included SLI/CrossFire results as well. Here's an excerpt:
The demo also gives us a preview of what kind of performance we can expect from the game, and here we saw the GeForce 8800 cards reigned supreme, particularly the GeForce 8800 GTX and Ultra. At 1600x1200 with 0xAA/16xAF the GeForce 8800 GTS 640MB trailed the GTX by 19%, with that gap increasing as you crank up the screen resolution. Under the increased demands of 4xAA/16xAF, the GTX pulls further away from the GTS 640MB. Here the GeForce 8800 GTS 640MB also pulls away from its 320MB counterpart. The GeForce 8800 GTS 320MB just doesn't have enough memory to run the game at the settings we chose at high resolutions of 1920x1200 with 4xAA/16xAF. And what about AMD's Radeon HD 2900 XT?

At worst, the Radeon HD 2900 XT ran about 9% slower than the GeForce 8800 GTS 320MB. This occurred at 1600x1200 with 0xAA/16xAF. As you increase screen resolution that gap narrows, by 2560x1600 the card pulled even with the GeForce 8800 GTS 640MB. With AA enabled, the 2900 XT runs faster overall than the GeForce 8800 GTS 320MB and performs anywhere from 12-15% slower than the GeForce 8800 GTS 640MB, although keep in mind that at 2560x1600 with 4xAA/16xAF the cards are only separated by 2.7 FPS in our testing. That's close enough to call it even, particularly considering the variability in our benchmark runs.

Since the Radeon HD 2900 XT is priced to compete with the GeForce 8800 GTS 640MB, some may consider this a decent showing for the 2900 XT, but we think the card is being held up a little by its driver. The X1950 XTX performs awfully close to the Radeon HD 2900 XT. CrossFire scaling needs a little more work as well.

If the rumors are true, the GeForce 8800 GTS 320MB will be replaced shortly by NVIDIA's upcoming G92 GPU. Some leaked documents suggest this card will be outfitted with more stream processors and memory than the GTS 320MB, as well as higher clock speeds: 600MHz on the graphics core (100MHz higher than the GTS today) and 900MHz memory, also 100MHz higher than the GeForce 8800 GTS. The card is expected to utilize a 256-bit memory interface versus the 320-bit interface of today's cards, but the extra stream processors, memory, and higher speeds should be enough to offset the difference.
Tim Sweeney On Your UT3/GoW PC - tech
(hx) 10:32 AM EDT - Oct,15 2007 - Post a comment
The Inquirer put Tim Sweeney to the task of giving some general recommendations on what Unreal Tournament 3 and Gears of War PC will need to fly.  When asked about the amount of video memory, here is what Tim stated:
In Unreal Tournament 3 and Gears of War for PC, there is a significant gain in having 512MB of video memory rather than 256MB. So, first and foremost, get at least a 512MB card if you can afford it. If you haven't maxed out your budget, then go for the maximum single-card performance that doesn't require extreme cooling, e.g. buy an entry-level GeForce 8800 over a GeForce 8600.