Gameguru Mania Updated:01:30 PM EST Mar,01
CHAT TOPICS
Chess Ultra Coming to PC thi
Microsoft to Block Non-Windo
Guy Tries To Use Windows '98
Bulletstorm: Full Clip Editi
New Brutal Doom video
TechNews - Fiber Reads Data
Horizon Zero Dawn Gets New 4
Pine - Official Announcement
Steam Audio Partial Occlusio
Can the AMD 1055T Still Game
Mass Effect: Andromeda PC Re
For Honor Cheaters Banned
Mass Effect: Andromeda - gam
Viking RTS Northgard Hits St
March Games with Gold Announ
Elder Scrolls Online: Morrow
Use Your PC Without a Screen
Dungeons 3 Announced
TechNews-How often do you re
Titanfall 2 Live Fire Update

our friends:

Binary Option Robot & Bot
The Diet Suggestion
loadview-testing.com

OurCouponSpot.com

PromoCodeWatch.com

best rated online slots

best power tools
Web Services X3 Digital

play for free at ceskecasino
Play casino at Simijuegos

AC
best e cigarette



Please e-mail us if you have news.

(c) 1997-2017 Gameguru Mania
Privacy Policy statement
SEARCH GAME:   Windows 10 Tools    CD/DVD tools    PC GAMES
 Gameguru Mania News - May,07 2006 - tech 
Sunday Tech Reading - Merom against Yonah - tech
(hx) 07:30 AM EDT - May,07 2006 - Post a comment
 Gameguru Mania News - May,05 2006 - tech
The AGEIA-Havok War..(cough.)..Debate! - tech
(hx) 08:08 PM EDT - May,05 2006 - Post a comment / read (12)
Earlier this week, Havok sounded off on physics usage in Ghost Recon: Advanced Warfighter specifically dealing with AGEIA's PhysX card. In particular Havok cited claims by users of lower frame rates once PhysX was applied to the game. In this article AGEIA responds to the FPS reports, as well as clarifying physics usage in GRAW and game consoles. Here's an excerpt:
AGEIA: Consider first that PhysX is focused on enabling an entirely new experience in gaming in which literally 1,000s of objects of differenct types can move, collide and interact. Up to now, this has not been possible. The goal of PhysX is to enable this new class of complex physics algorithms to be processed within a more than acceptable frame rate. Of course, depending on the game and the PhysX implementation and the rendering capabilities of the system, there may be occasional momentary frame rate impact, but on average, we don't expect this to be significant.

It's unclear how this individual measured frame rate impact, however we aren't keeping our heads in the sand. We appreciate feedback from the gamer community and based partly on comments like the one above, we have identified an area in our driver where fine tuning positively impacts frame rate. We made an adjustment quickly and delivered it in a new driver (2.4.3) which is available for download at ageia.com today. That's the beauty of the PhysX solution. A powerful processor is in place now and a flexible software solution is there to continue improving the PhysX experience for our customers. Buy a PhysX accelerator today and it keeps getting better.
TechNews RoundUp - Free AIM phone service - tech
(hx) 09:38 AM EDT - May,05 2006 - Post a comment
 Gameguru Mania News - May,04 2006 - tech
TechNews RoundUp - BFG Tech's PhysX PPU - tech
(hx) 12:00 PM EDT - May,04 2006 - Post a comment
 Gameguru Mania News - May,03 2006 - tech
TechNews RoundUp - PhysX cards with a game - tech
(hx) 08:07 PM EDT - May,03 2006 - Post a comment / read (3)
Havok Sounds Off On Ghost Recon AGEIA Physics - tech
(hx) 06:27 PM EDT - May,03 2006 - Post a comment / read (5)
Game physics software engine company Havok has decided to go on the offensive and take on some claims on the newly released PC version of Ghost Recon Advanced Warfighter, which is the first game to support the AGEIA physics processor. Here is some statements Havok sent over:
  • Havok Physics (on the CPU) is used for all game-play physics in both the multiplayer and single-player PC versions of the game. All persistent collidable objects in the game are simulated using Havok software technology running on the CPU.
  • Havok's logo is on the GRAW PC box, substantiating Havok's use in the game (confirmed by Ubisoft marketing). Havok was also used in recent GRAW releases including Xbox360, Xbox, and PS2 skus.
  • AGEIA Novodex is said to be used in the single-player GRAW version for added PPU-accelerated effects - at the most AGEIA appears to be used for particle effects - and in no-way affects game-play outcome. AGEIA is NOT used in any way in any GRAW sku other than the PC.
  • From our inspection, differential effects in the GRAW PC game when using the PPU are not significantly obvious - but where they can be observed, additional particles do not appear in volumes greater than 100's of particles (a range that is typically easily in the domain of the CPU/GPU for particles). These observed particle effects are also only particles and not apparently persistent rigid bodies. They pass through the environment after a short time (seconds) at most.User comments back this up: "…to be honest it looks exactly same with the PPU as it does without it, the only difference is you get the extra blocks/debris, the strange thing is these extra blocks/debris seem to appear unrealistically out of no where when you shot things like the wall, floor etc, it really is like they've just been tacked on just to say *this game supports PhysX*."
  • Consumer reports from users who already have purchased the PPU and GRAW indicate that the PPU "actually slows down the game" in moments when effects are generated that are unique to the PPU. The effects described above appear to be the cause of the slow down - our observations here using a DELL/PPU confirm this. Also see here. One user comments states: "10-16 FPS slower with hardware PPU, I guess I need another GPU (SLI) to help render the added debris and effects I get from using the PPU, the price of PC gaming just went up again :-(, I can't believe that I have to disable the hardware PhysX card I just paid 200 quid for so that I can play GRAW at an acceptable FPS, to be honest I just feel like giving up on PC gaming these days."
  • AGEIA appears to imply and consumers conjecture that the PPU is generating so many objects that the GPU cannot handle the load. Multiple direct tests on the game by using NVIDIA's and ATI GPUs indicate the GPU has room to spare and in fact, if the PPU is factored out of the game, that the particle content generated by the PPU can easily be drawn at full game speeds by the GPU. So the introduction of the PPU most certainly appears to be the cause of the slow down in this case. NVIDIA specifically can technically verify that the GPU is not the cause of the slowdown.
We should stress that Havok is supportive of efforts like GRAW and Ubisoft specifically is a valued and strong business partner. More generally, Havok is a strong supporter of the PC development community with over 38 titles shipped to date on the PC using Havok technology. Havok is very enthusiastic about the prospect of additional acceleration for physics in PC games - specifically coming from multi-core CPUs and GPUs - both dual configurations and cutting edge GPUs targeting both graphics and "GP-GPU" applications.
ATI's 'Chuck' on HDR with AA in GRAW - tech
(hx) 06:28 AM EDT - May,03 2006 - Post a comment / read (6)
When the demo of Ghost Recon: Advanced Warfighter for the PC became available last week, one of the first and loudest complaints was that the game didn't allow for the use of anti-aliasing in any shape or form, leaving users with jaggies galore wherever they cared to look in the game. Thoughts immediately turned to recent developments with Elder Scrolls IV: Oblivion, and ATI's special driver that allowed for High Dynamic Range and anti-aliasing to be used in this title. The question on everybodies lips was - Could ATI pull off the same trick in Ghost Recon: Advanced Warfighter? Elite Bastards asked the mysterious driver developer known only as 'Chuck'. Here's the quote, directly from the man himself, in full:
The rendering path in G.R.A.W is very different from most games in that it appears to make extensive use of multiple render targets (MRTs). (This is where one draw operation can write different values to different surfaces.)

The DX9 spec doesn't allow multi-sample AA when using MRTs and our hardware requires that all of the destination surfaces either have AA or not. This means that in order to get AA in G.R.A.W. we'd need to have lots of AA surfaces and perform a ton of AA resolves. The end result would be slow and require much more texture memory. It's not 100% impossible, and I'm not giving up on the possibility, but there is no playable solution right now.
 Gameguru Mania News - May,02 2006 - tech
TechNews RoundUp - Mozilla Firefox 1.5.0.3 - tech
(hx) 08:30 PM EDT - May,02 2006 - Post a comment / read (1)
 Gameguru Mania News - May,01 2006 - tech
Tech News Round-up - Most Fuel-Efficient Cars - tech
(hx) 07:32 PM EDT - May,01 2006 - Post a comment / read (8)
 Gameguru Mania News - Apr,29 2006 - tech
TechNews RoundUp - Conroe in July 2006 - tech
(hx) 09:41 AM EDT - Apr,29 2006 - Post a comment / read (7)
The Valve store has 1/4 scale headcrab plushies with posable hooking legs for $24.95
 Gameguru Mania News - Apr,28 2006 - tech
TechNews RoundUp - Vista Trips Up Dual Booting? - tech
(hx) 08:00 AM EDT - Apr,28 2006 - Post a comment
 Gameguru Mania News - Apr,27 2006 - tech
Thursday Tech Madness-GeForce 7950 GX2 - tech
(hx) 07:14 AM EDT - Apr,27 2006 - Post a comment
 Gameguru Mania News - Apr,26 2006 - tech
TechNews Roundup- MS WGA Becomes Nagware - tech
(hx) 07:56 AM EDT - Apr,26 2006 - Post a comment / read (12)
 Gameguru Mania News - Apr,25 2006 - tech
Oblivion Athlon 64 CPU Performance - tech
(hx) 03:02 PM EDT - Apr,25 2006 - Post a comment / read (2)
Wondering if The Elder Scrolls IV: Oblivion truly takes advantage of dual-core processors? And if so, by how much? Does L2 cache size play a role in performance, and what clock speeds yield the biggest performance improvements? The chaps over at Firing Squad checked how 10 different Athlon 64, FX, and X2 processors perform in comparison to one another. Here's a taster:
Oblivion is definitely capable of taking advantage of the latest dual-core processors from AMD. At low resolution/detail settings, we saw performance improvements of over 15% in some cases. At the same time however, keep in mind that once you crank up the graphics settings in the game, you shift the load from your CPU to your graphics card - once you're running at 1280x1024 or 1600x1200 with HDR lighting, you're probably not going to see much of a difference in performance regardless of what processor you have installed in your system. In our testing on the previous three pages you saw the Athlon 64 3500+ hanging with the latest and greatest AMD processors, the Athlon 64 FX-60, Athlon 64 X2 4800+, and Athlon 64 FX-57. Oblivion also ran faster with the AMD CPUs with 1MB L2 caches. Here the performance difference wasn't as significant, but we still saw a nice gain of about 3-5% at 800x600.

If you're in the market to upgrade your CPU for Oblivion but don't want to spend a lot of money, obviously the 3500+ delivered an awesome price/performance ratio, particularly in light of the high-res results. If you can afford to spend a little more, around $330 (a little over $100 more than the 3500+), AMD's Opteron 165 CPU is a great value. The Opteron 165 is a dual-core CPU and runs at just 1.8GHz, 400MHz slower than the 3500+, but ships with 1MB of L2 cache per core and is known for being an excellent overclocker, often running at speeds in excess of 2.3/2.4GHz on air cooling. The most remarkable part is that these chips used to sell for just under $300, making them an absolute steal! AMD has wizened up and raised the price on these parts, but with Opteron 165 CPUs selling online at right around $320-$330 it's still a terrific value, especially once you factor in its overclocking potential and consider the fact that X2 3800+ CPUs ship at higher clock speeds (2.0GHz) but with only 512KB of L2 cache per core. You can easily make up the 200MHz clock speed difference with the Opteron 165, but there's no way you can ever drop 1MB of cache into an X2 3800+.
Tuesday Tech Madness - 17-inch MacBook Pro - tech
(hx) 06:13 AM EDT - Apr,25 2006 - Post a comment / read (4)
 Gameguru Mania News - Apr,23 2006 - tech
Running Elder Scrolls IV: Oblivion On Older Cards - tech
(hx) 03:19 PM EDT - Apr,23 2006 - Post a comment / read (2)
Oldblivion is a  third party program (download) which allows graphics cards which are pre DirectX9 to run Elder Scrolls: Oblivion from Bethesda. This means that Oblivion can run on cards such as Geforce 3, Geforce 4, Radeon 9200 and so on. Obviously this is not supported by Bethesda Softworks so download and use at your own risk. The screenshots shown are taken using Oldblivion on a PC with a Geforce 3 standard edition graphics card, with a 1.5 GHz AMD Athlon processor and 512 MB of RAM, according to the Oldblivion site

Supported cards: Geforce 3 series, Geforce 4 series, Geforce FX 5200 series, Geforce PCX 5300, Geforce FX 5500, Geforce FX 5700 series, Radeon 9550, Radeon 9200 series, Radeon 9000 series, Radeon 8500 series, SiS 760 (working with some issues), Geforce FX 5100 GO (working with some issues).
 Gameguru Mania News - Apr,22 2006 - tech
Tech Madness - Windows Live Mail with 2GB - tech
(hx) 08:56 AM EDT - Apr,22 2006 - Post a comment / read (2)
 Gameguru Mania News - Apr,21 2006 - tech
Friday Tech Reading - 750GB desktop HDD - tech
(hx) 05:37 AM EDT - Apr,21 2006 - Post a comment / read (4)
 Gameguru Mania News - Apr,20 2006 - tech
Morning Tech Reading - ATI vs.NVIDIA in Oblivion - tech
(hx) 06:19 AM EDT - Apr,20 2006 - Post a comment / read (4)
 Gameguru Mania News - Apr,19 2006 - tech
Wednesday Tech Reading - 1080p HDTV Explained - tech
(hx) 08:39 AM EDT - Apr,19 2006 - Post a comment / read (2)
PREV PAGENEXT PAGE