Updated:03:06 PM CEST Sep,23
NEWS
/ RSS
CHEATS
REVIEWS
ARTICLES
HARDWARE
DEMOS
FORUMS
LINKS
(
new
)
66 lottery login
91 club
okwin
bdg game
55 club
CONTACT
Please
e-mail us
if you have news.
(c) 1998-2025 Gameguru Mania
Privacy Policy
statement
SEARCH:
Gameguru Mania News - Dec,15 2024
-
view all
view only briefly
view only gameguru review
view only preview
view only media
view only movie
view only demo
view only patch
view only interview
view only tech
view only mod&map
view only gold
view only freegame
view only console
ARC B580 vs RTX 4060 - Test in 10 Games
- tech
(hx) 12:48 PM CET - Dec,15 2024 -
Post a comment / read (13)
Intel ARC B580 12GB vs GeForce RTX 4060 8GB l 1080p
last 10 comments:
Sabot
(05:09 PM CET - Dec,15 2024 )
Are we looking at the same cards here
https://gpu.userbenchmark.com/Compare/Nvidia-RTX-4060-Ti-vs-Intel-Arc-B580/4149vsm2371910
Because that thing has nothing to shout about.
Why 1080? I play over 90% of mine at 1440 inc cyberpunk, ETS2 at over 100/120 fps with max everything
heretic
(06:22 PM CET - Dec,15 2024 )
https://www.digitaltrends.com/computing/intel-arc-b580-vs-nvidia-rtx-4060/#dt-heading-specs-and-pricing
Tom
(06:39 PM CET - Dec,15 2024 )
Sabot> Are we looking at the same cards here
https://gpu.userbenchmark.com/Compare/Nvidia-RTX-4060-Ti-vs-Intel-Arc-B580/4149vsm2371910
Because that thing has nothing to shout about.
Why 1080? I play over 90% of mine at 1440 inc cyberpunk, ETS2 at over 100/120 fps with max everything
because everything looks good at 1080 stats wise.. this is how marketing works. Granted at least as per steam survey results half gaming population on Steam is still playing 1080... For a non serious gamer or someone into Sims or other lesser games, I'm sure the Intel card would suit.
gx-x
(07:15 PM CET - Dec,15 2024 )
"NVIDIA GeForce RTX 4060: The starting price is approximately €329 as per recent announcements"
...yea. Also, Intel's card scales better from 1080p to 1440p than any other brands cards. Look it up. Also, you be sure that Intel's newer drivers will only improve the card. I haven't seen AMD or nVidia improve anything with new drivers for over a decade.
Sabot
(07:18 AM CET - Dec,16 2024 )
Tom>
Sabot> Are we looking at the same cards here
https://gpu.userbenchmark.com/Compare/Nvidia-RTX-4060-Ti-vs-Intel-Arc-B580/4149vsm2371910
Because that thing has nothing to shout about.
Why 1080? I play over 90% of mine at 1440 inc cyberpunk, ETS2 at over 100/120 fps with max everything
because everything looks good at 1080 stats wise.. this is how marketing works. Granted at least as per steam survey results half gaming population on Steam is still playing 1080... For a non serious gamer or someone into Sims or other lesser games, I'm sure the Intel card would suit.
Of course. The carrot to buy the most expensive cards (“at 1080…you can play
[email protected]
,000fps with this card”- they never *asked* which Doom though :wink: spiel) but you should’ve just stayed with your older 10yr old GPU.
At least I put in homework before I built my PC.
Hence why who gives a fuck about a £600 AMD X3D CPU if your playing at 1080… Yeah it’s a batshit world out there. Who’s kidding who
:lol:
Sabot
(07:32 AM CET - Dec,16 2024 )
gx-x> "NVIDIA GeForce RTX 4060: The starting price is approximately €329 as per recent announcements"
...yea. Also, Intel's card scales better from 1080p to 1440p than any other brands cards. Look it up. Also, you be sure that Intel's newer drivers will only improve the card. I haven't seen AMD or nVidia improve anything with new drivers for over a decade.
Well so DLSS,FG,FSR doesn’t count….the whole point of DLSS/FSR is to have this card perform like it was a card or two above it or improve the quality of games in general. Scaling.
That’s a monumental difference from years ago.
BTW, you can scale all you want. It’s how it looks on *your* monitor what counts to you individually
Sabot
(07:39 AM CET - Dec,16 2024 )
heretic>
https://www.digitaltrends.com/computing/intel-arc-b580-vs-nvidia-rtx-4060/#dt-heading-specs-and-pricing
Pricing is a biggie. It’s about £130 cheaper than my TI. But it’s budget and with budget comes support and that is reliant on sales…
Sabot
(07:54 AM CET - Dec,16 2024 )
heretic>
https://www.digitaltrends.com/computing/intel-arc-b580-vs-nvidia-rtx-4060/#dt-heading-specs-and-pricing
What about Fresync or Gsync?
I use it a lot and by god does it improve detail in games as well as being milky smooth.
People buy what they want or don’t research and that’s how the world goes round.
In the 70s it was fondue sets and soda streams. I see they are back around in shops today :lol:
gx-x
(08:46 AM CET - Dec,16 2024 )
Sabot>
gx-x> "NVIDIA GeForce RTX 4060: The starting price is approximately €329 as per recent announcements"
...yea. Also, Intel's card scales better from 1080p to 1440p than any other brands cards. Look it up. Also, you be sure that Intel's newer drivers will only improve the card. I haven't seen AMD or nVidia improve anything with new drivers for over a decade.
Well so DLSS,FG,FSR doesn’t count….the whole point of DLSS/FSR is to have this card perform like it was a card or two above it or improve the quality of games in general. Scaling.
That’s a monumental difference from years ago.
BTW, you can scale all you want. It’s how it looks on *your* monitor what counts to you individually
We are not talking about the same thing. I am talking about going up from 1080p to 1440p, Arc looses less performance than others, and you are talking about smearing technology that also introduces something akin to input lag and should be avoided whenever possible, including XeSs.
Sabot
(01:04 PM CET - Dec,16 2024 )
gx-x>
Sabot>
gx-x> "NVIDIA GeForce RTX 4060: The starting price is approximately €329 as per recent announcements"
...yea. Also, Intel's card scales better from 1080p to 1440p than any other brands cards. Look it up. Also, you be sure that Intel's newer drivers will only improve the card. I haven't seen AMD or nVidia improve anything with new drivers for over a decade.
Well so DLSS,FG,FSR doesn’t count….the whole point of DLSS/FSR is to have this card perform like it was a card or two above it or improve the quality of games in general. Scaling.
That’s a monumental difference from years ago.
BTW, you can scale all you want. It’s how it looks on *your* monitor what counts to you individually
We are not talking about the same thing. I am talking about going up from 1080p to 1440p, Arc looses less performance than others, and you are talking about smearing technology that also introduces something akin to input lag and should be avoided whenever possible, including XeSs.
Even the world’s most powerful GPU, 4090 has to use DLSS in cyberpunk
Remember, it’s subjective.
The point of it is I’m playing cyberpunk with maximum settings at 1440 at 120fps using DLSS. Sometimes you see the DLSS at work, but what do you want?
To be forced to say “your card doesn’t support this game, you need to upgrade”. Like the consumerism days, course you don’t!
Previous to DLSS and FSR we would be playing at the lowest resolution possible, with shit nonexistent detail and 40fps if you were lucky, because you came in at the bucket end entry for your GPU.
I don’t care as long as I can enjoy my gameplay, which I do. I can enhance any game with nvidia control as well as using (or not using DLSS) ‘ultra performance’ or ‘balanced’ or whatever.
Using FreeSync negates lag completely. My monitor is 165Hz at 1440 and below and I use custom profiles for my games.
The arc will live or die by what it provides the gamer in its support and sales.
Everyone is after a piece of the pie. I will have this 4060 TI for a few years yet, by then prices above will have dropped and I will be buying into something like a 4080.
My expectations are reasonable :wink:
On the subject of drivers and support.
My EVGA GTX 1060 SC 6GB just had its last official driver after nearly *9 YEARS* of full support on my Win10 machine.
gx-x
(02:18 PM CET - Dec,16 2024 )
Using FreeSync or G-sync does not and cannot nagate the lag introduced by frame generation be it FSR or whatever nVidia calls it because frame that are generated are not seen as out of sync. They are literally "predictions" that take time to complete and be inserted between the engine output and gpu output to the monitor.
I am guessing you are talking about using downscaling (or as money grabbers like to call it "upscalers" where the game is rendered in lower resolution than your native resolution and then upscaled to native resolution using "AI" with the most preposterous claims about being "the way it's meant to be played" or some similar bullshit. The way it's meant to be played is in your monitor's native resolution. Since my hardware can't run 1440p in all games well, I am still using 1080p monitor and don't use "upscaling" ;)
As for Cyberpunk, runs @70-80fps @high on my PC, no RT. It looks just fine to me. Works well too game is slow-paced anyway.
edit:
The talking points should be about new Intel, not whether someone likes fake frames and upscaling.
My point was, since it's a part of my daily job, that Intel added performance to their GPU lineup with each driver release. They've fixed almost all of the problems they've had and so on and I don't see that changing for their new GPU lineup. Something that AMD and nVidia cannot do anymore since there is no room left for that, and general laziness that came with duopoly.
And yes,Intel does support VRR aka "FreeSync".
PS. I am not interested in Intel GPU offering personally since I already have a faster GPU. I do applaud a third market player because at times all you can get for $250 is nV 1650, amd 6500 or some disaster like that.
Sabot
(05:26 PM CET - Dec,16 2024 )
As i said, your talking 'subjective' No i don't use DLSS on all my games my GPU and CPU is plenty powerful enough to run Freesync without rendering on/off.
As ive said, that you blindly missed. Even the 4090 has to use DLSS to get a smooth ride in Cyberpunk!
Yeah you have a bee in your bonnet lol. Get used to it, it's called 'i'm saving money' and not being taken for a ride.
Your fixated on rendering, upscaling, downscaling which AMD DOES as well, so you can get that into your quotes. Nvidia indeed...
I guess i play games, you play the 'cut my nose off to spite may face' game
definitely a generational thing all this.
To think back 35 years we never had or bothered about a FPS counter, we played by the Mrk1 eyeball and never complained :roll:
I've played 1080 for 15 years and only last year did i build my newest system with a 1440 monitor, because i deserved it.
It's not some crime to be able to play Cyberpunk on an entry/mid level PC and be satisfied that it chucks Cyberpunk out at 120fps with everything on max. It's not boasting either, since everybody that has an RTX 2060 upwards is chucking DLSS.Dls to get more LIFE out of their cards to make them last longer
The WHOLE point, which you conveniently miss.
I take it you own the ARC, since your on the 'my cake tastes better than your cake' warpath?
Listen, I'm 61 been through the whole thing from the very,very beginning of graphics cards that cost a small fortune, to the entry of the first 3D/3DFX.
every single year i was spending £2,500+ at XMas on a NEW build because it was superceded at incredible pace.
I'm fucking glad to leave it to the nobs to finance the 'i must have the best or i hate Nvidia' i know which side my bread is buttered and i know when to call it a day.
I buy what is value, is going to last me and not what spites my face.
:santa: Merry Christmas
gx-x
(08:09 PM CET - Dec,16 2024 )
Sabot> As i said, your talking 'subjective' No i don't use DLSS on all my games my GPU and CPU is plenty powerful enough to run Freesync without rendering on/off.
As ive said, that you blindly missed. Even the 4090 has to use DLSS to get a smooth ride in Cyberpunk!
Yeah you have a bee in your bonnet lol. Get used to it, it's called 'i'm saving money' and not being taken for a ride.
Your fixated on rendering, upscaling, downscaling which AMD DOES as well, so you can get that into your quotes. Nvidia indeed...
I guess i play games, you play the 'cut my nose off to spite may face' game
definitely a generational thing all this.
To think back 35 years we never had or bothered about a FPS counter, we played by the Mrk1 eyeball and never complained :roll:
I've played 1080 for 15 years and only last year did i build my newest system with a 1440 monitor, because i deserved it.
It's not some crime to be able to play Cyberpunk on an entry/mid level PC and be satisfied that it chucks Cyberpunk out at 120fps with everything on max. It's not boasting either, since everybody that has an RTX 2060 upwards is chucking DLSS.Dls to get more LIFE out of their cards to make them last longer
The WHOLE point, which you conveniently miss.
I take it you own the ARC, since your on the 'my cake tastes better than your cake' warpath?
Listen, I'm 61 been through the whole thing from the very,very beginning of graphics cards that cost a small fortune, to the entry of the first 3D/3DFX.
every single year i was spending £2,500+ at XMas on a NEW build because it was superceded at incredible pace.
I'm fucking glad to leave it to the nobs to finance the 'i must have the best or i hate Nvidia' i know which side my bread is buttered and i know when to call it a day.
I buy what is value, is going to last me and not what spites my face.
:santa: Merry Christmas
You think I can't afford better shit? Dude...I just don't care. I play games couple of times a month these days. Few are worth playing, even fewer are worth spending a bunch of money on.
Merry Christmas to you too!
All comments
Add your comment
(free registration
required
)
Related news:
RTX 5060 Ti 8GB: Even Slower Than The Arc B580!
- tech (May 06 2025
)
We Were Accused of Forgery: 'Fake' Arc B580 Benchmarks
- tech (Apr 09 2025
)
9800X3D vs. R5 5600, Old PC vs. New PC: Intel Arc B580 Re-Review!
- tech (Jan 10 2025
)
Arc B580 Overhead Issue, Ryzen 5 3600, 5600, R7 5700X3D & R5 7600: CPU-Limited Testing
- tech (Jan 04 2025
)
Intel Arc B580 Overhead Issue! Upgraders Beware
- tech (Jan 03 2025
)
The Intel ARC B580 is Broken...on Older Systems
- tech (Jan 03 2025
)
Intel Arc B580 Review, The Best Value GPU!
- tech (Dec 12 2024
)
Intel Arc Battlemage B580 & B570 GPU Specs, Price, & Release Date
- tech (Dec 04 2024
)
Intel Selling Offices, B580 GPU Leak, 5090 Imminent, & AMD 9950X3D Rumors
- tech (Nov 27 2024
)
related cheats/trainer:
no results found
External links