Well, it's here - our full
NVIDIA GeForce RTX 3080 Founder's Edition review. NVIDIA's RTX 30 series debut, which we put to the test at 4K and 1440p across a wide range range of games. And we get it to trace a few rays too. The results? Well, that headline should give you an idea of how it stacks up against the RTX 2080 and even the RTX 2080 Ti.
A preview.
Any review or deep dive into a new piece of tech, especially a graphics card, is an ambient occlusionary tale full of whimsical numbers, charts, and talk about frames, games, and names. Like Ampere, Turing, and GDDR6X. Which we’ll get too – that is if you can resist the temptation to scroll. Go on, resist.
The RTX 3080 story is nuanced, varied, but always impressive. A true crank-up-the-details 4K powerhouse, the sort of generational leap that might arrive after three or four years -- let alone two.
Okay, okay. For the impatient ones out there the RTX 3080 is around 36% faster than the RTX 2080 Ti (the previous $2,000 AUD flagship from NVIDIA) when it comes to 4K game performance. And, it’s 50-60% faster when you factor in ray-tracing. It blows the RTX 2080 SUPER, the 3080’s like-for-like comparison in terms of price-point and naming, away.
Ahem. Back to the show.
Our Full NVIDIA GeForce RTX 3080 Founder's Edition Review
Posted 04:27am 17/9/20
Posted 07:28am 17/9/20
3080 is up to about twice as fast as 2080, more likely about 80% faster. Up to 40% faster than 2080ti and much cheaper.
It's worth noting that lots of benchmarks have improved across the board from the 2000 launch, as CPU and pci bus bottlenecks have actually started to matter across these generations. Be careful because you may get more bang for your buck from a cpu-mobo upgrade and sitting on say a 2070 card if you've ignored it for a few generations.
I've decided to sit this launch out and wait for the ti or a 3090 paetner card. There's just too much hype and real value here, which means stocks will be low and real prices will be spectacular for a while.
Posted 09:11am 17/9/20
I assume this means it's just not powerful enough to do it? Is there anything that can do available yet or is that still a future hardware thing.
Posted 09:17am 17/9/20
Posted 09:23am 17/9/20
Doesn't bother me too much if I don't snag one as I'm waiting on the Zen 3 announcement before deciding on a system to build, but still...
Posted 10:06am 17/9/20
I'll expand on that to make it more clear though
Posted 11:21am 17/9/20
Back of the envelope calc (RTX 3080 FE TDP 320W vs RTX 2080 FE TDP 225W):
~100 more Watts used for 10 hours a day is an extra ~1000KWh per day, at .30c per KWh, that's over $100 a year more on a power bill than 2080.
Maybe it doesn't get anywhere near those numbers unless you're crypto mining or running crysis the whole time, but still seems like a huge jump for a single generation.
I'd be interested in seeing some kind of deep dive into how many killowatts an average PC versus the new gen games consoles might use in a year of average use.
Posted 01:31pm 17/9/20
Maybe?
Posted 01:45pm 17/9/20
Posted 02:39pm 17/9/20
We're taking about a market segment that commonly straps multiple graphics cards together and installs complex after market liquid cooling alongside delicate electronics just so the magic smoke stays on the inside.
Posted 06:05pm 17/9/20
It's the magnitude in the TDP increase that I found to be remarkable. The 1080 founder's was 180, 2080 was 225 and 3080 is 320 .
There's obviously a huge performance leap too, but I can't think of any comparable leap between annual computer hardware iterations that made a ... 42%? jump in power consumption. Historically, I thought it had only been the "titan" style superclocked versions that punched that high, never the flagship products, but I could be wrong.
The 'maybe' in my previous post was just me having no idea about the kind of numbers a 2080 and 3080 would run at when under average load, but I still figured that with those max values it was feasible the middle difference could still be in the ballpark of extra 1kwh a day for a typical user. And if just upgrading a GPU could also add $100 a year to an energy bill that seems worth talking about.
I'm curious whether it's mostly just because its early days for their 8nm fabrication and the TDP of the revisions will level out quick as it matures, or whether we might continue to see similar jumps in power consumption going forward. Also, whether it means it's going to take longer than usual to get 3080s into laptops.
Posted 10:14pm 17/9/20
Posted 11:42pm 17/9/20
Posted 01:04am 18/9/20
Posted 10:57am 18/9/20
Posted 11:36am 18/9/20
I've ordered a Model# RC-DP38K 8Ware Ultra 8K DisplayPort v1.4 Cable 3M from PC Case Gear Because, Yes, I am future proofing my Setup I use Currently a KOGAN 50" TV screen for my monitor (and so I can watch TV without leaving my chair) can't afford an 8k TV at the moment (Some of them Cost More then a CAR 8( ) I'm so getting
-GAMING-X-TRIO-24G/Specification
-GAMING-TRIO-24G/Specification
Corsair AX1600i Digital Titanium Modular 1600W Power Supply
MSI GeForce RTX 3090 Gaming X Trio 24GB
What screws me up is literally what is the difference between the
MSI GeForce RTX 3090 Gaming Trio 24GB
MSI GeForce RTX 3090 Gaming X Trio 24GB
They are Identical I tell you:
https://www.msi.com/Graphics-card/GeForce-RTX-3090
https://www.msi.com/Graphics-card/GeForce-RTX-3090
So what gives?
Posted 05:10pm 18/9/20
Even if you are going SLI 1600W is probably 600 more watts than you'll need. 1200W if you want to feel extra safe. I recently saw a youtube video of a Ryzen 3950x and 2080ti running stable on a 450W PSU. That's still lower than anyone would recommend but 1600W is just a waste.
Posted 12:10pm 22/9/20