The graphics card arms race continues with AMD today announcing the reveal of a brand new, dual-GPU, ultra-enthusiast graphics card: the Radeon R9 295X2.
Coming in just two weeks after Nvidia's CEO showcased their brand new dual-GPU on stage at its GPU Technology Conference, AMD has stepped up to show that it too has something only the most enthusiastic gamers can be eager for. Weighing in at a total of 5,632 cores between the two mashed together GPUs, the 295X2 is one heck of a beast. If that wasn't enough, AMD has also decided it would be smart to include a closed-loop water cooler attached to the card, just to ensure house fires aren't started.
Under the brushed aluminium shell you'll find a pair of AMD's Tahiti XT GPUs, with 128 ROPs and 8GB GDDR5 to boot, split between two 512-memory buses. If that wasn't enough to send you into a dizzy spin, the card also clocks in higher than the R9 290X series, at 1,018MHz, making it the first dual-GPU to run higher than its single-GPU card brethren.
PC Gamer were able to take one for a quick spin, spitting out BioShock Infinite at 58FPS on a 3840 x 2160 resolution. At a lower res of 2560 x 1600, the dual-GPU topped out at 100FPS with the same test, giving it a rather considerable lead on what Nvidia currently offers. Of course, the GPU arms race isn't static, and with the competition planning on releasing Titan Z soon enough the race for craziest graphics card performance may once again relight.
AMD's R9 295X2 will be retailing at USD$1,500, so expect our Australian land tax to hike that price up closer to the $2,000 mark. Eager? Let us know in the comments on your thoughts.
Posted 12:10pm 09/4/14
For any other reason though, buying 2 separate cards provides better performance per dollar, and plenty of flexibilty (add a 3rd or sell one off as you require).
Posted 12:39pm 09/4/14
Posted 01:36pm 09/4/14
Posted 01:40pm 09/4/14
Posted 06:15pm 09/4/14
Posted 06:15pm 09/4/14
Posted 06:37pm 09/4/14
Source?
Basic electrical nonsense: A 290 is a 250watt GPU stock. A 295 would run at 500 watts under load. PCI-e 8+8+6 pin gives you 450 watts of power handling.
Knowing this, how can a 295 be faster than 2x 290's?
Posted 10:15pm 09/4/14
Posted 10:18pm 09/4/14
Posted 11:19pm 09/4/14
Lol, here I am thinking 8+8+6 isn't even enough and they're only using 8+8? In before house fire.
Posted 12:17am 10/4/14
Posted 04:38pm 10/4/14
If you are talking about BTC, the hashrate is way to high now for something like this, it would actually be costing you money to mine because the power usage to mining return wouldnt work out.
but F*** yeah finally a car out that I would be willing to upgrade my 5970oc from.
Posted 05:25pm 10/4/14
Actually as we enter the 4K resolution era, we're going to need cards like this as the norm.
Posted 07:10pm 10/4/14
I was referring to Litecoin. Yes, BTC is long gone for any card. About 6 months ago I worked out my 900W system would need to draw 10W (yes, 10 watts) to make me only $30 every 3 months if mining BTC.
I'm a strong disbeliever on this. An immense amount of manhours goes into 'cheap tricks' to give you awesome visuals at a small computational price. If devs could go balls to the wall on a game, we'd need a super computer to run it.
AA is a great example. When it first came out, it was basically Supersampling (raising the res internally on the card, doing the rendering, then compressing it to standard res) and the 'all' jaggies would be eliminated - all the while sharpening the textures and the image in general. This was computationally expensive, so MSAA, TXAA, FXAA (etc) were all devised to cut carious corners to give you whats visually close to proper AA, but uses much less power.
Since then Supersampling isn't commonly found in many games. However, as we move into the future where cards are becoming orders of magnitude more powerful, the disparity between top end vs 3-4 year old cards becomes larger. As a result, devs are starting to include Supersampling in their options so high end users can still put their card under full load and get something out of it as a result - while at the same time the game scales down for low/mid end users with no extra work from the dev than if they had to make adjustments for post processing, textures, shadows, etc etc.
Edit: I forgot to add just above - Now Supersampling is making its way back into games in the meantime - such as Arma 3 and BF4, both having a slider to adjust your rendering resolution from a default of 100% (lets say 1080) to 200% (which with a 1080 screen, is actually 4K res internal to the card)
With this all in mind, I think what holds us back is the fact that not everyone upgrades straight away. If we 'all' had these high end beasts, the devs would certainly cater for it - but until then it's extra work for no monetary return as we're most likely going to buy it anyways.
Posted 07:27pm 10/4/14
But we struggle enough with input lag on big TVs so we're a long way off.