Bethesda's epic sci-fi RPG is here, and it's a big one. From shipbuilding to exploring the surface of Mars, our thoughts so far.
Starfield Review... In Progress
The first trailer for Grand Theft Auto 6 is finally here.
Grand Theft Auto 6 Trailer
We take an in-depth look at Avatar: Frontiers of Pandora and tell you why it should be heavily on your radar!
Avatar: Frontiers of Pandora - a Deep-Dive into its Potential
Range-wise, the ROG Rapture GT6 is phenomenal, and it's ideal for all gaming and non-gaming-related tasks.
ASUS ROG Rapture GT6 WiFi 6 Mesh System Review
Post by Dan @ 12:26pm 21/10/13 | 20 Comments
In addition to the ambitious G-Sync Tech, the other big announcement to come out of Nvidia at their Montreal event over the weekend was an October 28th 2013 launch date for ShadowPlay, the company's solution for GeForce owners that want to record video output from PC games.

ShadowPlay differs from other long-popular PC recording applications like Fraps and OBS in that it makes use of the onboard H.264 encoder on recent model GeForce cards (GTX 600 and 700 series) to record full-frame video with very minimal performance overhead, and while activated, will be automatically recording all gameplay, keeping "ten to twenty minutes" of gameplay to a temporary buffer for players to save-to-disk on demand.
ShadowPlay leverages the H.264 hardware encoder found on GeForce GTX 600 and 700 Series graphics cards to record 1920x1080, 60 frames per second. All DirectX 9 and newer games are supported. In comparison to software solutions that hammer the CPU, ShadowPlay’s hardware solution has an approximate 5-10% performance impact when using the max-quality 50 mbps recording mode, and by saving to automatically-encoded and compressed H.264 .mp4 files, ShadowPlay avoids the disk-thrashing, humongous, multi-gigabyte files associated with other gameplay recording applications.

When streaming with Twitch in a future GeForce Experience release, this minimal performance impact will ensure competitive multiplayer matches aren’t compromised by high CPU usage or hard disk thrashing.

If you prefer to save every single moment, enable Manual Mode with the rebindable Alt + F9 hotkey, which acts like traditional gameplay recorders, saving your entire session to disk. Windows 7 files cap out at 4GB per file due to OS limitations, but on Windows 8 and Windows 8.1, file size is only limited by available hard disk space, enabling hours of footage to be recorded to a single file.
This time next week, ShadowPlay will be available as part of the GeForce Experience application that comes bundled with Nvidia GeForce drivers on Windows PCs. Check out the trailer below for a quick overview, and head over to the geforce blog for a longer explanation.




nvidiageforceshadowplaytrailer





Latest Comments
ph33x
Posted 12:33pm 21/10/13
Another awesome feature. My FRAPS runs about 850mbps across the LAN to a server, on the fly compression will be great.

-----------------------

Not sure if it was mentioned, but the GTX 780 Ti GPU has also been announced. It's slated to have 5GB GDDR5 over a 320bit interface. Pretty much what is the "Titan LE"

There has also been hints that non-reference PCB's are being allowed by nVidia, and a Classified edition is in the works!

Coming mid November, with reviewers getting them shortly.
BladeRunner
Posted 01:20pm 21/10/13
Will it still work for people with out 600 & 700 nvidia cards?
ph33x
Posted 01:35pm 21/10/13
Will it still work for people with out 600 & 700 nvidia cards?

No. You'll need to use a software recorder such as FRAPS. The hardware endoder was only build into GTX 6xx and 7xx cards.
trog
Posted 01:38pm 21/10/13
Man this is awesome!
BladeRunner
Posted 01:41pm 21/10/13
Eorl and I will just have to wait til next upgrade.
slamma
Posted 02:23pm 21/10/13
Does this include the 680 and 780 m's.cheers
ph33x
Posted 02:39pm 21/10/13
Does this include the 680 and 780 m's.cheers

These GPUs would support it.

600's: GTX 650, GTX 650M, GTX 650 Ti , GTX 660, GTX 660 Ti, GTX 670, GTX 680, GTX 680M, GTX 680MX, GTX 690

700's: GTX 760, GTX 760M, GTX 765M, GTX 770 GTX 770M, GTX 780, GTX 780M, GTX TITAN

I believe the H.264 hardware encoder was built into all Geforce 6xx and 7xx models, but these are all the GTX (mid and high end) models.
Dan
Posted 04:42pm 21/10/13
Yeah, my understanding is that it's built into the Kepler architecture, so if your card is packing Kepler, then you're good.

Some of the 600 series mobile cards were running on the older Fermi tech, which doesn't have the onboard H.264 encoder, so they won't support shadowplay.
copuis
Posted 04:55pm 21/10/13
Another awesome feature. My FRAPS runs about 850mbps



anyone else read that as my FAPS run about 850mbps

masturbational beats per second
ph33x
Posted 04:58pm 21/10/13
masturbational beats per second

Mega bats per second.
copuis
Posted 05:00pm 21/10/13
Mega bats per second.


that fits better at that rate
Eorl
Posted 05:06pm 21/10/13
Eorl and I will just have to wait til next upgrade.

Slowly the 570 is being cast out of all the hip and cool features.
Tollaz0r!
Posted 05:15pm 21/10/13
So Nvidia are doing all they can to capture the PC market, they need to without PS4/Xboned.

So far they are doing well. They still need to do something to convince people that Nvidia cards are/will be better than ATI's offerings on PC. As people at the moment seem to favour ATI based on the idea that games will be better optimised for ATI cards due to them being used on PS4/Xboned .
ph33x
Posted 06:41pm 21/10/13
The problem is we're moving into 4K now, so we're going to see lots of tests to suit each card as (how I see it) nVidia is winning with core power, but AMD wins with 512Bit memory bus.

Eg: The best way to make the new AMD card look better in every single game test, is to test with Eyefinity or 4K "but with AA turned on" so the VRAM usage skyrockets. Because NV cards are using 256/320/384Bit memory bus, and the AMD card uses a 512Bit memory bus, the AMD card will always come out on top due to a higher memory bandwidth. But this is once they actually hit the memory bandwidth limit. If you've ever ran out of VRAM bandwidth (common on cards with double the RAM to reference) you'll know it sucks balls bigtime.

The test wasn't faked or wrong, it's just a strawman based test to begin with. I can see this being thrown around left right and centre by people who won't use the video memory to that level. BOTH cards were producing 30-40fps (which most people will agree is unplayable anyways) but since the memory is the bottleneck, there would be huge input lag evident as well. Moot test, but AMD is faster on it.

Which will then bring in further biased and subjective arguments about who honestly uses 4K? Who honestly uses 120Hz? Who honestly needs G-Sync or ShadowPlay "when FRAPS will do everything just as good"? Then you have the argument of "Mantle" or "Game packages" or "You can't see past 60hz" or "Drivers" - the list goes on and on.

Another few years when we're all back on the same monitor standards things will calm down. Until then the unbiased people will be after the 'best graphics solution to run on monitor X or Y'. - I think that's how I'll ask the question from now on: "What games and on what screen?" as a product of the two can require an immensely different amount of horsepower.

I kinda went back and forth adding stuff to this post so it's probably a dogs breakfast by now.
Khel
Posted 07:03pm 21/10/13
I don't get the whole thing about how games are going to be more optimised for ATI now, maybe I'm just flat out wrong, but I would have thought that the APIs being used would abstract the hardware layer anyway, thats at least the case for DirectX and I think the XBone at least is using DirectX isn't it? And Sony have made a big deal about how their new API is much easier to use, so I'd imagine they're doing something similar. So its not like its going to be coded at the hardware level to work specifically on ATI cards, it should work just as well on any card that has the same feature set.

And afaik things like Mantle aren't part of the API on the consoles anyway? I dunno, maybe I'm way off base, just seems to me like people are overstating the importance of the next gen consoles having ATI in them. I mean I'm sure financially its a big win for ATI, but gone are the days where people are really coding games down at the hardware level, they'd be coding to feature sets rather than specific hardware. There might be some tricks they can exploit due to the hardware to squeeze extra performance out, but generally you don't see that happening until years into a console's lifespan, definitely not at the start because you can already do impressive stuff at the start the traditional way. And by that stage, PC graphics hardware is going to be multiple generations ahead so it will be a moot point.
ph33x
Posted 07:23pm 21/10/13
^ Pretty much. Mantle ain't coming out on at least xbox I'm 100% on. MS don't want them knawing into Windows sales, which is the one reason why a fair few gamers have dual boot.

Some FYI on Mantle API: You're right on DirectX abstracting all hardware into a neat bundle. The argument is that Mantle does the same for AMD cards only, and there is a lot of optimization which apparently (and could be well true) reduces the overheads that DirectX has. A really, really dumb example would be that DX can make 60 calls to the card per second, where Mantle can do 120 in the same time, increasing performance/fps/visuals/etc. The API typically talks to the driver anyways. I guess you could say they're cutting out one of the 2 middlemen, most likely the driver, as the driver has hardware access and that's what they tout Mantle to have. There wasn't a great deal of technical stuff about the API as the time I was checking it out.

If there was poor optimization on the API level, you should see a lack of optimization on the card as well. If both cards are pushing 99% all the time with DX in a benchmark (which it does) - then you have it's max performance under any API. So yeah, a little confusing. It'll be a full blown API and not a card specific feature set, but still, it's another API devs will need to learn, and unfortunately they'll need to learn it before we know what is faster. Apparently DICE went to AMD and asked for it.

As an nVidia user, I find all of this inconsequential. The GK110 chip isn't really clocked anywhere it's limits. If anything, nVidia really sandbagged on that one. Also, the K40 Tesla with the GK180 GPU is coming out soon, and will reveal the new architecture from their end.
Tollaz0r!
Posted 07:54pm 21/10/13
Khel, it isn't about what is correct or not. It is what people perceive to be correct. People have already said it here and elsewhere that they intend to get ATI cards because of the PS4/Xboned. :(
Eorl
Posted 10:44pm 21/10/13
I've only stated that I plan to be waiting UNTIL the Xbone/PS4 come out to see whether there is a difference at all (something I doubt because if there was Nvidia would swiftly trying to one-up the increase). The idea that AMD users may benefit isn't something to laugh at though, it is a genuine technical occurrence that could be possible due to the next-gen consoles utilising AMD hardware.

Though, that isn't to say we may not see any real difference. As Khel and Pheex have pointed out, the amount of abstracting from API's probably won't see that difference shown in PC games so really its just best to go with whatever has the better money for power ratio.

I can see Mantle being handy, a lot of people in the industry have said some good things about it and are eager enough to see its implementation. Of course we haven't even seen any actual data from AMD stating what benefits come from Mantle's usage, hell not even DICE have really said much and they are the ones using it first. We might see a big shift with Linux/SteamOS gaming and the need for better API's, but even then Mantle isn't Linux ready and I'm not sure when it would be.

Also, I really doubt 4K will be a "thing", at least not for some time. Hell the next-gen launch titles (I know, launch titles) can barely run at 1080p/60FPS. For 4K to be a thing we'd need those consoles to push people to get 4K TVs, just like HD became a thing because Xbox 360 and PS3 started pushing it.
Dan
Posted 10:23am 22/10/13
I can confidently say that 4K will not "be a thing" for games on the PS4 and XB1. The consoles can playback video at that resolution and that's about it.

I will eat my hat if a single game comes out on either console with 4K support that is anything more than a tech demo or indie title with extremely simple geometry.
ravn0s
Posted 11:50am 22/10/13
i can see 4k becoming popular on PC in the next few years. it probably wont happen on console until the ps5 and xbone2.

i read recently that oculus are hoping to have a 4k screen for the rift. damn, that would be good.
Commenting has been locked for this item.
20 Comments
Show