Concluding their Gamescom festivities for their newly-introduced GeForce RTX 20-series, NVIDIA has revealed a bit more about the hardware, its features, and its expected performance this evening. Tonight NVIDIA is announcing the new Ansel RTX features in GeForce Experience, as well as some game performance metrics for the GeForce RTX 2080 up against the GeForce GTX 1080. After recent hands-on demos featuring real-time raytracing, NVIDIA is offering some numbers for out-of-the-box and Deep Learning Super Sampling (DLSS) performance in traditionally rendered games.

NVIDIA RTX Support for Games
As of August 20, 2018
Game Real-Time Raytracing Deep Learning Super Sampling (DLSS)
Ark: Survival Evolved - Yes
Assetto Corsa Competizione Yes -
Atomic Heart Yes
Battlefield V Yes -
Control Yes -
Dauntless - Yes
Enlisted Yes -
Final Fantasy XV - Yes
Fractured Lands - Yes
Hitman 2 - Yes
Islands of Nyne - Yes
Justice Yes
JX3 Yes
MechWarrior 5: Mercenaries Yes
Metro Exodus Yes -
PlayerUnknown's Battlegrounds - Yes
ProjectDH Yes -
Remnant: From the Ashes - Yes
Serious Sam 4: Planet Badass - Yes
Shadow of the Tomb Raider Yes -
The Forge Arena - Yes
We Happy Few - Yes

Starting with NVIDIA’s DLSS – and real-time raytracing for that matter – we already know of the supported games list. What they are disclosing today are some face-value 4K performance comparisons and results. For DLSS, for now we can only say that it uses tensor core-accelerated neural network inferencing to generate what NVIDIA is saying will be high-quality super sampling-like anti aliasing. Though for further technical information, this is a project NVIDIA has been working on for a while, and they have published some blogs and papers with some more information on some of the processes used. At any rate, the provided metrics are sparse on settings or details, and notably measurements include several games rendered in HDR (though HDR shouldn't have a performance impact).

Otherwise, NVIDIA presented a non-interactive Epic Infiltrator 4K demo that was later displayed on the floor, comparing Temporal Anti Aliasing (TAA) to DLSS, where the latter provided on-average near-identical-or-better image quality but at a lower performance cost. In this case, directly improving framerates. To be perfectly honest, I spent the entire floor time talking with NVIDIA engineers and driver/software developers, so I have no pictures of the floor demo (not that anything less than a direct screenshot will really do it justice). Ultimately, the matter of DLSS is somewhat nuanced and there isn’t much we can add at the moment.

Overall, the idea is that even in traditionally rasterized games without DLSS, the GeForce RTX 2080 brings around 50% higher performance than the GeForce GTX 1080 under 4K HDR 60Hz conditions. Because this excludes real-time raytracing or DLSS, this would be tantamount to ‘out of the box’ performance. Though there were no graphics settings or driver details to go with these disclosed framerates, so I'm not sure I'd suggest reading into these numbers and bar charts one way or another.

Lastly, NVIDIA announced several new features, filters, and supported games for GeForce Experience’s Ansel screenshot feature. Relating to GeForce RTX, one of the features is Ansel RT for supported ray-traced games, where a screenshot can be taken with a very high number of rays, unsuitable for real-time but not an issue for static image rendering.

Ansel RTX also leverages a similar concept to the tensor core accelerated DLSS with ‘AI Up-Res’ super resolution, which also works for games not integrated with Ansel SDK.

In terms of the GeForce RTX performance, this is more-or-less a teaser of things to come. But as always with unreleased hardware, judgement should be reserved until objective measurements and further details. We will have much more to say when the time comes.

Comments Locked

92 Comments

View All Comments

  • Yojimbo - Thursday, August 23, 2018 - link

    You're still willfully ignorant of the fact that the GTX 280 cost more than the GTX 2080 and the GTX 780 cost the same...
  • eddman - Thursday, August 23, 2018 - link

    ... and you are ignoring the fact that nvidia cut its price by $150 down to $500 a mere month after launch because it was overpriced for the offered performance compared to 4870.
  • Yojimbo - Thursday, August 23, 2018 - link

    So what? They still launched it at the high price. If AMD comes out with a card in a month that's competitive with the GTX 2080 and decides to price it at $400 then NVIDIA will again be forced to cut their prices. But AMD won't be able to do that. NVIDIA lost money 4 out of 5 quarters between mid 2008 and mid 2009.

    https://www.macrotrends.net/stocks/charts/NVDA/nvi...

    Look at the quarterly earnings per share (second graph). You can see why they were trying to price things higher and how it hurt them to have to cut their prices. Of course AMD was also losing money over that time period. Of course the recession didn't help. AMD was also losing more and more to Intel Core processors on the CPU side at this time, and I am guessing they might have had heavy debt payments from their acquisition of ATI a couple of years prior. Why AMD decided to introduce cards at low prices at this time I don't know. Perhaps they either had a different idea of what the real price/demand curve for graphics cards was than NVIDIA or maybe they thought they could hurt NVIDIA through financial attrition even though they themselves were bleeding money. I really have no idea.
  • eddman - Thursday, August 23, 2018 - link

    As you mentioned, the recession was in effect. AMD priced their cards quite low, yes, but I doubt it was low enough to lose them money. They were not in a situation to willingly sell cards at a loss. I could be wrong. Same goes for nvidia.

    Intel's figures also took a nosedive in that period, so it seems recession was the main factor.
  • Yojimbo - Thursday, August 23, 2018 - link

    Oh it definitely was low enough to lose them money. The same for 2007 and 2006. You can look at their financial results. Go to page 60 of this financial document: http://quarterlyearnings.amd.com/static-files/2b12...

    Look at the Graphics operating income. They lost 6 million in 2006, 39 million in 2007, and made 12 million in 2008. Making 12 million is pretty much breaking even. There was no recession in 2006 or most of 2007. I think NVIDIA only lost money in 2008 out of those years, probably a mix of the recession and NVIDIA switching to an expensive new architecture (Tesla).

    Note that for AMD the Graphics segment "includes graphics, video and multimedia products and related revenue as well as revenue from royalties received in
    connection with the sale of game console systems that incorporate our graphics technology". They were probably making money on consoles, as the XBox 360 was pretty popular, so they probably still lost money on PC graphics cards in 2008.

    Also notice that in 2007 and 2008 AMD was bleeding money in their CPU business. Why they decided to fight on price so heavily that they lost money in GPUs as well I don't know. But they either willingly sold at a loss, or they thought NVIDIA couldn't or wouldn't cut prices and therefore AMD would gain market share, or they miscalculated the price demand curve, or they would have lost money anyway even if they hadn't cut prices. I think the last one is very unlikely. People still would have bought almost as many graphics cards even if they all cost $50 more, as history has shown. Anyway, it wasn't good for them because they had to start cutting down their R&D expenses and then they fell behind NVIDIA. So when people say we need more competition to drive down prices, well, they had their time of lower prices years ago at the expense of said competition today, because the price cuts of yesteryear (along with AMD's awful CPU performance) cost AMD the ability to compete, until with Fury and Vega they were forced to build such expensive cards just to try to compete that they couldn't get much market share even by cutting prices. The cards were so expensive to make they could only cut the prices so much. At times they have given up the high end of the market completely.
  • eddman - Monday, August 27, 2018 - link

    You said AMD was losing money in that period and now we see that in 2008, where 4870/50 launched in June, they actually made money. So yea, the cards DID make them money even at such low prices. They were still recovering from the ATI purchase. Making even that amount of money in 2008 and in that financial situation and that time period is not bad with such low priced cards.
  • Yojimbo - Thursday, August 23, 2018 - link

    besides...

    "Perhaps the most notable thing about the GeForce 7800 GTX at launch was the $600 USD price tag. Previous generations had drawn the line for flagship pricing at $400, so back in 2005 $600 really was considered to be pretty staggering."

    Maybe your example is a bad one. The 8800 GTX wasn't a big jump in price only because NVIDIA had already pushed the price up to new territory with the 7800 GTX.

    Let's look at what happened after the 8800 GTX. The 9800 GTX launched at $350.

    In any case, things were not as stable. back then as now. It's not a good comparison to be looking at generational price differences. But if we look at the 200 series and onward, prices have gone up and down frok generation to generation, but there hasn't really been a trend one way or the other.
  • eddman - Thursday, August 23, 2018 - link

    It doesn't matter if 7800 GTX was already more expensive than the prior cards (and I did consider it to be quite overpriced back then). If technological advancements and better performance are the driving factors for a price increase, then how come 8800 GTX wasn't even MORE expensive, considering it was massively outperforming 7800 GTX and introduced CUDA?

    9800 GTX is a terrible example. It was launched only 2 months before GTX 280 and 260, simply to act as a lower range, cheaper card to those two. Nvidia never released true low-range Tesla based cards.

    2080 Ti is the most expensive generational flagship launch card in the past 18 years at $1000. The second most expensive is 7800 GTX at $767 adjusted for 2018. That's about 30% more.

    I made this graph last year, so prices are adjusted for 2017 dollar value: https://i.imgur.com/ZZnTS5V.png
  • Yojimbo - Thursday, August 23, 2018 - link

    "If technological advancements and better performance are the driving factors for a price increase, then how come 8800 GTX wasn't even MORE expensive,"

    I think you can answer that for yourself if you just stop and think a second. The answer isn't hard to reach.

    "9800 GTX is a terrible example."

    All of the examples from back then are terrible. NVIDIA and ATI/AMD were coming out with cards all the time back then. Besides, there's no reason they need to introduce the card for a low price just because they will come out with something better and more expensive later. They can cut the price like GPU manufacturers used to do all the time in those days.

    "2080 Ti is the most expensive generational flagship launch card in the past 18 years at $1000"

    No. The Titan X launched at $1200. What was the major difference between the Titan X and the 1080 Ti other than time of introduction and 1 GB of RAM? It also launched close to the smaller chips just like the 2080 Ti has. What you want to call the flagship products (Ti versions) always launched later. The 2080 Ti is launching alongside the 2080 this time. Also, compare the die size and the percentage of cores cut off the full die of the GTX 780 with the RTX 2080 Ti and it's obviously the wrong comparison to make. The RTX 2080 Ti is a huge chip. It is not proper to compare it to the GTX 780. What has happened is that people are buying more and more powerful GPUs as the GPU has become more and more important to the game experience relative to the other components. Therefore, NVIDIA has introduced a new class of product with huge die sizes that are priced higher. They introduced that back in the Maxwell generation. At first they released them as prosumer cards first, only later coming out with a "Ti" branded version at a reduced price. This time they skipped the Titan and released the "Ti" branded card at the architecture launch. The proper comparison to the GTX 780 or the GTX 980 or the GTX 1080 is the GTX 2080.
  • eddman - Thursday, August 23, 2018 - link

    How about you answer why 8800 gtx wasn't more expensive instead of dodging the question.

    No, just a few are bad examples, like the one you came up with and I clearly explained why.

    Titans are not mainstream cards and do not fit in the regular geforce line up.

    Excuses. It's not my fault they named it 2080 Ti which makes it a direct replacement for 1080 Ti. It is overpriced. Simple as that. I really don't understand how you as a consumer could defend a corporation's pricing behaviour.

Log in

Don't have an account? Sign up now