Comments Locked

83 Comments

Back to Article

  • milkywayer - Thursday, January 9, 2020 - link

    Here's hoping it doesn't automatically jack prices in their already overpriced mobile chips. Even now it's crazy seeing the newer faster AMD 4000 mobile cpu laptops are cheaper than the Intel cpu ones. I for one can't wait to stop sending $500 cheques to every time I buy an xps or a MacBook.
  • milkywayer - Thursday, January 9, 2020 - link

    $500 cheques to Intel per premium ultra book purchase.**

    Goddamnnit anand tech, the UN passed a resolution this week where an edit button is a requirement on discussion threads. Do you guys follow.
  • Adonisds - Thursday, January 9, 2020 - link

    That and the ability to receive notifications when someone replies, it's all determined by the UN folks
  • travsb1984 - Thursday, January 9, 2020 - link

    Who cares about some silly UN resolution? Are you saying you prefer to be ruled by foreign overlords? If you want to change something from a previous statement just post another reply... It's kind of silly to expect someone for go back up a thread and find something you edited, wouldn't you agree?
  • yannigr2 - Thursday, January 9, 2020 - link

    The edit button can have a time limit, for example 15 minutes. It's not rocket science.
  • milkywayer - Thursday, January 9, 2020 - link

    I offered in another post to implement it for free for Anand Tech. No one reached out :(
  • xTRICKYxx - Friday, January 10, 2020 - link

    This has been requested well before Anand left. Pure laziness.
  • Slash3 - Friday, January 10, 2020 - link

    It's being moved to the same comment system as Tom's Hardware.

    Which, it should be noted, is just as awful.
  • jimbo2779 - Friday, January 10, 2020 - link

    I'd say it is worse. This section is light and performs the functionality it needs to with complicating matters. The Tom's one is just awful to use.

    There is something to be said for simplicity in design.
  • Alexvrb - Sunday, January 12, 2020 - link

    The one at THG is definitely garbage. It is however, directly tied to the article comments on the forum, so you can just use the forum instead. I just wish they'd have the direct link in every article, so you don't have to trudge there yourself.
  • olde94 - Friday, January 10, 2020 - link

    i saw it and smiled at the generosity in this community.
  • s.yu - Friday, January 10, 2020 - link

    This.
  • evernessince - Thursday, January 9, 2020 - link

    Makes sense to me, people should be able to edit what they said and / or retract them.
  • Cullinaire - Thursday, January 9, 2020 - link

    That's not a right, merely a privilege given in certain places.
    If you want to clarify yourself here make another post.
  • evernessince - Thursday, January 9, 2020 - link

    Having control over what you say isn't a right? Where do you live, china?
  • Cullinaire - Thursday, January 9, 2020 - link

    How is being able to backtrack for comments considered a good given right essential to life? You post something, you take responsibility for it. Where do you live, in some fantasy world?
  • Cullinaire - Thursday, January 9, 2020 - link

    See I just made some typos and you're gonna latch onto them since you don't have an actual rebuttal. That's fine, I accept that as my victory lol.
  • milkywayer - Thursday, January 9, 2020 - link

    If we had the said edit option, you could fix your typos so he couldn't latch onto em. Point proved.
  • travsb1984 - Friday, January 10, 2020 - link

    You should start by NOT caring about members of the grammar police... They are the scummiest of the scum. Do that and life will be much easier.
  • Retycint - Thursday, January 9, 2020 - link

    This is an online discussion forum. Nobody cares if you backtrack your statements or sneakily edit it. I fail to see what kind of "responsibility" one needs to take for one's words here, unless it is something illegal or constitutes harassment
  • tipoo - Thursday, January 9, 2020 - link

    If AMD really did solve the battery life gap, I'd love to see a move to AMD CPUs on the next Macbooks. Unless Tiger Lake has another core count increase up its sleeve for the same wattages (i.e 8 for 15W), AMD should hold a clear multicore advantage for the same wattage.

    All depends on that battery life thing though.
  • extide - Thursday, January 9, 2020 - link

    Smart. Intel needs to get some mindshare back as they have been getting plowed by AMD lately. This is cool to see them actually showing it off and stuff. Hopefully it doesn't suck.
  • travsb1984 - Thursday, January 9, 2020 - link

    They're already marketing it for "power efficiency", I think we all know what that means...
  • Smell This - Thursday, January 9, 2020 - link

    **most surprising product at the show has been for me __ And this year, it looks like that honor is going to Intel.**
    ___________________________________________

    I, personally, would have gone with the sex stuff, but whatever floats yer boat, Bubba.
  • BigDragon - Thursday, January 9, 2020 - link

    Warframe is a 7 year old game. Intel has some serious problems if they're unable to maintain a smooth frame rate at only 1080p in such an old game. All the marketing materials in the world won't improve performance. Intel has a lot of work ahead of them.
  • III-V - Thursday, January 9, 2020 - link

    At least their situation can be improved, unlike your abortive reading comprehension skills:

    "but as this is early hardware on early software, it’s clearly not in a state where Intel is even focusing on performance."
  • drexnx - Thursday, January 9, 2020 - link

    but why would you bother to show this off, if it's in such a negative light?
  • jaggedcow - Thursday, January 9, 2020 - link

    "since Intel is getting in front of any dev leaks by announcing it to the public now"
  • imaheadcase - Thursday, January 9, 2020 - link

    Because unlike the comments being posted here, its Intels first GPU that is not mobile, and its also not trying to compete with AMD. Its literally a first run GPU to see what they can do.
  • 29a - Thursday, January 9, 2020 - link

    No it's not, the i740 existed in the 90's. One of the first if not the first AGP cards.
  • zmatt - Thursday, January 9, 2020 - link

    Hell, you can even read Anand's review from 1998 still. https://www.anandtech.com/show/204
  • Tomatotech - Thursday, January 9, 2020 - link

    Oh wow that was a wild read. I was going to say they tested it with Doom, but no, they tested it on Win-95 with Quake 2 at 640x480, getting a whopping 21 frames per second.

    Top performer was the Voodoo 2 GPU, with a whole 12MB of on-card RAM, getting 48fps at 800x600. Comparing that with the struggles a 2080 ti has with ray-tracing at 4k, some framerates never change...
  • Foeketijn - Friday, January 10, 2020 - link

    Thank you for finding that article. That's from before I even knew Anandtech. The CPU world got a bit more interesting lately. Maybe the GPU's will be revived also.
  • Dolda2000 - Thursday, January 9, 2020 - link

    Frankly, I'm just glad they're actually showing off things still in development instead of keeping everything under complete wraps until it's completely ready for final release. That should be encouraged, not met with "oh wow, look how much this half-finished project sucks right now".
  • DasGanon - Thursday, January 9, 2020 - link

    It's not 7 years, closed beta was 7 years ago. It's GaaS that has had a lot of changes since then (and depending on game mode even an I7-4790K & GTX 970 has some hiccups, never below 60fps but it's not consistent)

    Is it a great demo, no because Warframe is one of the weirdly best optimized games ever and it was extremely recently WF got rid of XP support.
  • BigDragon - Thursday, January 9, 2020 - link

    You're right -- Warframe has had a lot of changes. DE recently made a big change to the lighting engine that could impact performance. The game is wildly optimized. Integrated graphics on potato PCs can run Warframe buttery smooth. Instead, Intel is trying to build hype with https://youtu.be/gXVRSCQ9zGo . I'm unwilling to accept Intel's excuse that immature hardware and drivers cannot handle Warframe.
  • Cullinaire - Thursday, January 9, 2020 - link

    It's optimized... for AMD and nvidia.
  • Spunjji - Friday, January 10, 2020 - link

    That's not how any of this works.

    It's possible to tailor a game for one specific architecture by implementing features that play to that architecture's strengths - e.g. Hairworks, TressFX, overdoing tesselation, etc. That will make it run well for that vendor or product at the expense of everyone else's. The rest of the optimization done for specific hardware is done at the driver level by the vendor, though.

    There's not really any way that Warframe itself could be "optimized" for both AMD and Nvidia but not Intel. It's their job to make existing 6-year-old games run well on it, not developers.
  • Spunjji - Friday, January 10, 2020 - link

    That's what I thought, too. Warframe runs at 60fps in 1080p with medium settings on lightly optimized 2400G systems. I get that this is only an early demo... but it's a terrible one.
  • AshlayW - Sunday, January 12, 2020 - link

    To be fair it has received some major graphics overhaul. But yeah. It's not demanding. My 3400g does just fine
  • Zizo007 - Thursday, January 9, 2020 - link

    Its good that Intel is in the GPU business now. Competition will bring down Nvidia's outrageous prices.
    But Intel GPUs might have many bugs at launch as its their first dGPU ever.
  • Hurn - Thursday, January 9, 2020 - link

    Not their first dGPU.
    The I740 was available both on an AGP card and soldered onto Intel motherboards.
    In 1998-199.
  • Retycint - Thursday, January 9, 2020 - link

    Nvidia prices are hardly outrageous, given that Nvidia has actually been improving gen after gen in terms of power/efficiency, unlike its CPU counterpart Intel. Things like RTX may be an expensive gimmick now, but it pushes the envelope for the gaming industry and is only going to be a good thing in the long run
  • Spunjji - Friday, January 10, 2020 - link

    They increased the price of their high-end GPU by 70% in a generation where performance increased by ~30%. I'd definitely call that outrageous.

    RTX isn't worth that extra 40% in a first-gen product with so little software support lined up.
  • FANBOI2000 - Friday, January 10, 2020 - link

    And AMD weren't any cheaper, I know that was down to mining but I think it Al adds up to new normal now. We paid up so they don't have a great incentive to keep prices low. If NVIDIA have the fastest cards then people will pay. I don't like it, I look at £250 cards and think they should be closer to 150, but even AMD seem in on the game now. They could easily clean up with a 30 drop on some of the cards but chose not to. You'd think they might of given the success of the Ryzen on desktop. But conversely there Intel has dropped prices so AMD CPU don't seem quite the bargains they were. It seem the opposite with the graphics with AMD increasing its prices to come closer to NVIDIA. That suggests to me that if AMD had the cards to really beat NVIDIA they would be charging high for them just like Nvidea. So the new normal.
  • sarafino - Thursday, January 9, 2020 - link

    Nvidia pricing won't budge unless Intel manages to come out with something that can actually compete in the high-end. It's going to take quite a while just for them to get their graphics cards and drivers performing at that kind of level.
  • Yojimbo - Friday, January 10, 2020 - link

    NVIDIA and AMD both offer a range of cards at a range of prices. I mean, AMD's products currently compete with NVIDIA's all the way up to the $450. NVIDIA is better at building the GPUs so they can make more money selling a card at that price. You don't want the same performance card to be sold for the same price with less profit, because then the company will have less opportunity and incentive to invest in future performance gains. The real bad situation for consumers would be if innovation and performance tails off and then not only will you be paying more money per gain in experience, but you'd be getting less of a gain in experience overall.

    Buy what you can afford. If prices do actually go up then it sets you back a few months in terms of price/performance ratio, and they only go up to what people are willing to pay. Competition should bring down prices somewhat but there's nothing outrageous about the prices unless you think it's outrageous what other people are willing to pay. If no one was willing to pay 600 dollars for a card then NVIDIA wouldn't make a card for that and developers wouldn't bother putting features in their games that require a video card that costs that much.
  • Spunjji - Friday, January 10, 2020 - link

    People were willing to pay $600 - a small minority of them, but still. $350 is where most of the market is at.

    $1199 is an objectively outrageous price for a high-end consumer graphics card, and it is outrageous that people are willing to pay that.

    As a result, most developers haven't yet bothered putting RTX-exclusive features in their games. The most popular card by far is the 2060, which can't really *do* RTX in any meaningful sense.
  • Yojimbo - Friday, January 10, 2020 - link

    Your opinion is tat it is outrageous. However, it is their money. It's not that developers "haven't bothered" to put RT features in their games, it's that it takes time and money since it's a new technology. But what in the world does it have to do with the discussion, expect possibly to form an argument against the idea that NVIDIA is charging "outrageous prices". Raytracing is an innovation. Because of NVIDIA's push, you will eventually have raytracing at least 2 years earlier than otherwise. That's more valuable than 6 months of price/performance lag. 2060 can do RTX. There's nothing that says you need to run your games at 80 FPS. 40 FPS is just fine. But, it's the choice of the individual purchaser.
  • Korguz - Friday, January 10, 2020 - link

    " There's nothing that says you need to run your games at 80 FPS. 40 FPS is just fine. But, it's the choice of the individual purchaser. " tell that to those that keep saying for games.. the intel cpus are the ones to get because of the higher clocks. and how they want 100+ fps. To say 40 fps is fine to those, they would laugh at you.. ray tracing on current rtx hardware isnt fast enough to use unless you go high end 2070 or 2080 cause the performnace hit is too much for those that want a minimum of 80 fps.. and more then 100+ most of the time. and those 2070s and 2080s are priced way to high, i think the 2070s start at around $800 cdn here.. way to expensive
  • FANBOI2000 - Friday, January 10, 2020 - link

    We might see RT 2 years early but that is a moot point if you can't afford it and developers will write software for the average PC and the high end will be less of a thought. Certainly those gamers who can justify over 1000 for a graphics card will be hardcore gamers but I don't see them being as profitable to developers as the middle ground. So they write for the middle ground because you get diminishing returns spending time and money wowing people with graphics few will be able to get-simply because they don't have the budget for the high end cards required for access.
  • Kangal - Friday, January 10, 2020 - link

    I don't know if this qualifies as "competition".
    It's struggling to keep up with an RX 550, yet it's likely drawing more power (60W ?), generating more heat, and most definitely has fewer features (codecs, etc etc). And based on what we know, it will likely cost more too.

    And that's the RX 550, the joke of modern GPUs. There's practically zero reasons to get an RX 550 already when you can instead get the (slightly) superior GT 1030 for cheaper. Heck, I think Intel's "free" Iris UHD630 is more impressive than this. And instead of all of those, I'd much prefer to instead have a "free" Navi-8 APU when they eventually launch in 12 months time.... or step up to a proper dGPU like a GTX 1650 or better.

    I don't think this was a good idea for Intel to showcase this. It's like inviting friends over to your house, and presenting a pigsty!
  • Alistair - Friday, January 10, 2020 - link

    i bought a gt 1030 for a customer, never seen such a bad and slow card before

    you're looking at the 1650 super minimum for a real video card, there isn't anything worth buying under $150
  • Alistair - Friday, January 10, 2020 - link

    1650 super is 4x faster? for a few extra bucks
  • FANBOI2000 - Friday, January 10, 2020 - link

    The proper 1030 with gddr is OK for 720 gaming. That is damming with faint praise as it is Xbox 360 levels but it is OK for some games in a general purpose PC. I agree the 1650s is where you want to be if gaming is going to feature at 1080p or if you are going for esports competively or FPS. but the 1030 still has a place, especially second han although sometimes the 9 series can offer better value there. I've seen a 980 and 1030 both go for £100 on eBay and I know which one I'd rather have.
  • FANBOI2000 - Friday, January 10, 2020 - link

    Will it? NVIDIA are all about machine learning now and I think that is where Intel will really be working. How much cash is there really in gaming cards, especially when you consider there are two well known brands in that sector and you aren't known for such abilities. It doesn't even seem like AMD is going that heavily with a price war based on their recent releases.

    I'm not a hardcore gamer, 1080p on high settings will placate me, so I am still sticking with my 1060 6GB as it will be good enough at least until say 6 months into the new consoles. I don't think the consoles will drive things much at first as developers will still be writing games for Xbox S/X just as they did with the 360. It may well mean we have the status maintained for a while yet, maybe it is the new normal given AMDs recent pricing.
  • thestryker - Thursday, January 9, 2020 - link

    If the leaks regarding the desktop part are accurate it is just a desktop version of the exact same thing in Tiger Lake, EUs and all. I suspect this is just an easy way for Intel to design a working video card and get something out for vendors to work with. I wouldn't be particularly surprised if whatever this was never even made it to the consumer.

    I really do hope they're able to deliver though as with the price increases in the video card industry there's plenty of room for Intel to come in and undercut the others without taking a loss doing so. If they're able to do that it may put some pressure on AMD and Nvidia to return to reality with pricing.
  • tipoo - Thursday, January 9, 2020 - link

    Yeah, sounds like a way to work on the Xe architecture without having to switch out to a Tiger Lake system. If it's not largely for developers, maybe some bundle in OEM parts.
  • Uroshima - Friday, January 10, 2020 - link

    How many did get that weird idea that the demoed DG1 is nothing else than a Tiger Lake with the CPU fused off ... after all, it is the LP version, it runs below 75W, they run some feeble demoes ...

    And lets not forget that LP was added later in the game next to the "imba" versions.

    My really nasty thought is that LP is Gen12 graphics which uses some Xe technology which turned out better than the beefier Xe versions. (like they overcomplicated and the GPUs are running hot and have a bunch of power hungry features that do not help in mundane GPU tasks)

    That would lead this year to Tiger Lake APUs, some heavy duty GPU for professionals and the real "gamer grade" discrete graphics in like ... 2-3 years?
  • bill44 - Thursday, January 9, 2020 - link

    4x DP? Not 1x HDMI + 3x DP?
  • phoenix_rizzen - Thursday, January 9, 2020 - link

    Zoom in on the pic in the gallery and you can see it's definitely 4x DP.
  • bill44 - Thursday, January 9, 2020 - link

    I did. 1x HDMI + 3x DP.
    https://cdrinfo.com/d7/content/first-look-intels-d...
  • phoenix_rizzen - Friday, January 10, 2020 - link

    You're right. That's a better angle. The pics in the gallery, even when zoomed in, showed a squared off right side to the far left port, making it look like DP as well.

    This pic shows the far left port is HDMI.
  • Slash3 - Friday, January 10, 2020 - link

    I'm always amazed that CDRInfo still exists.
  • Duncan Macdonald - Thursday, January 9, 2020 - link

    How much is this demo an attempt to divert attention from AMD and the kicking that AMD is giving Intel on the CPU front?
  • alufan - Thursday, January 9, 2020 - link

    give that man a Banana, I hope Intel comes good and we have a 3 way split and competition for GPUs but I doubt it will happen for many moons Nvidia and AMD have too much of a lead, however this entry level laptop GPU is managing to make the front page on many sites simply because its Intel and yes it pushes other news aside, what other possible reason could there be to split the "presentation" over 2 Days for what is essentially a lab product ages away from completion let alone release, Intels old dirty tricks book again
  • sarafino - Thursday, January 9, 2020 - link

    Considering they were willing to throw together a 5 GHz overclocked Xeon with an industrial chiller to interfere with previous AMD presentations, it's not all that hard to believe they might have been attempting something similar here. Based on the impressions given by Steve over at GamersNexus, the DG1 is in a pretty rough state yet, with input latency pushing 100ms. With how polished Intel products usually are by the time they're shown at events like this, the DG1 seems very early and unfinished for something like CES.
  • FANBOI2000 - Friday, January 10, 2020 - link

    PC Pro magazine, via podcast, were quite negative about Intel. They said they just waffled on for over an hour and has nothing really to show. They focused instead on reporting on AMD. They were a bit cynical about power consumption with the mobile chips, so am I.

    So yes, and that seems pretty much where Intel are since they super cooled that CPU a year back. Or auctioned off that CPU that they were only able to offer a guarantee of 12 months on (because evidently they don't expect it to last long overclocked to Death, binning or no binning). AMD got a name for being less than honest in the past and I'm starting to get that same feeling.

    With Intel it is more jumping around and waving their hands to try and get attention for not very interesting things. So a graphics card that doesn't really work but look how nice they've made the case work. And semi interesting things to do with Optane. Semi interesting but not as interesting as Pcie 4 where all AMD has to do is provide it and the 'partners' do all the work releasing the hardware to use it.
  • Kevin G - Thursday, January 9, 2020 - link

    Obviously slightly different using the Xe design but in terms of the fundamental design and principles, how does this different than the VCA-2 cards Intel current sells? Just that the these have the display outputs actually soldered to the PCB?
  • Tomatotech - Thursday, January 9, 2020 - link

    That’s a nice little case, mITX I presume. What is it?
  • Tomatotech - Friday, January 10, 2020 - link

    Nobody has any idea what that case is? I do like the look of it.
  • FANBOI2000 - Saturday, January 11, 2020 - link

    Looks a bit like the new NUC elements case From Intel.
  • lmcd - Thursday, January 9, 2020 - link

    I just hope Intel holds onto the consumer-available GPU virtualization functionality they've included in their iGPUs.
  • bigsnyder - Friday, January 10, 2020 - link

    Will it be able to run Crysis?
  • eastcoast_pete - Friday, January 10, 2020 - link

    Will it be able to pull Intel out of its crisis?
  • UltraWide - Friday, January 10, 2020 - link

    Enough with the previews! Just release the damn product!! lol
  • eastcoast_pete - Friday, January 10, 2020 - link

    Of course this is mainly done to show that Intel's talk about dedicated graphics is not just that - talk. I for one see a market for a dGPU that is basically a faster version of their current top iGPU (in the G71 or whatever it's called), further boosted by actual video RAM. A card like that, with 4 GB of GDDR6, HDMI 2.1, an on-board ASIC for 10bit HDR decoding and streaming and all for under $100 could give some desktops another 2-3 years of usable life. Now, this being Intel, they'll probably manage to screw that up, somehow. They usually do.
  • spkay31 - Friday, January 10, 2020 - link

    Seeing is believing? Seeing early availability developer only stuff at CES with no concrete information about product availability and pricing is only believing if you have blind faith in Intel. They've been having so many problems just getting 10nm node ramped up I don't expect this to be making much of an impact on graphics market before the next CES.
  • sftech - Friday, January 10, 2020 - link

    Intel thinks there's a client market for GPUs? They're doubting the game streaming revolution?
  • FANBOI2000 - Friday, January 10, 2020 - link

    I think completion is a good thing, at the moment AMD have had APUs all to themselves and that will make them lazy. Already most of the stack has no graphics and based on what I have seen we aren't going to get any great discounts from AMD when it comes to DGPU. If Intel could hit 3200g performance then it would be notable and change the narrative quite a bit. I'm all over AMD in desktop, and if I am A fan boy it would be AMD, but chips like the 9400F are still good CPU at a fair price. I build systems for people as paying hobby and CPU still has a place against AMD. imagine it as an APU with 3200G levels of performance.

    anyway I'm drifting, at the moment I place this higher up my cynicism scale than I do AMDs mobile claims. I think AMD is saying little about battery life for a reason and I think Intel is showing us empty boxes for all it is worth at the moment. I imagine we are probably at least a year away before Intel will be ready for market. And I also can't see them really focusing on gaming. If you look at Intel it isn't all about CPUs, they have a lot of other IP they can trade off like Optane so they will be looking at compute, machine learning, things like that. But any completion is still a good thing and I am happy for AMD even if I have some doubts about the power draw with their new mobile chips. If I am wrong then I am very wrong because I have argued that Intel would be more robust in mobile and server. While they have taken a complete beating in server, again it isn't all about the CPU but also server will take some time to filter through for AMD I think. Companies will just bin all their Intel stuff overnight? I think with mobile though, if AMD are being straight, that could be very disruptive. I can't remember a time when I ever considered AMD mobile chips. Maybe in some SFF ten years ago, where power consumption was less of an issue.
  • FANBOI2000 - Friday, January 10, 2020 - link

    completion? competition I think I mean, should have proof read before posting. AND I think AMD entering the mobile space as they claim will be very disruptive simply because of turnover and the business model (for business). Laptops get replaced often and often they are leased. So there's a bit less risk in giving a department AMD laptops than there is swapping out servers in what could be a highly Intel based ecosystem. AMD, if they are being straight, will be first to market and as long as the cores and the graphics make a compelling argument to the business they will hammer Intel quickly. I think it will take longer in consumer because they have only known disappointing, budget, AMD laptops so say a £800 AMD based system will be a hard sell.
  • AshlayW - Sunday, January 12, 2020 - link

    My 3400g runs warframe at 1080p max without much issue...
  • SharonTTurner - Monday, January 13, 2020 - link

    If the 15W chip can TDPup to 25W, can the 28W chip do something similar?

Log in

Don't have an account? Sign up now