Taking on the Dark Lord, Mobile Style

There have been a few recent product launches, with more to come in the near future, from AMD, Intel, and NVIDIA. On the CPU side we have Intel’s Ivy Bridge and AMD’s Trinity, both arguably more important for laptop users than for desktops—and in the case of Trinity, it’s currently laptops only! The two products both tout improved performance relative to the last generation Sandy Bridge and Llano offerings, and in our testing both appear to deliver. Besides the CPU/APU updates, NVIDIA has also launched their Kepler GK107 for laptops, and we’re starting to see hardware in house; AMD likewise has Southern Islands available, but we haven’t had a chance to test any of those parts on laptops just yet. With all this new hardware available, there’s also new software going around; one of the latest time sinks is Blizzard’s Diablo III, and that raises a question in the minds of many laptop owners: is my laptop sufficient to repel the forces of Hell yet again? That’s what we’re here to investigate.

Before we get to the benchmarks, let’s get a few things out of the way. First, Diablo III, for all its newness, is not a particularly demanding game when it comes to graphics. Coming from the same company as World of WarCraft and StarCraft II, that shouldn’t be too surprising: Blizzard has generally done a good job at ensuring their games will run on the widest array of hardware possible. What that means is cutting edge technologies like DirectX 11 aren’t part of the game plan; in fact, just like StarCraft II and World of WarCraft (note: I'm not counting the DX11 update that came out with Cataclysm), DirectX 10 isn’t in the picture either. Diablo III is a DirectX 9 title, and there should be plenty of GPUs that can handle the game at low to moderate detail settings.

The second thing to bring up is the design of the game itself. In a first person shooter, your input is generally linked to the frame rate of the game. If the frame rate drops below 30 FPS, things can get choppy, and many even consider 60 FPS to be the minimum desired frame rate. Other types of games may not be so demanding—strategy games like Civilization V and the Total War series for instance can be played even with frame rates in the teens. One of the reasons for that is that in those two titles, mouse updates happen at the screen refresh rate (typically 60 FPS), so you don’t feel like the mouse cursor is constantly lagging behind your input. We wouldn’t necessarily recommend <20 FPS as enjoyable for such games, but it can be tolerable. Diablo III takes a similar approach, and as a game played from a top-down isometric viewpoint, 30 FPS certainly isn’t required; I have personally played through entire sections at frame rates in the low to mid teens (in the course of testing for this article), so it can be done. Is it enjoyable, though? That’s a different matter; I’d say 30 FPS is still the desirable minimum, and 20 FPS is the bare minimum you need in order to not feel like the game is laggy. Certain parts of the game (e.g. interacting with your inventory) also feel substantially worse at lower frame rates.

Finally, there’s the problem of repeatability in our benchmarks. Like its predecessors, Diablo III randomizes most levels and areas, so finding a section of the game you can benchmark and compare results between systems and test runs is going to be a bit difficult. You could use a portion of the game that’s not randomized (e.g. a town) to get around this issue, but then the frame rates may be higher than what you’d experience in the wilderness slaying beasties. What’s more, all games are hosted on Blizzard’s Battle.net servers, which means even when you’re the only player in a game, lag is still a potential issue. We had problems crop up a few times during testing where lag appeared to be compromising gameplay, and in such cases we retested until we felt the results were representative of the hardware, but there’s still plenty of potential for variance. Ultimately, we settled on testing an early section of the game in New Tristram and in the Old Ruins; the former gives us a 100% repeatable sequence but with no combat or monsters (and Internet lag is still a potential concern), while the latter gives us an area that is largely the same each time with some combat. We’ll be reporting average frame rates as well as providing some FRAPS run charts to give an overall indication of the gaming experience.

And one last disclaimer: I haven’t actually played through most of Diablo III. Given what I’ve seen so far, it would appear that most areas will not be significantly more taxing later in the game than they are early in the game, but that may be incorrect. If we find that later areas (and combat sequences) are substantially more demanding, we’ll revisit this subject—or if you’ve done some informal testing (e.g. using FRAPS or some other frame rate utility while playing) and you know of an area that is more stressful on hardware, let us know. And with that out of the way, let’s move on to our graphics settings and some image quality comparisons.

Update: Quite a few people have pointed out that later levels (e.g. Act IV), and even more so higher difficulty levels (Hell) are significantly more demanding than the early going. That's not too surprising, but unfortunately I don't have a way of testing later areas in the game other than to play the game through to that point. If performance scales equally across all GPUs, it sounds like you can expect Act IV on Hell to run at half the performance of what I've shown in the charts. Give me a few weeks and I'll see if I can get to that point in the game and provide some additional results from the later stages.

Diablo III Graphics Settings and Image Quality
Comments Locked


View All Comments

  • JarredWalton - Sunday, May 27, 2012 - link

    And funny enough, after additional investigation, the issue isn't throttling on the Acer but rather a higher clock on the GT 630M compared to the GT 540M. NVIDIA's updated specs page for the 630M lists 800MHz as the clock, but oddly their control panel is only reporting 475MHz on the ASUS. According to GPU-Z's Sensors tab, however, it really is running an ~800MHz core clock (1600MHz shaders), which accounts for the higher performance compared to the 672MHz GT 540M. I've updated the text in the article to explain this.
  • ananduser - Saturday, May 26, 2012 - link

    Suddenly those fancy expensive ultrabooks(Apple or otherwise) seem like extremely poor deals for tech enthusiasts. Then again they were always aimed at bloggers.
  • DanNeely - Saturday, May 26, 2012 - link

    ... and enthusiasts who want an ultra portable that's a PC not a fondleslab, and which is faster than an atom.
  • ananduser - Sunday, May 27, 2012 - link

    No offense, enthusiasts(in the real sense of the word) are always more extreme than your average MBA wielding blogger. If they wanted something light they would spare no expense and would have gone with a VaioZ, or some crazy Japanese Fujitsu that is lithium made, or a moded Sony UX. PC hardware enthusiasm has nothing to do with Apple commodities that try to be as "safe" as possible.
  • Impulses - Monday, May 28, 2012 - link

    Suddenly? They were never marketed as gaming rigs, most don't even have dGPUs and Diablo 3 isn't even one of the 5 most demanding games this year. I dunno what you're getting at, ultrabooks are still great for the propose they're meant for. Can you get just as much done with an uglier/thicker/heavier $700 laptop? Sure, you might even get a dedicated GPU to go along with it... They're serving entirely different markets tho.
  • ananduser - Monday, May 28, 2012 - link

    Which is why I mentioned tech enthusiasts in my original comment. There's nothing that I dispute from your enumeration.
  • futurepastnow - Saturday, May 26, 2012 - link

    Or, perhaps I should say, a concern. You increase the detail setting and the resolution together.

    What about 1366x768 at high detail? Or 1920x1080 at low detail?
  • JarredWalton - Saturday, May 26, 2012 - link

    I have to stick to a subset of the possible resolution/detail settings or I'd be testing a single game 24/7 for a week. I've already spent probably 20 hours benchmarking Diablo III, and let me tell you: running the same three minute sequence at least a dozen times per laptop gets to be mighty damn tedious. I did run tests at some other settings, which I commented on I believe, but here's a bit more detail.

    For example, on the N56VM, 1080p with all settings maxed but Shadow Quality set to Low results in performance of 20.1 FPS/18.5 FPS for our test sequences -- so that one setting boosted performance by over 50% compared to having all settings at High/Max. What's more, I also tested at max detail 1080p but with Shadow Quality set to Off, and the scores are 27.1/24.8 -- another 35% improvement over Low shadows. Everything else combined (e.g. 1080p but all other settings at low) only accounts for probably 20%. I could test that as well if you really want, but I have other things to do right now.
  • futurepastnow - Saturday, May 26, 2012 - link

    I'm mostly thinking that a large majority of laptops sold, even now, have 1366x768 displays. It looks like all of the non-Intel laptops handle playable framerates with low detail at that resolution, so I'm curious how that performance falls as the detail goes up.

    In particular, can Llano and Trinity handle high detail at 1366x768? They are (or will be) sold in budget laptops that won't get high-res screens.

    However, I understand the time constraints your working under. Thanks for the comparison, anyway.
  • kyuu - Saturday, May 26, 2012 - link

    I agree with this. I understand time constraints, but honestly, the paradigm that's being followed here (and with a lot of reviews) is simply not representative of real-world usage. It's not the case that people play with low details at low resolutions and high details at high resolutions. *Especially* when you're dealing with laptops. Generally, you're going to have the resolution at the display's native resolution, and going to work with the settings from there.

    In any case, the article is still appreciated, and it's possible, at least, to make an educated guess at how the game will run at various resolutions and settings based on the presented info. Definitely going to grab myself a nice Trinity-powered laptop soon as one meeting my desired specs comes out.

    Also, yet again we see that HD4000 does not match Llano, let alone exceed it, as I've seen some people spreading around.

Log in

Don't have an account? Sign up now