How Call of Duty: Modern Warfare 2019 Plays on Different Graphics Cards
Call of Duty has rebooted one of its most influential entries for 2019's Call Of Duty: Modern Warfare, promising the classic gunplay the franchise is known for but with the improvement possible in a modern game release.
|CPU||Intel Core i3-4340 or AMD FX-6300|
|GPU||Nvidia GeForce GTX 670/Nvidia GeForce GTX 1650 or AMD Radeon HD 7950 - DirectX 12.0 compatible system|
On PC, its minimum requirements are not much higher than other 2019 releases, but what does this mean in practice? Can you play it using an integrated GPU? How does Nvidia's still popular 10th generation perform? Let's find out!
|GTX 1050||Lowest||1080p - 70% Resolution Scale (1344x756)||60 fps|
|GTX 1060||Max||1080p||70-80 fps|
|.||Low + Medium Textures, Decals and AA||1080p||80 fps|
|.||Low + Medium Textures, Decals and AA||1440p||70 fps|
|GTX 1080||Max||1080p||90 fps|
|.||Low + High Textures, Decals and AA||1080p||100 fps|
|.||Low + High Textures, Decals and AA||4k||60 fps|
The Setting Screen
The settings screen for Call of Duty Modern Warfare is pretty straight forward, with sections for controlling details and textures, shadow and lightning and post effects, with each individual option clearly labelled from lowest to highest.
Like most current games, Call of Duty: Modern Warfare includes a versatile Render Resolution scaler that allows the resolution of 3D elements to be lowered and then resampled back into the window's native resolution without affecting the UI.
Sadly, the resolution scaler only goes down to 66% of 1080 which is a tad higher than what other modern games allow and an obstacle for older or cheaper GPUs. But as we are about to see, this game is not very friendly to the entry-level in several ways.
AMD Vega integrated GPUs, such as the Vega 8 or Vega 11 are usually great cheap options for 720p or light 1080p gaming.
Strangely, the current version of Modern Warfare might have some issue with these options. The game initially seems to start fine, but trying to enter any actual game scene crashes the game with a DirectX error.
I tested with the Ryzen 3200G + Vega 8 and Ryzen 3400G + Vega 11 using driver 19.9.2 and 19.10.2, but the problem persisted.
Intel HD graphics sadly did not fare much better. I tried the game using a Dell XPS 13 with Core i7-8550U CPU and Intel UHD 620 graphics which I have used for plenty of games. The game did not start, throwing an incompatibility error message when trying to open.
Call of Duty: Modern Warfare on GTX 1050 Ti
Since the game seems to require a dedicated GPU I decided to begin with the GTX 1050 2GB, one of Nvidia's entry-level gaming GPUs for the 10th generation. The humble 1050 is usually one of the cheapest options to help you get gaming on older games, or newer games at lower settings and Modern Warfare is not an exception.
I paired this with a Ryzen 5 3400G and 16 GB of DDR4 ram and with the lowest settings allowed by the game, and resolution set to 1080 but the scaler to 70% (so a render resolution of 1344x756) and the game maintained a 60 fps cap even at the most-challenging moments. .
For testing, I used the Ground War multiplayer mode since it tends to have the bigger maps, the higher player counts and plenty of vehicles running down constant destruction.
At lowest settings, the game is not exactly pretty on the eyes and the lack of antialiasing and decent lightning is obvious but a lot of people playing competitive games like this one are more interested in function than form. . If that is the case for you as well the GTX 1050 will not disappoint.
The game does like its memory, utilizing almost all the 2 GB of VRAM and over 10 GB of RAM at the lowest settings. Let's see how an upgrade will do.
Call of Duty Modern Warfare on GTX 1060
We jump now to a more powerful GTX 1060 with 6GB. While it is starting to show its age, using a 1060 with 6 GB of VRAM is still a pretty inexpensive way to get decent settings in modern games at 1080.
Modern Warfare is another example of that. From nothing to everything, the game is now able to max out the settings (except for Ray Tracing) at 1080 and still keep 60 fps at a minimum and often more.
Of course, the difference visually is quite impactful, particularly regarding overall lightning, anti-aliasing and full 1080p resolution. Some might prefer to focus on competitive responsiveness over visual settings, and for them, two interesting options emerge.
The first is to prioritize framerate. By lowering all the settings but still leaving textures at medium and bullet decals enabled, we can squeeze the most performance while still leaving some visual flair. This leads to something closer to 80 fps on average which will work great in a 90Hz (or more) monitor with any sort adaptive v-sync.
Another option is to use a higher resolution. I bumped the game to 1440p at the same settings and was happy to see that the GPU was used to its max while keeping a nice 60 fps or more even at the worst moments.
Given the nature of the game, I would lean towards higher framerates but that is dependant on your monitor having a high refresh rate and variable refresh rate, so you can focus on settings or resolution if that is not available.
Call of Duty: Modern Warfare on GTX 1080
Given the previous results, it is not a surprise that a GTX 1080 card can max out the game at 1080p resolution and still provide somewhere between 80 and 90 fps at its most intense moments thus making it a great choice for both graphics and framerates.
Let's explore options from here. Since the computational power is there, we can raise the resolution to 1440p at the same settings and expect 60-70 fps in most scenarios. If you are looking for a balance between framerate and resolution, this might be your best bet.
When I attempted to play at full 4K, the GTX 1080 wasn't as good, sticking to a still respectable average of 45 fps at maximum settings.
If we drop down to the lowest settings, except for medium shadows, high textures, bullet decals enabled, and SMAA anti-aliasing, then we can keep around 55-60 fps. The game still looks decent enough, and as I have mentioned in the past, many people prefer higher framerates over better details.
Call of Duty: Modern Warfare plays well on budget graphics cards and excels on mid-range models. You can get a smooth 60 fps on even a GTX 1050 at low resolution and settings, and the image quality improves dramatically as you move up to a GTX 1060. A GTX 1080 can even play at 4K with some compromises. However, this game does not work with integrated graphics at all so you can only play on PCs with discrete GPUs.
Call of Duty: Modern Warfare PC settings guide, system requirements, benchmarks, ray tracing, and more
Call of Duty: Modern Warfare is a surprising reclaiming of the franchise's former glory, marrying modern gameplay and visuals with Modern Warfare's iconic story and settings. These modern visuals can look great on almost any machine, with budget PCs putting out extremely playable framerates even at maxed out settings, though not including ray tracing. Turning on the ray tracing effects with RTX hardware isn't too rough, but GTX graphics cards will struggle mightily.
A notable hardware story with Call of Duty is its odd performance on AMD hardware. Radeon graphics cards put up a good fight against their Nvidia competitors, but the Ryzen processors don't look so hot. It's not that you can't play Modern Warfare on a Ryzen CPU, but the latest Ryzen 3900X flagship performs worse than Intel's last-gen Core i5-8400 offerings at 1080p, and even the Core i3-8100 jumps ahead at 1440p and 4K. It's not all smooth sailing for Intel, either, as Hyper-Threading appears to hurt performance. That might explain the Ryzen results as well, but the game engine doesn't seem to deal well with high core counts.
A word on our sponsor
As our partner for these detailed performance analyses, MSI provided the hardware we needed to test Call of Duty: Modern Warfare (2019) on a bunch of different AMD and Nvidia GPUs, multiple CPUs, and several laptops. See below for the full details, along with our Performance Analysis 101 article. Thanks, MSI!
Features are fairly standard for an Activision title these days. Resolutions of all shapes and sizes are supported, including double-wide and multi-monitor setups. In testing, 16:9, 21:9, and even 32:9 look correct, with the latter two giving a much wider field of view by default. The intro movies are also in 21:9, which is a nice touch. Nearly everything else is as expected for a AAA game of this caliber; controller support, a healthy amount of graphics settings to play with, your usual FOV and framerate boundaries.
CoD:MW does limit the HUD removal options, however. You can't just turn them off, but instead have to play in Realism mode. In the singleplayer campaign, that's a higher difficulty, while in multiplayer you're left with a small unremovable reticle in the center of the screen and a tiny killstreak pager that beeps once your perks are available. It also heavily buffs weapons damage and lowers defensive stats, making the game much less forgiving. If this style of play is your cup of tea, great! Otherwise, you're stuck with an always-on HUD. There's also Ansel support for capturing images, though only on Nvidia GPUs.
Call of Duty: Modern Warfare system requirements
Infinity Ward built Modern Warfare to be played on modern hardware, though you can certainly get by with much less. Here's the developer's minimum and recommended specs:
Minimum PC specifications:
- OS: Windows 7 64-bit (SP1) or Windows 10 64-bit
- CPU: Intel Core i3-4340 or AMD FX-6300
- Memory: 8GB
- GPU: Nvidia GTX 670/GTX 1650 or AMD HD 7950
- DirectX 12.0 compatible
- Storage: 175GB
Recommended PC specifications:
- OS: Windows 10 64-bit
- CPU: Intel Core i5-2500K or AMD Ryzen R5 1600X
- Memory: 12GB
- GPU: Nvidia GTX 970/GTX 1660 or AMD R9 390/RX 580
- DirectX 12.0 compatible
- Storage: 175GB
Modern Warfare's minimum spec is a bit higher than other games, suggesting current-gen graphics cards. And that's overlooking the mind-boggling 175GB of storage space for the game itself. That size makes even Red Dead Redemption 2 look pint-sized, so I would advise setting aside an afternoon just for the game to download. The game originally launched at 112GB, but two updates have already pushed that to 141GB and future updates are planned.
The recommended specs aren't much higher, though the fact that a 2nd Gen Intel chip is paired against a Ryzen 5 suggests Infinity Ward knew AMD's CPUs would struggle a bit. Based on our testing and a little bit of number-comparing for the older parts, the minimum spec should have no problem breaking 60fps at 1080p min, while recommended specs would net somewhere around 60 fps at 1080p max.
Call of Duty: Modern Warfare settings overview
Modern Warfare doesn't provide any graphics presets for you to choose from. Instead, it gives you 16 individual graphics quality settings to play around with, and most of these hardly affect performance. Lower-end cards like a GTX 1650 take a hit to performance when going from minimum to maximum, but modern midrange and high-end cards go from extremely playable to slightly less extremely playable at 1080p. Unless you need to make every bit of VRAM count, I'd suggest aiming for maxed out (or nearly maxed out) settings; the game looks significantly better for not much of a hit to performance.
There are two exceptions to the above. First, AMD cards with only 4GB failed to run Modern Warfare at maximum quality—they simply crashed to desktop. Updated drivers or a game patch may fix this, as the GTX 1050 2GB card didn't completely fail to run.
Second, DirectX Raytracing (DXR) is available on Nvidia cards, as Modern Warfare has joined the hall of games with RTX support. But don't expect to see Control levels of graphical improvement; ray tracing on Modern Warfare is almost unnoticeable. The shadows and light casting look better with ray tracing, but not by a significant margin, and after Control, it's odd to not have accurate reflections. For 30 percent lower performance, you're getting very little impact to your viewing experience, so unless you really need to have all settings maxed for cool points, leaving ray tracing turned off isn't a bad idea.
That goes double if you have a GTX graphics card with 6GB or more VRAM. You can technically enable ray tracing, but the GTX 10-series hardware sees framerates cut to one third their non-ray traced levels, while GTX 16-series hardware doesn't quite drop by half.
Call of Duty: Modern Warfare graphics card benchmarks
For testing CoD, we used the singleplayer campaign on the first mission. It was reasonably demanding and, unlike multiplayer, it's possible to roam around without repeatedly dying. Performance can be lower or higher in other maps and game modes, but our test at least gives a baseline level of performance.
For these tests, we only used the minimum (all settings to low) and maximum (all settings to high except DXR) options for 1080p, and full maximum quality at 1440p and 4K. We also tested with DirectX raytracing shadows enabled on many of the GPUs, using the same settings otherwise.
Our GPU testbed is the same as usual, an overclocked Core i7-8700K with high-end storage and memory. This is to reduce the impact of other components, though we'll test CPUs and laptops below.
At 1080p minimum, Modern Warfare doesn't really look like Call of Duty. The flat, lifeless colors and low resolution of textures can look blocky and ugly, particularly on some of the effects like fires and lights. The good news is that every graphics card we tested is very playable. Even the RX 560 and GTX 1050 average60 fps, with occasional dips only going down to 50. Anything faster won't have any problems.
Going up the ladder presents a very familiar graph, with most cards exactly where you'd expect. And in a bit of a win for AMD, the normally lackluster Vega 56 and 64 beat out the GTX 1070 by a solid margin, though the RTX 2060 manages to match them. Meanwhile, every card in the top 10 is pretty much in the same places they take in almost every other hardware test.
Ray tracing (DXR) with everything else turned down doesn't really make much sense, but it does show that it's the DXR calculations causing the massive drop in performance. The 2060 drops from 180 fps to 'only' 128 fps, and the particle effects and textures still look awful.
The GTX 1070 meanwhile plummets from 143 fps without DXR to just 40 fps with DXR. Yikes. The newer Turing architecture GTX 1660 meanwhile goes from 137 fps without DXR to 83 fps with DXR. That concurrent FP + INT hardware in Turing certainly can be useful.
Cranking graphics settings up to maximum, the story largely remains the same. All cards take about a 25% dip in performance, and the only changes are with cards that were already within a handful of frames of each other. The GTX 1050 drops to around console framerates, and anything slower could be deemed unplayable. AMD's 4GB cards also failed to run and aren't present in these charts.
No cards make surprising rises or falls, and curiously, all cards drop at almost the same rate. That means, at least with our overclocked Core i7-8700K test CPU, we're almost purely GPU limited, even with the RTX 2080 Ti. The game also looks much nicer; especially in night missions like "Going Dark" that depend on shadows and reflections to truly stand out.
If you have an RTX GPU, enabling DXR is certainly still possible. Even the RTX 2060 stays above 60 fps on minimums. The GTX 1660 cards also do surprisingly well—again, thanks to the concurrent FP and INT hardware in Nvidia's Turing architecture. A GTX 1080 Ti meanwhile can only just barely average 60 fps, with stutters from minimum fps dropping into the 30s. If you want to try ray tracing, in other words, you really need a Turing GPU.
Now the culling begins, with anything short of an RX 580 dropping below 60fps and only cards above the 2060 Super managing to stay above 100fps. For single-player experiences, I would say 30-40fps is playable, but in online titles where your refresh rate, frames per second, and input lag all impact your playing performance, I would err against dropping below 60 frames at all costs. Keep your 1060s set to 1080p, lest your K/D ratio drop to unforgivable lows.
The high-end graphics cards mostly stay in the same positions as 1080p max, with the 1080 Ti and RX 5700 XT swapping spots. Also note that not even a 2080 Super can max out a 144Hz 1440p display, and even the 2080 Ti dips below that on minimums.
DXR naturally makes things even more demanding. For improved shadows that you may or may not notice in the heat of battle, the RTX 2060 goes from buttery smooth to I-can't-believe-it's-not-butter. None of the GTX cards are even worth discussing at 1440p, but the 2070 Super and above keep minimums above 60 as well. But the most competitive gamers are better off with a 144Hz or even 240Hz monitor running at 1080p.
The pinnacle of PC gaming, 4K ultra (or maximum, since there is no ultra preset) is surprisingly forgiving to our GPUs. While every card below the GTX 1070 is now borderline unplayable for a multiplayer shooter, everything from the RTX 2070 and up can hit the coveted 60 fps.
For AMD, the Radeon VII and RX 5700 XT trade places, finally favoring the older but theoretically more powerful GPU. Both are right on the 60 fps threshold as well, along with the GTX 1080 Ti. Only the RTX 2070 Super and beyond consistently reach 60 fps, with the 2080 Ti delivering an impressive 94 fps average. While the cost of entry to 4K ultra 60fps here is $499, it's still a wide field of viable cards for peak performance.
Or you can enable DXR and drop the 2080 Ti to point where it waffles around the 60 fps mark, depending on the scene, and every other GPU falls shy of 60. And no, you can't run with SLI to boost performance with DXR, sorry.
Call of Duty: Modern Warfare CPU benchmarks
Now to address the elephant in the room. For some reason, Ryzen takes a huge performance hit in Call of Duty. The Ryzen 5 3600 and Ryzen 9 3900X fall well behind their usual Intel rivals. The previous generation Ryzen 5 2600 looks even worse. At 1080p, the R9 3900X has trouble keeping up with the i5-8400, and the story only gets worse moving into higher resolutions. Here are the graphs below:
In 1080p min and max, the two Ryzen 3000 series chips we tested fall 30 frames below the i5-8400, with the older Ryzen 5 2600 trailing them by an additional 30 fps at minimum quality. Intel's range of products all fall down the line in order, with the overclock on our i7-8700K barely making any difference versus stock. Only the 4-core/4-thread i3-8100 really falls off the pace.
DXR drops performance on everything, with a much heavier GPU load, but the Ryzen CPUs all continue to trail the Core i5. The faster Intel CPUs (ie, not the i3) with ray tracing almost perform as well as the Ryzen chips without ray tracing.
When we move up to 1440p or 4K, there's some interesting moving around in the ranks. The i5-8400 jumps up to second place, surpassing both i7-8700Ks. The i3-8100 also makes a surge, joining the rest of Intel handily above the Ryzen family, though minimum fps is still lower at 1440p.
This seeming superiority of the slower i5-8400 processors doesn't necessarily indicate an Intel preference, but rather that the game engine doesn't quite know what to do with higher thread count processors. Having no Hyper-Threading on the i5-8400 overcomes even a 1000MHz deficit relative to the overclocked 8700K. Even though we live in a time where real-time ray tracing exists, we still don't have nice multi-core support.
Call of Duty: Modern Warfare laptop benchmarks
The three laptops MSI provided feature an RTX 2080, RTX 2070 Max-Q, and RTX 2060 GPUs. Based on general rules of thumb with laptop GPUs, these shouldn't perform too terribly with Modern Warfare. Especially since we're limited to 1080p, though DXR could be a bit of a problem on the lower spec models.
Can I run Call of Duty Modern Warfare at Ultra Quality setting? game 1080p, 1440p, Ultrawide, 4K benchmarks - Multiple cards tested
Do not worry about the consequences either, I can even write you a receipt that you are not to blame, but this is all me, and then, in my city N. Lives and works as a candidate of medical sciences, my good friend, a friend from childhood, and she always tells me if anything, I can count on her. Marina, I just live in this city and I know this research institute well, Vadim answered.
So, do you agree.
Modern for warfare card graphics
Closer, hugging me. A knock on the door. Barely audible voice: Paw, are you all right.Modern Warfare but I use a gun no one has ever used before...
Imagine yes. Old man Freud, having described the Oedipus complex, did not explain it. And I think that it is in the genes to look for someone similar to the parents.
You will also be interested:
- Living faith daily mass readings
- 2005 chevrolet astro for sale
- David jeremiah revelation bible study
- High school basketball preseason rankings
- Sum columns in r dplyr
- Retirement homes in palo alto
- Logitech ultimate ears boom 2
- Georgetown internal medicine and pediatrics
- C7 corvette lug nut torque
- Trees n trends weekly ad
- Elect in the bible kjv
- Sistine chapel charleston, sc
- Live cory catfish for sale
Darling. let's try again. don't be afraid. he closed my mouth with a kiss and made another push, after which I felt a sharp pain inside myself and deeper something moving.