Since so many of the latest graphics card vendors had been claiming that their latest generation of cards supported 4K we decided to test out their claims and see just how well they supported 4K (4096 x 2160 resolution). This led us to eventually borrow a 4K monitor from our friends over at EIZO, their 36 inch 4K monitor, the FDH3601. This monitor would be our test bed for 4K testing, helping us determine which card delivered the best overall 4K experience.
The two graphics cards that we tested were the NVIDIA GeForce GTX 680 and the AMD Radeon HD 7970. Both cards were reference cards, while we also tested the XFX Radeon HD 7970 which was a reference clocked card with a non-reference cooling solution just to see how well the non-reference design compared to the referenced design (Unfortunately for us, due to the time constraints of this article, we were unable to test any non-reference GTX 680 4GB cards. If you guys are a little disappointed, we share this sentiment, and the omission is partly due to the fact that GTX 680s are still in short supply and that some board vendors are extremely stingy).
We will be pitting the GTX 680 against the HD 7970 in 4K desktop experience, 4K photo experience and 4K gaming experience. We will also be evaluating the performance of 4K Gaming on both cards while monitoring memory usage to see whether or not the GTX 680’s 2GB of GDDR5 is enough to do 3D graphics in 4K. Our setup consists of an Intel Core i7 3960X, Gigabyte X79-UD3, 16GB of Kingston 1600MHz DDR3 and a Thermaltake 1475w Tough Power Gold PSU. This was all done using the latest drives from both NVIDIA and AMD that were WHQL. So, we are certain that our setup will not in any way limit the ability of the graphics card to properly process and display 4K.
For connectivity, we were forced to use two DVI cables (provided by EIZO) to connect the GTX 680 to the EIZO FDH3601. This monitor, as we showed in our preview, has a few drawbacks to it. One of the major drawbacks is that it only accepts two DisplayPort cables or two DVI cables. Each cable drives one half of the monitor rather than having one cable driving the whole monitor, which is a little disappointing. We would’ve preferred to have a single DisplayPort 1.2 cable and have our 4096 x 2160 resolution that way.
The monitor itself weighed about 60lbs and was not easy or fun to carry and we do not recommend trying to move this monitor anywhere once you’ve got it installed. The nice thing about the monitor is that it has quite a bit of flexibility for setup as it allows you to pivot the display up and down as well as adjust the display height. Furthermore, the display can also be twisted from the base by using the built-in swivel/Lazy Suzan style base. For the sake of comparison, we set it up next to our ASUS 23.6″ 1080P monitor.
4K Video Playback
For our testing suite, the first application that we tested was 4K video playback. This task would generally require anyone else to have to go over to YouTube and find a 4K video or two to play back and then assume that any lag or stuttering is a result of either flash, the browser, or the internet speed. In order to remove these confounding variables from our testing, we needed to have our own 4K footage. Unfortunately for us, during the time of our testing, we were not able to acquire Time Scapes in 4K and use that as a test file. Thankfully, though, we do have a 4K video production studio and are able to source our own footage for the 4K testing, using the local 4k video file below for testing.
Based upon our experience, the GTX 680 and HD 7970 both performed very well with the 4K video file. Because the GTX 680’s drivers didn’t fully support the configuration of the EIZO FDH3601 monitor, we had to minimize the VLC window and then expand the video to fit to the screen in order to get the full 4K video experience. The video ran pretty well in both cases and we’d consider it to be a tie between the two in 4K video playback.
Photoshop CS6 and Desktop
The second application where we tested these two cards against each other was in Adobe’s Photoshop CS6. For this, we simply installed CS6 and then swapped cards to see which card gave us the best performance and user experience. In terms of performance, the difference between the cards was not noticeable at all. The disappointment we encountered while testing CS6 was that the NVIDIA GTX 680 did not fully support full screen applications because it detected the monitor as two displays and would only allow one half of the display to be maximized.
This situation was something that we had experienced across the board in Windows when using the NVIDIA GTX 680, only getting the taskbar to cover half of the display. When using the XFX AMD HD 7970, we simply enabled Eyefinity and the system treated the EIZO 4K display as one large display rather than a combination of two displays. Because of this, the entire Windows and Photoshop experience was much more enjoyable and natural with the AMD HD 7970 than it was with the NVIDIA GTX 680. Without a doubt, in this scenario, we would have to say that the AMD card certainly won against the NVIDIA card purely due to better user experience.
Although performance was virtually the same, the experience on the HD 7970 easily makes it a better choice in this arena. This experience was further amplified when we tried to play some games with the EIZO 4K monitor in order to see how it performs in 3D graphics 4K environments.
4K Video Games – Battlefield 3
Battlefield 3 would be our most intensive application, since we would be running it at maximum settings to evaluate how the graphics cards handled nearly 9 megapixels of 3D graphics. In our tests, we also ran the game at maximum resolution. We wanted to test other games, but Battlefield 3 was the only recent game that we could get to work well using the GTX 680, with the help of a 3rd party scaling application. We also played DIRT 3 on the HD 7970, but since the 680 couldn’t run it at full screen we decided to stick with Battlefield 3.
In Battlefield 3, we wanted to test out claims that the GTX 680’s 2GB of RAM wasn’t enough for certain applications and that 4GB cards were necessary. In order to test this claim as well as to measure overall performance, we tested both cards in both Ultra and High presets and measured their overall memory usage via GPU-Z. This way we were able to monitor FPS as well as peak memory usage in certain scenarios.
Running Battlefield 3 at High settings at 4K resolution, the HD 7970 got an average FPS of 33 FPS with a minimum of 26 and a maximum of 45. At this resolution and these settings, there was absolutely no noticeable lag and the game played like butter. The GTX 680 didn’t do much worse with an average frame rate of 30 FPS and a maximum of 45 and a minimum of 18. This also played smoothly for the most part, but there were a few instances where a short bout of lag could be noticed.
At Ultra settings, the disparity between the GTX 680 and the HD 7970 actually narrowed on paper with the gross exception of the minimum frame rate. The HD 7970 came in at an average frame rate of 22.95 FPS while the GTX 680 had 21.77 FPS. Admittedly, the difference is only 5% here, but there is still a bit of a performance edge. The story is essentially the same with the maximum FPS with the HD 7970 edging out the GTX 680 with 39 FPS to the 680’s 37 FPS. The real shocker here, though, is the minimum FPS. The minimum FPS is really what affects the visible lag and overall gaming experience the most. The HD 7970 had a minimum FPS of 14 while the GTX 680 had a minimum FPS nearly half that, at 8 FPS. As a result, when playing with the XFX HD 7970 versus the NVIDIA GTX 680, the HD 7970 experience in Battlefield 3 was simply better.
An interesting chart that we created may explain the huge disparity between the GTX 680 and HD 7970 in minimum FPS and we believe it has everything to do with onboard RAM. Looking at the graph we’ve made, you can see that the two cards basically use about the same amount of RAM on the high preset, but the difference is much wider in the ultra preset. In the high preset, the GTX 680 uses 1436 MB while the HD 7970 uses 1471, a negligible difference, but a difference nonetheless. With the ultra preset, however, we noticed two things. According to the user manual, the HD 7970 uses 2342 MB while the GTX 680 uses 2043 MB, which would normally lead one to say that the HD 7970 is purely a more memory hungry graphics card. But there is one factor many will overlook. The GTX 680 has 2048 MB of installed RAM while the HD 7970 has 3072 MB of RAM, which means the GTX 680 basically exhausted all GDDR memory when trying to run Battlefield 3 in 4K at ultra settings.
We do want to note, however, that the menus and heads up displays in Battlefield 3 were fairly broken for both NVIDIA and AMD. Much of the controls and HUD icons throughout the game were barely functional or horribly incorrect with the HD 7970, while the GTX 680’s controls, although not broken, were horribly shrunken down and difficult to use.
Since with the XFX HD 7970 we had a non-reference cooler, we were able to determine that it ran much quieter and cooler than the reference card we had previously used. The reference card we had from AMD loaded at 70C under full load and idled at 36C. Admittedly, not all that bad but fairly loud. The XFX HD 7970 with the same clocks but a non-reference cooler did 60C under the same load and idled at 30C, delivering a solid temperature drop of 10C while still being quieter than the reference design. If you plan on getting a reference clocked HD 7970, we highly recommend you check out the XFX card based on this comparison alone, not to mention how good it looks.
Overall, we would have to say that from our tests and experiences with EIZO’s 4K monitor and the NVIDIA GTX 680 and the AMD HD 7970, the performance differential between the two is relatively small. However, the experience on the AMD HD 7970 is better simply because of Eyefinity and the added 1GB of RAM. In order to get a quality gaming experience with a good frame rate at 4K with ultra settings, you must have two HD 7970s in crossfire mode.
We never thought that we would be able to max out 2GB of VRAM on a graphics card, but when you take this monitor into consideration with a game like Battlefield 3 running at ultra settings, you begin to realize that it is indeed possible. Truthfully, there aren’t going to be very many people driving 9 million pixels at ultra settings in a video game, so for 99% of gamers out there the 2GB is going to be more than enough to play their games. Anyone selling you on 4GB is basically selling you a 4K solution as we did actually run out of VRAM on the GTX 680 at Ultra settings in 4K resolution.
Hopefully we’ve been able to answer a few of your questions about 4K and address some other questions along the way. Based upon everything that we’ve seen with these cards, we really are looking forward to seeing what both graphics card manufacturers have to offer in the future as neither of these cards was able to successfully cope with BF3 at ultra settings. Perhaps GK-110 with added GPU performance and memory bandwidth will edge out the HD7970, and perhaps NVIDIA GeForce will improve the multi-monitor experience to reach the level of AMD’s Eyefinity.
In the meantime, though, there are tradeoffs to consider with either company. AMD’s drivers no longer come out on a monthly basis and have been historically difficult to deal with. As it stands now, at lower resolutions the GTX 680 will beat out the HD 7970 and as you move up in resolution, the HD 7970 will begin to pull away. NVIDIA certainly needs to address this as we continue to move towards a world where 4K slowly becomes the new 1080P and computer users begin to demand higher resolutions.
Original Author: Anshel Sag
This news article is part of our extensive Archive on tech news that have been happening in the past 10 years. For up to date stuff we would recommend a visit to our PC News section on the frontpage.