The rumors are turning into a pre-launch mountain of news. According to popular technology websites, on Monday, February 18th at 12:01AM EST Nvidia will lift the NDA on GeForce GTX Titan, “world’s most powerful graphics card.”
Our sources are placing the date on other dates in the week, such as Tuesday, or even February 25th/26th, but we believe that is the standard tactic to find out who is leaking. But, let’s leave politics and sneaky tactics aside, because this product is one to be watched. Let’s move onto the specs, shall we?
Introducing the GeForce GTX Titan
As you know by now, the GeForce GTX Titan is not the GTX 780, which is the name (currently) reserved for a product refresh coming later this year, which in tune will be followed by Maxwell-based GeForce 8 series. The name Titan was deliberately put “out of sequence” to give accent to the importance of the product, but also not to make it outdated when GTX 780 steps onto the stage.
The GK110 GPU represents the heart of this card, featuring 14 SMX clusters (1 is disabled for yield reasons) for a grand total of 2,688 CUDA cores. This puts the product between two shipping products based on the GK110 core. The GTX Titan comes below the K20X with its 2,680 CUDA Cores (full 14 SMX clusters active) and above the regular K20, which ships with 13 SMX clusters enabled (2,496 CUDA cores). GTX Titan has 224 Texture Mapping (TMU) and 48 Raster Operating Units (ROP).
The chip itself is clocked at 875 MHz for reference clock, even though the two select vendors (ASUS and EVGA) have the option of custom clocking the parts. For example, ASUS GTX Titan allegedly comes clocked at impressive 915 MHz. We did not receive any information on Turbo mode for the card, nor for the power consumption. You can expect that the power consumption will be significantly higher than Tesla K20X, given the increased clocks. Furthermore, the GTX Titan comes with 6GB of GDDR5 memory. You’ve guessed it right, this means the GK110 has all twelve 32-bit interfaces active for a grand total of 384-bit. Given that the memory is clocked at 1.5GHz QDR, end result is “6GHz”.
The board itself follows the design language introduced by GeForce GTX 690, so you can expect rich use of magnesium alloy to make the product look special. Images we saw showed uncanny resemblance to the GTX 690, as the shroud covers the card from front and back. However, the new branding is very much present. If you are interested in color coding, the board is metallic silver / grey with a black PCB, so good luck in finding the matching components.
Performance – Faster or slower than a Dual-GPU GTX 690? HD 7990?
If you decide to use this card for GPGPU, for example password cracking, this card should be able to beat the K20 and K20X without breaking a sweat. The company castrated the Double Precision, and you can expect great Single Precision performance (2,688 CUDA cores times 875 MHz should result in around 4.5 TFLOPS SP from a single chip). Double-precision follows the Kepler tradition of 1/24 Single Precision performance. Yes, 4.7 TFLOPS SP and 196 GFLOPS DP, nicely protecting Tesla K20/K20X and the upcoming Quadro K6000 products.
The chip commands pixel fillrate of 49 GPixel/s and texel fillrate of 196 GTexel/s (once more, Texel fillrate is identical to FP64 Double Precision), while 384-bit at six billion transfers per second was enough for amazingly high 288.4GB/s.
Preliminary results show that in 3DMark Fire Strike (Extreme), GTX Titan scores 4870. According to our own testing, this would put the part some 500 points (10%) behind two GeForce GTX 680s in SLI mode. Given that a single GTX 680 achieves around 3000 points, almost 1900 points fewer than GTX Titan. Bear in mind this is still not enough to beat a single GeForce GTX 690.
You can expect an in-depth review from the usual suspects, which allegedly all received three cards for SLI testing. Given that a single card will go for about 10% less than a GTX 690, we’re talking about some serious Benjamins. The boards, at least at the beginning – will only be available through ASUS and EVGA in the respective markets.
But don’t expect that this card will become a mainstream product. Allegedly, less than 10,000 will be made. In any case – we’ll know more next week. Or are all those different rumors on the Internet false, and add-in-board vendors don’t know what they are talking about (even though if they sell NV GPU hardware as their lifeline)…? Only time will tell.
Original Author: Theo Valich
This news article is part of our extensive Archive on tech news that have been happening in the past 10 years. For up to date stuff we would recommend a visit to our PC News section on the frontpage.