Nvidia GeForce 8800 GTX

The world’s most powerful & First DirectX 10 Graphics Card.
by Md. Sadat Hussein

The next generation game consoles Xbox 360 and PlayStaton 3 are out now, Nintendo Wii is coming soon. Now what about Pc gaming? Well, With Windows Vista & DirectX 10, Pc gaming is going to be even better than game consoles! And Nvidia has just released their first DirectX 10 graphics card: Nvidia GeForce 8800 GTX. According to the results of lots of benchmarks, currently this card is the world’s most powerful graphics card beating Nvidia GeForce 7950 GX 2 and Ati Radeon X1950XTX! As this is a DirectX 10 Graphics card, first we look at what is DirectX 10.

DirectX 10 DirectX 10 is an evolutionary step over the current DirectX 9C. DirectX 10

introduces new unified shader architecture with Shader Model 4.0. One of the new things on DirectX 10 is geometry processing. Until now the system CPU was in charge of this task. On DirectX 10 geometry processing can now be done by the graphics processor. Geometry processing smoothes curved surfaces and you can see a better quality on character animation, facial expressions and hair. DirectX 10 provides more resources to the GPU, improving 3D performance. In DirectX 9C, Pixel and Vertex shaders were separated. DirectX 10 cards have unified Pixel and Vertex shaders’ processors meaning that any shader processor can handle the workload of Pixel and Vertex processing. This is will improve game performance dramatically. The key differences in GPU resources between DirectX 9 and DirectX 10 you can see on the table below.

Resources

DirectX 9

DirectX 10

Temporary Registers

32

4,096

Constant Registers

256

16 x 4,096

Textures

16

128

Render Targets

4

8

Maximum Texture Size

4,048 x 4,048

8,096 x 8,096

There are also several new features on DirectX 10, but in summary the goal of this new programming model is to reduce the system CPU role on 3D graphics performance – i.e. it tries to avoid the use of the system CPU as much as possible.

Nvidia GeForce 8800 GTX Staying close to DX10 specifications, GeForce 8800 GTX fully comply with the DX10 standard with Shader Model 4.0, various data storage and transmitting specifications, geometry shaders and stream out.

Nvidia said: "The GeForce 8800 design team realized that extreme amounts of

hardware-based shading horsepower would be necessary for high-end DirectX 10 3D games. While DirectX 10 specifies a unified instruction set, it does not demand a unified GPU shader design, but Nvidia GeForce 8800 engineers believed a unified GPU shader architecture made most sense to allow effective DirectX 10 shader program load-balancing, efficient GPU power utilization and significantly improved GPU architectural efficiency."

Architecture of Nvidia GeForce 8800 GTX Instead of the graphics chip having separated shader engines accordingly to the task to be performed – for instance, separated pixel shader and vertex shader units – the GPU has now just one big engine that can be programmed on the fly accordingly to the task to be done: pixel shader, vertex shader, geometry and physics. This new architecture also makes it easier to add new shader types in the future.

The reason behind unified architecture is that in some situations the GPU was using all shader engines from one type(pixel shader engines, for example) and even queuing tasks for these engines while engines from another type.

So Shader 4.0 allows the use of any shader engine by any shader process – pixel, vertex, geometry and physics.

Features of Nvidia GeForce 8800 GTX

GeForce 8800 GTX has 128 shader engines (i.e. streaming processors; 16 streaming processors x 8 units) and 384-bit memory interface (64 bits x 6).

Support for thousand of independent simultaneous threads, technology nVidia calls GigaThread.
16x anti-aliasing and 128-bit floating point precision HDR (High Dynamic Range) – GeForce 6 and 7 use a 64-bit precision engine. Nvidia is calling these two technologies as “Lumenex”.
Nvidia is calling their physics simulation technology as “Quantum Effects”. Physics simulation improves the reality of effects like smoke, fire and explosions.

Hardware-based features to enhance 2D high definition video quality. Nvidia calls these enhancements as “PureVideo HD”. Two new features are provided on GeForce 8 series, HD noise reduction and HD edge enhancement. GeForce 8 series also support HDCP decryption, in order to play HD-DVD and Blue-ray that have this technology on PCs. Of course GeForce 8 supports all other enhancements provided on previous nVidia chips, like de-interlacing, high-quality scaling, inverse telecine and bad edit correction.

  • Recommended for resolutions starting at 1600 x 1200.






0 Comments:

Post a Comment



Navigation Links | Home | Website | Forum | Subscribe | About | Contact |

Partners | Dr.Tomorrow | PochaDim | WCG BD

©2006-2008 Tomorrows Gaming Inc. All rights reserved. | Powered by Blogger