GeForce FX Series – automatic train wash machine – tunnel car wash systems

Overview
Nvidia’s GeForce FX series is the fifth generation in the GeForce line. With GeForce 3, Nvidia introduced programmable shader units into their 3D rendering capabilities, in line with the release of Microsoft’s DirectX 8.0 release, and the GeForce 4 Ti was an optimized version of the GeForce 3. With real-time 3D graphics technology continually advancing, the release of DirectX 9.0 ushered in a further refinement of programmable pipeline technology with the arrival of Shader Model 2.0. The GeForce FX series is Nvidia’s first generation Shader Model 2 hardware.
The Dawn demo was released by Nvidia to showcase pixel and vertex shaders effects of the GeForce FX Series
The FX features DDR, DDR2. GDDR-2 or GDDR-3 memory, a 130 nm fabrication process, and Shader Model 2.0/2.0A compliant vertex and pixel shaders. The FX series is fully compliant and compatible with DirectX 9.0b. The GeForce FX also included an improved VPE (Video Processing Engine), which was first deployed in the GeForce4 MX. Its main upgrade was per-pixel video-deinterlacing a feature first offered in ATI’s Radeon, but seeing little use until the maturation of Microsoft’s DirectX Video Acceleration and VMR (video mixing renderer) APIs. Among other features was an improved anisotropic filtering algorithm which was not angle-dependent (unlike its competitor, the Radeon 9700/9800 series) and offered better quality, but affected performance somewhat. Though Nvidia reduced the filtering quality in the drivers for a while, the company eventually got the quality up again, and this feature remains one of the highest points of the GeForce FX family.
Hardware based on the NV30 project didn’t launch until near the end of 2002, several months after ATI had released their competing DirectX 9 architecture. The delay was partly caused by Nvidia’s transition to a 130 nm manufacturing process which encountered unexpected difficulties. Nvidia had ambitiously selected TSMC’s then state-of-the-art (but unproven) Low-K dielectric 130 nm process node. After sample silicon-wafers exhibited abnormally high defect-rates and poor circuit performance, Nvidia was forced to re-tool the NV30 for a conventional (FSG) 130 nm process node.
Marketing
While it is the fifth major revision in the series of GeForce graphics cards, it wasn’t marketed as a GeForce 5. The FX (“effects”) in the name was decided on to illustrate the power of the latest design’s major improvements and new features, and to virtually distinguish the FX series as something greater than a revision of earlier designs. The FX in the name also was used to market the fact that the GeForce FX was the first GPU to be a combined effort from the previously acquired 3dfx Interactive engineers and Nvidia’s own engineers.
The advertising campaign for the GeForce FX featured the Dawn fairy demo, which was the work of several veterans from the computer animation Final Fantasy: The Spirits Within. Nvidia touted it as “The Dawn of Cinematic Computing”, while critics noted that this was the strongest case of using sex appeal in order to sell graphics cards yet.
The initial version of the GeForce FX (the 5800) was one of the first cards to come equipped with a large dual-slot cooling solution. Called “Flow FX”, the cooler was very large in comparison to ATI’s small, single-slot cooler on the 9700 series. Its blower fan was also very loud. It was jokingly referred to as the ‘Dustbuster’.
The way it’s meant to be played
Nvidia debuted a new campaign to motivate developers to optimize their titles for Nvidia hardware at the Game Developers Conference (GDC) in 2002. In exchange for prominently displaying the Nvidia logo on the outside of the game packaging, Nvidia offered free access to a state of the art test lab in Eastern Europe, that tested against 500 different PC configurations for compatibility. Developers also had extensive access to Nvidia engineers, who helped produce code optimized for Nvidia products.
Overall performance
GeForce FX5900
When the FX 5800 finally launched, it was discovered after testing and research on the part of hardware analysts that the NV30 was not a match for Radeon 9700’s R300 GPU. The most significant performance deficit occurs when the pixel shading hardware is rendering Shader Model 2 effects with FP32 precision. Additionally, the 5800 had roughly a 30% memory bandwidth deficit compared to Radeon 9700 Pro, caused by the use of a comparatively narrow 128-bit memory bus. Nvidia originally used GDDR-2 because of its support for much higher clock rates than the original DDR specification. Even this memory couldn’t clock fast enough to make up for the bandwidth of a 256-bit bus, however.
NV30’s GPU architecture is a 4×2 design capable of rendering 8 Z pixels, 8 stencil operations, 8 textures, and 8 shader operations per clock. It is only capable of drawing 4 pixels per clock however, which puts its pixel fillrate at half of R300. However, in games with heavy use of stencil shadowing, NV30 did benefit from its 8 pixels/operations per clock capabilities, because the engine does a Z-only pass. This was not a typical rendering scenario, however.
Pixel shader performance
With regard to the Direct3D 9.0 shader model 2.a capabilities of the NV3x series and the related marketing claim of “cinematic effects” capabilities, the actual performance was quite poor. A combination of factors combined to hamper how well NV3x could perform these calculations.
Firstly, the chips were designed for use with a mixed-precision programming model. A 64-bit precision “FP16” mode would be used for situations where high-precision math was unnecessary to maintain desired image quality. In cases where accuracy was more important, a 128-bit “FP32” mode would be utilized. However, because ATI’s R300 GPU didn’t benefit from lower precision FP16 and the R300 performed significantly better on shaders overall, and because it took more effort to optimize shader code for the lower precision, the NV3x hardware was often unnaturally hampered by computing full precision full-time.
The NV3x chips also used a processor architecture that relied heavily on the effectiveness of the video card driver’s shader compiler. Proper instruction ordering and instruction composition of shader code could dramatically boost the chip’s efficiency. Compiler development is a long and difficult task and this was a major challenge that Nvidia tried to overcome during most of NV3x’s lifetime. Nvidia released several guidelines for creating GeForce FX-optimized code and worked with Microsoft to create a special shader model called “Shader Model 2.a”. Nvidia would also somewhat controversially rewrite game shader code and force the game to use their shader code instead of what the developer had written.
Valve Software’s presentation
In late 2003, the GeForce FX series became known for poor performance with DirectX 9 shader model 2 vertex & pixel shaders because of a very vocal presentation by the popular game developer, Valve Software. Early indicators of potentially poor pixel shader performance had come from synthetic benchmarks (such as 3DMark 2003). Valve presented detailed performance statistics for their game Half-Life 2 running on Nvidia NV30-based hardware.
Valve revealed a significant performance gap of approximately 80120% between the GeForce FX 5900 Ultra and the ATI Radeon 9800 Pro. In shader 2.0-utilizing game-levels, Nvidia’s top-of-the-line FX 5900 Ultra performed about as fast as ATI’s mainstream Radeon 9600 Pro, which cost approximately one third as much as the Nvidia card. Valve had initially planned on supporting partial floating point precision (FP16) to optimize for NV3x, but they realized that this was too time intensive and thus too costly. ATI’s cards did not benefit from FP16 mode, so all of the work would be entirely for Nvidia’s NV3x cards. When Half-Life 2 was released a year later, Valve opted to make all GeForce FX hardware default to using the game’s DirectX 8 shader code in order to extract adequate performance from the Nvidia cards. This naturally resulted in a reduction of visual quality.
Questionable tactics
Nvidia historically has been known for their impressive OpenGL driver performance and quality, and the FX series maintained this. However, with regard to image quality in both Direct3D and OpenGL, they aggressively began various optimization techniques not seen before. They started with filtering optimizations by changing how trilinear filtering operated on game textures, reducing its accuracy and thus visual quality. Anisotropic filtering also saw dramatic tweaks to limit its use on as many textures as possible to save memory bandwidth and fillrate. Tweaks to these types of texture filtering can often be spotted in games from a shimmering phenomenon that occurs with floor textures as the player moves through the environment (often signifying poor transitions between mip-maps). Changing the driver settings to “High Quality” can alleviate this occurrence at the cost of performance.
Nvidia also replaced pixel shader code in software with GeForce FX-optimized versions with lower accuracy. These “tweaks” were especially noticed in benchmark software from Futuremark. In 3DMark03 it was found that Nvidia had gone to extremes to limit the complexity of the scenes through driver shader changeouts and aggressive hacks that prevented parts of the scene from even rendering at all. Side by side analysis of screenshots in games and 3DMark03 showed noticeable differences between what a Radeon 9800/9700 displayed and what the FX series was doing. Nvidia also publicly attacked the usefulness of these programs and the techniques used within them in order to undermine their influence upon consumers. It should however be noted that ATI also created a software profile for 3DMark03. Application-specific optimizations are typical practice to fix bugs and enhance performance, but when they affect visual quality significantly to enhance performance they become controversial. With regards to 3DMark, Futuremark began updates to their software and screening driver releases for these optimizations.
Hardware refreshes and diversification
See also: Comparison of Nvidia graphics processing units
GeForceFX 5500-SX
Nvidia’s only initial release, the GeForce FX 5800, was intended as a high-end part. There were no GeForce FX products for the other segments of the market. The GeForce 4 MX continued in its role as the budget video card and the older GeForce 4 Ti cards filled in the mid-range.
In April 2003, Nvidia introduced the mid-range GeForce FX 5600 and budget GeForce FX 5200 models to address the other market segments. Each had an “Ultra” variant and a slower, cheaper non-Ultra variant. With conventional single-slot cooling and a mid-range price-tag, the 5600 Ultra had respectable performance but failed to measure up to its direct competitor, Radeon 9600 Pro. The GeForce FX 5600 parts did not even advance performance over the GeForce 4 Ti chips they were designed to replace. Likewise, the entry-level FX 5200 did not perform as well as the DirectX 7.0 generation GeForce 4 MX440, despite the FX 5200 possessing a notably superior feature-set. FX 5200 was also outperformed by the older Radeon 9000.
In May 2003, Nvidia launched a new top-end model, the GeForce FX 5900 Ultra. This chip, based on a heavily-revised NV35 GPU, fixed many of the shortcomings of the 5800, which had been discontinued. While the 5800 used fast but hot and expensive GDDR-2 and had a 128-bit memory bus, the 5900 moved to slower and cheaper DDR SDRAM with a wider 256-bit memory bus. The 5900 Ultra performed somewhat better than the Radeon 9800 Pro in games not heavily using shader model 2, and had a quieter cooling system than the 5800.
In October 2003, Nvidia released a more potent mid-range card using technology from NV35; the GeForce FX 5700, using a new NV36 core. The FX 5700 was ahead of the Radeon 9600 Pro and XT in games with light use of shader model 2. In December 2003, Nvidia launched the 5900XT, a graphics card intended for the mid-range segment. It was similar to the 5900, but clocked slower and using slower memory. It managed to more soundly defeat Radeon 9600 XT, but was still behind in a few shader-heavy scenarios.
The final GeForce FX model released was the 5950 Ultra, which was a 5900 Ultra with higher clock speeds. The board was fairly competitive with the Radeon 9800XT, again as long as pixel shaders were lightly used.
Windows Vista and GeForce FX PCI cards
Windows Vista requires a DirectX 9-compliant 3D accelerator to display the full Windows Aero user interface. During pre-release testing of Vista and upon launch of the operating system, the video card options for owners of computers without AGP or PCIe slots were limited almost exclusively to PCI cards based on the Nvidia NV34 core. This included cards such as GeForce FX 5200 and 5500 PCI. Since then, both ATI and Nvidia have launched a number of DirectX 9 PCI cards utilizing newer architectures.
Discontinued support
Nvidia has ceased driver support for GeForce FX series.
Final Drivers Include:
Windows 9x & Windows Me: 81.98 released on December 21, 2005; Download;
Product Support List Windows 95/98/Me 81.98.
Windows 2000, 32-bit Windows XP & Media Center Edition: 175.19 released on July 9, 2008; Download. (Products supported list also on this page)
Note that the 175.19 driver is known to break Windows Remote Desktop (RDP). The last version before the problem is 174.74. It was apparently fixed in 177.83, although this version is not available for the GeForce 5 graphic cards. Also worth of note is that 163.75 is the last known good driver, that correctly handles the adjustment of the video overlay color properties for the GeForce FX Series. Subsequent WHQL drivers do not handle the whole range of possible video overlay adjustments (169.21) or have no effect on those (175.xx).
See also
Comparison of Nvidia graphics processing units
GeForce 4 Series
GeForce 6 Series
References
^ a b c d Wasson, Scott (April 7, 2003). “Nvidia’s GeForce FX 5800 Ultra GPU”. Tech Report. http://techreport.com/articles.x/4966. Retrieved 2008-06-14. 
^ http://www.maximumpc.com/article/features/graphics_extravaganza_ultimate_gpu_retrospective?page=0%2C6
^ Ferret, Wily (May 4, 2007). “Post-Nvidia man writes in”. The Inquirer. http://www.theinquirer.net/default.aspx?article=39402. Retrieved 2008-06-14. 
^ Cross, Jason. Benchmarking Half-Life 2: ATI vs. nVidia, ExtremeTech, November 29, 2004.
^ a b Demirug. CineFX (NV30) Inside, 3DCenter, August 31, 2003.
^ a b c d Valve’s Half-Life 2 Test Results Confirmed, ExtremeTech, September 12, 2003.
^ a b c StealthHawk. Forceware texture filtering quality study, NVNews Forum, October 16, 2003.
^ a b Wasson, Scott (June 5, 2003). “Further Nvidia optimizations for 3DMark03?”. Tech Report. http://techreport.com/articles.x/5226/1. Retrieved 2008-06-14. 
^ Shilov, Anton (May 23, 2003). “Futuremark Caught Nvidia and ATI Technologies On Cheating In 3DMark03”. X-bit labs. http://www.xbitlabs.com/news/video/display/20030523152718.html. Retrieved 2008-06-14. 
^ Gasior, Geoff (May 6, 2003). “Nvidia’s GeForce FX 5600 GPU”. Tech Report. http://techreport.com/articles.x/5103/1. Retrieved 2008-06-14. 
^ Gasior, Geoff (April 29, 2003). “Nvidia’s GeForce FX 5200 GPU”. Tech Report. http://techreport.com/articles.x/5065/1. Retrieved 2008-06-14. 
^ Bell, Brandon (June 20, 2003). “eVGA e-GeForce FX 5900 Ultra Review”. FiringSquad. http://www.firingsquad.com/hardware/evga_e-geforce_fx_5900_ultra_review/default.asp. Retrieved 2008-06-14. 
^ Gasior, Geoff (October 23, 2003). “Nvidia’s GeForce FX 5700 Ultra GPU”. Tech Report. http://techreport.com/articles.x/5799/1. Retrieved 2008-06-14. 
^ Gasior, Geoff (December 15, 2003). “Nvidia’s GeForce FX 5900 XT GPU”. Tech Report. http://techreport.com/articles.x/5990/1. Retrieved 2008-06-14. 
^ Hagedoorn, Hilbert (October 23, 2003). “GeForce FX 5700 Ultra & 5950 Ultra Review”. Guru3D. Archived from the original on 2007-08-20. http://web.archive.org/web/20070820001403/http://www.guru3d.com/article/content/88/. Retrieved 2008-06-14. 
^ User forum complaints about v175.19 driver breaking RDP
^ [http://forums.anandtech.com/showpost.php?p=26234627 AnandTech forum post regarding RDP issue
External links
Nvidia: Cinematic Computing for Every User
ForceWare 81.98 drivers, Final Windows 9x/ME driver release
Geforce 175.19 drivers, Final Windows XP driver release
Museum of Interesting Tech article Picture and specifications for the FX5800
Driver Downloads
laptopvideo2go.com Contains an archive of drivers and modified .INF files for the GeForce FX series
v  d  e
Nvidia
 
Motherboard chipsets
GeForce Series
8-series  9-series  Ion
nForce Series
nForce 220 / 415 / 420  nForce2  nForce3   nForce4  nForce 500  nForce 600  nForce 700
Technologies
ESA  EPP   LinkBoost  MXM  SoundStorm  SLI
 
Graphics processing units
Early chipsets
NV1  NV2
RIVA Series
RIVA 128  RIVA TNT  RIVA TNT2
GeForce Series
GeForce 256  GeForce 2  GeForce 3  GeForce 4  GeForce FX  GeForce 6  GeForce 7  GeForce 8  GeForce 9  GeForce 100  GeForce 200  GeForce 300  GeForce 400
Workstation and HPC
Quadro (Quadro Plex)  Tesla
Technologies
SLI  PureVideo  TurboCache  PhysX  CUDA  OptiX
 
Others
Consoles
NV2A (Xbox)   RSX (PlayStation 3)
Handheld
GoForce  Tegra
Driver / Software
ForceWare   System Tools  nView  Gelato  CUDA  Cg
Acquisitions
3dfx Interactive  Ageia  ULi  Mental Images
Categories: 2003 introductions | Nvidia | Video cardsHidden categories: Wikipedia articles needing style editing from March 2009 | All articles needing style editing

The e-commerce company in China offers quality products such as automatic train wash machine , tunnel car wash systems, and more. For more , please visit car wash systems today!

Processing your request, Please wait....

Leave a Reply