nVIDIA's GeForce 6600 GPU is their first
native PCI Express chipset; no more AGP to PCI Express HSI
bridge's here folks. From now on, videocards should start to take more advantage of
the bandwidth that's provided by PCI Express x16 slots. There
are rumors that nVIDIA will soon come out with a PCI Express to AGP HSI chip
so they can bring the NV43 into the AGP world, but details are still
core is built on IBM's 0.11 micron process manufacturing technology, and
contains 146 million transistors. The NV43 has eight pixel rendering and three vertex pipelines,
backed by a 128bit memory controller. These numbers are exactly
half those of the 6800GT/Ultra GPU.
While not as
quick as the NV40 GPU, the NV43 supports all the latest features like DX 9.0C
as well as nVIDIA's SM 3.0 (Shader Model) support. True, not many games
incorporate these features at the moment, but it's only a matter of
time. Other than that, the NV43 retains all
the same features as the higher end NV40 GPU. If you'd like to learn a bit more about the
technology behind the card, make sure you read PCstats's nVIDIA GeForce 6800 Ultra review!
to the GeForce 6600GT which is clocked at 500 MHz
core, 1 GHz memory, the regular GeForce 6600 is clocked considerably slower in
both areas. With the Albatron GeForce 6600, the core runs along at 306 MHz, while
the memory is set to 500 MHz. As usual we started with the core first and
here's how things went.
Going up slowly a few MHz at a time, things were actually quite uneventful.
The 350 MHz, 360 MHz and 370 MHz marks passed by quite easily, as
did 400 MHz! In the end we were able to hit a maximum speed of 437 MHz, not
as high as the stock 6600GT, but definitely very nice for the price.
5ns Hynix TSOP-II DDR memory the Albatron GeForce 6600 uses, we
didn't know what to expect from memory overclocking. Unfortunately, the memory didn't seem to like
overclocking very much at all, so the story pretty much ends there. The maximum
speed the RAM would run at was a lowly 536 MHz, which was a bit
pleased to see that the core and memory
played nicely together though. With some previous nVIDIA-based videocards, we had problems
getting the two to run at maximum overclock simultaneously. Next up, the