Video Card History (1996 to the present)
1 Mensagem
|Página 1 de 1
Video Card History (1996 to the present)
Author: Darren Paschedag Date: 11/9/03
Introduction
When 3dfx released their first card in October of 1996, it hit the computer world like a right cross to the face. That card was called Voodoo. This new card held the door open for video game development, accelerating 3D graphics like nothing before. The Voodoo card performed its magic via a pass-through connection to the user's regular 2D video card - a practice known as piggy backing. A few months later a new card was introduced, the Voodoo Rush, which combined both 3D and 2D functions into one card. However, it ran significantly slower than the normal Voodoo. This, combined with driver issues, caused the Rush to be seen as a flop by the community.
In all races, there must be competitors. ATI and NVIDIA both had cards out shortly after that to compete with 3dfx. ATI had the Rage, and NVIDIA dubbed theirs the Riva 128. This was long before they both took the 3D giant 3dfx completely out of the race though, and both companies were but tiny blips on the radar during this time. To counter the new competition, 3dfx released the Voodoo2 in March of 1998. It was a vast improvement over the Voodoo, having a 90 MHz core clock and a whopping 12 MB of video memory. Voodoo2 could produce a resolution up to 1024 x 768, and had a blistering fast 3.6 Gb memory bandwidth - top of the line back then. As before, the Voodoo Banshee came out after the Voodoo 2, and like the Voodoo Rush it was a waste of money due to performance issues. Incidentally, the Voodoo2 was also a piggy-backer, which caused image quality issues for many.
In March of 1999, 3dfx came out with the Voodoo3. This time, the Voodoo 3 was separated into different steps to cover different consumer needs (sound familiar?). The Voodoo3 2000 was the low-end budget card, and it had a core speed of 143 MHz to offer. On the next rung was the Voodoo3 3000, which offered up a 166 MHz clock speed. At the top was the 3500 version, which featured a TV-out port, and a 183 MHz clock speed. All these cards were offered in PCI and AGP versions (a new concept, also shared with an ATI card called the 3D Rage Pro).
Like many underdogs, the competing companies started catching up to the hardware giant. NVIDIA released a card around the same time as the Voodoo3, called the TNT 2. The TNT 2 was the successor of the TNT, and upped the ante from 8 million to 10.5 million transistors - a huge jump in complexity. It also offered 32-bit color support, and digital flat panel support. The Voodoo3 barely beat the TNT2 in pure FPS, but the TNT2 had much higher visual quality, so people started checking out the competition. It didn’t cripple 3dfx, but it let them know that they better have something groundbreaking with their next release. ATI, possibly one of the cleverest (or maybe luckiest) of all three companies was content to sit in the corner and watch NVIDIA and 3dfx battle it out. ATI still released new cards - they weren’t spectacular, but by no means were they horrible. The cards were just enough to keep them in the race. ATI's strategy seemed to be to lie in wait for their time to strike, which wouldn’t come until later.
In October of 1999, NVIDIA dealt the final blow to 3dfx with the introduction of the Geforce 256. As 3dfx didn’t have anything to immediately combat the new card with, they took the blow right to the face. The revolutionary Geforce 256 brought much to the table, including four pixel pipelines at 120 MHz, DDR RAM support, and many other new features. 3dfx had two cards that were very highly anticipated, but delayed long past the original schedule - the Voodoo4 and Voodoo5. Once these cards were finally released they were well accepted by 3dfx die-hards, but they came far too late to do damage to NVIDIA. The long-term strategy behind the new cards was fairly weak, as the improvements were made mainly with the addition of more GPUs instead of real chipset improvements. This made the cards about twice as big as the previous models, with nearly the same increase in price. The Voodoo 5 outperformed the Geforce 256 card by a modest amount, but with that hefty price tag it didn’t make it very far at all. Sadly, the last card 3dfx constructed was the Voodoo5 6000, which was rarely seen at all. This is rather hard to believe considering that it was one of the biggest graphics cards I have ever seen. The Voodoo5 6000 was equipped with 4 GPUs (that’s right, 4) and 128 MB of memory. This card was mostly only seen in high-end workstations though, and never really made it to the consumer market.
Needless to say, 3dfx was defeated and taken out of the race. At the time, they had one last product up their sleeve called Rampage. Rampage was an amazing new chipset that would have pushed 3dfx far ahead of the game, had they not been bought out by NVIDIA in December of 2000. This meant that NVIDIA had the Rampage project in their hands, and it was rumored that Rampage technology was put into use on their NV30 (Geforce FX) series of cards.
ATI was still trying to be a player during this time, and released a card called the Rage Fury MAXX. Like the 3dfx cards of the time, this card made performance gains through the use of multiple GPUs. Using two Rage Pro processors in parallel, the card carried a fairly high price tag. Disappointingly, the Rage Fury MAXX just barely keep up with the TNT2, to say nothing of the GeForce. In the spring following their release of the Geforce 256, NVIDIA released its successor - the Geforce2 GTS. The GTS was more than just an overclocked version of the Geforce 256, and nearly doubled the pixel fill rate along with the addition of multi-texturing in each pipeline. Surprisingly, when the Geforce2 GTS started hitting the shelves, NVIDIA followed up with the Geforce2 MX. This chipset cut off two of the pixel pipelines, and took the fill rates down to 350 megapixels per second - a move that was questionable to many. However, two important features were added to the GeForce2 MX that made up for the performance losses. One of these was TwinView, which allowed for dual monitor setups. More importantly though, the GeForce2 MX added firmware support for the Apple Macintosh. Apple later named the GeForce2 MX the high end graphics card for the new Apple Power Macintosh G4 - a big win for NVIDIA's pocketbook.
As time went on, ATI and NVIDIA battled between themselves, releasing card after card. The cards released during this period were significantly faster than their predecessors, but nothing truly groundbreaking came to market. The game would not get too exciting until later on, when NVIDIA boldly went out on a limb and announced “The Cinematic Gaming Experience” that would come with their next generation of cards. During this time NVIDIA was busy with the XBox, which undoubtedly influenced the direction of the research and development surrounding their new chipset, codenamed NV30. The NV30 boasted 128MB of DDR2 memory and a .13-micron chip design, rather than the .15 micron chip design that ATI used on their 9700. ATI, being the competitors that they are, answered with the Radeon 9700 Pro around the time the GeforceFX (NV30), was scheduled to be released. When the new Radeon hit the shelves, people were stunned - it was quite possibly the best card to hit the market for a long time. But where was the GeforceFX? It was delayed, unsurprisingly because of a lack of manpower. Fastsilicon.com was at the launch of the GeforceFX, and it was a very interesting event. The GeforceFX was so hyped, that when it came out it couldn't help but be a disappointment. The frame rate was there, but unfortunately it just couldn’t match the picture quality of the Radeon 9700. The cooling solution, teasingly called the Dust Buster or Leaf Blower by consumers, also left a lot to be desired. This monster of a fan took up a PCI slot, and ran at two speeds depending on the GPU usage. The higher speed was much louder than most consumers could tolerate, and definitely hurt sales.
ATI took a huge lead with the Radeon 9700, and has extended their lead with the 9800. Slight revisions to the chipsets of both NVIDIA and ATI, but the market is currently in a holding pattern. Both companies have been accused of cheating on benchmarks to varying degrees, and the market on the whole seems to be ready for the next big thing to come down the pipeline.
Conclusion
With that crash course through recent video card history, we are left with only one more thing to talk about - the future. What will the NVIDIA NV40 and the ATI R420 (ATI) bring us? Will the NV40 Finally close the gap between the two companies, and possibly pull NVIDIA back ahead of the game? Or will the R420 vanquish the only real competition it has in one clean stroke? Only time and message board rumors will tell. Just keep reading, watch what happens, and make an informed decision as to what takes a seat in your AGP slot in the coming months.
Introduction
When 3dfx released their first card in October of 1996, it hit the computer world like a right cross to the face. That card was called Voodoo. This new card held the door open for video game development, accelerating 3D graphics like nothing before. The Voodoo card performed its magic via a pass-through connection to the user's regular 2D video card - a practice known as piggy backing. A few months later a new card was introduced, the Voodoo Rush, which combined both 3D and 2D functions into one card. However, it ran significantly slower than the normal Voodoo. This, combined with driver issues, caused the Rush to be seen as a flop by the community.
In all races, there must be competitors. ATI and NVIDIA both had cards out shortly after that to compete with 3dfx. ATI had the Rage, and NVIDIA dubbed theirs the Riva 128. This was long before they both took the 3D giant 3dfx completely out of the race though, and both companies were but tiny blips on the radar during this time. To counter the new competition, 3dfx released the Voodoo2 in March of 1998. It was a vast improvement over the Voodoo, having a 90 MHz core clock and a whopping 12 MB of video memory. Voodoo2 could produce a resolution up to 1024 x 768, and had a blistering fast 3.6 Gb memory bandwidth - top of the line back then. As before, the Voodoo Banshee came out after the Voodoo 2, and like the Voodoo Rush it was a waste of money due to performance issues. Incidentally, the Voodoo2 was also a piggy-backer, which caused image quality issues for many.
In March of 1999, 3dfx came out with the Voodoo3. This time, the Voodoo 3 was separated into different steps to cover different consumer needs (sound familiar?). The Voodoo3 2000 was the low-end budget card, and it had a core speed of 143 MHz to offer. On the next rung was the Voodoo3 3000, which offered up a 166 MHz clock speed. At the top was the 3500 version, which featured a TV-out port, and a 183 MHz clock speed. All these cards were offered in PCI and AGP versions (a new concept, also shared with an ATI card called the 3D Rage Pro).
Like many underdogs, the competing companies started catching up to the hardware giant. NVIDIA released a card around the same time as the Voodoo3, called the TNT 2. The TNT 2 was the successor of the TNT, and upped the ante from 8 million to 10.5 million transistors - a huge jump in complexity. It also offered 32-bit color support, and digital flat panel support. The Voodoo3 barely beat the TNT2 in pure FPS, but the TNT2 had much higher visual quality, so people started checking out the competition. It didn’t cripple 3dfx, but it let them know that they better have something groundbreaking with their next release. ATI, possibly one of the cleverest (or maybe luckiest) of all three companies was content to sit in the corner and watch NVIDIA and 3dfx battle it out. ATI still released new cards - they weren’t spectacular, but by no means were they horrible. The cards were just enough to keep them in the race. ATI's strategy seemed to be to lie in wait for their time to strike, which wouldn’t come until later.
In October of 1999, NVIDIA dealt the final blow to 3dfx with the introduction of the Geforce 256. As 3dfx didn’t have anything to immediately combat the new card with, they took the blow right to the face. The revolutionary Geforce 256 brought much to the table, including four pixel pipelines at 120 MHz, DDR RAM support, and many other new features. 3dfx had two cards that were very highly anticipated, but delayed long past the original schedule - the Voodoo4 and Voodoo5. Once these cards were finally released they were well accepted by 3dfx die-hards, but they came far too late to do damage to NVIDIA. The long-term strategy behind the new cards was fairly weak, as the improvements were made mainly with the addition of more GPUs instead of real chipset improvements. This made the cards about twice as big as the previous models, with nearly the same increase in price. The Voodoo 5 outperformed the Geforce 256 card by a modest amount, but with that hefty price tag it didn’t make it very far at all. Sadly, the last card 3dfx constructed was the Voodoo5 6000, which was rarely seen at all. This is rather hard to believe considering that it was one of the biggest graphics cards I have ever seen. The Voodoo5 6000 was equipped with 4 GPUs (that’s right, 4) and 128 MB of memory. This card was mostly only seen in high-end workstations though, and never really made it to the consumer market.
Needless to say, 3dfx was defeated and taken out of the race. At the time, they had one last product up their sleeve called Rampage. Rampage was an amazing new chipset that would have pushed 3dfx far ahead of the game, had they not been bought out by NVIDIA in December of 2000. This meant that NVIDIA had the Rampage project in their hands, and it was rumored that Rampage technology was put into use on their NV30 (Geforce FX) series of cards.
ATI was still trying to be a player during this time, and released a card called the Rage Fury MAXX. Like the 3dfx cards of the time, this card made performance gains through the use of multiple GPUs. Using two Rage Pro processors in parallel, the card carried a fairly high price tag. Disappointingly, the Rage Fury MAXX just barely keep up with the TNT2, to say nothing of the GeForce. In the spring following their release of the Geforce 256, NVIDIA released its successor - the Geforce2 GTS. The GTS was more than just an overclocked version of the Geforce 256, and nearly doubled the pixel fill rate along with the addition of multi-texturing in each pipeline. Surprisingly, when the Geforce2 GTS started hitting the shelves, NVIDIA followed up with the Geforce2 MX. This chipset cut off two of the pixel pipelines, and took the fill rates down to 350 megapixels per second - a move that was questionable to many. However, two important features were added to the GeForce2 MX that made up for the performance losses. One of these was TwinView, which allowed for dual monitor setups. More importantly though, the GeForce2 MX added firmware support for the Apple Macintosh. Apple later named the GeForce2 MX the high end graphics card for the new Apple Power Macintosh G4 - a big win for NVIDIA's pocketbook.
As time went on, ATI and NVIDIA battled between themselves, releasing card after card. The cards released during this period were significantly faster than their predecessors, but nothing truly groundbreaking came to market. The game would not get too exciting until later on, when NVIDIA boldly went out on a limb and announced “The Cinematic Gaming Experience” that would come with their next generation of cards. During this time NVIDIA was busy with the XBox, which undoubtedly influenced the direction of the research and development surrounding their new chipset, codenamed NV30. The NV30 boasted 128MB of DDR2 memory and a .13-micron chip design, rather than the .15 micron chip design that ATI used on their 9700. ATI, being the competitors that they are, answered with the Radeon 9700 Pro around the time the GeforceFX (NV30), was scheduled to be released. When the new Radeon hit the shelves, people were stunned - it was quite possibly the best card to hit the market for a long time. But where was the GeforceFX? It was delayed, unsurprisingly because of a lack of manpower. Fastsilicon.com was at the launch of the GeforceFX, and it was a very interesting event. The GeforceFX was so hyped, that when it came out it couldn't help but be a disappointment. The frame rate was there, but unfortunately it just couldn’t match the picture quality of the Radeon 9700. The cooling solution, teasingly called the Dust Buster or Leaf Blower by consumers, also left a lot to be desired. This monster of a fan took up a PCI slot, and ran at two speeds depending on the GPU usage. The higher speed was much louder than most consumers could tolerate, and definitely hurt sales.
ATI took a huge lead with the Radeon 9700, and has extended their lead with the 9800. Slight revisions to the chipsets of both NVIDIA and ATI, but the market is currently in a holding pattern. Both companies have been accused of cheating on benchmarks to varying degrees, and the market on the whole seems to be ready for the next big thing to come down the pipeline.
Conclusion
With that crash course through recent video card history, we are left with only one more thing to talk about - the future. What will the NVIDIA NV40 and the ATI R420 (ATI) bring us? Will the NV40 Finally close the gap between the two companies, and possibly pull NVIDIA back ahead of the game? Or will the R420 vanquish the only real competition it has in one clean stroke? Only time and message board rumors will tell. Just keep reading, watch what happens, and make an informed decision as to what takes a seat in your AGP slot in the coming months.
Surfer
1 Mensagem
|Página 1 de 1