Tweak3D - Your Freakin' Tweakin' Source!
NVIDIA GeForce 256 Preview (Page 1/10)

Posted: October 11, 1999
Written by: Dan "Tweak Monkey" Kennedy


By now, nobody should be unfamiliar with NVIDIA and/or its products. In fact, long-time Tweak3D readers may remember that it was the original Riva 128 Tweak Guide that started this site. Throughout the last two years, we have brought you numerous TNT and TNT2 reviews, along with several articles pertaining to this company. Alas, here is the latest, an official NVIDIA GeForce 256 preview.

Company Background

I know, you're already begging for me to get to the specs and features of this new chip from NVIDIA, but first you should consider the company behind the product. Although NVIDIA was founded in 1993, they had a rather slow start and consequently did not make a huge impact in the 3D hardware world until the Riva 128 was released in 1997. This fine product was touted by many to be a powerful chip, and it did a great job when compared to others in its price range, including 3dfx's much loved Voodoo Graphics chipset. After a year went by, NVIDIA again shocked the world with the incredible TNT chipset. TNT, or TwiN Texel , offered advances that few had ever seen. It combined wonderful visual quality with an amazing feature set, as well as great performance. The only killer of the TNT chipset was the TNT2 chipset, which was released a year later. TNT2 basically stretched TNT to offer even better performance. Even with NVIDIA's imminent success, the company is rather small; featuring a staff of 300 hand selected individuals.

What's in a Name?

If you've read speculation (rumors, unofficial previews, etc.), you were probably surprised to see the name of this product. So, for future reference, keep in mind that NVIDIA's new chipset is called GeForce 256. Also keep in mind that this is pronounced JEE-Force, not G-E-Force or GEE-Force. The name was derived from two terms: one being "g-force", and the other being "geometry". The 256 added to the end of the name represents the the 256-bit architecture that I will explain later.

As for the name NV10? Well, that's just the common style of naming an NVIDIA product. For example, the first product NVIDIA made was the NV1. As NVIDIA's Co-Founder, President and CEO, Jen-Hsun Huang said, "It sucked". Riva 128 was the project known as NV3, Riva 128ZX was NV3.5, etc. Now, on to more important topics...

What the Industry Needs

To be honest, I thought that the computer graphics industry did not need much improvement. Few people would complain about the status of computers and the ultimate impact that the graphics industry has made with modern games. Thanks to higher resolutions, frame rates, rendering techniques, and filtering, games are looking more and more like real life or better. Nearly perfect, realistic worlds are being created for our imaginations and game developers to indulge upon.

However, there are components that are still lacking, as I learned upon analyzing the industry carefully. The most important lacking aspect is that only those with the best and most expensive computers can enjoy the luxuries of games. The 32-bit color, high resolutions, and high frame rates are only myths to most people who have to play Quake3 at 640x480x16bpp just to maintain a consistent frame rate. And although the graphics in most of these games are good, they lack realism. Detail aside, lighting is the area that needs the most work. Luckily, lighting techniques have improved vastly over the last few years, but once again, it's a luxury only few gamers can enjoy. Will the NVIDIA GeForce 256 fix these issues? Read on to find out, as I explore the features and unique aspects of this chip.


Finally, NDA has been lifted and I can post benchmarks of the GeForce 256. And guess what? My board is DDR, not SDR! So, why didn't I just write a whole review? Well, first of all, most of the information I would include is already here.. why write another massive 10-page article? And also, I don't consider this a final product yet, and therefore it shouldn't be reviewed. Sure it may be stable, but I want to wait until more games can utilize the card before I write a review.

Test System

  • Pentium II 450 MHz CPU
  • Abit BX6 motherboard
  • 128 MB SDRAM
  • Sound Blaster Live Value
  • Windows 98
  • NVIDIA GeForce 256 Reference Board w/ 32 MB DDR SGRAM
  • VSYNC Disabled
  • Default clock speed: 120 MHz Core / 300 MHz Memory
  • Using 3.48 Reference Drivers
When I have time I will post comparisons between other video cards. Until then, I recommend that you check out the Review Index and search for specific cards if you are looking to compare performance.

Quake 2 - Default Settings

800x600 1024x768 1280x960 1600x1200
Demo1 125.3 116.1 87.3 58.4
Crusher 57.3 56.1 54.0 47.0

Q3 Test - "Normal" Setting

800x600 1024x768 1280x960 1600x1200
Demo1 75.5 73.4 55.4 38.2
Demo2 67.3 67.2 59.5 44.9

Q3 Test - "High Quality" Setting

800x600 1024x768 1280x960 1600x1200
Demo1 73.8 63.8 28.7 22.0
Demo2 66.8 59.1 28.2 22.6

Doesn't something seem wrong here? The 1280x960 and 1600x1200 High Quality benchmarks are extremely low, and the pattern doesn't fit. The culprit? I believe the problem is memory or bandwidth. I played with 3D ExerciZer and set the texture memory to 46.8 MB in a scene. The frame rate hovered around 60 FPS. When I upped the texture memory to an even 48 MB, the frame rate dropped to 3 FPS. Yes, 3 FPS! WTF!??? Want to comment on this? Please, do so in the message board. I'm interested to read what people think. Hopefully we'll have this all figured out soon. =)

Q3 Test - Very High Quality

In addition to "High Quality", the following settings were used to simulate "very high quality":
  • r_lodBias -2 // don't use lower detail models
  • r_subdivisions 1 // lots more triangles in curves
  • r_lodCurveError 10000 // don't drop curve rows for a long time
John Carmack recommended that users try these settings to slow down a system and/or boost visual quality. Did T&L help with the extra detail? Check the benchmarks and decide for yourself:

800x600 1024x768 1280x960 1600x1200
Demo1 63.6 56.6 28.2 21.8
Demo2 58.3 50.3 27.0 22.7

When you compare results, it looks like lower resolutions take a more significant performance hit with extra detail.


I have been drooling over the GeForce 256 since I first saw it in action at Nvidia's HQ. These benchmarks prove that the card has potential... for sure. First, consider that the drivers I used are not totally optimized. Remember the good ol' Riva 128 days? Even after the ICD was non-beta, performance almost always increased when new drivers were released. The same went for TNT/TNT2 cards. Nvidia is good at tweaking drivers. Next, consider that the GeForce 256 has barely even been tapped (no pun intended) by developers. In the future, more and more games will boost performance with T&L and the other features that make the GeForce 256 stick out as one of the coolest video cards to date.


The 3.48 Reference Drivers (and hopefully all future GeForce 256 drivers) include the option to enable/disable VSYNC. And better yet, they include an overclocking utility. YES!!! Finally!

The core speed is a mere 120 MHz, which is fine considering the card renders 4 pixels per clock. But come on, we're all tweak monkeys at some time or another, and we're all going to try to overclock a GeForce 256. So... how does it do? How about the sweet DDR SGRAM that calculates to 300 MHz? I will have information and overclocking results online later this week.

Stay tuned for more GeForce 256 information and benchmarks in the next few days. Continue reading my GeForce 256 Preview by clicking the Next Page link below.

Next Page

  • News
  • Forums
  • Tweaks
  • Articles
  • Reviews