News

Early GPUs like the GeForce 256 or original ATI Radeon had fixed-function pixel and vertex shaders that were not programmable. This changed over time, beginning with the introduction of ...
The GeForce 256 had 32MB DDR graphics memory, a 128-bit memory bus, four pixel shaders, four TMUs, and four ROPs. the GPU core itself operated at a frequency of 120MHz, while the memory ran at 150MHz.
Now, two and a half decades later, Nvidia is celebrating the GeForce 256's 25th birthday with a video and green popcorn. Here, we look back on what made a true modern classic of the PC era tick.
The "256" in the name is in reference to the 256-bit QuadPipe Rendering Engine employed within, consisting of four 64-bit pixel pipelines on the NV10 chip running at 120 MHz, resulting in a 480 ...
I salute you, GeForce 256. We truly didn't know how good we had it. Best CPU for gaming: Top chips from Intel and AMD. Best gaming motherboard: The right boards.
The original release, which lines up with today’s date, was for the GeForce 256 SDR, or single data rate. Later in 1999, Nvidia followed up with the GeForce 256 DDR, or dual data rate.
Nvidia is rightfully marking this milestone anniversary. From the GeForce 256 to the GTX 4090, we’ve witnessed remarkable advancements. However, the initial launch of the GeForce 256, specifically the ...
August 31, 1999 nVIDIA (as the name was then spelled) announced the GeForce 256 graphics card, which was released on October 11 of that year. Advertised as the «world’s first GPU», the video card ...
Like the ATi competition, these cards went back to using DDR memory, but with a 256-bit wide interface. In the case of the top-end FX 5950, the memory was clocked at 475MHz (950MHz effective) and ...
Based on a 150nm process node, it had a core clock of 325MHz and 128MB of DDR memory across a 256-bit interface. It could support resolutions of up to 2,048 x 1,536.