Did Shader Model 3.0 Help the 6800 Ultra?

preview_player
Показать описание
In this video we put Shader Model 3.0 support on the GeForce 6800 Ultra to the test, and see if it really made NVIDIA's highend 2004 offerings more "futureproof".

Рекомендации по теме
Комментарии
Автор

It reminds me of the Voodoo cards:
One card has truely awe inspiring performance, but skimps out on features, whereas it’s competitor(s) have less capable in power, but more feature speced cards, and that card was, in the end, more future-proof.

loganiushere
Автор

Such nostalgia! I'm so glad I found your channel. Makes me miss the good old days of playing games like Farcry, Doom and HL2 on my AthlonXP 2800+ PC with an AGP 6600 GT. PC gaming hasn't really felt as special to me since then, so being able to relive those times through your videos is ace. :)

Also, I remember an issue with Halo where cards like my 6600 with SM3 wouldn't display pixel shader effects like bump mapping and specular, but an X800 I picked up for a later system did. Maybe there was only support up to SM2 in Halo since it was made with the SM1.1 GeForce 3 GPU in the Xbox? Or maybe I just had some weird driver issue at the time.

Also, (if you're still reading this) maybe you should make a video about the arrival of the 8800GTX and stream processors? I remember how much of a big deal it was at the time, and when I bought my 8800GTX it was like a night and day difference compared to what I'd seen from traditional pipe based GPUs. One of the biggest differences was in Oblivion, where the most cards before would have a massive disparity between indoor and outdoor framerates, but the 8800GTX had the raw power to stabilise this. :)

Shuttersound
Автор

Some ATi owner disliked this video.

I love watching your content, sometimes over and over again, for nostalgia and for it's accurate information. Like this one that reminded me of that Far Cry console variable that enabled HDR lighting.

MFG
Автор

I've always wondered about the longevity SM3 gave the GF6/7, this tackles it perfectly. I've got a lot to learn from you :)

hblankpc
Автор

I have a cheap old phone I bought as backup, it's a Sony E4g and it's crappy. This phone has a Mediatek 4-core chip with a low-end Mali-T760 MP2 GPU which has a theoretical computation power of 48 GFLOPs only and can do DX11 & SM5.0. While the Geforce 6800 Ultra discussed in this video has 54 GFLOPs. And that realisation blows my mind.

SirDimpls
Автор

I upgrade from a Geforce FX 5700 to an X1950XT in 2007...I did nothing but play Oblivion until the end of that year.. good times!

sinizzl
Автор

Cool throwback. I remember the discussion although I didn’t really care at that moment as I was still rocking a GeForce 4 4200Ti. Although I did buy a 6600 GT (AGP) later in 2004 Keep your content coming BTW! It is always nice to watch deep dives into this retro stuff.

fabiolorefice
Автор

I remember being 12 years old in 2005 and buying black and white 2, only to be confronted with the error "This game requires pixel shader 1.1 to run, please upgrade". The sales rep at PC world had told us our new celeron D PC could play new games but that was clearly a load of crap. So I went on ebay and bought the cheapest graphics card I could see which was the Radeon 9250 256mb. The game now ran! I was playing oblivion at the time and even for my 12 year old self it felt pretty choppy so I eventually got a Radeon X1650 Pro 512mb and upgraded to 1GB DDR RAM. Both games looked amazing with SM3 enabled and I knew I could do better so the next year, with crysis on the horizon I saved up my paper round money over the entire year and built myself a pc with an amd athlon x2 4200, x1950pro and 2gb ram, as well as a kick ass 22" widesceen 1680x1050!

It was pretty awesome but when I eventually got my hands on an 8800GT in late 2008 I was blown away. That card was phenomental. I feel as though I missed out on the pre 2005 generation of graphics cards but my early expereinces with SM3 and the struggles of peasant gaming on a celeron D continue to humble me to this day!

synixx
Автор

I owned a X850XT at the time. It was infuriating to see all those games being purely and simply incompatible with an otherwise really fast card. The worst thing was that I was unable to upgrade for a while, meaning that I kept that useless thing until 2008, which is when I got a then brand new 8800 GT.

SteelSkin
Автор

Reminds me of the old ATi rage pro chip. It had the CIF 3D application that wasn't widely used. Only company that released a working CIF 3D patch were Eidios for the Tomb Raider gold game. That patch works very nicely with the old ATi Rage Pro and proper era CIF enabled driver.

retropcscotland
Автор

The GF 6 cards were fine, Same goes for the GF7 series. That was a phase when both companies had cards reasonable to choose after the disaster that was the GF FX.
The GF 8 on the other changed alot with an insame performance gap and the 8800 GT being the "minimum requirement" on may games even almost a decade later.

HappyBeezerStudios
Автор

well, since i kept my 6800GS until early 2008, it paid off. when my first xbox360 3rl'd i was devastated. but since there was gears of war and test drive unlimited (my favorite games on the xbox360 at the time) versions available on pc, that kept me going for the year it took me to buy a new 360 (and a 8800GT). even though the 6800GS ran those like crap. at least it did run them.

GraveUypo
Автор

When it comes to PC gaming, I am, I admit, a bit behind the curve. Most of the games I tend to play are not the latest and greatest. I had a Radeon X800 Pro card that my roommate had bought brand new, and day one, he installed his own cooler (an AMD Athlon 64 x2 cooler he modded to fit) because he saw the card hit 80 degrees C almost instantly in a game. Once he put the modded cooler on, at full load, and overclocked, it barely got 5 degrees C over ambient temperature. And when I got that card, I ran it for a good, long time. Most of the games I played back then either needed SM 2.0 or older, and the X800 was a huge upgrade over the Radeon 9800 Pro I had before it (also modded and HEAVILY overclocked) for a long while. After the X800 Pro paired with an Athlon 64 dual core, I upgraded power supplies and ran a factory overclocked X1950 Pro for a little while, before jumping to a quad core Athlon II with first a Radeon 7770 (my first and only new video card ever) and then that same machine was upgraded to a Phanom II x6 1045T and Radeon 7850, both very overclocked. I do have a love for ATi/AMD cards I have to say, to the point my current rig now has the first Nvidia card I have owned since I had a GeForce 4200Ti that was bios modded to a 4500se. At this time, the heavily overclocked GTX 1070 I am now running was the best bang for the buck card to run with my new to me AMD Ryzen 5 1600X.

Carstuff
Автор

brings me back...at that time i think i only had the ati x700..great vids

playingwithtrainerspcmt
Автор

I played oblivion on a 9800 non-pro at 800x600. Pretty sure I disabled HDR to get better framerates. At least I didn't have to use the "oldblivion" mod like geforce FX 5900 owners :P

soylentgreenb
Автор

I had a PNY 6800GS  with the pixel & vertex pipelines unlocked with Riva Tuner, ran on par with the 6800GT for less money. Great video btw.

trajanaugustus
Автор

There were games released in 2010 -2013 that had directX 9 support like Crysis2, Far Cry 3 & Metro: Last Light. I suspect a lot people used their 6800's for a long time. I remember having an AGP 6800gs that I unlocked to a 6800 ultra.

JamesSmith-swnk
Автор

Awesome channel :) Just subscribed. I hope that you grow as much as RetroGamingHD and Budget Builds!

auroraattardcoleiro
Автор

Early tech adopting hardware is seldom actually fit to handle the new feature set.
For every exception like Radeon 9700 or GeForce 8800 there are GeForce 3, GeForce FX, Radeon HD2000, Radeon HD5000, GeForce RTX20 which couldn't handle their new stuff.
In fact since GeForce 8800 we haven't had a single GPU feature pioneer that could reliably deal with a cool new thing - and that was 18 years ago.

wrmusic
Автор

I was still rockin my 9700Pro AIW at that time. I held onto it until replacing it with a 7950GT. Perhaps close to the longest I ever had a video card. I only did that as I was able to use an Asrock 939Dual Sata2 board, which could facilitate the upgrade path. The video card was the last. Though I was not exactly paying all the AAA titles as they came out, i have generally waited for sales. And since hten I have stayed a bit behind the curve in graphics performance.

wishusknight