CyberGenX 0 Posted November 24, 2002 So, how many of you out there noticed the new (old) tag Nvidia is sporting? Looks like the last 3DFX project hybrid with GeForce technology. I can't wait to get this card. Opinions Please.... Share this post Link to post
sapiens74 0 Posted November 24, 2002 I'll buy one only if it comes in a 256MB version Share this post Link to post
Immortal 0 Posted November 24, 2002 Ok, i saw this over at megagames: Quote: As far as speculation about the name is concerned, nVidia chose the FX as a tribute to 3DFX, whose former engineers played a large part in the production of the new card but they will include a 5 in the model number. The 5 will indicate that the card is the fifth in the GF range. The first card to be released, although not officially announced, will be called the GeForce FX 5800 and will cost a mere USD 399, same as the GF 4 TI4600 did upon release. If you think that's high-end then consider the reved-up up version, the GeForce FX 5800 Ultra, which will set, its lucky owner, back USD 499. nVidia with a Hint of 3DFX! Share this post Link to post
Admiral LSD 0 Posted November 24, 2002 The GFFX requires an external source of power, doesn't it? It's not hard to see where the 3Dfx influence comes in, is it? Seriously though, with USD$399 minimum price tag (and a two year lead before the technology becomes even remotely useful) they can forget it. Anyone who buys one of those at that price is a retard. Share this post Link to post
JP- 0 Posted November 24, 2002 Does that mean im a retard for buying a 9700pro ? Share this post Link to post
CyberGenX 0 Posted November 25, 2002 Honestly I never liked ATI cards other than TV Tuner/Capture. Back in the day it was 3DFX, Nvidia and Matrox. I seems as though ATI is always plagued with some sort of prob. or incompat. with hardware/software etc. It's not good when you have the fastest GPU (currently) and the slowest driver/compatibility time tables. I was a die hard 3DFX fan, and with this newer GeForce coming out, I couldn't even ponder switching yet again. Nvidia should just say f*ck it and try to purchase ATI. Share this post Link to post
sapiens74 0 Posted November 25, 2002 Quote: Honestly I never liked ATI cards other than TV Tuner/Capture. Back in the day it was 3DFX, Nvidia and Matrox. I seems as though ATI is always plagued with some sort of prob. or incompat. with hardware/software etc. It's not good when you have the fastest GPU (currently) and the slowest driver/compatibility time tables. I was a die hard 3DFX fan, and with this newer GeForce coming out, I couldn't even ponder switching yet again. Nvidia should just say f*ck it and try to purchase ATI. This coming from an AMD/Via fan? Share this post Link to post
tweaked 0 Posted November 25, 2002 Quote: This coming from an AMD/Via fan? I got no problem with AMD... But I had NO IDEA there was such a thing as a VIA fan... :x I am going to wait for the 6 month refrsh of GeforceFX (NV35?), The refresh product is worth waiting an additional 6 months. Hopefully with a 256 bit mem bus and 256 meg mem. i wonder how Nvidia will handle that with their crossbar memory controller... 32 bit x 8 or 64 bit x 4? I wanna see 256 bit memory with true ddr2 performance (4 reads per clock not 2 like the geforcefx uses)... 256 bit, 256 meg @ 2.0 ghz(effective). Memory bandwidth problems would be a NON-ISSUE for some time to come. i am too lazy to do the math, but i know that equals god d@mn fast. Don't get me wrong... geforce fx is very nice... but for $500+ i am gonna wait a little longer. i wanna play doom 3 at 1600 x 1200, all bells and whistles on with 60+ fps. i think (hope!) nv35 will do this. Share this post Link to post
tweaked 0 Posted November 25, 2002 Actually, i am kinda hoping nvidia purchases AMD... Share this post Link to post
CyberGenX 0 Posted November 25, 2002 No way man MACs rule all the way!!!! :x Share this post Link to post
Brian Frank 0 Posted November 27, 2002 Quote: Does that mean im a retard for buying a 9700pro ? yes. j/k It's nice to have the fastest thing on the block, but I have a really hard time justifying much over $200 for any single component. GFFX sounds cool, but the cooling requirements seem to be getting a little nuts. I haven't really read up on it, but I swear I've seen something about it needing a cooling solution like Abit's OTES cooler. If that's the case, that's gonna kinda suck for people that need every last available expansion slot. I hope the cooling market gets some better solutions out there, as we're really gonna need it. Share this post Link to post
ConQueso 0 Posted December 4, 2002 Quote: GFFX sounds cool, but the cooling requirements seem to be getting a little nuts. I haven't really read up on it, but I swear I've seen something about it needing a cooling solution like Abit's OTES cooler. If that's the case, that's gonna kinda suck for people that need every last available expansion slot. Ya, it's definitely beefy.Click for more picks and AnandTech's GeForce FX preview Oh, and on the topic of 3dfx, here is the avatar I use on other sites. To bad this one doesn't have 'em. (it's an animated gif, may have to wait a few seconds for the 3dfx bit) Share this post Link to post
Brian Frank 0 Posted December 5, 2002 Good god! That is huge!. I usually don't have enough PCI cards to necessitate using every PCI slot---and PCI 1 is usually left open...but still the loss of a PCI slot isn't terribly attractive. I hope nVidia will do something to deal with the heat better. Share this post Link to post
adamvjackson 0 Posted December 5, 2002 Generally, it's best to leave PCI1 unpopulated anyway, as the IRQ can be shared with the AGP slot. Share this post Link to post
ConQueso 0 Posted December 6, 2002 This cooling does have an advantage: The GPU won't put as much heat into the case. Overclockers, rejoice! Share this post Link to post
Ali 0 Posted December 12, 2002 Quote: Good god! That is huge!. I usually don't have enough PCI cards to necessitate using every PCI slot---and PCI 1 is usually left open...but still the loss of a PCI slot isn't terribly attractive. I hope nVidia will do something to deal with the heat better. I'm waiting to see if they justify the ATX case and motherboard factors!!!! I love ATI all-inWonder serries. recently ATI has been good with drivers and support. besides i could call them anytime during business day without paying a long-distance charge!!! I have problems with AMD. somehow any AMD athlon older than a year needs a bigger heatsink and fan than what it already has. In my experience they just overheat by themselves. I'm happy with my P4 for now. Share this post Link to post
Ali 0 Posted December 12, 2002 Quote: This cooling does have an advantage: The GPU won't put as much heat into the case. Overclockers, rejoice! the memory clock is 1000 Mhz and the GPU clock is 500MHz. Question: why they don't just use a Pentium 3 CPU instead? it is going to be the same!!! may be even cheaper!!! Share this post Link to post
sflesch23 0 Posted December 12, 2002 Quote: Quote: This cooling does have an advantage: The GPU won't put as much heat into the case. Overclockers, rejoice! the memory clock is 1000 Mhz and the GPU clock is 500MHz. Question: why they don't just use a Pentium 3 CPU instead? it is going to be the same!!! may be even cheaper!!! why would they do something as dumb as that? ;( Share this post Link to post
Ali 0 Posted December 12, 2002 Quote: why would they do something as dumb as that? ;( because it is going to be the same way anyways. they want faster GPU's as they needed faster CPU's. why not just modify the product that is already there instead of building a totally new one !!! why not? that is going to be a revelution! Share this post Link to post
sflesch23 0 Posted December 12, 2002 Quote: Quote: why would they do something as dumb as that? ;( because it is going to be the same way anyways. they want faster GPU's as they needed faster CPU's. why not just modify the product that is already there instead of building a totally new one !!! why not? that is going to be a revelution! my question still stands if they wanted a fast cpu why would they use an overrated chipzilla chip? Share this post Link to post
adamvjackson 0 Posted December 12, 2002 CPU and GPUs are completely different architectures, and have different instruction sets, that's why. Share this post Link to post
sflesch23 0 Posted December 13, 2002 Quote: CPU and GPUs are completely different architectures, and have different instruction sets, that's why. i knew that already. Share this post Link to post
Ali 0 Posted December 14, 2002 How do you keep the thing from falling off your agp slot? Share this post Link to post