Forum: Geek Forum Topic: Summoner started by: whtdrgn_2 Posted by whtdrgn_2 on Apr. 17 2001,12:43
Hello everyone...I haven't had any post letely because I have been so busy learning to hate Java file streams... that and my new addiction... That being Violitions new 3D RPG summoner. My only problem is this. I have a k62 500 with 392 Mb of RAM and a 3dfx Voodoo 3 3500 TV with 16 Mb of RAM. I really want a net video card but I am afraid of what to get (I write code I don't design video cards). To tell the truth Summoner is the first game for the PC that I have really played since Civilization. I want to get a Radeon card, for two reasong. Under Linux the new xFree86 4.0.3 says they are the best supported, and two I can get it with 64 Mb of DDR Ram. What I want to know is do these cards hold up under Windows, or should I hold out for a GForce 3. I have always had a thing for ATI, and there chips are very well supported under Linux... But to play summoner I need it to rock under Windblows. Just looking for some expert Geek opinion... ------------------ Posted by Vigilante on Apr. 17 2001,13:10
The Radeon is a solid card. Quite outperformed by the various Geforce 2's under windows, but you'll surely notice an improvement over the voodoo3.Don't try to play tribes 2 with it, though. Posted by DuSTman on Apr. 17 2001,14:01
I have the AIW radeon. In short, it rocks, and I'd pick one over a GF2 any day.The principle architecture difference between the GF2 and Radeon is that the GF2 has four pixel pipelines with two texture lookup units per pipeline, whereas the Radeon has two pixel pipelines with three texture lookups per pipeline. Now, using bilinear filtering and only two textures, the GF2 will be speedier than the radeon per clock cycle (although not 2x, the benefit of each additional pixel pipeline is less than the benefit of adding the one before it due to memory bottlenecking) however, that third texture lookup will mean that you can achieve some extra effects on the radeon with comparatively little slowdown. Trilinear filtering, for example, needs the value of a texture from 2 seperate MIP-maps of the texture to determine the pixels eventual colour.. Which is fine, if you happen to have a texture unit spare as you do on the radeon (assuming one lookup is being used for lightmaps), but without the extre texture lookup trilinear filtering slaughters the framerate on the GF2.. There are other benefits of radeons, such as support for all the current methods of bump mapping, the ability to do compressed textures (the GF2s S3TC implementation is a little fuX0red).. Anisotropic filtering is nice and doesn't really cost a great deal of performance.. then there's the hyperz technologies... priority buffers.. vertex shaders (there's some little bugs in this, i'm not sure whether they're silicon or driver problems though).. On the down side the quality of 16 bit on the radeon isn't great.. sometimes even looking a bit like parchment if a lot of multitexturing/alpha blending is going on.. but 32bit is free from those probs.. New drivers seem to be being released every two to three weeks.. I'm sure there's still stuff I havn't mentioned.. but a radeon is my recommendation.. Posted by L33T_h4x0r_d00d on Apr. 17 2001,14:16
From personal experience, while looking at the same game on virtually the same layout with exception to the video card. A geforce 2 gts kicked the radeons ass. This was running on 2 pIII 750 machines with 640 megs of ram. There was only one game that came close on the radeon and I think it was Q3. You could get the geforce 2 ultra and blow the gts card away, much less a geforce 3. ------------------ Posted by whtdrgn_2 on Apr. 18 2001,14:15
So it looks like I should spend the 250 bones and get the 64 Mb Radeon than... Cool thanks guys... when it comes to 3D and all that I am left behind... I am a DB programmer and don't really care about bi-linear filtering or whatever the hell you call it... now if you wana talk about table cache, buffer cache, and write cache setting for MySQL then give me a holler...Oh yea and I can only do AGP 1x on my MB will the bottleneck be so bad I won't notice a difference? ------------------ Posted by DuSTman on Apr. 18 2001,15:08
err.. that's not good news.the radeon NEEDS at least an AGP 2x interface. Same with the Geforce 2 line I think.
Posted by aventari on Apr. 22 2001,01:27
quote: Naw you can get a 64 meg DDR Radeon for 165 bucks. If it were me, i'd buy a 32 meg GF2 GTS. 134 bucks. the 64 meg is 200, which isn't worth it IMHO. ------------------ Posted by DuSTman on Apr. 22 2001,09:13
while we're on the subject of video cards i'm going to make a bold statement: the geforce 3 is not good enough. The main thrust of video card competition is nVidia vs Ati. Don't get me wrong, the Geforce 3 is the most advanced chip there is at the moment, but the fact is that many features that nvidia is just adding to the GF3 the radeon 1 already had (vertex shaders, anisotropic filtering, etc).. Now this means that ATI are not going to have a very hard time catching up with the Geforce 3s level of complexity, and this in turn gives ATI a development time advantage on this cycle.. they'll be able to equal the geforce 3s feature set and also add some extra stuff to it.. Rumour has it they're working on putting in a 3dfx style HSR algorithm on chip.. Rumour also has it that the R200 is 99\% ready, and so are the drivers. The Geforce 3 is a very nice chip, but i don't think it's good enough to win this next round of the chip wars.. Posted by CatKnight on Apr. 22 2001,13:11
nor is it worth 軸. i think it's mainly for developers who want to program new games with the infinite fx engine. gamers won't notice a big difference.btw, i have a geforce 256, which i am upgrading to a ultra 64 soon. Posted by askheaves on Apr. 22 2001,14:24
Dude. The GeForce3 has one of the coolest features ever... totally programmable pixel shaders. OOOoohhhhhhh. Basically, you can manually program effects, like shading, anti-aliasing, specular effects... whatever you want. You're not limited by the effects that they could hardwire into the thing.The thing isn't a whole lot faster than the GeForce2, but it's so much more expandable. It's a giant leap that ATI is gonna be hard pressed to match. Posted by DuSTman on Apr. 22 2001,20:21
quote: Programmable pixel shaders are a fairly awesome feature, yes, but the majority of effects that you can accomplish using them are theoretically possible using simple multitexturing techniques. (but not quite as neatly, and often somewhat less efficiently).. In some beta drivers that ATI released there was an option that you could add to the registry which exposed an experiment by the driver writers - the idea was to try and emulate DX8 programmable pixel shaders using multitexturing techniques. It played about with it in the DX SDK pixel shader assembler.. It didn't quite support all of it but it did support the vast majority of the language. The radeon was designed basically from a pre-release of the D3D8 spec and many of the major architectural improvements that went into the radeon had this in mind. Accordingly the radeons rasterisers are actually programmable units, although not fully up to the directX spec, they allow greater flexibility to the driver writers. Basically all they need to do is spit&polish the design for D3D8 pixel shader language, expose the functionality in the driver and have done. It's an exciting feature, but it's not going to be a difficult one for ATI to add. Posted by Sithiee on Apr. 22 2001,22:27
ATI blows goats, nvidia wins. suck it up, doofus. :P
Posted by DuSTman on Apr. 23 2001,04:55
Well, when you put it like that.
|