Out of the top three, the 9600 is the best. Your card is better most likely because of the architecture. For example, when the 2900XT was released everyone though it would easily beat a 8800GTX - the 2900XT had more of, well, everything... However, the 8800GTX had such good architecture that it was still faster than the 2900XT - and required less power at the same time. EDIT: About the screen shots, I will try to get one up tomorrow.
Save up for the 9600GT, you wont regret it. Maybe by then the price might srop too. I just checked your Sig and you have a Halo edition Xbox. That's pretty badass! I want both Halo edition Xbox and Xbox 360, but I got both at launch and they didn't have them back then.
Lol, Thanks. I love the Halo Edition Xbox too. I love it more since its softmodded with XBMC 360 on it and a 200GB hard drive. I got my PS2 like a month after launch and well, it lasted about 2 years, which was pretty good, and then died. Since then I haven't gotten anything at launch except games. I also probably will just save up for the 9600. I want to be able to laugh in my friends face who bought the 8800GT when it came out for a good 300+
I also noticed that the mobo I have, the P5K-E Wifi/AP edition does not support PCI-E 2.0. Can that feature be added in? Will I see a significant drop in performance when using the 9600GTs?
I have nothing against overclocking GPUs, but you must NEVER fiddle with voltmods. Yes, they work in getting you higher overclocks, but I don't know a single person that's done it yet without ruining their graphics card. That doesn't mean to say it can't be done safely, but if that's how often it goes wrong, it's too risky. A voltmod-ruined GPU probably won't get through warranty anyway, and you need aftermarket cooling to keep up with that overclock as well. Core2kid: The 9600 is the better option. I don't think you can just add in PCIe2.0 but you won't see a difference with that card, I guarantee it. You do realise the 8800GT is still significantly better than the 9600?
Yea, I noticed that but at what resolutions I'll be running at, its not that much different then the 8800GT. Everyone here though thinks its a better buy then the 8800GT. My friend spent 300+ and I will be spending around 150 so I win. lol.
The HD3850 and 9600GT are similarly competing products. Whilst I've no real objection to anyone going with the 9600, I personally think the 3850 would be the better option.
With tests though, the 9600GT outperforms the 3850 by a bit and they are the both same price. Any other opinions on the 9600GT vs the 3850?
Not really, that only applies when AA and AF are turned up, and the ATI card makes a much better job of that anyway. If you're not using AF, the 9600GT gets beaten by the 3850 pretty reliably, that's why I recommend it. The 3850 is also easily overclockable.
As of now I'm not in for any new video cards. I should be getting a new one around June-July. I'll see what Newegg has to offer me then.
the industry will probably have completely changed by then, so we'll see what there is out there when that comes.
sam, i dont think thats true. ATI card tremble with AA. im pretty sure the 3870 and 9600GT and trading blows, not the 3850. even at 1680x1050 they are equal, but with AA the 9600GT pull away. BUT the ATI cards have taken a price cut, makign them even more pleasing ESP for the price (and with a 3850 clocking to 3870 speeds) amazing for the price. it all depends on what res and what your price range is. imo id go for the 3850 (becuase it can clock to 3870 speeds (which is very comparable witht he 9600GT)(unless yo want HDMI), then the 8800GT
Indeed, when you enable AA the 9600GT comes out on top. ATI do a better job of rendering AA is what I said, that means the frame rate suffers a lot more because the cards are actually rendering AA properly, rathert than the cut-down version nvidia employ. Since the ATI cards come with a DVI to HDMI adapter and support the standard through their DVI ports (HDCP included) I fail to see why HDMI is something that goes in nvidia's favour, except for the fact that you can't have two devices that use DVI and one that uses HDMI all plugged in at the same time. I don't even know if that's possible with the software anyway...
on another thought, i hope the 48XX series and xfireX compatible with the 38XX series this is because they are just evolutionary cards, not revolutionary. its the same core (well close enough) with double the amount of stream processors and speed bumbed with GDDR5 (which is rated to go to 2500MHZ (5000 effective) :O
Is that actually true about the 4800 series though? I've yet to see something that convinces me that that isn't all made up. As for the HDMI/DVI converters, both Sapphire and Powercolor shipped them, and I daresay they're two of the more popular brands. I doubt you'd get them with the OEM cards, but hey. Buying OEM cards comes with more issues than just lack of HDMI converters... On the AA quality, image coming shortly. Okay here. ATI no AA, ATI with AA, nvidia no AA, nvidia with AA. Look at the window frame towards the top on the left hand side. It's only properly anti-aliased in the second frame. While nvidia have tried, it looks very similar to no AA at all. This is for an X1950 and 7900, but the same general rule applies for the current gen. In some games, nvidia do render AA properly, and in these games, the frame rate drop is as you'd expect, but when there's next to no frame rate loss, this is why.
From Any Questions Answered: "Q. (08.44pm) Why does nvidia's method of anti-aliasing have less of an impact on frame rate than ATI's? A. NVidia's anti-aliasing calculations cause a lower frame rate hit than ATI's since their calculations are more approximate (lower edge smoothing quality)" Crysis doesn't anti-alias much, but it does AA some of the vegetation. http://resources.vr-zone.com/Firefox/Radeon HD3000/Crysis_3870_new.png http://resources.vr-zone.com/Firefox/Radeon HD3000/Crysis_8800gt_new.png Take a close look at those. Look carefully at the edges of the trees and you'll see that the edges are smoother on the Radeon. Unfortunately the only other game image quality tested for the cards is Gears of War, which doesn't support AA.
are you taking the piss mate...? there is no difference, bar the placement of some leaves. you cannot tell the difference in AA from those two pics. the trees are but a few pixels at most, and telling difference is near impossible. and as you can see, on any near or far (bar the trees (as you claim)) there is no difference in AA.