1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

The Official Graphics Card and PC gaming Thread

Discussion in 'Building a new PC' started by abuzar1, Jun 25, 2008.

  1. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
  2. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    I'll be honest, 2GB per GPU at 1920x1200 is still a bit of waste. 1.5 is actually sufficient for 2560x1600 even with 4xAA, for now.
     
  3. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    so you think that 2gb at 2560 x 1600 is enough?
     
  4. harvrdguy

    harvrdguy Regular member

    Joined:
    Sep 4, 2007
    Messages:
    1,193
    Likes Received:
    0
    Trophy Points:
    46
    Hmmmm - so sam says 2 gigs is plenty per gpu for 2560x1600 even with 4xaa - nice to know. Your $389 2 gig 5850 looks pretty good kevin.

    Say guys, I should know this, but I don't, is the 5850 significantly less powerful than the 5870, or is it just a matter of the clocks, and can it be oc'd to equal 5870?
     
  5. shaffaaf

    shaffaaf Regular member

    Joined:
    Jan 5, 2008
    Messages:
    2,572
    Likes Received:
    4
    Trophy Points:
    46
    yep, the 5850s are BEAST overclockers, that do get to 5870 FPS. but the 5870s are beasts aswell for OCing. and then there is the whole extra heat thing.

    been looking at the prices of the 5 sereis, and i gotta saqy the 5830 at liek £165 seems a steal now. compete U turn from what i was saying before. and what makes it better is that its a BEAST ocer once again and is quiet alot less than the 5850s price. surely thr 5850 should drop. heres hoping the GTX265 makes it drop, but i doubt it seeing as it doesnt even hit 5850s speeds, jsut above the GTX275 infact.

    and hwo is it ATI can drop the prices of the 5830 so much yet increace the 5850 :(
     
  6. Red_Maw

    Red_Maw Regular member

    Joined:
    Nov 7, 2005
    Messages:
    913
    Likes Received:
    0
    Trophy Points:
    26
    I believe it means that if you're the second person to buy that card you also get a lifetime warranty.
     
  7. harvrdguy

    harvrdguy Regular member

    Joined:
    Sep 4, 2007
    Messages:
    1,193
    Likes Received:
    0
    Trophy Points:
    46
    Haha. Now THAT makes a lot more sense than what I was thinking! :D
     
  8. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    2GB per GPU at 2560x1600 is plenty for now. Any game that needs more than 2GB is likely to need more performance than an HD5 series GPU setup can offer, since that'd be at least 40% more memory than a current title (and 40% more memory implies much more than 40% more demand off the GPU)
    The HD5850 is lacking in processing cores compared to the 5870, in addition to the reduced clock speed, but it doesn't have that much of an impact on performance. An HD5850 clocked to 5870 speeds isn't very far behind at all. I imagine perhaps that's something that will only show up with newer titles once the cards get a bit older.
    The HD5830's stable price (i.e. the price at which cards can regularly be found) is about £180-£190 at the moment. This makes it a good deal, especially when you use the performance figures shown from less biased benchmarks. What is interesting is that the site that showed the HD5830 in the most positive light (and the performance area I would expect from its original price) shows a considerable bias in favour of the GTX400 series, with the highest performance of all the sites I've seen shown. (GameGPU).
    Ultimately the HD5830 is a 'cheap through inefficiency' card. It performs quite well for the price it's offered at, but it's bigger and more power hungry than its performance should really suggest. For that reason I'm glad its price has been cut more substantially below the £200 mark.
    As for the increasing price of the HD5850, it's such an amazing card compared to the competition, sales are still very high even at its new price, UK retailers are regularly selling out of the big brands, even though supply at least is no longer a major issue.
    The GTX470, when you believe the more realistic benchmarks, does not best the HD5850 by any substantial margin, yet it still costs at least £280 for the cheapest brands. Any price premium at all, given the drawbacks of the GTX400 architecture, is still unacceptable in my opinion.
    The GTX465 doesn't look like it's going to do much to alter the market position, it will surely sit in the same position as the HD5830 for excess power, reduced performance, but as to how much it will cost for its performance output, will be interesting to see. Given the choice of the name GTX465 over 460 or 455, it shouldn't in theory be much different to a 470 in performance (indeed, the card is identical to a 470, just a little crippled), whereas the 5830 is quite a substantial reduction from the 5850.
    Were nvidia to get this right, it may well redeem the GTX400 architecture, as it is common knowledge that the bulk of graphics card sales are in the lower-end sector, hence why nvidia are still around, they're standing on the success of their older architectures for their cheaper products :p
     
  9. harvrdguy

    harvrdguy Regular member

    Joined:
    Sep 4, 2007
    Messages:
    1,193
    Likes Received:
    0
    Trophy Points:
    46
    Wow, nice analysis Sam, as always.

    So while the 5850 is lacking in processing cores, 1440 compared to 1600 for the 5870 (10% less) performance for today's games doesn't seem to suffer much at all. That's very interesting. But the 5830, on the other hand, with 1120 (30% less than 5870) processing cores, is significantly less powerful, but at the same time, an energy hog.

    So it looks like the 5850 that Kevin linked to in post 6161 above, at $389, is indeed a really good value. I see the cheapest newegg 2 gig 5870 at $519.

    So comparing 5850 to 5870, both at 2 gigs, we're looking at 90% of the processing cores, which at equal clocks doesn't seem to make much difference in performance on today's games, like what - maybe 95-97% of the performance? - at only 75% of the cost.

    And Jeff and Red both can vouch for the effectiveness of crossfire on the 1 gig 5850!

    Rich
     
  10. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Don't forget the 5850 uses lower clocks, 725/4000 vs 850/4800. A 5970 2GB shares the 725/4000 clocks of a 5850, despite using the full 1600 shader cores of the 5870. Meanwhile the 5970 4GB uses 900/4800, but is 70% more expensive than the standard 5970 2GB.
     
  11. Red_Maw

    Red_Maw Regular member

    Joined:
    Nov 7, 2005
    Messages:
    913
    Likes Received:
    0
    Trophy Points:
    26
    Actually I haven't gotten around to being able to afford a second 5850 yet. Spent that money on my WC stuff.
     
  12. harvrdguy

    harvrdguy Regular member

    Joined:
    Sep 4, 2007
    Messages:
    1,193
    Likes Received:
    0
    Trophy Points:
    46
    Red, what!!!!!!!!! (I should talk, lol)

    Wow, 70% more for 4 gigs versus 2. But with raised clocks.

    So might some of that be for the extra heavy-duty cooling? Or is it just a big premium for the rabid gamers with money to burn? Anyway, that's bound to come down, don't you think?
     
  13. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    As I'm sure some of you have gathered, I'm no genius. At least not anymore :p
    So crossfire could be asking for more trouble than its worth. At least with a 5970, 2 Gpu's have been innerXfired if you will ;) It makes most softwares/Os's believe that its one SUPER Gpu :D Correct me if I'm wrong on this sam. As I'm sure you will LOL!
    And besides that, I believe that a 5970 takes less power, than 2 5850's. I don't wanna buy 2 5850's only to discover that my PSU can't take it. I do have other hardware not included in my sig ;) I prefer to have more than enough power, than barely enough...

    I just like to relax and play an awesome game every now and again. Arkham asylum seems to have some strange kind of bug. When I look around, there's strange artifacts being produced. Kind of like a rubix cube. Imagine spinning 2 rows at a time, but one row is slightly out of sync. That's what I'm dealing with.

    On another note, I got windows 7 reinstalled, and everything seems to be back to normal. Though I still have a high percentage of Softwares and games to reintegrate. I plan on putting off networking my 2 pc's, until I'm certain everything is hunky dory ;) I've gotten pretty good at the re-installation game. probably be done inside a couple hours.
     
  14. harvrdguy

    harvrdguy Regular member

    Joined:
    Sep 4, 2007
    Messages:
    1,193
    Likes Received:
    0
    Trophy Points:
    46
    Sam will clarify - but Kevin, I think that inner-crossfired isn't exactly the case.

    The 5970 as far as I know, as it appears to the outside world, is EXACTLY the same as two 5850s crossfired together. And with the later catalysts, cross fire is working fairly well these days - again Sam will give some pointers on Windows 7.

    There is also the advantage of the two cards, in that you don't have to have fast pci-e slots - you could even put one of the cards on a 4x slot, and lose only about 5% of your performance. The 5970, on the other hand, I believe will suffer if you knock it down from 16x to 8x speed - for example if you have something else on your motherboard that doesn't let the graphics card have all 16 lanes of pci-e.

    But power requirement-wise, you may have a point. Or maybe not. I don't see why two 5850s would use more power than a 5970 - fans or turbines don't use any real power. I guess I should look up the specs on the 5850, but I have to turn off the computer right now. But if the 5970 comes in just under 300 watts, then I would expect the 5850 to come in under 150 watts.

    Anyway, let's see how Sam clarifies this discussion, but I think you would be fine with two 2-gig 5850s in crossfire - your motherboard does have two full-size pci-e graphics cards slots, right? Also Jeff is running crossfire - let's see what he says. I thought Red was too, but the water cooling left him broke, lol.

    Rich
     
  15. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
  16. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Rich: The 4GB 5970 is a limited edition card, as far as I can see, so I'm eager to get my mits on one as soon as. Due to what will be increasing rarity of the cards the price will go up if anything, not down. I have pre-ordered one for £840 (effectively $1035), on the premise that they will soon rise to the common price of £950 ($1173)
    The extra price is because it's two 2GB 5870s on one card, the 2GB 5870s themselves are worth £385 each here ($475) so the 4GB card can justifiably be $950 without any cost of its benefits. Amalgamating two GPUs on one card typically carries a premium (the HD5970 for example typically costs around £30 more than two 5850s here), so that brings it up to at least $1000. Then consider the fact that the cooler is worth $50 by itself, and that you're buying a pre-overclocked card.
    Sadly, the extreme price of the card is actually pretty justified.

    Omega: No program can see the HD5970 as one GPU, it still exists as two GPUs in every sense, with all of the drawbacks, except for needing two slots of course. Additionally, since CF technology is not used the same way with dual cards, there are rare occasions when games simply aren't compatible. Fortunately, disabling Catalyst AI removes the link between the two cards and the vast majority of the time, acts like you've turned crossfire off.
    An HD5970 uses slightly less power than two HD5850s but there's very little in it. The main advantage of the 5970 over two 5850s is being a bit more efficient at idle.
    What you're describing in Arkham Asylum sounds like tearing, which is the reason VSync was invented. Try using it :p

    The HD5970 does not add two 5850s in series lol. Dual graphics technology relies on splitting a load between two GPUs. Since you can't split any particular job in half, the amount of work each GPU has to do is always different to the other, which means scaling is rarely 100%.
     
  17. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    Thanks for that explanation. Silly me...I briefly forgot about the benefits of being dual/quad core. If windows still used single core processors, lockups would be inevitable. Games benefit in the same way, utilizing multiple gpu's. Sharing the load. If one gpu stalls on a process, the other is there backing it up so to speak ;) Though I doubt they stall eh.

    LOL! Vertical Sync. You know, had I put a little thought into the definition of both those words, I may not have needed to ask. Boy do I feel silly LOL! Thanks for setting me straight.
     
  18. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    No, it doesn't really work that way, the GPUs have to be combined at the driver level so the game only perceives one device to send load to. Point is, the driver software still has to manage the load splitting. This is one of the reasons you need a slightly faster CPU for dual graphics systems.
     
  19. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    Ahhh! Well put. I think I have an understanding here then. Drivers certainly require capable CPU's ;) Especially when the drivers are coded for particular types of cpu's. Eg. particular instruction sets. Like SSE. I dealt with that a while back. Microsoft silverlight requires a particular instruction set...SSE if memory serves :p
     
  20. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    GameGPU have removed all the pointless GPUs from the tests, making the bottom end more tricky to calculate from now on :p

    As per usual dual GPU is scaling is assumed at 80%, i.e. 1.8x,2.52x,3.24x.

    Need for Speed: World Online (AA excluded)
    Minimal: Radeon X1800XL/HD2900 series/HD3690/HD4650/HD5570 or above, Geforce 7800GT/8600GTS/9500GT/GT220 or above
    Reduced: Radeon HD3850/HD4700 series/HD5670 or above, Geforce 8800GS/9600GSO G92/GT240 or above
    Moderate: Radeon HD4850/HD5770 or above, Geforce 8800 Ultra/9800GTX/GTS250 or above
    Good: Radeon HD4850X2/HD5830 or above, Geforce GTX280/GTX275 or above
    Optimal: Radeon HD4890CF/HD5830CF or above, Geforce GTX295/GTX480 or above
    Extreme: Radeon HD5970 4GB/HD5830 TriCF or above, Geforce GTX285 Tri-SLI/GTX480 SLI or above

    Split Second: Velocity (AA excluded) - 30fps frame limit enforced by game engine
    Minimal: Radeon HD2900XT/HD3850/HD4670/HD5570 or above, Geforce 8800GS/9600GSO G92/GT240 or above
    Reduced: Radeon HD4770/HD4850/HD5750 or above, Geforce 8800 Ultra/9800GTX/GTS250 or above
    Moderate: Radeon HD4850X2/HD4770CF/HD5830 or above, Geforce GTX285 or above
    Good M30: Radeon HD4850X2/HD4770CF/HD5830 or above, Geforce GTX260 SLI/GTX470 or above
    Optimal M30: Radeon HD4870X2/HD5870 or above, Geforce GTX260-216 SLI/GTX470 or above
    Extreme M30: Radeon HD5870CF or above, Geforce GTX295 QSLI/GTX470 SLI or above

    Blur
    Minimal: Radeon X1800 series/HD2900 series/HD3650/HD4550/HD5570 or above, Geforce 7800 series/8600GT/9500GT/GT220 or above
    Reduced: Radeon X1950XT-X/HD2900 series/HD3850/HD4670/HD5570 or above, Geforce 8800GS/9600GSO G92/GT220 or above
    Moderate: Radeon HD3870/HD4700 series/HD5700 series or above, Geforce 8800GT/9800GT/GT240 or above
    Good: Radeon HD4860/HD5770 or above, Geforce GTX260 or above
    Optimal: Radeon HD4870X2/HD5850 or above, Geforce GTX260 SLI/GTX470 or above
    Extreme: Radeon HD5850 Tri-CF or above, Geforce GTX285 QSLI/GTX470 Tri-SLI or above

    Alpha Protocol (AA excluded)
    Minimal: Radeon X1800 series/HD2900 series/HD3650/HD4650/HD5570 or above, Geforce 7600GT/8600GT/9500GT/GT220 or above
    Reduced: Radeon HD2900 Pro/HD3850/HD4670/HD5570 or above, Geforce 8800GS/9600 series/GT220 or above
    Moderate: Radeon HD3870/HD4700 series/HD5700 series or above, Geforce 8800GS/9600GSO G92/GT240 or above
    Good: Radeon HD4850/HD5770 or above, Geforce 8800GTS G92/9800GT/GTS250 or above
    Optimal: Radeon HD4850X2/HD5830 or above, Geforce GTX260 or above
    Extreme: Radeon HD5970 or above, Geforce GTX275 SLI or above


    Looking good for the 2x HD5850 users at 1920x1200, nothing they can't handle at the Optimal setting here. 3/4 games also pass with the HD5870CF combo (albeit one only when overclocked to the 4GB HD5970 standard). Blur seems to scale quite strictly with resolution, but since all Extreme preset results are extrapolated due to GameGPU not owning a 30" monitor, the real-world result may not be as bad (or who knows, could be worse). The estimation of 80% dual scaling may also play a part, if the scaling is closer to 100%, the 4GB 5970 may pull off Extreme for Blur as well.
     

Share This Page