1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

The Official Graphics Card and PC gaming Thread

Discussion in 'Building a new PC' started by abuzar1, Jun 25, 2008.

  1. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    I read a couple of reviews on newegg, that the 480 doesnt run as hot as people are saying. Or at least with proper case cooling, they do just fine ;) I'll never know though. By the time I can afford another card, whether its nvidia or ati, there will likely be something else out...
     
  2. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Just quickly, you've been doing the exact opposite for nvidia. Me and Jeff can see the GTX400 series for the pile of wasted silicon it is. If you disagree you have no basis to criticise the 4GB HD5970 for its extra power consumption or price.

    More importantly though, there is no logic to using that argument on the 4GB HD5970. No pretentious fanboyism here, just look at the details.
    HD4870 original retail £160, HD4870X2 original retail £380 -> X2 card is 18% more expensive than two cards ->benefits of double memory per GPU, benefits of single card, same cooler used, noisier
    HD5850 current retail £230, HD5970 current retail £520 -> X2 card is 13% more expensive than two cards ->no double memory benefit, benefits of single card, same cooler used, noisier
    HD5870 current retail £320, 4GB HD5970 expected retail £750 -> X2 card is 17% more expensive than two cards -> benefits of double memory per GPU, benefits of single card, benefits of pre-overclock, better cooler used, quieter.

    On paper the 4GB HD5970 is better value by comparison than the HD5970 and HD4870X2 that have gone before it. Therefore by your logic the HD4870X2 and HD5970 are duds too. They certainly weren't.

    The 4GB HD5970 is no every man's card. Aside from the huge price tag placing it outside the reach of most gamers, the 4GB of total memory only has benefits for 30" monitor users and Eyefinity owners. As a 30" monitor user, I can clearly see scenarios where 1GB of video memory is limiting, there are at least 3 or 4 occasions where this is currently true. This situation will only get worse.
    Crysis Warhead on max detail uses about 1.4GB or so. That doesn't leave much left for the GTX480. Plenty left in the 4GB HD5970 though. Crossfire scaling is no problem, it achieves the same 77-78% in Bad Company 2 as normal cards. The substantial lead it has over the pair of 480s in the Bad Company 2 benchmark (13%) will apply to other games, and either place it on top performance-wise, or at least greatly reduce the bias against it.
    On top of this, what about eyefinity users? Sure, there may not be a big pack of titles yet that need more than 1GB per GPU at 4.096 Megapixels. But what about 5.292? 6.912? Heaven forbid, 12.288 or 13.824! There are games out there I'm sure that two slightly overclocked 5870 GPUs could take on at such gargantuan resolutions, without the frame rate dropping anywhere near the levels you get when you run out of video memory.
    Two GTX480s for £900, using four PCIe connectors and having a maximum power output of 710/532W, running at 95ºC at 95% of their maximum fan speed.
    Or, an HD5970 4GB for £750 or so, using two PCIe connectors, a maximum power output of 670/488W, running at 65ºC at 84% of its maximum fan speed, which in turn is half that of a normal graphics card. This is Bad Company 2, and the thing is running quiet. It's pulling the same amount of power above the GTX480 as the GTX480 is above the GTX470. It runs 30ºC cooler than its rivals without blowing your ears off.

    The fact that this card is £750 ish is almost immaterial, it's the best gaming graphics card there has ever been, and that's an end of it. They could charge what they like. But instead, they've priced it competitively. No matter how ridiculous £750 for one graphics card sounds, considering all of the above, and the competition, frankly, it's almost good value for money. Sad but true.
    A dud? I lol'ed. If it ran as hot and as loud as the GTX480, I'd still disagree, and blame it on the card's sheer ability. The thing's a monster. I fully intend to get my mits on one. As for it being hard to find, I'm doing my best to get in on the pre-orders. Have spoken to Scan who say they will let me know as soon as it's available to pre-order, and will shortly be doing the same for other stores as well.
     
  3. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    Agree 100% Excellent thrashing! lol
     
    Last edited: Apr 26, 2010
  4. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    As for the comment about these being hard to find, people seem to keep citing Sapphire as saying the cards are a limited edition. Through various research, I can't find one quote that states the Sapphire card will be a limited edition, only the Asus and XFX. The Asus makes sense, as the last ROG card they made was also a limited edition, and the XFX card states it's a limited edition in the title.

    Remember that the last ROG card Asus made was a limited edition of 1000, not just because it was exclusive and expensive, but also because it was a technological experiment. Dual GTX285s in the same card was never an official product from nvidia, just another of Asus' many experiments, such as the HD3850X3. The HD5970 4GB is an official ATI product, ATI themselves have set the MSRP, and given it its own codename. Got to love ATI's sense of humour, their midrange product is given the name of one of the tallest trees in the world, the high end product gets the name of a smaller tree, their flagship dual card the name of a poisonous plant, and their ultra-high end dual GPU offering, a small garden flower!

    Although they may not make that many more than 1000 units, I expect the Sapphire card to be the most numerous, because the 4GB HD5970 is worthy of a full product. The MARS295 at the time as laughable as it was a whopping 60% more expensive than the two cards it was (albeit with extra memory), and this on top of the fact that the GTX285 was already absurdly overpriced.
    The 4GB HD5970 carries a much more modest 20-30% premium over the two cards it replaces (also with the additional memory), and the single card it clones is a well-priced offering already. Couple this with the disappointment that was the GTX400 series, and it makes pretty good sense for this card to be popular. I imagine the number of people who, if there was availability, buy two GTX480s in SLI is very high, well over 1000, because, apart from nvidia fanboys being nvidia fanboys, that amount of graphical horsepower, and just the dual graphics prestige alone, convinces numerous people to buy stuff like that. So, as a direct competitor, coming in at a lower price, running at half the heat, a fraction of the noise, having more memory and still leaving the other 16x slot free for expansion, the 4GB 5970 being a limited edition all over would be a missed opportunity.

    Make no mistake, I'd still much rather single graphics cards be sufficiently powerful to be able to do away with multi graphics altogether, but look what happened to the last card that tried to do that...
    With the ever-increasing pile of games that require hugely powerful GPUs to run, I don't think CF/SLI based products are going anywhere any time soon. Look at how much more popular SLI and Crossfire have become in the last 3 years. I suspect Crysis has had a lot to do with that.
     
    Last edited: Apr 26, 2010
  5. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Grand Theft Auto 4: Episodes from Liberty City (AA excluded)
    Medium Textures, 60 View Distance, Otherwise Maximum
    , Multi-GPU assumes 1.80x,2.52x,3.24x scaling

    Minimal: Radeon HD2900XT/HD3800 series/HD4670/HD5570 or above, Geforce 8800 series/9600GSO G92/GT220 or above
    Reduced: Radeon HD3850X2/HD4750CF/HD4770/HD4830CF/HD4850/HD5670CF/HD5750 or above, Geforce 8800GT/9800GT/GTS250 or above
    Moderate: Radeon HD4850X2/HD5770CF/HD5850 or above, Geforce 8800GTS G92 SLI/GTX280/GTX275/GTX470 or above
    Good: Radeon HD5830CF/HD5970 or above, Geforce GTX260-216 SLI/GTX295/GTX470 SLI/GTX480 or above
    Optimal: Radeon HD5850 Tri-CF/HD5970QCF or above, Geforce GTX280 Tri-SLI/GTX295 QSLI/GTX470 SLI or above
    Extreme: Geforce GTX470 OC QSLI / GTX480 QSLI

    Very High Textures, 20 View Distance, Otherwise Maximum

    Minimal: Radeon HD5570 or equivalent, Geforce GT240 or equivalent
    Reduced: Radeon HD5670CF/HD5750 or equivalent, Geforce GTS250 or equivalent
    Moderate: Radeon HD4860CF/HD5770CF/HD5850 or equivalent, Geforce GTX260 SLI/GTX295/GTX470
    Good: Radeon HD5830CF/HD5970 or equivalent, Geforce GTX275 SLI (not 280 or 295)/GTX470 SLI
    Optimal: Radeon HD5850Tri-CF or equivalent, Geforce GTX280 QSLI/GTX295 QSLI/GTX470 Tri-SLI/GTX480 SLI
    Extreme: Geforce GTX480 OC QSLI

    CPU Requirement
    Maximal Settings for 1GB applied (Very High Textures, 1920x1080, 63 view distance)
    Limited to M27, A41
    Clock speeds based on Yorkfield Architecture
    M10: Single core 3.25Ghz, Dual core 1.6Ghz, Tri-core 1.35Ghz, Quad core 1.3Ghz
    M15: Single core 4.25Ghz, Dual core 2.35Ghz, Tri-core 2Ghz, Quad core 1.95Ghz
    M20: Dual core 3.25Ghz, Tri-core 2.7Ghz, Quad core 2.6Ghz
    M25: Dual core 4.25Ghz, Tri-core 3.95Ghz, Quad core 3.7Ghz
     
    Last edited: Apr 26, 2010
  6. andmill11

    andmill11 Regular member

    Joined:
    Sep 23, 2007
    Messages:
    802
    Likes Received:
    0
    Trophy Points:
    26
    It sounds like an Audigy SE is the way to go. I remember I had issues with it before so I returned it. It was exactly because some of the features were locked out.
     
  7. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    The daniel_k drivers don't work for all systems. They are completely unusable for me in Windows 7, sound becomes corrupted and garbled as soon as they are installed.
     
  8. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    That's actually pretty rare AFAIK. It seems it depends on your card. The very old Audigy SE's might need an older driver package. But I'm not entirely sure. I've installed the Daniel K drivers for 5 different cards and I've never seen an issue...
     
  9. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Tested on two different Audigy SEs, one from 2006 and the other from 2010.
     
  10. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    From what I've been reading, the nvidia 480 doesn't necessarily run hot. If one has good cooling, it does not. Or perhaps the hot ones, simply need their blocks reseated. I've heard that watercooling is a sweet spot for those. There are overclockers running barely over 80C. I imagine if I had 4 120mm fans in the side of my HAF932, 2 480's would do quite nicely. I'll probably never know. Those cards are wayy too power hungry for my liking LOL! The idea of needing a 1KW PSU does not sound appealing. Now their next revision could be a good thing. Depends on how ati Vs Nvidia are doing in a few months, when I can afford to upgrade ;)
     
    Last edited: Apr 27, 2010
  11. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    In bad cases they run hot, in good cases they run warm, same as any high-end graphics card. The difference is, hot for a typical card like an HD4870 is 87C, and warm is 70C or so. Hot for a GTX480 is 96C, and warm is 85C or thereabouts. If nvidia cards were built to withstand such temperatures, it would be no issue. All of the ones in the past, however, have not. In the case of an ATI card, if you need emergency extra cooling, you can turn the fan speed right up. Typically an HD5870 may run at 70C with a fan speed of 2200rpm. Turn it up to its maximum of 5000rpm and it will run in the 40s, definitely no problem there. The GTX480 in many scenarios already runs at nearly 90C with the fan speed at 3000rpm or higher. The max fan speed of the card is 4000rpm which is likely to get you down to the high 70s, low 80s. In a cramped environment, or in SLI, maximum fan speed is still going to be pushing 85-88C. That leaves very little breathing room.
    One HD5970, which is two high-end DX11 GPUs, uses less power at load than one GTX480. That should sum it up really.
     
  12. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    Power requirements are definitely the major turn off LOL!

    I have no doubt I could cool the beast, but the power requirement is absurd...
     
    Last edited: Apr 27, 2010
  13. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    There's not much reason to use 270W to run a GTX480 when 170W to run an HD5870 does an almost identical job!
     
  14. andmill11

    andmill11 Regular member

    Joined:
    Sep 23, 2007
    Messages:
    802
    Likes Received:
    0
    Trophy Points:
    26
  15. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Not unless in a quad-threaded game. There aren't very many of those yet.
     
  16. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    LOL! I love it. Quads aren't even fully supported, and they're releasing yet another monster. 6 Core behemoths LOL! And a 12 core in the not so distant future ;)
     
  17. andmill11

    andmill11 Regular member

    Joined:
    Sep 23, 2007
    Messages:
    802
    Likes Received:
    0
    Trophy Points:
    26
    It was going to either be getting a new processor or buying a Jtag'd xbox. I guess I will go with the xbox.

    I am still having issues with bad company 2 as well :(

    I can play as long as I want it seems if the game doesn't have punkbuster.

    (I asked about the processor because BC2 supports it)
     
    Last edited: Apr 27, 2010
  18. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    Hey sam. Remember once upon a time, when I said "I don't believe the Samsung 2433Bw would support 2 simultaneous video signals"? Well...now I'm thinking it can. I just looked once again at its capabilities, and it does have an ability to switch from D-sub to Dvi. Tonight, I'll find out if I'm in hog heaven :D I just lost my other display. It was actually my mothers. She was ready for it. I'm bordering on tears welling up LOL! Now I REALLY want a larger display :p
     
  19. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Eh? All monitors that have two inputs can do this...
     
  20. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    But you more or less just shrugged last time. See...the asus monitor actually listed this on neweggs site. The samsung did not. However, now they list it. Convenient eh LOL!
     

Share This Page