1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

The Official Graphics Card and PC gaming Thread

Discussion in 'Building a new PC' started by abuzar1, Jun 25, 2008.

  1. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Just to show that not all recent games are hideously demanding, here's something I may never play (but will read reviews of just in case), a recent addition to GameGPU's test page:

    Superstars V8 Next Challenge
    Minimal: Radeon HD3650/HD4650/X1800/HD2900/HD4700/HD5500 series and above, Geforce GT220/7800/8600/9500 series and above
    Reduced: Radeon HD2900 Pro/HD4670/HD3800 series/HD4700 series/HD5500 series and above, Geforce GT220/8800GS/9600GSO (G92)/9800 series and above
    Moderate: Radeon HD3870/HD4700 series/HD5700 series and above, Geforce GT240/8800GT/9800GT and above
    Good: Radeon HD4850/HD5770 and above, Geforce GTS250 and above
    Optimal: Radeon HD4870X2/HD5800 series and above, Geforce GTX275/GTX280 and above
    Extreme: Radeon HD5970 and above, Geforce GTX275 SLI and above

    There, that's some hope for those who want to play recent games at high graphics without spending several hundred pounds on graphics hardware. It's worth noting that a fluid 60fps minimum can be achieved at 1680x1050 using an overclocked HD4890, or assuming crossfire scaling, two HD4850s. An HD5850 of course will be achieving more like 70 minimum, and will be able to pull off 60 minimum (just) at 1920x1200.
     
  2. harvrdguy

    harvrdguy Regular member

    Joined:
    Sep 4, 2007
    Messages:
    1,193
    Likes Received:
    0
    Trophy Points:
    46
    Yeah, sorry about that Schwaber, it was just an old discussion about my ancient P4 netburst being half as fast, at identical clocks, as modern core 2 duo, (never mind nehalem which is even faster.)

    Oh wait - schwaber - do you have a p4 too, or is shaff thinking of me? (If you do have a p4, then sorry for my confusion - I thought you had core 2 duo and that I was the only one still using netburst. But I DO play Left 4 Dead 1 and 2 - 2 is more of a drain on the p4, and I can only run 1440x900, whereas left 4 dead 1 I can run 1920x1200. Your GTS250 will be much faster than my agp 3850.)
    No, Shaffaaf, I can't get to 2560 as per above - except on old games like Far Cry, or Half Life 2.

    What's your processor Schwaber?

    Oh. Well, then I see you and I are still on netburst. As I recall, Pentium D - pushing past Pentium 4 - offered the only true dual core version of netburst - two truly independent cores, no hyperthreading between cores like my P4. (Although, as I mentioned, at identical clocks the throughput of each core is only half that of the new core 2 duo architecture, and maybe only 35-40% of the nehalem architecture like Sam's i5.)

    Although your dual core should handle L4D fine, when I run Left 4 Dead, I assume my hyperthreading will somehow help the game performance, so I keep hyperthreading turned on - for example I found that 3dmark6 DOES benefit from hyperthreading, giving me about 300 points more than when I have it turned off. By the way, my 3dmark6 score, at 6169, a few months ago was only 5162 - so it jumped slightly over 1000 points when I put in the new 670 P4 at 4ghz, with twice the L2 cache of 2mb, from the 3.2ghz (with slight overclock to 3.36ghz.)

    So Schwaber, what 3dmark6 score do you get with your Pentium D and your GTS250? You can download it for free, I think, for the basic test.

    Speaking of 3dmark6, wow sam, 18,909 running your i5 at stock clocks. I still remember when you had the best 3dmark6 at 7,000. So ultimately, what score do you think you can reach - 25,000? - Oh, there it is, a day later, 24,806 overclocking from 2.67 to 3.6ghz. I knew it! Hahaha!

    I agree with Sam, Schwaber, the 1440x900 looks really nice on Left 4 Dead 2!

    And to schwaber and omega and sam - according to Miles, the entire company at Valve all went together to see Zombieland when it first came out, lol. I don't know if the zombieland guys were inspired by Left 4 Dead original - I don't know how long the film had been in production - but you're all correct - quite similar!!

    Boozer jumped in - been a long time - not a gamer. I thought, boozer, that you mentioned you were going to run left 4 dead on your laptop!

    Omega's on GTA IV again. Sam wants to see how his new i5 handles it. I tried it with my new cpu and it was ungodly AWFUL!! The cpu extra power hardly did anything - still around 10 fps. I don't know how I could stand it 6 months ago!

    Jeff and Shaff on the new Battlefield BC. Those screens look awesome, shaff - SNOW!

    Jeff, as much as I like Battlefield 2 with Dragon Valley and similar maps - especially Dam under Construction - you had mentioned Battlefield Vietnam one time - do you still play that at all?

    (I've been gone for a while, just like boozer. I'm suiting up with a tie every day now - got a new real estate listing, and hopefully others will be coming - might have a spare $3,000 to put toward a new build within about 6 months. I'll probably stay with the spedo case and just add a bunch of fans.

    But I'm halfway wondering if I should wait for Ati's new future 6000 architecture which will be sure to rape crysis at 2560x1600. When are we going to see anything like that, summer of 2011?)


    - Rich
     
  3. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Here's a very vague 'mhz for mhz' comparison

    Core i5/i7: 3.0Ghz
    Core 2 Quad (45nm): 3.6Ghz
    Core 2 Quad (65nm): 4.15Ghz
    Phenom II: 4.05Ghz
    Phenom: 4.5Ghz
    Athlon64: 4.6Ghz
    Pentium 4 (Prescott): 7.5Ghz
    Pentium 4 (Northwood): 7.3Ghz
    Athlon XP: 7.8Ghz

    As for Left 4 Dead, changing the resolution has nil effect on the CPU, whatsoever. If you get lag you can cure by cutting the resolution, it's graphics, not CPU. For reference, a GTS250 is roughly 100-110% faster than a PCIe HD3850. Not sure how much AGP affects the performance.
    The Pentium D still used hyperthreading, but in conjunction with being two cores, thus it had four threads. The Pentium D is not a fast CPU as while it's dual core, it is still the same Netburst architecture as the P4 Prescott. It also depends which CPU he has, the Smithfield (D800 series) was far worse than the Presler (D900 series)
    Left 4 Dead 2 makes good use of multiple CPU cores, I see 60-70% CPU usage regularly with Left 4 Dead 2 (though that is because I have the graphics power to back it up). Thus, even a Pentium D wil be twice as fast as your P4, as long s the graphics card can keep up. At 1920x1200, an HD3850 won't, but at 1440x900 it probably will for the most part, at least with Netburst CPUs.
    Hyperthreading often helps with multi-threaded CPU tasks, and the 3dmark06 CPU tests are. However, the graphics tests, much like some real games, are single threaded, and will likely take no notice. Also be advised HyperThreading reduces performance in some games, sometimes substantially. This is where the i5 750 beats its other Core i7 peers by not having it. (Of course, you can disable HT)
    GTA4 is even more of a quad core CPU whore, it had 70-80% of my Q9550 when I ran it the first time, I will have to try again with the i5, not done so yet.
    As far as future graphics are concerned, right now there are two big things on the horizon, the GTX480 from nvidia (at long last), which will be press-launched (but who's to say about availability? probably as bad as the HD5870 when first released, i.e. long waiting lists! It's worth mentioning that in the UK at least, there is still a 6 week - 2 month waiting list for an HD5970) and an HD5890 from ATI, intent on rivalling said GTX480. As far as an HD6000 series is concerned, nothing for a long time, I would hazard a guess at around Spring/Summer 2011, but there is absolutely nothing about any plans whatsoever at this stage, it's a long way off.
    Fortunately, while lots of games are stepping up to the Crysis-esque hardware requirements, Shattered Horizon, STALKER Call of Pripyat, Cryostasis and Battlefield: Bad Company 2 being recent examples, few seem intent on pushing the boundaries that much further, which is good. It is my hope that Crysis 2 does not extend the situation as far as its predecessor once did, as the graphics card market is finally starting to catch up with the games demand market. A year and a bit ago, using a 30" resolution even on medium-high settings was a pipe dream for several games. With the DX11 architecture, albeit with large and hugely costly crossfire/SLI-to-be setups, it's finally doable in a lot of cases. If you can live without AA (which, at 2560x1600, a lot of people can - needless to say, I still want it) if you can wait for them to arrive, plop two 5970s in and play Crysis Warhead maxed out on a 30" monitor, it won't lag very much, if at all. The caveats here is that such demanding games finally show their true colours when you have enormous amounts of graphics power. Crysis Warhead on max requires at least 8GB of RAM, ideally 12GB if you're using Vista instead of Windows 7. It also requires a massive data rate of your hard disk, so if it's anything but a gaming-grade F1/Caviar Black/F3 drive that's defragmented and relatively empty, it's SSD time. Further to this, you also need stackloads of CPU power. Anything short of a 3.6Ghz i5/i7 is going to see you with lag spikes in Warhead once you max it out. This all goes for other games to lesser extents (except GTA4, where the CPU and RAM issue still applies, just not the hard disk). It really is a 'as good as the weakest link' scenario. For the first time in a while, having good components throughout the system, power (for running 4 GPUs!), CPU, RAM, storage and cooling, is an essential to play games how you want them to play. Doesn't seem fair does it? :p


    Oh by the way Rich, did I not show you these?
    [​IMG]
    [​IMG]

    The Vantage CPU speed isn't entirely correct. 3920mhz is the basic clock speed of the CPU with the standard multiplier. However, thanks to turboboost, you can permanently raise the ratio from 20 to 21, giving 4116mhz, which is where I am at, and have been since a few hours after building the system. I could probably push further since I haven't had any crashes at all, but am happy where I am for now.
     
    Last edited: Feb 24, 2010
  4. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Shattered Horizon performance comparisons
    Not totally unbiased, but a rough idea of how different GPUs compare in a demanding environment.

    High End (6.6x)

    GTX260: 85%
    GTX260-216: 95%
    HD4870: 100%
    GTX280: 105%
    HD4890: 110%
    GTX275: 115%
    GTX285: 120%
    HD5850: 160%
    HD5870: 180%

    Mid-range (4.7x)

    GT 240: 70%
    HD3870: 70%
    8800GT: 75%
    9800GT: 75%
    HD4830: 85%
    8800GTS G92: 85%
    8800GTX: 85%
    HD4750: 90%
    9800GTX: 90%
    GTS250: 100%
    HD5750: 105%
    HD4770: 110%
    HD4850: 115%
    HD4860: 115%
    HD4870: 140%

    Low-end (2.2x)

    HD3690: 70%
    9600GSO G94: 70%
    GT 220: 70%
    HD2900 Pro: 85%
    HD2900XT: 100%
    HD4670: 100%
    8800GS: 105%
    9600GSO: 105%
    8800GTS G80: 105%
    HD5570: 105%
    9600GT: 120%
    HD3850: 120%
    HD5670: 125%
    GT 240: 150%

    Bottom-end (1x)

    HD3450: 10%
    8400GS: 10%
    G210: 20%
    8500GT: 30%
    9400GT: 40%
    HD2400XT: 40%
    HD3470: 70%
    HD4450: 70%
    HD4470: 70%
    HD2600XT: 70%
    HD5450: 85%
    HD4550: 85%
    8600GT: 100%
    HD3650: 115%
    8600GTS: 115%
    9500GT: 115%
    HD4650: 140%
    HD3690: 155%
    GT 220: 155%
     
    Last edited: Feb 25, 2010
  5. shaffaaf

    shaffaaf Regular member

    Joined:
    Jan 5, 2008
    Messages:
    2,572
    Likes Received:
    4
    Trophy Points:
    46
    3690? never heard of that.


    BTW HD5830 released today, good card horrible price... (again) though this maybe due to the price gouging for the first month of sale.

    also about a GTX275 speeds isnt to impressive, seeing as a 4890 will do that, with 800 SPs vs 1120 of the 5830.

    if they took this to about £150-£160 it would be very decent. tbh i think most of the cards need a £20 drop aswell. come on fermi, hurry the F up.

    so the 5850 cost £220 before, now they are going up to £250, and this will sit at £200.

    thats just faceplam material ATI.

    the whole point of this card was to sit inbetween the 5770 and 5850, which it does, but then it goes and puts it in the price of the 5850, when it should be in the price range between the 2. even then, a 4890 performance is pretty poor IMO, for a card which should be better.


    If anything, the 5870 and the 5770 are looking much better now. and hell, 2 5770s would be ALOT better.
     
    Last edited: Feb 25, 2010
  6. alaneis

    alaneis Guest

    edited by ddp
     
    Last edited by a moderator: Feb 25, 2010
  7. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    Ummm...you can get the heck out of here! LOL! Doesn't binkie mod this thread?
     
  8. ddp

    ddp Moderator Staff Member

    Joined:
    Oct 15, 2004
    Messages:
    39,170
    Likes Received:
    137
    Trophy Points:
    143
    alaneis, lightning struck!!! post edited.
     
  9. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Spotted the HD5830 out on Scan. It's real early days yet so that price will be inflated - it is after all higher than we'd expect.
    GTX275 speeds seem to result in certain areas, yet far higher in other titles. Overall, I'd say the HD5830 is better than a GTX275, so it only has to compete with the GTX285 and HD5850 on price. Even at £215 it's still far cheaper than both of them. Remember, this card's position is meant to be between the HD4890 and HD5850, which places it as a direct rival to the GTX285.

    From bit tech:
    Fallout 3
    22" AA: 17% behind HD4890, 7% behind HD5850, 12% above HD5770, 10% above GTX285, 49% above GTX260-216
    24": 8% behind HD5850, 5% behind HD4890, 1% above GTX285, 7% above HD5770, 61% above GTX260-216
    24" AA: 14% behind HD5850, 1% behind HD4890, even with GTX285, 10% above HD5770, 24% above GTX260-216
    30": 17% behind HD5850, 3% behind HD4890, 14% behind GTX285, 5% ahead of GTX260-216, 7% above HD5770
    30" AA: 27% behind HD5850, 13% behind HD4890, 22% behind GTX285, even with GTX260-216, 10% ahead of HD5770
    Overall Scores:
    GTX260-216: 81
    HD5770: 92
    HD5830: 100
    GTX285: 107
    HD4890: 109
    HD5850: 118

    STALKER Clear Sky
    17": 20% ahead of HD5770, 13% ahead of GTX260-216, even with HD4890, 10% behind GTX285, 22% behind HD5850
    22": 26% ahead of GTX260-216, 23% ahead of HD5770, 14% ahead of HD4890, 4% ahead of GTX285, 20% behind HD5850
    24": 25% ahead of GTX260-216, 12% ahead of GTX285, 11% ahead of HD5770, 5% behind HD4890, 16% behind HD5850
    30": 47% ahead of GTX260-216, 18% ahead of GTX285, 17% ahead of HD5770, 11% ahead of HD4890, 15% behind HD5850
    Overall Scores:
    GTX260-216: 79
    HD5770: 85
    GTX285: 95
    HD4890: 96
    HD5830: 100
    HD5850: 122

    I was going to continue before seeing how dismally the HD5830 fares up to other cards (in Dawn of War 2, it is vastly inferior to the HD4870 and GTX260). Out of curiosity, I checked GameGPU to see if they've run a test with the new card, and they have:
    Napoleon Total War (2xAA used in all tests)
    15"
    HD5830: 16% behind HD5850, 14% above HD4890, 17% above GTX285, 25% above GTX275, 34% above HD4870, 35% above GTX280, 46% above HD5770
    17"
    HD5830: 17% behind HD5850, 16% above HD4890, 18% above GTX285, 26% above GTX275, 31% above HD4870, 31% above GTX280, 48% above HD5770
    22"
    HD5830: 16% behind HD5850, 12% above GTX285, 16% above HD4890, 19% above GTX275, 24% above GTX280, 30% above HD4870, 50% above HD5770
    24" 16:9/HDTV
    HD5830: 18% behind HD5850, 10% above HD4890, 11% above GTX285, 14% above GTX275, 26% above GTX280, 29% above HD4870, 44% above HD5770

    Everything where it should be. Of course, that's just one benchmark. I'll await some others and scan through some other sites.

    As far as value goes, it's not great. That much is clear. However, that goes almost exclusively for the UK, with the £215 pre-orders. In the US the HD5830 is $240, which I think is a fair price. The HD4890 was $200 before it was axed, and can now no longer be bought.
    The HD5770 is $160, so 66% of the price, and offers typically 55-60% of the performance at best, so that's fine.
    The HD5850 is $300, so 125% of the price, and offers typically 120% of the performance, so that's also fine.

    Whether you like it or not, the HD5830 is suitably priced for how it performs in the GameGPU test. If more games are like that, $240 is the perfect price for the HD5830. If it performs as Bit-tech suggest, we have a bit of a lemon on our hands. Still, could be worse. After all, the GTX285 is an astonishing $390, 62% more expensive than the HD5830 which can take it on, and often beat it.
    The biggest issue here is UK pricing. Right now we see a typical of around 75-77% pricing with US dollars. Thus, a $300 product costs £225-£230.
    With the HD5850 at $300, that's £225-£230. The HD5850 tends to go for a bit more than that here, but stock is no issue in the US, whereas supply is still a little tight in the UK. Look hard and you can get one for £240, which isn't terrible.
    The HD5870 is $400, which is £300-£310. Again, you can find them for that if you're lucky, but it's usually about £10 more.
    Meanwhile the HD5830 is $240, which should be £180-£185. So far £205-£215 seems to be the going rate. Hopefully that will drop after the first batch are sold.
     
  10. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    The price rise for the 5850 has only been slight here. The XFX only increased by $15. You can still find it at $309 lowest price. If the price drops when GT300 comes out I'm buying 2 immediately.

    I don't think it make the 5850 any less desirable. Only the 5830 a terrible deal. Drop that another $50 and we'll talk.
     
    Last edited: Feb 25, 2010
  11. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    As I say, I think it's due to the far better stocks of the HD5850 in the US. The 5800s are only imported in very small batches to the UK, whereas in US there are large stockpiles.
    I disagree that the HD5830 is a terrible deal. It performs exactly as it should for what it costs, having checked HardOCP, I can verify this. Not sure about the bit-tech test but presumably something is wrong with some of their tests.
    I'd imagine the GTX400 cards to produce a cascade drop in prices once it is released, but to cut the cost of the midrange stuff like the HD5770 and HD5830, more slight cards will have to come out from nvidia, like a GTS450. If they keep the trend as before, the GTX480 will be a $500+ HD5870 beater, the GTX470 will be a $400 ish HD5870 rival and the GTS450 if it appears should be an HD5850 comparator.
     
  12. shaffaaf

    shaffaaf Regular member

    Joined:
    Jan 5, 2008
    Messages:
    2,572
    Likes Received:
    4
    Trophy Points:
    46
    i mean fair enough if bittechs results are a bit bad, but the fact is that it even sometimes is less than the 4890, with 1120 SP, is horrible. and its price...... even today i saw 5850s for £10 more, at £225. but ATI have said they are increaing the price of the 5850 over the 5830, instead of the otherway around, leaving a gap at the £150-£160 mark.

    these 5 series havent been great IMO, and its all nvidias fault tbh, with no competition.
     
  13. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    I wonder how the GTX 295 fairs in those tests. Not that I'm ready to shell out 550$ LOL!
     
  14. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Napoleon Total War
    Single graphics cards
    Minimal: Radeon HD4700 series or above, HD5700 series or above, Geforce 8800GTS 512MB/9800GTX/GTS250 or above
    Reduced: Radeon HD4890/HD5800 series or above, Geforce GTX285 or above
    Moderate: Radeon HD5870 or above
    Good: N/A
    Optimal: N/A
    Extreme: N/A

    Dual graphics cards (assumes 80% scaling)
    Minimal: HD3800 series/HD4670/HD2900 Pro Crossfire or above, 8800GTS 640MB/8800GS/9600GSO (G92)
    Reduced: HD4700 series/HD5700 series Crossfire or above, GTS250 SLI or above
    Moderate: HD4870X2 or above, GTX285 SLI or above
    Good: GTX480 SLI (?)
    Optimal: N/A
    Extreme: N/A

    Triple graphics cards (assumes 150% scaling)
    Minimal: HD3690/HD4670/HD2900 Pro TriCF or above
    Reduced: HD3850/HD5670/HD4700 series TriCF or above
    Moderate: HD4770/HD4850/HD5750 TriCF or above, GTX260 Tri-SLI or above
    Good: HD5850 TriCF or above
    Optimal: N/A
    Extreme: N/A

    Quad graphics cards (assumes 220% scaling)
    Minimal: HD2900GT/HD3650/HD4650 QuadCF or above
    Reduced: HD2900XT/HD3850/HD4670 QuadCF or above
    Moderate: HD4700 series/HD5700 series QuadCF or above
    Good: HD4890/HD5830 QuadCF or above
    Optimal: HD5870 QuadCF or above





    I disagree about the HD5800 series being poor value. You have to get a sense of perspective. When the HD4870 and HD4850 first turned up, you would be gaining around 50% and 25% extra performance over the 8800GT for around 50% and 20% of the extra cost.
    The HD5870 is 50% more performance over the GTX285 for 10% extra cost, the HD5850 is 40% more performance over the GTX275 for 10% extra cost, the HD5830 is 35% more performance over the GTX260 for 35% extra cost. The latter isn't as good as the others, but consider in the US it's actually only 20-25%, it's still a good deal.
    The HD5770 is 15% more performance over the GTS250 for 15% extra cost, the HD5750 is 70% more performance over the GT240 for 40% extra cost, the HD5670 is similar performance to the GT240 for similar cost, the HD5570 is the only incorrectly priced card.
     
  15. shaffaaf

    shaffaaf Regular member

    Joined:
    Jan 5, 2008
    Messages:
    2,572
    Likes Received:
    4
    Trophy Points:
    46
    compared to the jump between the 7->8 nvidia series and then the 3->4 series for ATI these have been poor, performance wise. and you know, aswell as i do that the stock for nivida GTX200 parts has been abysmal since about septmeber, ofc their prices are going to be high, but that doesnt mean that ATI has to stop being competitive with itself. intel has done it for years now.

    the fact that ATI are selling inferior products for a hell lot more price, is not good. nto for fanboys, not for non fanboys, not for anyone that wants a decent GPU at a decent price.

    i hate that ATI are taking advantage, the way nvidia did back when the 2900 was a piss poor joke of a GPU. i mean i like ATI cards, and will probably upgrade to an ATI pair (or single) i personaly havent had an nvidia card since the 7800GT (though my brother has a 9800GT that a bought for him) but this is rediculous, and i cant wait for fermi to bring prices of the GPUs down, hell i might get fermi.

    with the 5 series ATI have left a sour taste in my mouth.

    PS any cance of the Q9550? :p
     
  16. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    7 to 8 was a substantial jump, but you can't really compare the gap between the HD3 series and the HD4 series as the HD3 series was a very poor performer, topping out a full 30% behind nvidia's best efforts of a year beforehand.
    Ultimately you have to see it as an economic situation. We shouldn't have to pay anywhere near as much as we do for the current HD5 series, but yet they are priced well given the circumstances. ATI are selling features and efficiency for the extra over their predecessors, because they can.
    To say ATI are taking advantage is true, but this is nowhere near as bad as the gouging from nvidia. The 8800 Ultra was £100 more than the 8800GTX for less than 10% extra performance, simply because ATI couldn't compete back then. ATI are hardly doing that now. Compared to disabling opposition cards in games using code switches, mild price gouging still makes ATI the 'good guy'.
    Fermi isn't going to make up for nvidia's corporate attitude, and you know as well as I do that they're going to be price gouged to hell if they're competitive. If the GTX480 beats the HD5870 by anything above insignificance, expect to cost just a tad less than an HD5970, simply because they can.
    As for Q9550, maybe. I can't afford a graphics card for this system at the moment (well, not the one I want) so the CPU is sitting unused. I might be tempted to swap you the E5200 with a little on the extra, but I really wanted a quad core CPU to use that system with, and the Q9550 E0 is ideal since it can then be tested for overclocking in a single GPU environment.
     
  17. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    Oh man! Fermi Technology sounds interesting.
    http://www.nvidia.com/object/fermi_architecture.html
    I wonder how much an entry level GPU is gonna cost though...
    If it can smoke the 295, something tells me I won't even be lucky enough to dream about it, much less purchase one :p
     
  18. shaffaaf

    shaffaaf Regular member

    Joined:
    Jan 5, 2008
    Messages:
    2,572
    Likes Received:
    4
    Trophy Points:
    46
    well let me know if you want to when u want to, march 5th BC2 should be here so ill let you know how the final version benches.

    i agree with that, i just am so pissed of withthe 5830 its translating else where lol.

    jsut foudn out why the 5830 is performing more like a a 5770 than a 5850, and its due to a paltry 16 ROPs..... the same as the 5770, half of the 5850, which will and does lead to poor AA/AF results.

    the more i read about this card the more i dont like it lol. hexus, anad and bittech dont seem to like it much either. Well, okay, good card, poor price.
     
  19. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Omega: You have to bear in mind that everything Fermi stands for furthers the performance of GPU computing applications, and not gaming performance. While the GTX480 will be a powerful card in itself, the optimisations of the new technology go toward better CUDA performance, not higher frame rates in games.
    Shaff: The HD5830 does seem a little bit more cut back than usual, but I suppose it is after all the entry level card of the 5800 series. It's worth noting that there are some bugs with the drivers used in initial tests of it, so things could well improve from here.
     
  20. shaffaaf

    shaffaaf Regular member

    Joined:
    Jan 5, 2008
    Messages:
    2,572
    Likes Received:
    4
    Trophy Points:
    46

Share This Page