Video Card Thread (Mostly Gecube x1950xt)

Discussion in 'PC hardware help' started by Waymon3X6, Jun 28, 2007.

  1. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    He probably means HDCP, which means it won't be able to play HD-DVDs or Bluray discs in the PC. That's it really, if you want HDCP you need to go with an 8000 or HD2000 series card.
    As for your CPU, I couldn't say, my XP 3000+ with an X800 pro didn't fare too well, it was only when I got my X2 4200+ that HD became a reality. I would imagine an X800 Pro should be powerful enough for HD playback, so I'm guessing it was the CPU...
    That minimal CPU usage is BS< WMV-HD is the most demanding of all HD codecs on my PC, The divx, H264, X264 codecs run OK, with WMV-HD it's working flat out.
    As for rainbow six Vegas, my X1900XT bests all the AGP cards available, and I have to play it on object dynamic lighting,not full, or I have to turn the resolution right the way down to 1024x768 for it to be playable. I don't think anything short of an 8800 can max Vegas.

    Move away from the Mhz idea (the lower reading is because you're looking at it in 2D mode, the cards under clock themselves at the windows desktop to run cooler, they only bring up the high clock speeds when rendering 3D.
    Mhz isn't really a good way of determining performance, the 9600 pro had a higher clock speed than a 7800GTX, but it was a fraction of the speed, it's all down to architecture, hence how my 3Ghz Core 2 Duo minces a 3Ghz Pentium 4, or even Pentium D.

    No such site exists, because there are too many variables, you can only make an estimate based on your knowledge of the technology in the field.
     
  2. DInc

    DInc Regular member

    Joined:
    Jul 19, 2007
    Messages:
    254
    Likes Received:
    0
    Trophy Points:
    26
    Ok, but it shows both the current clock and requested clock in the Overdrive-menu.
    The current is obviously what it is currently running at, the requested is I guess would be the speed in 3D-mode.
    But that one's only going up to 702MHz, while the Core Requested speed is showing the correct speed.

    But ahm, it's weird if a card like the 7950 couldn't play RSV at higher settings, not necessarily maxed out with the resolutions and all.
    Because RSV, although looking really good, is from like what, mid 2006 and the 7950 I just read is from September '06.
    Of course I know it's AGP also...


    About one of those "test-websites":

    I really like systemrequirementslab.com .
    But, when it checks your videocard it will only detect how many RAM you have on the card.
    And based on that it will say if your "video-department" is good enough to run the specific game.
    I mean, ARGH, that's such a beginner's-mistake right?

    They should just make it so it checks the speeds or more details of your GPU and see if can run the game.
    I don't know if it's possible or they just don't know how, but they should make it possible. :)
     
  3. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    yes, but you have to bear in mind the 'performance ambition' coded into top titles like Vegas. Games are designed so that they can't run on max settings on ANY hardware available at the time of release. This means that when hardware does come along that can do that, peopl play the game again at its top settings, extending the shelf-life of the game. It's sneaky, but it's been going on for quite a while.

    As for systemrequirementslab, you're correct, a 7300LE with 512MB of RAM (half of which is probably coming from your system RAM) is not going to be a match for an 8800GTS with only 320MB.

    Game manufacturers usually have a hierachy system, i.e. "Radeon X700 series or better", though this usually means that any higher number will work - so an X1300 should - a bit misleading because some X1300s are slower than an X700.
     
  4. DInc

    DInc Regular member

    Joined:
    Jul 19, 2007
    Messages:
    254
    Likes Received:
    0
    Trophy Points:
    26
    Yeah, it can be confusing, but I bet they don't do it on purpose.
    The GPU's just happen to get the model-number they get in a certain order right...

    But I know what you mean with games not even being able to run maxed out at release.
    I've been thinking about that, and actually found it messed up.
    But I also find it a good thing, like you said shelf-time, it will also have the chance to look good for a longer time.

    That's especially handy for me since I already thought games would kind of look like today or even better almost 10 years ago.
    I mean, I actually thought games back then like Grand Prix 2 or something would look great, but I just didn't have a good enough computer yet.
    I actually had a vision of the 3D-realistic games they are today or still have to come, 10 years ago.
    I guess 3D-MOVIES and TRAILERS misled me, but also inspired my fantasies lol.

    But you have to know that I'm really into quality.
    I mean, I'm really bothered by any flaws, even though I've loosened up about things over time.
    Blurriness and jagged lines are just the worst things in graphics.
    Also blurry audio or any other flaws like stutters, which would also mean the graphics performance needs to be stutterless.

    Etc. etc...



    BUT, I guess I could just try the card, I've seen a store with the exact same price (as the GeCube), even the shipping-costs the same.
    If it's too disappointing I guess I could just return it.
    And maybe then wait with gaming and concentrate on mu music-career lol.
    Damn, I'm gunna live on the streetz...
     
    Last edited: Aug 6, 2007
  5. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    hehe, you sound like me, an eye for detail, any blemishes aren't satisfactory. put it this way, even at 2560x1600 in games, if I can turn Anti-Aliasing on, I do!
     
  6. DInc

    DInc Regular member

    Joined:
    Jul 19, 2007
    Messages:
    254
    Likes Received:
    0
    Trophy Points:
    26
    Hehe yeah, if it would totally be about quality, me too.
    But it's performance too, you won't see me smiling with shiny sparkly super-tight images, running at 1 frame per 10 seconds lol.

    When you're playing a hectic FPS or where you have to concentrate almost every seconds anyways, you won't notice some flaws anyways.
    I'm rather afraid to get killed any second, you know, "battlefield-awareness".
    But still, good images enhance the more realistic experience, and the flaws are quite disappointing to look at.


    I think to put a resolution 1 or 2 steps up will rather smooth out the rough edges than the AA 1 step up.
    You know, for performance's sake...

    Oh well, if I get rich and famous in the next 10 years, I'll also buy YOU a monster-computer after your own specs lol.
    I don't care about money really, the pain is just that you need it :p .
     
  7. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Hmm, upping two resolution scales has more of a hit on performance then using Anti-Aliasing.
    For example:

    battlefield 2142 run on ATI X1900XT

    1024x768: 70fps
    1024x768 with AA and AF: 55fps
    1600x1200 with AA and AF: 33.6fps

    AA and AF caused a 21% drop in performance at 1024x768.
    once they were on, upping the resolution by 2 levels caused a drop of 39%.
     
  8. Waymon3X6

    Waymon3X6 Regular member

    Joined:
    Mar 9, 2006
    Messages:
    2,193
    Likes Received:
    0
    Trophy Points:
    46
    wow you guys wrote alot since the last time I was here!

    Anyways, before when I was searching for a good AGP card, I found the Gainward Bliss 7800+ (plus is important) and that's what I was going to get for a while untill I found the gecube, which was clocked faster and I thought it would be a faster card... Guess not...

    Well on newegg, everyone complained about the cooling blah blah blah, but after I did that cooling mod, right now my temps are 35.63C core and 48C ambiant. (This was even without the fan since it broke)

    But yeah, I would go with the gainward... I guess you could try the XFX card but, well, do what you want... (Im still a gainward fan from a previous card that I had)
     
  9. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Now you see the probem I have when I go away for a week!
    Google Mail - Inbox(37), about 30 of them'll be Afterdawn thread updates!
     
  10. DInc

    DInc Regular member

    Joined:
    Jul 19, 2007
    Messages:
    254
    Likes Received:
    0
    Trophy Points:
    26
    I also like to use combinations of different settings.

    But Waymon:

    Even though the cooler doesn't seem right for a card like this, it's not that bad.
    So yeah, you're right that the complaints aren't needed, except for that something's up at GeCube that the coolers aren't placed right.
    It's either the pressure, which you can fix with that mod we both did, or else that the cooler isn't mounted straight or flat, like I found.

    People just don't know this is the problem.
    But on the other side they have a right to complain, cause a product should be working out of the box.

    I wouldn't be so sure with Gainward, since that's again one of those brands I never heard of.
    I'd rather go with Sapphire, BFG, EVGA, XFX, things like that.
    But hey, if it work's, that's always good.


    Be glad you don't have too much e-mail, my box has over 600 in it right now.
    That used to be around 1000 when I took a break from the computer, or online anyways.


    Oh, and not to forget ASUS, which my 6800XT was made by.
    I was pretty pleased with it, except that it did something which is a long story again.
    And I also got it late... but that's because I didn't dig into the details of GPUs yet.
     
    Last edited: Aug 6, 2007
  11. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Surely most of those are spam?
     
  12. DInc

    DInc Regular member

    Joined:
    Jul 19, 2007
    Messages:
    254
    Likes Received:
    0
    Trophy Points:
    26
    I just saw I have 1670 messages actually, the other numers I mentioned were unread ones :S . *help... me...*

    But ahm, not much spam no, a lot of subscriptions and also just saved personal messages.
    And communications with whatever, stores, other things.


    Even though I have certain typical spam-things like, rolex replicas, get laid tonight, your new 2007 car, your free tickets, free trip to Oprah (LMAO).
    I've put most into "unwanted", but sometimes they get less, and sometimes they get more.
    Now they seem to be less again, just like 1-2-3 a week :p .
     
  13. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Archived I must have tens of thousands of messages, but inbox, only a couple.
     
  14. DInc

    DInc Regular member

    Joined:
    Jul 19, 2007
    Messages:
    254
    Likes Received:
    0
    Trophy Points:
    26
    Hey, how do you mean archived?
     
  15. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Google Mail allows you to archive (store) up to 2GB of emails for reference if you ever want to look at them later. You only need keep in your inbox what you'll look at soon, and since the mail search is a googlesearch, it's quick and easy to find what you're looking for.
     
  16. MaccerM

    MaccerM Regular member

    Joined:
    Aug 6, 2007
    Messages:
    141
    Likes Received:
    0
    Trophy Points:
    26
    Me and my Mistakes!

    I recently bought one of these GeCube X1950XT cards and I just thought I'd share my experience to hopefully help some peeps make a more informed decision.
    My previous rig was a venerable old 478 P4 3.2 with 2gb of DDR400 and a XFX 7800GS. It would play FEAR quite sweet in 1024x768 maxed and 3DMark06 gave me 3308 with the cpu @ 3.44 and the GPU at 490 vs 440 stock. Not bad for barebones that were 3 years old, but BF2 framerates at 1280x1024 were around 25 on big maps and the stutters meant I was getting creamed every time 10 plus people were on my screen. CM Dirt was even worse, 13-18 fps @ 800x600 on medium.
    Anyway I saw the X1950XT and thought this is the upgrade to make it last just a little longer....
    Got the card and my first subsequent purchase was a new PSU. Gecube recommend 450w+ but even a mid range 500w is not enough, you need a decent 700w+
    New PSU installed and 3DMark06 is up to 4022, great! 20% increase in the bag, it was worth it! No.
    Real game performance was pretty much the same, FEAR would run 1280x1024 but BF2 was the same and infact Dirt was slightly worse!
    Having bought the card second hand I had no real way to return it without it costing me so I embarked on my most ludicrous upgrade path ever (as you can see from my sig).
    Now with a Core2 my 3DMark is up to 5700, BF2 runs maxed at 60-90fps and I get the 30 odd fps I needed in Dirt - However playing both these games is a 'side panel off' affair as I too have overheating problems with this card even after the tape mod. The engineers at GeCube definitely had thumbs up bums when putting this cooler on!

    The Moral of my Story:

    If you want the fastest AGP card, this is it. However if you actually want it to run like the fastest you'll need a top end Athlon 64 or a Core2 (the latter is pretty pointless - buying a new motherboard with AGP!? just get a PCIE card!)
    If you buy this you will also need to be prepared to buy and fit an aftermarket cooler (think I'll go for the Thermaltake HR-03 and an extra pack of ramsinks for the VRMs and the bridge chip) You'll also have to spend on case airflow and probably a new PSU.
    It is a great card but a great deal of effort to get it going and if you have an old P4 or a lower end Athlon I'd suggest a Nvidia 7950, much less painful!
     
  17. Waymon3X6

    Waymon3X6 Regular member

    Joined:
    Mar 9, 2006
    Messages:
    2,193
    Likes Received:
    0
    Trophy Points:
    46
    to bad I've got a socket 478 mobo, so no C2D for me... :(

    Well, is it the processor then? I dont think my usage goes to 100% when I play BF2 though..
     
  18. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    You obviously had a bad power supply, you can run an X1950XT on a 350W unit if it's good enough.
    A 500W PSU can run a Quad core and an 8800GTX, a 600W PSU can run an overclocked quad core and two 8800GTXs in SLI. You just have to get one that's made properly, and actually puts out how many watts it says it does.
     
  19. Waymon3X6

    Waymon3X6 Regular member

    Joined:
    Mar 9, 2006
    Messages:
    2,193
    Likes Received:
    0
    Trophy Points:
    46
    Ran 3dmark06 again and got 3871... Is that underperforming for my system?

    SM2 Score: 1631
    HDR/SM3 Score 1881
    CPU Score: 927
     
  20. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Well, with that graphics card I would expect about 5000, but then my CPU score is 2500, so I suppose that's not too bad.
     

Share This Page