Hmmm, that removeWAT tool sounds handy. Well, I don't know - maybe those other updates, security things, are good. Maybe some of them help game performance - but I doubt it. That certainly sounds like a pain to have it detect a mobo change and want a new license - and then the inevitable phone call to MS to get it all cleared up. LOL Hey guys, I am super close to taking the plunge and getting a 7970. Now, I know you said gtx 670, Sam. And others say 7950 if AMD. But here's a thought. I don't have a sli-certified rig for now - but I do have a cf rig, with full 16 lanes of pci-e simultaneously for two cards, while I know pci-e lanes don't matter that much. BUT - just suppose that I put a GTX 670, or 7970 in, and for any particular title, as unlikely as it could be, I am still somewhat gpu bound - this is of course after overclocking the cpu up to 3.6. So - as unlikely as it would be - IF I ended up slightly gpu bound - say on BF3, well then another 7970 would solve that wouldn't it. But that option would not be available if I opted for a gtx670. I guess a gtx 670 is $400, same as 7950, but 7970 is $480. I'm not too concerned over $80 - I can eat rice for a while and save $80. Let me ask - for 30" gaming, is there any single title on which a 7970 beats a gtx670, or do you have to get to eyefinity with 6 million pixels before you ever beat it? Rich
The 7970 is better then the 7950 and if you're not concerned about the $80 go with the better card. The GTX670 is a very good card too so if you prefer NVidia then go that way. The update you should hide and avoid is KB971033 even if you are using legit OS's. I've had that update pooch good copies of Windows and that is why MS doesn't automatically install it now, you normally would have to review the updates and check the box for that one if you want to install it. If an update corrupts your machine MS won't be able to fix it, you would need to restore-image or re-install. MS can only fix activation issues on a machine that is not corrupted or doesn't have a physical issue(s), to address the concern prior. The Wat is only one way to cheat and an old one at that, there are better ways to do the same thing.
Hey gang - movies - well, I am starting to change my mind about some things. I am looking at CF like a lot of you have been running for a long time, Sam and Jeff for example, and my Q9450 is not sli-certified, so that obviously points to an AMD solution. Let me see if you guys think that I am on the right track. I haven't done a lot of personal research (I should, lol) but I'm hearing some things. If the GTX670 or GTX680 is better than the 7970 for everything except eyefinity, and that's what I am hearing, then I assume - while it might be better, the difference is probably not more than 10%, right? For sure it can't be better than a 7950 crossfire solution which I would guess would be at least 50-60% more powerful on 30" gaming. Do you guys agree? So, I'm kind of looking at EITHER $800 for a cpu upgrade, or $800 for a gpu upgrade. But I am already gpu bottlenecked. So my thought is to put the cpu upgrade off until next year and load up with crossfire, if that would allow me to play BF3 at ultra settings, 2560x1600. I guess I would take the gpu upgrade in two steps, by starting with one card, working with it for a while, making sure that I still had a gpu bottleneck on the most stressful titles. (For example I could go through the single player of BF3 like I already did with the 8800GTX. I would set it to ultra settings and log gpu load vs cpu load. Hopefully I'll see gpu load 100% and cpu something significantly less than that - then I would add the second card to eliminate the gpu bottleneck.) After that, the bottleneck would of course shift over to the cpu, but if I can pull 30 fps on BF3 with ultra settings, I'll move forward to strengthen the cpu next year. What do you guys think? (And for my needs, am I right - 7950 is just about the same?) Rich
You could use your Q9450 with a GTX 690 which you wouldn't need SLi and you should have plenty of GPU punch. The card is steep in price but it would perform better than two lesser cards at about the same price for the two. And if you want to upgrade your MB/CPU later you could always do so and use this card as well or even add another if you wanted to play with SLi. The Radeon HD 7970 should perform better spec wise then the GTX 680 but you could be right that the NVidia performs better, maybe? At 2560x1600 resolution won't be an issue for any of these GPU(s) so you should be fine. Since you are a gamer I would stick with the higher end cards and wouldn't go with a 7950, go with the 680, 690, or 7970, and I think NVidia is a little stronger for its Cuda Core's versus Streams with AMD. Well that's my thoughts, Sam or Russ may have some good ideas that differ or that I didn't bring up here. EVGA 04G-P4-2690-KR GeForce GTX 690 4GB 512-bit GDDR5 PCI Express 3.0 x16 HDCP Ready SLI Support Video Card XFX Double D FX-797A-TDBC Radeon HD 7970 Black Edition 3GB 384-bit GDDR5 PCI Express 3.0 x16 HDCP Ready CrossFireX Support Video Card
Movies, that is an interesting idea - I was seriously thinking of a gtx690 once I learned that the sli compatibility issue wouldn't come up. BUT - the very scary idea surfaced that the card might not even boot up on my rig. I chased a thread on tom's hardware and then posted a question - and that's when the issue of compatibility with older core 2 platforms surfaced. Then I dug through all the newegg reviews, and one guy with an i7 newer platform, mentioned that he wasn't able to get the card to boot initially, until he made some bios adjustments. So, not only is the card expensive at about $1000 or more, but IF IT WON'T BOOT on the core 2 quad that I own, I'm out a $150 restocking fee. On the other hand, I know the 7970 will run, because I found a guy with a Q9550 posting his 7970 3dmark scores. And so I can start with a small $400 chunk and take it in little steps. I have read some interesting things about the overclocking abilities of the 7950 - it's actually a bit of a cooler running card than the 7970, and pulls less power, about 32 watts less under load clocked the same as a 7970. It's volted slightly lower, but you can move that up to 7970 specs, and then it really overclocks like crazy. The reviewer said he thought it was better balanced architecturally - whatever he meant by that - but I'm thinking that as a slightly trimmer version of the 7970, it might be better for my 4 megapixel 30" gaming needs, since 6 megapixel eyefinity is not what I want to do. (Normally eyefinity is very widescreen, 5760x1200, but I saw an interesting custom eyefinity on youtube, with three samsung 1920x1080 monitors, set in portrait mode with the bezels removed, and a mere 6 mm gap between monitors - that's 3240 x 1920 - it looks good on youtube - he's running it on 7970 CF playing BF3.) Plus my toughbpower 750 PSU would more easily handle two of those in CF, pulling a combined total under load of 64 watts less, closer to 220 watts each instead of 250 watts for the 7970, and I might overclock it way more than the reviewer. So if the 7970 were getting close to pulling 300 watts on a major overclock, and two of them at 600 watts - that might start to push my 750 PSU, whereas at same extreme clocks, I should see a 60 watt savings from the 7950s. And lastly, there is the money savings - it's $80 cheaper, that's $160 less for the two of them. (I'm referring to the dual-fan sapphire at $399 with what looks like a good non-reference cooler.) Rich
I've dug around some and found some of the complaints but they seem to be user issues more then true compatibility issues. The one issue that could be due to compatibility is related to the PCI Express 3.0 slot but that is a backward compatible spec so it shouldn't be an issue. However that doesn't mean it can't be a problem and really would be more of a driver problem which is fixable. Most problems I found were with people OC'n which makes sense. Plus I've looked at the specs from the EVGA manufacture and the card IS PCI Express 2.0 to 3.0 compatible so that won't be a problem for you, your rig is OK for use with it. SPEC Running SLi or Crossfire will not give you a big boost in performance as you run the video cards in master/slave configuration which means the second card will only handle about a quarter of the load regardless of having two 16x slots or not. So you will pay more for less. But you will gain power over just having one card so if you wanted to get one 7970 now and another later when you get a Crossfire rig that would be OK as well. Keep in mind that OC'n two cards is much more difficult too. That BF3 was pretty nice with the two cards but I wonder how well it worked with just one of the 7970's, probably close to the same. My card is older so like you I need to upgrade too and haven't played with the 7970 yet. As to heat typically you always have more heat with more power which is why the 50 runs cooler than the 70, unless you OC the 50 of course. Hope that helps some, Stevo
Dual GPU cards still rely on Crossfire and SLI scaling respectively. They are two GPUs on a single PCB, not a single GPU with two cores. The scaling issues will be less due to less electrical and mechanical linkage/resistance between the chips, but to say Crossfire/SLI doesn't apply is wrong. As far as actual SLI compatibility motherboard-wise, most modern boards should be able to use a dual-GPU card problem free, even if only equipped with a single slot. Movies, you're almost a decade behind the times. Dual GPU configurations haven't used Master/Slave configs for several years. It hasn't been used for 6 or 7 generations of hardware now. The last TRUE Master/Slave cards were the X1800/X1900 series. Nvidia was even quicker to the draw than that, never actually having had Master/Slave SLI. Master/Slave Crossfire/SLI is positively ancient(centuries in computer terms) and an outdated idea. Both of my video cards are identical. Neither one is a master or slave. Also, I HAVE NO CLUE where you keep getting these convenient performance numbers you never seem to want to show. I can show you some real performance numbers on a wide variety of games. With a second card my framerates are typically 70-90% higher. Both cards normally hit 80-90% usage per-card and sometimes close to 100%. Far from "a quarter" of the work. ALSO, slot bandwidth matters very little. Most cards will perform identically at as low as 4x, even the very highest-end models. I also have numbers and PROOF for this. Not really. I would know, as I own two video cards and they are overclocked. The only trick is to link the clocks for the two cards. Otherwise, it's the exact same as OCing anything else. Turn the clocks up till they won't test stable, back them down until they do. The only catch is that you're limited by your lowest clocking card. Again, linking their clocks eliminates the guesswork. When the benchmarks test stable, both cards are stable. It's not any more difficult AT ALL. Battlefield 3 gets almost perfect 100% scaling. The framerate would be nearly double with a second card. I have proof, both from my own testing and THOUSANDS of other people. I only emphasize my points as such because I swear we've visited these exact subjects two or three times now...
Your wrong Jeff! But I'm off with the trade off it turns out that it is as high as 50% now so they can share the load equally now. There IS a Master/Slave relationship!
That information is outdated. Crossfire/SLI now uses a tiling method instead of rendering two halves of the screen. The cards don't SHARE a load, they are given two separate loads. The Master/Slave relationship explained there is being used as a demonstration tool. Notice it doesn't mention which hardware generation is being used for its example. YES, one card is going to have a bias simply because you have the monitor plugged in, but that's as far as the bias goes. Properly coded SLI/Crossfire has both cards at or near 100% load. Typically 80-90%. Both cards calculate separately and almost all inter-communication between the two is done by the CPU and the memory sub-system. Also, wikipedia is not the most reliable for hardware information. Notably their CPU/GPU lists are missing large chunks including entire families of hardware. I will also mention that Nvidia and AMD have taken two entirely different approaches to dual-card, further separating current tech from the Master/Slave configs of old. SLI is hardware based and Crossfire is software based, thus the need for Crossfire Application Profiles. Whatever the relationship may be, Master/Salve does not exist anymore as it once did. I'm making the distinction because Master/Slave was an entirely different way of making the hardware connection. It would imply one card has special circuitry allowing it to control the link between the two, making it effectively a "Master" card. But that is not the case anymore. Both cards are identical. I'm not trying to argue, I'm trying to set things straight. You might have been mislead, or misinterpreted something, or simply understood it differently than I did. I'm all for a bit of rousing hardware debate as long as we can keep it sensible. In this case, the facts are ever so slightly different from what you've read(or seem to be explaining to my biased ears).
It isn't being used as a demo but was the usage a few years back. Originally it was 25% in 2004 the spec changed to 50% but even in 2007 systems still were using 25% as I built thousands of them back then. It recently changed to independent control as you say and which makes sense that the CPU's these days are tasked harder, which is what Sam has complained about with his game bottle-necks. I haven't played much with dual/quad card configurations since it was such a poor way to go but now that they have taken a better approach I could see the benefit as long as you have a powerful CPU to handle the extra burden. It still would be better getting the fastest card you can for gaming and then down the road when they come down in price getting a second of the same card to improve your performance even more. That would be the path I would go... Thanks Jeff, Stevo
I couldn't agree more. Unless you actually need two cards right away to make the upgrade worthwhile(as was the case with myself), one higher performance card is far and above the superior option. And, as you said, price drops make Crossfire a more viable option later
Hey Stevo and Jeff and Kevin, Wow, that turned into a nice little discussion! LOL Stevo, thanks for the information. On the specifics of crossfire, I think we have no choice but to sit back and difer to Jeff's expertise. Along with Sam, he's the man on crossfire. He's accumulated a great deal of personal experience now in the last several years, and he knows his hardware - especially if it is hardware that he is running and working with everyday. (Not only hardware - the way Jeff digs in and discovers the patches and mods to tweak a particular game to max eye candy perfection is inspiring.) Jeff, I keep thinking of you, every time I think of crossfire 7950 instead of 7970. You did the same thing - which family was it, 6000, 5000, 4000? It's been a while, but then I think I recall that you more recently upgraded, is that right? You have had spectacular results on 2.3 megapixel gaming, running everything in full settings, while I have watched on the sidelines for three years, with my 4 megapixel requirement. But I wouldn't trade my big 30" Dell for anything - it's the best thing Sam ever suckered me into. But at 4 megapixels, I have had to forget about running anything with max settings unless I was willing financially to remain on the bleeding edge of technology a step or two behind Sam. I watched his frustration with quad CF on the 4000 family, and I thought "Go Sam go!" He went through a lot running those two 4870x2 cards - on the verge of quitting just before he figured it out. Whew! Better him than me! That was painful to watch. At that time he said it would be 3 more generations before we could play crysis at Very High, and he was right. Thank god Asus hardware can be quirky at times, and that the p5e acted up and reset the sata drive spec in the bios, lol, so at the beginning of 2010, after only two weeks of use, it wouldn't boot up for Miles the modeler/animator. They shipped him a 1366 i7 instead. The Sonata sat in his garage for 9 months until he said to me one day - "Hey I've got this computer that doesn't work - you want to see if you can get it running and if so you can use it for a while." It's been almost two years. That Q9450 and 8800GTX churned 13,500 3dmark6 points, way above the 6,000 points I was getting from my P4 and 3850, and moved me into some serious gaming - but not serious like you. So here I am ready to move past the gpu bottleneck. I want to join you guys out there on the BF3 battleground, but you have raved about ultra textures, so that's what I want too, lol. I never thought I would go for anything but the Cadillac of a particular family, meaning the 7970, but more and more I am thinking that the 7950 is a slightly trimmer card, and is better suited for my "mid-range" 4 megapixel 30" needs, halfway between 2.3 megapixels on 24" gaming, and 6-7 megapixels for eyefinity. At identical clocks it pulls 32 watts less under load, 220 vs 252, about 12% less, which matches the the 12% fewer stream processors. I have the sneaking suspicion that, as a cooler chip without "the baggage" of those extra processors, it might actually overclock better than the 7970. The review suggested as much. At identical clocks, it seems to perform only about 2-3% less fps on demanding titles, and I could probably make up for that, by a slightly higher clock. I could equal the 7970 performance, but still use less wattage. What I am saying is that the extra processors on the 7970 are not really needed for my mid-range 4 megapixel load, and only get in the way by adding wattage and heat. How's that for a radical theory!! Hahahaha Rich
Much like some other 'debates' we've had, I suspect the experience Stevo is drawing on is a personal one, rather than a factual one. We know from experience how crossfire works because we use it. Stevo is going by his beliefs, which as we know, supersede proven evidence. I'm not really getting into the debate, it's been a long and difficult month of work, thus I haven't been around to post often, though it does amuse me to see the heated arguments flare up even in my absence, which goes some way to dispelling the commonly held belief that I'm the catalyst for all the negative conversations around here. To add some worth to this post in a succinct manner, here are some facts: 1. The GTX690 as a dual-GPU card does not require an SLI licensed chipset/motherboard to operate. It is, however, being a modern geforce, potentially likely to conflict with older LGA775 based boards, which Radeons do not. 2. The GTX690 is overpriced. At $1100 where you can find one (1 month+ waiting list at the moment), it's fully twice the cost of the GTX680 which is itself massively overpriced compared to the near-identical GTX670 which is a comparative bargain. 3. Crossfire scaling, in all properly supported titles, is 85-95%. That's a given nowadays, if you get less, it's a bug, and it may or may not be fixed, depending on the title. SLI scaling also falls into the same region, but to the lower end, rather than the higher end - CF typically takes a 5-10% lead in scaling. However, SLI is more reliable (fewer games bug with low/no scaling) and has a shorter lead time (typically nil to 3 weeks, versus 2 to 24 weeks with crossfire) 4. Power consumption and performance per watt etc. are out the window this gen - both cards are basically on an even footing here, so that argument is, to the delight of nvidia fans throughout the world, history. 5. This statement still holds true: It is better to buy one single-GPU card that is the performance-equal or performance-approximate of an existing dual-GPU configuration, even if it costs more. Single GPUs are simply better, when they can provide enough power. 6. The new generation of hardware improves things, but does not allow, nor will future generations for some time, allow every modern title to be maxed out at 2560x1600. 7. PCI Express bandwidth remains a non-issue for any modern graphics card (Dual-GPU cards excluding) all the way down to 4x inclusive. Do not try and operate a dual-GPU card on a 4x slot, or a single card in a 1x operative slot, but anything else goes. There were no tangible effects proven when running two HD5970s or HD6990s in a dual 8x system - providing 4x per GPU, nor were there when testing four HD5870s in 4x slots each. 8. PCI Express 2 & 3 backwards compatibility will not cause any problems other than the potential stumbling block with nvidia cards and old chipsets already stated. 9. RemoveWAT may be old, and perhaps there are better methods - but it's reliable, requires no install or registry edits, and works with one simple click. It's as easy as it gets.
Shouldn't take much to sucker anybody into a wonderful monitor like the Dell IPS monitors! I love my 24" Dell, but I sure do want the 30" now. I honestly wouldn't mind a monitor with even more resolution, and larger Of course, if it were available now(probably), it would be crazy expensive
The best Dell monitor out there really is the U2711 - the pixel pitch is superb and it's half the price of the 30" despite having 90% of the pixels and being 80% of the size. Omega: There are a few UHD monitors out there - the IBM T221 at 3840x2400 and 22" (yep, that pixel pitch) dates back to 2002 when LCDs were first released. If you think we're making progress with LCDs, these make you think again! Around $20,000 when new, these are going around refurbished on ebay for $1000-$1200 ish, but you need a special video adapter to run them as they use a proprietary interface format - they're also limited to a 48Hz refresh rate and have very low contrast - they're no gaming monitors. On the more contemporary side, last year the EIZO FHD3601 was released, at 36" and 4096x2160 it's near-panoramic (256:135 or about 16:8.4) but it takes DVI (well, two dual-link of them to be precise) and runs the full 60Hz. Current pricing is around the $30,000-$35,000 mark.
Yes, at half the price, it's a better bargain. If one prefers the 16:9 resolution. I prefer 16:10 myself.
Which one are you drooling over, the one at $30,000???? Kevin you've been saying 30" forever. Just go get one. I paid $850 for mine. The way you fry your cards and then just go out and toss out $350 for another, with one day's thought about it, like it was 50 cents - you have the money I know! Even Jeff wants a 30" - he just won't admit it. He'll probably be the first on eyefinity - he'll just go straight to the 6-7 megapixels. Look Jeff, this is you on BF3. Sam that earlier fact layout was indeed succinct and brilliant. I see that you agree that I run a sizable risk that the enormous and expensive monster wouldn't even boot on my LGA775 - and besides that they aren't even available. (I prefer to let you take those kinds of risks, lol.) I particularly enjoyed this part: "..........though it does amuse me to see the heated arguments flare up even in my absence, which goes some way to dispelling the commonly held belief that I'm the catalyst for all the negative conversations around here." Hahahahaha. Have you and Russ and Stevo been slugging it out? Where have I been - sounds like fun. (I bet Shaff and DDP mix it up on this thread too!) ---------------------- But Sam, you didn't comment on my radical theory, that for my "mid-range mere 4 megapixel needs" the full 7970 is a tad bloated and power-hungry, not to mention costly, and I suspect that with a 7950, I can fully well equal 7970 performance - EVERY BIT IDENTICAL FPS - while consuming less power. Still just a theory. One must be modest about these kinds of things, but e=mc squared comes to mind. Rich