The same can be said for Crossfire unfortunately :S Most stuff runs wonderfully with Crossfire, but other stuff actually gets negative scaling or GUI issues or something of that nature. Luckily most of the issues come from older games that don't require Crossfire at all. My cards were worth the price at the time, as I was able to sell both 5850's and make up the difference easily. The cards are no doubt powerful though admittedly 1GB of memory has become a limitation in some titles... It's getting to be time for me to make an update but nothing really sticks out to me as a spectacular deal. A pair of second-hand 6970's might be attainable...
Sam, The HD-4670 I have is DDR3, not the crappy DDR2 ones. There is a big difference in performance. 970MHz core speed, 370MHz faster with the GTX-550, and an effective memory speed more than 4 times faster than the HD-4670. With 320 Stream processors it's a decent enough card, but the GTX-550 should easily be twice as fast or more than the HD-4670. I doubt very much that 3D mark is weighted towards nVidia though. I mean "Why"? The rivalry is good for the industry and the benefits ultimately grace the hard-core gamers hands, months before introduction and we're off to the next tournament! I think they would be nuts to change anything about the formula that they use quite successfully! JMHO! Best Regards, Russ
I can prove about a thousand times over that 3D Mark is extremely Nvidia biased... 3D Mark scores are basically meaningless when comparing AMD vs Nvidia. My 8800GTS gets consistently higher scores than my individual 6850s and anyone who has gamed in the last several years can tell you which card is faster... Also got better scores than my HD4870s about 16000 for the 8800GTS vs about 12,500 for a single 4870. Again, the AMD card is far superior in real world gaming. Is basically a widely known fact of PC hardware performance than every 3D Mark ever made is heavily biased.
The world of marketing & benchmarks. You're proving my point that you've argued against many times! LOL
Yes but my benchmark results are coming directly from home users who have proven the differences. My numbers are from my own machines.
Bla-Bla-Bla, you've said experience isn't important and now, when it suits you it is. Again LOL There are stats and even a few benchmarks that if used properly mean something but like most marketing these days they are abused or even lied about. And you are supporting that WHEN IT SUITS YOU. I just love it!
Not really, because 3dmark is what's known as a 'synthetic' benchmark, i.e. it is an application solely designed for benchmark testing, instead of a real world application that we're monitoring to provide benchmark results, which is what we use. Russ: It was worst with the geforce FX series. By far the worse performers of that gen, yet in 3dmark, and only 3dmark, they were streets ahead. That's been the case ever since really, e.g. 3dmark vantage that had PhysX (an nvidia owned feature) on by default, which ran an additional test to give you about a 20% score boost versus AMD.
leaving opinion out of this, let me try and translate. moovieman said before that with intel vs amd cpu's benchmarks mean nothing and its just marketing...somehow that means the same is true for nvidia vs amd gpu's and you twist facts the way you want to support your story. remember, when judging the difference in benchmarks, ignore the the whole biased vs unbiased arguement, don't take that into account, its all just marketing, all just a hoax, and if we go much further, ITS TURTLES ALL THE WAY DOWN!
Assuming the science is sound, carried out by unbiased professionals, I'll take science over anecdotal evidence. Stevo's method is the opposite to that. Simple as. The argument arose because 3dmark was described as a BS benchmark, which it is, really. A bit like saying because people have released biased/flawed scientific findings in the past, then all science is false.
forget pi calculations and stress tests, my benchmark from here on out will be "does it seem fast when i power on the pc and surf www.grannytranny.com?" i will ignore network speeds and only take into account how fast it feels. humans are flawed-humans interpret science-therefore science is flawed.
GTX 550 TI SLI is a much better value than the similarly performing GTX 570, but the Radeon 7850 is a similar performer at about the same price range as dual GTX 550 TI (it can actually be somewhat cheaper) and uses far less power. I wouldn't suggest buying a GTX 570. Heck, I'd take the 480 over the 570 strictly because of the memory and price advantage if I had to choose between them. Newegg has had a 480 for about $200 for a while now and although they use a lot of power at stock, they can undervolt like crazy and with an after-market cooler, they can overclock incredibly well (the 7850 can do so too, but it doesn't need an aftermarket cooler if you get a good one because it uses much less power). My point is that there are better solutions than GTX 550 TI SLI for about the same performance and price.
I think of those options I'd take the HD7850, or perhaps the GTX660, if it's out? not checked its performance, but either way, a modern card would be far better for the job, even if not necessarily as cheap. SLI of course can be used just as an experiment
Even if SLI/CF is wanted, then GTX 560 SEs or Radeon 7770s would be better choices than the 550 TI because they offer greater performance within the 550 TI's minimum price range.
True, but you have to consider the caveat with dual graphics plans, taking consideration of a card they already own. In this case I believe Russ already owns the 550?
Changing the subject, has anyone played with the new Android 4.0 OS's and if so which one is the better? Jelly Bean? Ice Cream Sandwich?.... I'm probably going to buy my daughter a tablet for her birthday and haven't played with them for a couple of years now so I'm out of tune with the new rigs. I like ASUS, Google/ASUS, and Samsung 10" generally speaking. The Google has a couple of draw backs like no rear camera or SD slot but the OS sounds sweet so I'm torn with that one.
Nexus 7 is good value, but if you want a 10" tablet then the Samsung Galaxy Tab 2. Depends on what you're using it for IMO, because obviously a 10" tablet is better for media/films but you could happily read/browse etc on a 7" tablet. The Galaxy Tab probably doesn't run stock ICS (ie Samsung will have tweaked it a bit), and even then I don't have too much experience with the Android tablet UI (there are some stylistic/UI differences obviously, tablets don't have hardware buttons etc). JB is supposedly a lot quicker than ICS (there are significant internal software changes/additions) but I haven't had a chance to install something on my phone and try it out. No doubt the Galaxy Tab 2 would eventually get an update to JB but Samsung aren't the quickest (unless you wanted to update it yourself, which is already possible). Hope that helps anyway.
Blazorthon, I understand what you are saying, but I already have a 970MHz GTX-550 ti, so the cost would be much lower. Even up, the price is still $70 or more higher, for the HD-7850! What I would buy for myself in an HD-7850, would make it $105 more. For me, heat is a big concern, and I know the Gigabyte card I want will be a perfect match for the one I have. The highest temp recorded with it to date is 64C. The idle temps are low to mid 30's. I see no reason to expect less from a second card, and they are very quiet Cards, even at 100% fan speed. For my needs, 2x SLI is the perfect solution. Best Regards, Russ
You're the only Kiddie I see DDP. LOL! Just teasing At least you guys aren't dealing with a faulty DSL line. At least that's what I think it is. The new modem/router acts very similarly to the old one. The DSL light drops out intermittently Just once I wish I could get a decent Isp. My cable provider in minnesota I rarely had trouble with. But the upload speed was crap...