1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

The Official PC building thread - 4th Edition

Discussion in 'Building a new PC' started by ddp, Sep 13, 2010.

  1. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    Woo had my 22nd birthday yesterday. Was a blast!
     
  2. DXR88

    DXR88 Regular member

    Joined:
    May 29, 2007
    Messages:
    714
    Likes Received:
    2
    Trophy Points:
    28
    good work you made it to Level 22, only 2 more levels before you get your discounted car insurance perk.
     
  3. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    Puhh! My discount wasn't much at all. But my credit has been pretty bad since I was 21. It's slowly getting better though. 36$ a month for liability isn't too bad :p
     
  4. theonejrs

    theonejrs Senior member

    Joined:
    Aug 3, 2005
    Messages:
    7,895
    Likes Received:
    1
    Trophy Points:
    116
    DXR88,

    Like a sink temporarily holds water, a CPU heat sink temporarily holds heat, for the fan to remove via it's airflow.

    Best Regards,
    Russ
     
  5. theonejrs

    theonejrs Senior member

    Joined:
    Aug 3, 2005
    Messages:
    7,895
    Likes Received:
    1
    Trophy Points:
    116
    Sam,

    Phenom II Furnaces? Other than the original 940 140w chip, there is no such thing!

    CPUz 4.0GHz

    [​IMG]

    Temps 4.0GHz

    [​IMG]

    The hottest it's ever been was 56C, running IBT with a Freezer 64. My Phenom II 955 Quad core was run at 3.8GHz, and never exceeded 53C running IBT. Both idled in the high 20s. It runs even cooler with the 990XA motherboard.

    I would hardly call that a Phenom II Furnace. In fact, the hottest CPU Temps I've ever seen award, goes to the Core i7 LGA 1366 Gulftowns, that seem to idle in the high 70s to low 80s when overclocked.

    I've never had any heat issues with any of the 4 AMDs I've owned, and I still haven't had to clean out the Radiator of my CoolIt ALC, and it's been over a year since I bought it. Case design also has a lot to do with how cool any CPU runs, Intel or AMD. Some cases simply have poor airflow, sacrificing good airflow for looks, or just poorly designed Cases, to begin with!

    BTW, I noticed you mentioned about how Thermaltake Test their fans about 10 times further from the pickups, than say, Scythe does! I still think I owe you a fan though. I believe it was called a "ThunderBlade!" The only thing Thermaltake got right with the "ThunderBlade", was the name!:)

    Best Regards,
    Russ
     
    Last edited: Mar 23, 2012
  6. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    To be fair, the early i7s were 130w chips and they're just as old as the 140w Phenom IIs anyway, despite the considerable performance difference. beyond that, when I say furnace I mean the amount of heat they have to dissipate, not the internal die temperature, as since AMD place their sensors in a different location to Intel, they're never a good way of testing how much heat a CPU is putting out. The only reason the old i7s got hot was cheap stock coolers.
     
  7. theonejrs

    theonejrs Senior member

    Joined:
    Aug 3, 2005
    Messages:
    7,895
    Likes Received:
    1
    Trophy Points:
    116
    Sam,

    The fact remains that you need water cooling for the Gulftowns, and you don't for the Phenom IIs. I was amazed that I could OC to 4GHz, using a Freezer 64, and idle was in the low to mid 30s. Maxed out with IBT 10 pass, it only reached 56C with the 1090T at 4.016GHz, while the norm for the Gulftown's is water cooled. I totally discount people who claim as low as 14C - 17C idle temps, because that's well below ambient temps 58F - 64F would be a very uncomfortable room temp!

    How do you know that the placement of AMD's heat sensor position is better or worse than Intel. At best it's a guess on your part. I've never overheated my CPU on air cooling, even at 4.0GHz, while the current Gulftown's still needs a much larger radiator to effectively control the heat. If performance was such a big deal to begin with, we all would be buying Intels. I agree with Steve, the "Hype" isn't worth the cost. Furthermore, the Custom Software I use for my AutoCad/Turbine work. It's optimized to my computer, using all 6 cores, it takes minutes instead of hours to come up with the right metallurgical properties needed for the specs of the design, that previously used only 2 cores! I still get the same money, but it only takes me a fraction of the time, which is fine with my employers. The sooner they have the needed info, the faster the problem gets dealt with. The question becomes, would it be worth it to switch to an Intel? The answer is a resounding "No!" It wouldn't be cost effective at all! That's according to the Software Engineers who write for Intel chips as well as AMD's! The big difference, according to them is the difference between Intel's Turbo Core and AMD's Core Performance Boost. It's much more efficient on the AMD, and consistently runs the cores at a higher percentage of CPU speed gain, than the Intel's do! For the majority of functions, we are talking milliseconds here, so what do you save at the end of the day, 5-10 minutes? It's not worth the cost to change!

    I may even buy one of these.

    http://www.newegg.com/Product/Product.aspx?Item=N82E16819103960

    They overclock very well, and would narrow the tight gap between the Intel's performance and the 3.6GHz 8150 Zambezi 8 core. I've always said, "How fast is fast?" Here is a practical example of just how insignificant the speed difference really is. On a GIGABYTE GA-990XA-UD3 AM3+ (which I have), it would be just about perfect for my purposes!

    I have to say that the HD-4670 has performed very well, but I'm pretty much video limited at this point. Video card prices on the high end seem to be getting more reasonable. One thing I did learn, was to expand my horizons a little more. I should have bought the next motherboard up in price, because I'm now considering SLI. That Model Gigabyte GA-990FXA-UD3 AM3+ has 2x16 PCIe slots, while I only have 1 on the XA, so it's 8x SLI or CrossFire!:( For another $15, I could have had 16x SLI or CrossFire. Hey, you live, you learn!

    Best Regards,
    Russ
     
  8. Mr-Movies

    Mr-Movies Active member

    Joined:
    Nov 9, 2002
    Messages:
    1,225
    Likes Received:
    0
    Trophy Points:
    66
    Regardless of were the sensors are the internal case temp does not get to furnace levels. My 140w runs cool and does not get excessively warm. Also the more cores you have the more potential heat you will have whether it is Intel or AMD. A new Intel quad may be a slight bit cooler than a quad AMD but it really is a mood point and not the exaggerated point you would like to make it out to be. Thinner substrates (masks), lower power consumption is always better or at least most of the time and Intel's are better for power consumption as to the considerable performance difference you can live in your benchmark world, or exaggerated sites world as I've stated before, I just don't see the great performance of your beloved Intel's!

    That is exactly what I'll buy for a performance machine and I'll save big bucks doing so.
     
    Last edited: Mar 23, 2012
  9. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Because Russ, the same power output, using the same heatsink (just with a different mounting bracket) produces wildly different figures for temperatures. Way beyond the scope of variations in thermal paste application. AMD CPUs read less than Intels, that's just how it is.

    Also, saying you need water cooling for Gulftown CPUs is wildly inaccurate, as not only are they shipped with air coolers, but the i7 970 only comes with the budget basic heatsink Intel provide with their standard i5 CPUs! It's a terrible cooler I admit, but the fact that it works with the 130W hex core i7 970 (Remember, 970 NOT 975), means that the statement that you need water for gulftown is completely false.

    I wouldn't worry about 16x SLI/Crossfire. I ran HD4870X2 Quad CrossfireX on 8x Crossfire, and that's four GPUs that are each almost twice as powerful as your HD4670, and I couldn't tell the difference between that and the 16x Crossfire board it replaced.
    It's a red herring apart from a few rare anomalies.


    You know this was a tongue-in-cheek expression, since furnaces get to 2000 degrees plus...
     
    Last edited: Mar 23, 2012
  10. DXR88

    DXR88 Regular member

    Joined:
    May 29, 2007
    Messages:
    714
    Likes Received:
    2
    Trophy Points:
    28
    Save your money, the performance gains from going from the Newer Phenom 2's to the FX is marginal at best.

    the 8 core FX is just a 4x2 setup, so despite the claim its still just 4 cores.
     
  11. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    AMD vs Intel aside, the FX CPUs are still dreadful. Stick with the Phenom II X6, that's the best AMD CPU for now.
     
  12. Mr-Movies

    Mr-Movies Active member

    Joined:
    Nov 9, 2002
    Messages:
    1,225
    Likes Received:
    0
    Trophy Points:
    66
    Do you have a FX cpu? One nice thing is that they support much higher speed rates native, without OC'n. That with more core power would make one believe they perform better. Since I don't have one yet you could be right even though it just doesn't add up.

    And your furnace references are exaggerated, on purpose, regardless of your poor tongue & cheek argument. The bottom line is they do not get excessively hot, at best a bit warm under load.
     
  13. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    And it's the same story for the i5/i7. If you don't own it, you can't apply real world experience. Still, real world experience does not really work with CPUs, because 'perceived' performance is too variable on software health, hard disk speed, memory, fragmentation and so on. The benchmark is numerical tests, hence using them in arguments.

    The 'furnace' argument was, as usual, perceived as a slant against AMD - it refers to all 100W+ CPUs, but since nobody here owns any 100W+ intels, I didn't mention it. Woe betide anyone that doesn't explicitly mention in every sentence that a negative attribute can be applied to either brand.
     
    Last edited: Mar 23, 2012
  14. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    I'll attest to the AMD 940 being a bit of a furnace :p I had to do some special modding with my last case, to get that sucker cool. It was my first X264 encoder. So naturally, it's full potential was being used 24/7. My 965 by comparison, is a dream! I don't see nearly the same temps. But the 940 taught me to use proper cooling ;) The Arctic Freezer did a pretty good job with the 940. Certainly better than the stock cooler, which is what it's using today. I ended up selling that system dirt cheap to a buddy. Adding a 140mm fan to the top of the case, is the smartest Mod i've ever done. It dropped the temps considerably under load. Rather cramped case...

    I feel bad for the FX processor. There are certain circumstances that make it the smart choice. For people that were expecting a performance freak, and bought an AM3+ board early, AND do a fair amount of encoding, it is a logical choice. Why build a completely new system? They've already got the base of their system. The FX shines in X264 encoding. Certainly compared to my current processor.

    Please don't misunderstand though. The Phenom X6 looks more tempting to me. I have no AM3+ board. Just a lowly AM3. If the price could drop to under 150USD for 1090t, I'd probably scoop it up. Because that processor could make me content for a very long time. Depending on advancements of course ;)
     
  15. Mr-Movies

    Mr-Movies Active member

    Joined:
    Nov 9, 2002
    Messages:
    1,225
    Likes Received:
    0
    Trophy Points:
    66
    But I do and I am! You talked me into delving into the Intel world which I should do from time to time but I sure don't see it the way you spew it to be! I'll be the first to slam the FX processors as well when I get one and it doesn't perform, if indeed that happens and it may.
     
  16. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    I would be careful overclocking FX steve. The VRM's will be taxed no doubt. Make sure the board can take it ;)
     
  17. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    But a dual core laptop i5 does not count as a quad core desktop i7 any more than a laptop dual core turion would a bulldozer FX series, so stop using that as basis to pass falsehoods as facts.
     
  18. Mr-Movies

    Mr-Movies Active member

    Joined:
    Nov 9, 2002
    Messages:
    1,225
    Likes Received:
    0
    Trophy Points:
    66
    You just ignore things you don't want to believe. I said before that I compared i5 laptop with AMD desktops, older Truion X2 notebooks as well as using my friends i7 2600 to compare as well. Also the two manufactures take different approaches, Intel's path tends to be high clock cycles to gain performance and AMD's approach is more cores so your trying to compare cores is a joke because it doesn't address performance for the dollar or top cpu against top cpu.

    My new Intel i5 doesn't have much on my old Turion X2, both are dual core which you should like except the Intel is suppose to be like a quad with their 4 thread nonsense. I forget the exact numbers but even though the Intel performed better it certainly wasn't a slam dunk as one might have thought or is spewed by the Intel lovers.

    When I tested the 2600 it was very close to the same performance as one of my old AMD quads so again to slam dunk their either. My friend that has the Intel 2600 and a i7 notebook feels the same way, he is just not impressed by Intel and will go back to AMD. Like me his boss influenced him into believing that Intel was the way to go and like me he was disappointed. My friend doesn't use the intensive programs that I do and didn't do any extensive testing as I did so his point of view was more prospective, mine is not.

    Like I said before if you want to pay too much for perceived performance and you are happy with that then knock yourself out but I'm not buying your argument like I did before now that I have experienced the Intel performance.
     
  19. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Surely that's the other way round as AMD have used higher clock speeds than Intel for their equivalent products for 6 years now...
     
  20. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Let's try and clear this one up once and for all in a civilised manner.

    As I understand it, your promotion of AMD over Intel derives from the fact that you did not notice a tangible performance benefit between a Phenom II and a Core i7 2600K? The answer to that is, you shouldn't.
    CPUs today are sufficiently powerful that they are never the source of noticeable delay in the day to day running of a system. We are now totally bound by memory (capacity only, not speed, and only up the threshold of necessity - usually 8GB at worse, 4GB for most) and more importantly, primary disk I/O.
    Since going SSD, my £50 2009 Pentium dual core E5200 loads programs very similarly to my 4Ghz i5 750 quad core, owing to the SSD, and sufficient memory. Compare that to a system running a Q9550 and a mechanical disk drive, which was slower than the E5200 SSD system in practice, for general usage.
    When you are not unbound from these limits by either using a mechanical drive, or not having enough RAM, the performance inconsistency created is so vast it eliminates all possibility of comparison between two CPUs.

    The only means to assess how good a CPU is these days is to use a numerical test. Either, how much of this can my CPU do every minute? or how long does it take my CPU to run this test?
    What you should generally find is that in most single, dual and four-threaded applications, the i7 2600K will be arithmetically, about 60% faster than a Phenom II at 3.4Ghz. In an application that can use 6 threads fully, a 3.3Ghz X6 1100T the Intel will retain the lead but it will drop to around 10%. This allows for situations like x264 video, where AMD have an innate advantage, to occasionally slip ahead by a few %.
    Sadly AMD have discontinued the Phenom II X6 in light of Bulldozer, which is a little short-sighted given their own recognition of the failings of Bulldozer, but if memory serves me correctly you could get the 1100T for about $250. On that basis then, if you sidegraded from an 1100T to an i7 2600K expecting a big performance benefit, and used 6-threaded applications, you'd be disappointed. The machine wouldn't necessarily be any faster at the desktop, nor would it do any of that hard multi-threaded processing any quicker either.

    Where it all comes into play is the lesser-optimised applications. In an ideal world, all software would fully utilise 8 threads, and the FX-8150 would be the undisputed king of desktop computing for a sensible price. Fact is though, I can probably only name one bit of software I have occasion to use that does useful work with that many cores. This leaves the situation where you're reliant on 4, 2 or usually only 1 core to do the work concerned. For that, the i7 is worth most of its price. Not all. Why not? Because the i5 is so much cheaper, and lacking HT does not cause it much angst in a single-threaded environment. The Phenom II X6 falls way behind, and Bulldozer even further still.

    Do you buy your CPU for:

    - Perceived desktop performance ("how fast it feels to use"): Buy EITHER brand (No difference)
    - Single-threaded application performance ("good performance in all software"): Buy INTEL
    - True Multi-threaded application performance ("the best performance with the best software"): Buy AMD

    Power consumption with the i7 and Phenom II is a minor issue as they're not that far apart. The i7s pull around 90W and the Phenom IIs around 110W, despite both CPUs being rated higher (TDP != real world power consumption!) The only real offenders are the original i7 and Gulftown (130W and 150W respectively) and Bulldozer (140W).

    This all therefore raises the following conclusion:

    If anyone ever says to you that either brand, 'feels faster' than the other at the desktop, and they are similarly rated competing CPUs, it is a placebo. There is no truth in this.
    If anyone ever says to you that either brand, 'is faster' than the other, in general, that is a CONDITIONAL. It can be true, it can be false, depending on the application, depending on the number of threads being used.

    If you don't really know what sort of software you will end up using, or use a lot of software including some relatively basic applications, the Intel will be the better natural choice, because it will always provide the best performance. I do stress however that the i5 2500K is likely to be the better buy, because it is so much cheaper and does a very large proportion of the work.
    If you know what you want with software, have some well-written software you use religiously that's multi-threaded (applies to many people here, esp. video encoders) then yes, AMD is still a good choice.
    Is bulldozer better than Phenom II in that case? Rarely, even in this area because the lack of per-thread performance of bulldozer relatively nullifies the extra two cores, but sometimes. Fact is though, you're now stuck with the inferior offering of bulldozer post-discontinuation of the Phenom II X6, and that's enough to raise some doubts.

    So, I want no more:

    "Intels are worse for your money than AMDs" - that is an opinion, and can never be used as a rebuttal to a factual argument.
    LIkewise, "AMDs are worse for your money than Intels" falls under the same ruling. It too is an opinion.


    Now, where were we? :)
     

Share This Page