Shaff. Take the defective one? Shaff, Are you suggesting, that they'll replace my current monitor, and have a return label for the existing one? Essentially I pay nothing? It'd be my luck, the next one has dead pixels, or something else
At the time of purchase they told me 3 dead pixels was policy for my 2407WFP. It might have changed from series to series or over time(I purchased in late 2006).
I can guarantee you it isn't! It depends on screen size but I know for a fact that normally 2-3 pixels is considered good from most manufactures and that can go as high as 6-7 pixels. I haven't dealt with Dell in a while but I know they had a bad pixel policy too years ago so I would think they still do. It is pretty tough for manufactures to produce flat panels without some bad pixels here and there. If they didn't accept a few bad pixels flat panels would sky rocket in price because they would reject way too many panels. This is why they have bad pixel policies. Here is Dell's Bad Pixel Policy
Perhaps that is why they charge so much. To enforce a "zero dead pixel" policy. It seems like I've seen that somewhere. Given what the masses use their Ultrasharp line for, it wouldn't surprise me. Photo/video editors need perfect monitors for their trade. Which is a big reason why I got it. I love photoshop, but the best way to experience it is with a monitor that has excellent viewing angles, good refresh rate, etc. Top of the line I didn't buy a $500 SD port. That was just a plus. Since everything else works perfectly, it'd be risky to trade for another. I do blame myself for the malfunction. It makes perfect sense to me. Imagine a flash drive barely connected to a port. Like 1/100th of an inch. That isn't very much. That could wreak havoc on both the drive, and port. Thankfully, my SD card suffered no ill effects. I'm very greatfull for that. It was $60USD after all
So with UltraSharp you could get stuck with up to 5 dead dark pixels,pixels that are truly dead, turned off not full on. There is a way to message dead pixels and possibly get them back in case you ever need to try that.
Yah, that's unfortunate. Thankfully I have zero. Which is why I wouldn't trade it for anything Unless something serious happens in the next 3 yrs. I may just make the next monitor something in the same class. Not sure yet. I have many plans My samsung had a dark pixel that corrected itself. Or it was while I was cleaning it one day. I had to rub fairly hard to clean it. I just noticed it was gone one day. Perhaps that is why they shrug the stuck/dead pixels. Because a good percentage of them are correctable.
Estuansis, One of the big differences between the 790X, and the newer tech 990X is it seems to be more seamless with better thruput on the AM3+ boards. The tech of the 790x is getting close to 3 years old now, and a lot of improvements have been made in the latest motherboards and CPUs. I know this kid down the street from me built a Athlon II x II 7850 Dual Core on a 790x like mine. He just bought an Athlon IIx4 940 Propus Quad for it. This is the third generation for the 3.0GHz "Little Quad that Could", and the difference between it and my old 630 is like night and day. All the newest ones have better memory cache performance and C3 stepping. By the numbers it benchmarked right at 10% better in the 790X than the 2.8GHz 630 did at 3.0GHz. The 640 overclocks much better than the 630 did, and can hit 4.0GHz pretty easy with the newest 640s. Later in the week, we are going to test it in my motherboard and see what the 640 can do on the 990X. From looking at the numbers at 3.0GHz, I would guess about a 13 to 15% improvement stock, and maybe more. I'm pleased with the choice of Cas7 1333MHz 1.5v DDR3@ 7-7-7-21 in mine, and works well with the 600MHz increase in North Bridge frequency, with the HT link speed at 2400MHz for a 5000MT/s Hyper Transport on a 4000MT/s CPU. Works for me! My 1090t will run at 4.2GHz at less than 1.40v, but I'm not planning on stressing it a lot as it will never be used at that speed anyway! I did Run an 8+GB 1080p file through DVDRB/CCE 2 pass at 4.2GHz, and knocked 7 minutes off or my previously recorded time, and it didn't shut down or exceed 43C. Needless to say I'm pretty pleased with it. I'll report on how the 3.0GHz 640 Quad does on my motherboard, but from what I'm seeing is if your are planning on staying with AMD, then a socket AM3+ motherboard and DDR3 will give you a very decent boost in performance for any socket AM3 CPU, so it's a very worthwhile upgrade, and you are ready for Bulldozer. Can't beat that with a stick! Best Regards, Russ
Heard it through the grapevine from an undisclosed friend in the industry that some places have received Bulldozers for testing. Also, they seem to actually not suck. He told me a bit more but the basic idea he gave me is their price will make them very competitive. He wasn't at liberty to tell me any more than that though
Omega: I have never had an issue with the SD slot or the USB hub on any of the three Ultrasharps that I've owned. You're positive that your monitor wasn't a refurb? As for the EZRX drive, it should work fine with the controller your EZRS came with, but only with that controller. Whether it works in your board's SATA ports is probably down to luck, but it's certainly not guaranteed to work. Jeff: I'll believe it when I see it. The Phenom II CPUs are competitive for their price, but that's just it. They aren't competitive overall because the only AMD CPUs you can buy are cheap ones, because of their lack of performance compared to i5s and i7s.
From the way he made it sound they are comparable to the i7/i5 clock-for-clock or at least in the same ballpark. Again he wouldn't give me numbers so I only have hints to work with here. Just let it be known that they look very promising. Remember, even without matching or beating the i7, AMD still has room to make huge strides. Importantly, if it even gets close to the i7, the price alone makes them a solid buy. General consensus though says to wait for the upcoming post-release revisions they will be making. I have a feeling it's going to parallel the Phenom II ie a great product in its own right, but not as fast overall. Also consider that they are releasing it as an octo-core from the get-go. So in heavily multi-threaded applications(where most software is leaning these days) they will have a distinct advantage. Of course such an advantage is moot if the actual per-clock-per-core performance is not as good. So I'm in the same camp as you Sam. I'll believe it when I see it.
I have considerable doubts. Comparable to the i5/i7 dollar-for-dollar would make a lot more sense. Remember the highest end quad core is being released with a default clock speed in excess of 4Ghz. If they were comparable to Sandy Bridge CPUs, that thing would decimate Intel's current lineup, and the price they're asking for it is tiny. Seems too good to be true to me.
Read my edit. Also iirc number of cores works a bit differently for Bulldozer so we really don't know what that 4+ GHz clock really means. Who knows, maybe overclocking like nobody's business helps
The 8-core CPUs will be where Bulldozer wins. No doubt, as long as the Bulldozer arch. makes any improvement at all on Phenom II, the 8-core chips will be hot stuff against Intel's hexes. At the moment I'm dubious about the Phenom II X6 because it's basically no faster than Intel's best quad cores, so in intensive CPU work, the extra cores are redundant. 8 cores though, it'll definitely only be going up against CPUs like the 970 and 980X/990X, and that will be interesting because of the considerable cost advantage.
Well from what I've read so far, the basic idea is Bulldozer uses 2 conventional cores for each module. In turn each module takes the place of a single traditional core. The question for me is, is their 8 core CPU made from 8 modules or is it 4 modules made from 8 cores? Ahh nvm I see that the 8 core models will have 8MB of cache. So if you go by 2MB of cache per-module as read out in wikipedia, the 8 core version should be considered their quad core. I see it this way as you can't simplify them any further. Ie due to the way the architecture works, a single core cannot exist. You need a pair of 2 to make a module. Also consider that AMD have always been competitive in price at least. So if you compare by price alone you are still looking at stiff competition for Intel. $250+ usually buys a fairly high-end CPU as far as the end-consumer(like myself) is concerned.
Right, this is a suspicion that I had, having read about the architecture, but it wasn't made abundantly clear. Marketing them as they are takes a leaf out of the nvidia intentional mismarketing book! I'm curious how they can have two cores attached as a module with sequential processing.
Right I agree that is a bit of mismarketing. I'm fairly curious as well. If it turns out to be a solid product(ie doesnt suck for the price) I will most certainly be buying one.
I think we're looking for a general ballpark type thing more than an actual performance match. Looking at prices tells me it won't perform as well as i7. Or maybe as well as i7 but not as well as Sandybridge. Though consider that Intel have basically had free-range for prices so far given the lack of AMD's competition. If AMD can make another product worth competing with again, Intel might be forced to drop some prices. The way I see it, everything is skewed in Intel's favor including pricing as an indicator. So whether or not you can use prices to estimate Bulldozer is beyond me, because Intel might be forced to drop them to compete. A lot of different things could happen. Even given the general consensus that Bulldozer won't be as fast as Sandybridge or i7, AMD could still very well pull off a hard-won, if short-lived, victory after all is said and done. Also remember they plan to make some revisions quite soon after release so it could potentially be even better. AMD would be fools to reveal their plans now, and they know that. It's all a corporate game of cat and mouse. We'll have to wait till it releases to have any clue.